Skip to main content
Notice removed Draw attention by D.R
Bounty Ended with whuber's answer chosen by D.R
added 9 characters in body; edited tags
Source Link
D.R
  • 123
  • 8

First of all, when we say that $X_n \sim \text{Unif}(0,X_{n-1})$, what does that mean, rigorously? Does it mean that for every $\omega \in \Omega$, $X_n(\omega)\sim \text{Unif}(0,X_{n-1}(\omega))$? This doesn't make much sense as $X_n(\omega)$ is a number and not a random variable. Or does it mean that for every $\omega \in \Omega$ there is some auxiliary $X'_n \sim \text{Unif}(0,X_{n-1}(\omega))$, and then $X_n(\omega) = X'_n(\omega)$? Once we have that cleared up, I have another question:

Let's say that we have $X_n \sim \text{Unif}(0,X_{n-1})$ where $X_0=1$. I've been told that defining $\mathcal A_n = \sigma[X_0, \ldots, X_{n}]$, $$\mathbb E(X_{n+1}|\mathcal A_n) = \mathbb E(X_{n+1}|X_n) = \mathbb E[\text{Unif}(0,X_{n})] = \frac{0+X_{n}}{2}$$ Similarly, for $Z_{n+1} \sim \text{Poisson}(Z_n)$ and $\mathcal A_n$ defined similarly as in the above case, I've been told that $$\mathbb E(Z_{n+1}|\mathcal A_n) = \mathbb E[\text{Poisson}(Z_{n})] = Z_n$$ and also for $Y_{n+1} \sim2 \text{Binom}(Y_n, p)$ $$\mathbb E(Y_{n+1}|\mathcal A_n) = 2\mathbb E[\text{Binom}(Y_{n}, p)] = 2Y_n p$$

Question: What are the proofs of these three statements (using the modern probability theory approach and conditional expectation properties such as those listed here)? Is this sort of pattern generally true for random variables defined off of each other like this?

The only formula from conditional expectations that I think would apply here is that $\mathbb E(Y|\mathcal D) =_\text{a.s.} \mathbb E[Y]$ if $\mathcal F(Y)$ and $\cal D$ are independent. However, in the above case I don't think the sigma fields are independent, so I don't know how to go about proving the nice equations above.

First of all, when we say that $X_n \sim \text{Unif}(0,X_{n-1})$, what does that mean, rigorously? Does it mean that for every $\omega \in \Omega$, $X_n(\omega)\sim \text{Unif}(0,X_{n-1}(\omega))$? This doesn't make much sense as $X_n(\omega)$ is a number and not a random variable. Or does it mean that for every $\omega \in \Omega$ there is some auxiliary $X'_n \sim \text{Unif}(0,X_{n-1}(\omega))$, and then $X_n(\omega) = X'_n(\omega)$? Once we have that cleared up, I have another question:

Let's say that we have $X_n \sim \text{Unif}(0,X_{n-1})$ where $X_0=1$. I've been told that defining $\mathcal A_n = \sigma[X_0, \ldots, X_{n}]$, $$\mathbb E(X_{n+1}|\mathcal A_n) = \mathbb E(X_{n+1}|X_n) = \mathbb E[\text{Unif}(0,X_{n})] = \frac{0+X_{n}}{2}$$ Similarly, for $Z_{n+1} \sim \text{Poisson}(Z_n)$ and $\mathcal A_n$ defined similarly as in the above case, I've been told that $$\mathbb E(Z_{n+1}|\mathcal A_n) = \mathbb E[\text{Poisson}(Z_{n})] = Z_n$$ and also for $Y_{n+1} \sim2 \text{Binom}(Y_n, p)$ $$\mathbb E(Y_{n+1}|\mathcal A_n) = 2\mathbb E[\text{Binom}(Y_{n}, p)] = 2Y_n p$$

Question: What are the proofs of these three statements (using the modern probability theory and conditional expectation properties such as those listed here)? Is this sort of pattern generally true for random variables defined off of each other like this?

The only formula from conditional expectations that I think would apply here is that $\mathbb E(Y|\mathcal D) =_\text{a.s.} \mathbb E[Y]$ if $\mathcal F(Y)$ and $\cal D$ are independent. However, in the above case I don't think the sigma fields are independent, so I don't know how to go about proving the nice equations above.

First of all, when we say that $X_n \sim \text{Unif}(0,X_{n-1})$, what does that mean, rigorously? Does it mean that for every $\omega \in \Omega$, $X_n(\omega)\sim \text{Unif}(0,X_{n-1}(\omega))$? This doesn't make much sense as $X_n(\omega)$ is a number and not a random variable. Or does it mean that for every $\omega \in \Omega$ there is some auxiliary $X'_n \sim \text{Unif}(0,X_{n-1}(\omega))$, and then $X_n(\omega) = X'_n(\omega)$? Once we have that cleared up, I have another question:

Let's say that we have $X_n \sim \text{Unif}(0,X_{n-1})$ where $X_0=1$. I've been told that defining $\mathcal A_n = \sigma[X_0, \ldots, X_{n}]$, $$\mathbb E(X_{n+1}|\mathcal A_n) = \mathbb E(X_{n+1}|X_n) = \mathbb E[\text{Unif}(0,X_{n})] = \frac{0+X_{n}}{2}$$ Similarly, for $Z_{n+1} \sim \text{Poisson}(Z_n)$ and $\mathcal A_n$ defined similarly as in the above case, I've been told that $$\mathbb E(Z_{n+1}|\mathcal A_n) = \mathbb E[\text{Poisson}(Z_{n})] = Z_n$$ and also for $Y_{n+1} \sim2 \text{Binom}(Y_n, p)$ $$\mathbb E(Y_{n+1}|\mathcal A_n) = 2\mathbb E[\text{Binom}(Y_{n}, p)] = 2Y_n p$$

Question: What are the proofs of these three statements (using the modern probability theory approach and conditional expectation properties such as those listed here)? Is this sort of pattern generally true for random variables defined off of each other like this?

The only formula from conditional expectations that I think would apply here is that $\mathbb E(Y|\mathcal D) =_\text{a.s.} \mathbb E[Y]$ if $\mathcal F(Y)$ and $\cal D$ are independent. However, in the above case I don't think the sigma fields are independent, so I don't know how to go about proving the nice equations above.

Tweeted twitter.com/StackStats/status/1230145159884660740
Notice added Draw attention by D.R
Bounty Started worth 50 reputation by D.R
added 271 characters in body
Source Link
D.R
  • 123
  • 8

First of all, when we say that $X_n \sim \text{Unif}(0,X_{n-1})$, what does that mean, rigorously? Does it mean that for every $\omega \in \Omega$, $X_n(\omega)\sim \text{Unif}(0,X_{n-1}(\omega))$? This doesn't make much sense as $X_n(\omega)$ is a number and not a random variable. Or does it mean that for every $\omega \in \Omega$ there is some auxiliary $X'_n \sim \text{Unif}(0,X_{n-1}(\omega))$, and then $X_n(\omega) = X'_n(\omega)$? Once we have that cleared up, I have another question:

Let's say that we have $X_n \sim \text{Unif}(0,X_{n-1})$ where $X_0=1$. I've been told that defining $\mathcal A_n = \sigma[X_0, \ldots, X_{n}]$, $$\mathbb E(X_{n+1}|\mathcal A_n) = \mathbb E(X_{n+1}|X_n) = \mathbb E[\text{Unif}(0,X_{n})] = \frac{0+X_{n}}{2}$$ Similarly, for $Z_{n+1} \sim \text{Poisson}(Z_n)$ and $\mathcal A_n$ defined similarly as in the above case, I've been told that $$\mathbb E(Z_{n+1}|\mathcal A_n) = \mathbb E[\text{Poisson}(Z_{n})] = Z_n$$ and also for $Y_{n+1} \sim2 \text{Binom}(Y_n, p)$ $$\mathbb E(Y_{n+1}|\mathcal A_n) = 2\mathbb E[\text{Binom}(Y_{n}, p)] = 2Y_n p$$

Question: What isare the proofproofs of these three statements (using the modern probability theory and conditional expectation properties such as those listed here)? And isIs this sort of pattern generally true for random variables defined off of each other like this?

The only formula from conditional expectations that I think would apply here is that $\mathbb E(Y|\mathcal D) =_\text{a.s.} \mathbb E[Y]$ if $\mathcal F(Y)$ and $\cal D$ are independent. However, in the above case I don't think the sigma fields are independent, so I don't know how to go about proving the nice equations above.

First of all, when we say that $X_n \sim \text{Unif}(0,X_{n-1})$, what does that mean, rigorously? Does it mean that for every $\omega \in \Omega$, $X_n(\omega)\sim \text{Unif}(0,X_{n-1}(\omega))$? This doesn't make much sense as $X_n(\omega)$ is a number and not a random variable. Or does it mean that for every $\omega \in \Omega$ there is some auxiliary $X'_n \sim \text{Unif}(0,X_{n-1}(\omega))$, and then $X_n(\omega) = X'_n(\omega)$? Once we have that cleared up, I have another question:

Let's say that we have $X_n \sim \text{Unif}(0,X_{n-1})$ where $X_0=1$. I've been told that defining $\mathcal A_n = \sigma[X_0, \ldots, X_{n}]$, $$\mathbb E(X_{n+1}|\mathcal A_n) = \mathbb E(X_{n+1}|X_n) = \mathbb E[\text{Unif}(0,X_{n})] = \frac{0+X_{n}}{2}$$ Similarly, for $Z_{n+1} \sim \text{Poisson}(Z_n)$ and $\mathcal A_n$ defined similarly as in the above case, I've been told that $$\mathbb E(Z_{n+1}|\mathcal A_n) = \mathbb E[\text{Poisson}(Z_{n})] = Z_n$$ and also for $Y_{n+1} \sim2 \text{Binom}(Y_n, p)$ $$\mathbb E(Y_{n+1}|\mathcal A_n) = 2\mathbb E[\text{Binom}(Y_{n}, p)] = 2Y_n p$$

Question: What is the proof of these three statements? And is this generally true?

The only formula from conditional expectations that I think would apply here is that $\mathbb E(Y|\mathcal D) =_\text{a.s.} \mathbb E[Y]$ if $\mathcal F(Y)$ and $\cal D$ are independent. However, in the above case I don't think the sigma fields are independent, so I don't know how to go about proving the nice equations above.

First of all, when we say that $X_n \sim \text{Unif}(0,X_{n-1})$, what does that mean, rigorously? Does it mean that for every $\omega \in \Omega$, $X_n(\omega)\sim \text{Unif}(0,X_{n-1}(\omega))$? This doesn't make much sense as $X_n(\omega)$ is a number and not a random variable. Or does it mean that for every $\omega \in \Omega$ there is some auxiliary $X'_n \sim \text{Unif}(0,X_{n-1}(\omega))$, and then $X_n(\omega) = X'_n(\omega)$? Once we have that cleared up, I have another question:

Let's say that we have $X_n \sim \text{Unif}(0,X_{n-1})$ where $X_0=1$. I've been told that defining $\mathcal A_n = \sigma[X_0, \ldots, X_{n}]$, $$\mathbb E(X_{n+1}|\mathcal A_n) = \mathbb E(X_{n+1}|X_n) = \mathbb E[\text{Unif}(0,X_{n})] = \frac{0+X_{n}}{2}$$ Similarly, for $Z_{n+1} \sim \text{Poisson}(Z_n)$ and $\mathcal A_n$ defined similarly as in the above case, I've been told that $$\mathbb E(Z_{n+1}|\mathcal A_n) = \mathbb E[\text{Poisson}(Z_{n})] = Z_n$$ and also for $Y_{n+1} \sim2 \text{Binom}(Y_n, p)$ $$\mathbb E(Y_{n+1}|\mathcal A_n) = 2\mathbb E[\text{Binom}(Y_{n}, p)] = 2Y_n p$$

Question: What are the proofs of these three statements (using the modern probability theory and conditional expectation properties such as those listed here)? Is this sort of pattern generally true for random variables defined off of each other like this?

The only formula from conditional expectations that I think would apply here is that $\mathbb E(Y|\mathcal D) =_\text{a.s.} \mathbb E[Y]$ if $\mathcal F(Y)$ and $\cal D$ are independent. However, in the above case I don't think the sigma fields are independent, so I don't know how to go about proving the nice equations above.

added 21 characters in body
Source Link
D.R
  • 123
  • 8

First of all, when we say that $X_n \sim \text{Unif}(0,X_{n-1})$, what does that mean, rigorously? Does it mean that for every $\omega \in \Omega$, $X_n(\omega)\sim \text{Unif}(0,X_{n-1}(\omega))$? This doesn't make much sense as $X_n(\omega)$ is a number and not a random variable. Or does it mean that for every $\omega \in \Omega$ there is some auxiliary $X'_n \sim \text{Unif}(0,X_{n-1}(\omega))$, and then $X_n(\omega) = X'_n(\omega)$? Once we have that cleared up, I have another question:

Let's say that we have $X_n \sim \text{Unif}(0,X_{n-1})$ where $X_0=1$. I've been told that defining $\mathcal A_n = \sigma[X_0, \ldots, X_{n}]$, $$\mathbb E(X_{n+1}|\mathcal A_n) = \mathbb E[\text{Unif}(0,X_{n-1})] = \frac{0+X_{n-1}}{2}$$$$\mathbb E(X_{n+1}|\mathcal A_n) = \mathbb E(X_{n+1}|X_n) = \mathbb E[\text{Unif}(0,X_{n})] = \frac{0+X_{n}}{2}$$ Similarly, for $Z_{n+1} \sim \text{Poisson}(Z_n)$ and $\mathcal A_n$ defined similarly as in the above case, I've been told that $$\mathbb E(Z_{n+1}|\mathcal A_n) = \mathbb E[\text{Poisson}(Z_{n})] = Z_n$$ and also for $Y_{n+1} \sim2 \text{Binom}(Y_n, p)$ $$\mathbb E(Y_{n+1}|\mathcal A_n) = 2\mathbb E[\text{Binom}(Y_{n}, p)] = 2Y_n p$$

Question: What is the proof of these three statements? And is this generally true?

The only formula from conditional expectations that I think would apply here is that $\mathbb E(Y|\mathcal D) =_\text{a.s.} \mathbb E[Y]$ if $\mathcal F(Y)$ and $\cal D$ are independent. However, in the above case I don't think the sigma fields are independent, so I don't know how to go about proving the nice equations above.

First of all, when we say that $X_n \sim \text{Unif}(0,X_{n-1})$, what does that mean, rigorously? Does it mean that for every $\omega \in \Omega$, $X_n(\omega)\sim \text{Unif}(0,X_{n-1}(\omega))$? This doesn't make much sense as $X_n(\omega)$ is a number and not a random variable. Or does it mean that for every $\omega \in \Omega$ there is some auxiliary $X'_n \sim \text{Unif}(0,X_{n-1}(\omega))$, and then $X_n(\omega) = X'_n(\omega)$? Once we have that cleared up, I have another question:

Let's say that we have $X_n \sim \text{Unif}(0,X_{n-1})$ where $X_0=1$. I've been told that defining $\mathcal A_n = \sigma[X_0, \ldots, X_{n}]$, $$\mathbb E(X_{n+1}|\mathcal A_n) = \mathbb E[\text{Unif}(0,X_{n-1})] = \frac{0+X_{n-1}}{2}$$ Similarly, for $Z_{n+1} \sim \text{Poisson}(Z_n)$ and $\mathcal A_n$ defined similarly as in the above case, I've been told that $$\mathbb E(Z_{n+1}|\mathcal A_n) = \mathbb E[\text{Poisson}(Z_{n})] = Z_n$$ and also for $Y_{n+1} \sim2 \text{Binom}(Y_n, p)$ $$\mathbb E(Y_{n+1}|\mathcal A_n) = 2\mathbb E[\text{Binom}(Y_{n}, p)] = 2Y_n p$$

Question: What is the proof of these three statements? And is this generally true?

The only formula from conditional expectations that I think would apply here is that $\mathbb E(Y|\mathcal D) =_\text{a.s.} \mathbb E[Y]$ if $\mathcal F(Y)$ and $\cal D$ are independent. However, in the above case I don't think the sigma fields are independent, so I don't know how to go about proving the nice equations above.

First of all, when we say that $X_n \sim \text{Unif}(0,X_{n-1})$, what does that mean, rigorously? Does it mean that for every $\omega \in \Omega$, $X_n(\omega)\sim \text{Unif}(0,X_{n-1}(\omega))$? This doesn't make much sense as $X_n(\omega)$ is a number and not a random variable. Or does it mean that for every $\omega \in \Omega$ there is some auxiliary $X'_n \sim \text{Unif}(0,X_{n-1}(\omega))$, and then $X_n(\omega) = X'_n(\omega)$? Once we have that cleared up, I have another question:

Let's say that we have $X_n \sim \text{Unif}(0,X_{n-1})$ where $X_0=1$. I've been told that defining $\mathcal A_n = \sigma[X_0, \ldots, X_{n}]$, $$\mathbb E(X_{n+1}|\mathcal A_n) = \mathbb E(X_{n+1}|X_n) = \mathbb E[\text{Unif}(0,X_{n})] = \frac{0+X_{n}}{2}$$ Similarly, for $Z_{n+1} \sim \text{Poisson}(Z_n)$ and $\mathcal A_n$ defined similarly as in the above case, I've been told that $$\mathbb E(Z_{n+1}|\mathcal A_n) = \mathbb E[\text{Poisson}(Z_{n})] = Z_n$$ and also for $Y_{n+1} \sim2 \text{Binom}(Y_n, p)$ $$\mathbb E(Y_{n+1}|\mathcal A_n) = 2\mathbb E[\text{Binom}(Y_{n}, p)] = 2Y_n p$$

Question: What is the proof of these three statements? And is this generally true?

The only formula from conditional expectations that I think would apply here is that $\mathbb E(Y|\mathcal D) =_\text{a.s.} \mathbb E[Y]$ if $\mathcal F(Y)$ and $\cal D$ are independent. However, in the above case I don't think the sigma fields are independent, so I don't know how to go about proving the nice equations above.

added 42 characters in body
Source Link
D.R
  • 123
  • 8
Loading
added 4 characters in body
Source Link
D.R
  • 123
  • 8
Loading
added 501 characters in body
Source Link
D.R
  • 123
  • 8
Loading
Post Reopened by whuber
Post Closed as "Needs details or clarity" by whuber
Source Link
D.R
  • 123
  • 8
Loading