First of all, when we say that $X_n \sim \text{Unif}(0,X_{n-1})$, what does that mean, rigorously? Does it mean that for every $\omega \in \Omega$, $X_n(\omega)\sim \text{Unif}(0,X_{n-1}(\omega))$? This doesn't make much sense as $X_n(\omega)$ is a number and not a random variable. Or does it mean that for every $\omega \in \Omega$ there is some auxiliary $X'_n \sim \text{Unif}(0,X_{n-1}(\omega))$, and then $X_n(\omega) = X'_n(\omega)$? Once we have that cleared up, I have another question:
Let's say that we have $X_n \sim \text{Unif}(0,X_{n-1})$ where $X_0=1$. I've been told that defining $\mathcal A_n = \sigma[X_0, \ldots, X_{n}]$, $$\mathbb E(X_{n+1}|\mathcal A_n) = \mathbb E(X_{n+1}|X_n) = \mathbb E[\text{Unif}(0,X_{n})] = \frac{0+X_{n}}{2}$$ Similarly, for $Z_{n+1} \sim \text{Poisson}(Z_n)$ and $\mathcal A_n$ defined similarly as in the above case, I've been told that $$\mathbb E(Z_{n+1}|\mathcal A_n) = \mathbb E[\text{Poisson}(Z_{n})] = Z_n$$ and also for $Y_{n+1} \sim2 \text{Binom}(Y_n, p)$ $$\mathbb E(Y_{n+1}|\mathcal A_n) = 2\mathbb E[\text{Binom}(Y_{n}, p)] = 2Y_n p$$
Question: What are the proofs of these three statements (using the modern probability theory approach and conditional expectation properties such as those listed here)? Is this sort of pattern generally true for random variables defined off of each other like this?
The only formula from conditional expectations that I think would apply here is that $\mathbb E(Y|\mathcal D) =_\text{a.s.} \mathbb E[Y]$ if $\mathcal F(Y)$ and $\cal D$ are independent. However, in the above case I don't think the sigma fields are independent, so I don't know how to go about proving the nice equations above.