3
$\begingroup$

I am reading Larry Wasserman's All of Statistics and exercise 2 in chapter 6 asks for a proof that given sequence of random variables $ X_1, X_2, \dots $, show that $ X \xrightarrow{\text{QM}} b $ if and only if

$$ \begin{align} & \lim_{n \rightarrow \infty} \mathbb{E}(X_n) = b & \text{and } & & \lim_{n \rightarrow \infty} \mathbb{V}(X_n) = 0. \end{align} $$

I'm getting stuck proving the forward direction. I started by expanding the definition of quadratic mean convergence as follows. By assumption, we have $$ \lim_{n \rightarrow \infty} \mathbb{E}(X-b)^2 = 0. $$

And then by linearity of expectation we have, $$ \lim_{n \rightarrow \infty} \mathbb{E}(X-b)^2 = \lim_{n \rightarrow \infty} \mathbb{E}(X_n^2) - 2b\ \mathbb{E}(X_n) + b^2 = 0. $$

This is where I get stuck. It seems like we will somehow get that $ \mathbb{E}(X_n) $ has to equal $ b $ but I don't see how.

$\endgroup$
6
  • 2
    $\begingroup$ Just a hint: If $E(X_n) \to b$ in $L^2$, then we can have $\int |X_n - b| \ dP \leq \ldots$ Use Cauchy-Schwarz! $\endgroup$ Commented Dec 25, 2019 at 5:12
  • $\begingroup$ Also: your last line only holds if all limits exist and are finite. We don't know if $E(X_n)$ exists! $\endgroup$ Commented Dec 25, 2019 at 5:27
  • $\begingroup$ Ah I see. I have to check but I think we can assume the limits exist (as in it's in the problem statement). $\endgroup$ Commented Dec 25, 2019 at 17:06
  • $\begingroup$ Maybe I'm just being dense but I don't see how the Cauchy-Schwarz inequality helps with the inequality chain you started. The probability version of Cauchy-Schwarz that I'm familiar with is $ \mathbb{E}(X^2Y^2) \leq \mathbb{E}(X^2) \mathbb{E}(Y^2) $. How does that relate to $ \mathbb{E} \lvert X_n - b \rvert \leq \dots $? $\endgroup$ Commented Dec 26, 2019 at 20:49
  • 1
    $\begingroup$ Precisely. I'll do a full solution when I can figure the second part. $\endgroup$ Commented Dec 29, 2019 at 23:31

2 Answers 2

4
+50
$\begingroup$

By Jensen's Inequality, (alternatively, this follows from noting $\operatorname{Var}(X_n - b) \geq 0$ ),$$\mathbb{E}(X_n - b)^2 \geq (\mathbb{E}|X_n - b|)^2$$ so taking the limit as $n\to\infty$ of both sides gives $0 \geq \limsup_{n\to\infty} \mathbb{E} |X_n - b|$, and we also clearly have $\liminf_{n\to\infty} \mathbb{E} |X_n - b| \geq 0$ since the argument is nonnegative. Then $\lim_{n\to\infty} \mathbb{E} |X_n - b| = 0$, so $\lim_{n\to\infty} \mathbb{E}(X_n) = b.$

For the second part, use the lemma posted from this stackexchange post. In particular, since $b$ is a constant, it has $0$ variance, so $\lim_{n\to\infty}\operatorname{Var}(X_n) = 0$.

$\endgroup$
0
0
$\begingroup$

The second part:

By the Var definition:

$$Var(X) = E(X^2) - [E(X)]^2$$

$$Var(X_n-b) = E(|X_n-b|^2) - [E(X_n-b)]^2$$

$$\lim_{n \to \infty}⁡Var(X_n-b) = \lim_{n \to \infty}⁡E(|X_n-b|^2 ) - \lim_{n\to \infty}⁡{[E(X_n-b)]^2}$$

the Quadratic mean tell us that:

$$ \lim_{n\to \infty} ⁡E(|X_n-b|^2 ) = 0$$

and the first part of this problem defined:

$$\lim_{n \to \infty}⁡(E|X_n-b|) = 0,$$

then

$$\lim_{n \to \infty}⁡ Var(X_n-b) = 0.$$

$\endgroup$

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.