2
$\begingroup$

If $X_n$ converges to $X$ in distribution, does it imply $$\lim_{n \rightarrow \infty }E[|X_n|] = E[|X|]$$ If not, suppose, $X_1, X_2, .. X_n $ are i.i.d. random variables with mean $\mu$ and finite variance, what is $$\lim_{n \rightarrow \infty}E\left[\left|\frac{1}{N}\sum_{i=1}^{N}X_i - \mu\right| \right]$$

What I think? Finite variance, so Central limit theorem is valid, i.e. convergence in distribution of "debiased scaled variables" to $N(0,1)$, but I cannot relate this convergence in distribution to the first moment!

$\endgroup$

1 Answer 1

2
$\begingroup$

Convergence in distribution does not imply convergence of the first moment. For example consider $X_n$ defined by \begin{align}P(X_n = 0) &= 1-\frac{1}{n} \\ P(X_n = n) &= \frac{1}{n} \end{align} Then $X_n \Rightarrow 0$ in distribution but $E[X_n] = 1 \nrightarrow 0$. To prove convergence in distribution, take $f$ a bounded real function, then $$E[f(X_n)] = \big(1-\frac{1}{n}\big)\,f(0) + \frac{1}{n}\,f(n) \longrightarrow f(0) = E[f(0)]$$ To get the convergence you want, you should first use the law of large numbers: $$\frac{1}{n}\sum_{i=1}^{n}X_i \longrightarrow \mu \ \ \text{a.s.}$$ then argue that the $\overline{X}_n = \frac{1}{n}\sum_{i=1}^{n}X_i$ are uniformly integrable (for instance because they are bounded in $L^2$). Now applying Vitali convergence theorem proves that $\overline{X}_n \to \mu$ in $L^1$.

$\endgroup$

You must log in to answer this question.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.