2
$\begingroup$

My question is about proving the Lyapunov CLT (every mean is $0$, $\delta = 1$). It is similar to this question but without any assumption about the random variables following Bernoulli distributions.

The idea is to use Taylor's expansion: $\psi_{X_1}(t) = 1 - \frac{1}{2}\mathbb{E}X_1^2t^2 - \frac{i}{6}\mathbb{E}X_1^3t^3 + o(t^3)$.

Like in the Wikipedia entry, define $s_n^2 = \Sigma_{i = 1}^n\sigma_i^2$. Then $\psi_{X_1}(\frac{t}{s_n}) = 1 - \frac{1}{2}\mathbb{E}X_1^2\frac{t^2}{s_n^2} - \frac{i}{6}\mathbb{E}X_1^3\frac{t^3}{s_n^3} + o(\frac{t^3}{s_n^3})$.

If Lyapunov's condition ($\delta = 1$) is satisfied, I see that $\lim_{n\rightarrow\infty}\psi_{X_1}(\frac{t}{s_n}) = 1 - \frac{1}{2}\mathbb{E}X_1^2\frac{t^2}{s_n^2} + o(\frac{t^3}{s_n^3})$.

Define $S_n = X_1 + \dots + X_n$. If the random variables were i.i.d., then after scaling to make variances be $1$, $\psi_{\frac{S_n}{\sqrt{n}}}$ involves raising a Taylor expansion to the power of $n$. But as the random variables are non-identically distributed, $\psi_{\frac{S_n}{s_n}}$ is a product of $n$ terms. How do I manipulate it so that I can use $e^x = \lim_{n\rightarrow\infty}(1 + \frac{x}{n})^n$?

(How do you turn division by $s_n^2$ into division by $n$, which happens by squaring $\sqrt{n}$ in the i.i.d. case?)

$\endgroup$
2
  • $\begingroup$ Possibly you can rewrite $\prod (1-a_i t/n + O(t^2/n^2))$ as a Taylor series in a similar fashion as en.m.wikipedia.org/wiki/… $\endgroup$ Commented Nov 29, 2023 at 10:43
  • 2
    $\begingroup$ You may follow the outline in the proof of Theorem 27.2 in Probability and Measure (3rd edition) -- or any rigorous probability book that proves the Lindeberg CLT. The problem boils down to prove the asymptotic identity (after rescaling) $\prod_{k = 1}^n(1 - \frac{1}{2}t^2\sigma_k^2) + o(1) = e^{-t^2/2} + o(1)$, which can be established by the Lyapunov's condition and the inequality $|z_1 \cdots z_m - w_1 \cdots w_m| \leq \sum_{k = 1}^m|z_k - w_k|$ when $|z_k| \leq 1, |w_k| \leq 1$. $\endgroup$ Commented Nov 29, 2023 at 13:14

1 Answer 1

7
$\begingroup$

The goal is to show that if $\{X_1, \ldots, X_n, \ldots\}$ is a sequence of independent random variables with zero mean and finite third moments and satisfy \begin{align*} \lim_{n \to \infty} \frac{1}{s_n^3}\sum_{k = 1}^nE[|X_k|^3] = 0, \tag{1}\label{1} \end{align*} where $s_n^2 = \sum_{k = 1}^nE[X_k^2] =: \sum_{k = 1}^n\sigma_k^2$, then \begin{align*} \frac{S_n}{s_n} \to_d N(0, 1), \tag{2}\label{2} \end{align*} where $S_n = X_1 + \cdots + X_n$.

To begin the proof, it is standard to first normalize $X_k$ as $Y_k := X_k/s_n$ so that the condition $\eqref{1}$ becomes \begin{align*} \lim_{n \to \infty} \sum_{k = 1}^nE[|Y_k|^3] = 0. \tag{3}\label{3} \end{align*} And the target $\eqref{2}$ becomes \begin{align*} Y_1 + \cdots + Y_n \to_d N(0, 1). \tag{4}\label{4} \end{align*} In addition, if denote $E[Y_k^2]$ by $\tau_k^2$, then clearly $\sum_{k = 1}^n \tau_k^2 = 1$.

Denote the characteristic function of $Y_k$ by $\varphi_k(t) := E[e^{itY_k}]$, $k = 1, \ldots, n$, it follows by Levy's continuity theorem and $\varphi_{N(0, 1)}(t) = e^{-t^2/2}$ that $\eqref{4}$ is equivalent to \begin{align*} \lim_{n \to \infty} \prod_{k = 1}^n \varphi_k(t) = e^{-t^2/2} \tag{5}\label{5} \end{align*} for every $t \in \mathbb{R}$, which will be proved as follows.

To prove $\eqref{5}$, we need the following well-known inequality that bounds $\left|\varphi_k(t) - (1 - \frac{1}{2}t^2\tau_k^2)\right|$ (which is a corollary of the basic inequality $\left|e^{ix} - (1 + ix - \frac{1}{2}x^2)\right| \leq \min\left(|x|^2, \frac{1}{6}|x|^3\right)$): \begin{align*} \left|\varphi_k(t) - \left(1 - \frac{1}{2}t^2\tau_k^2\right)\right| \leq E\left[\min\left(|tY_k|^2, \frac{1}{6}|tY_k|^3\right)\right] \leq \frac{1}{6}|t|^3 E[|Y_k|^3]. \tag{6}\label{6} \end{align*}

$\eqref{3}$ and $\eqref{6}$ together then imply \begin{align*} \sum_{k = 1}^n\left|\varphi_k(t) - \left(1 - \frac{1}{2}t^2\tau_k^2\right)\right| \to 0 \tag{7}\label{7} \end{align*} as $n \to \infty$.

Having made these preparations, $\eqref{5}$ follows from (justifications for each step can be found in the addendum): \begin{align*} & \left|\prod_{k = 1}^n \varphi_k(t) - e^{-t^2/2}\right| \\ =& \left|\prod_{k = 1}^n \varphi_k(t) - \prod_{k = 1}^ne^{-t^2\tau_k^2/2}\right| \tag{8.1}\label{8.1} \\ \leq & \left|\prod_{k = 1}^n \varphi_k(t) - \prod_{k = 1}^n\left(1 - \frac{1}{2}t^2\tau_k^2\right)\right| + \left|\prod_{k = 1}^n\left(1 - \frac{1}{2}t^2\tau_k^2\right) -\prod_{k = 1}^ne^{-t^2\tau_k^2/2}\right| \tag{8.2}\label{8.2} \\ \leq & \sum_{k = 1}^n\left|\varphi_k(t) - \left(1 - \frac{1}{2}t^2\tau_k^2\right)\right| + \sum_{k = 1}^n\left|\left(1 - \frac{1}{2}t^2\tau_k^2\right) - e^{-t^2\tau_k^2/2}\right| \tag{8.3}\label{8.3} \\ \to & 0. \tag{8.4}\label{8.4} \end{align*} This completes the proof.


Addendum

$\eqref{8.1}:$ This is because $\sum_{k = 1}^n\tau_k^2 = 1$.

$\eqref{8.2}:$ Triangle inequality.

$\eqref{8.3}:$ This is the consequence of the following inequality: if $z_1, \ldots, z_m$ and $w_1, \ldots, w_m$ are complex numbers of modulus at most $1$, then \begin{align*} |z_1 \cdots z_m - w_1 \cdots w_m| \leq \sum_{k = 1}^m|z_k - w_k|. \end{align*} Note that this inequality applies because $\max_{1 \leq k \leq n}\tau_k^2 \to 0$ as $n \to \infty$, which is implied by the Lyapunov's condition $\eqref{3}$ (interestingly, the first inequality below is usually referred as the Lyapunov's inequality): \begin{align*} & (\tau_k^2)^{3/2} = (E[Y_k^2])^{3/2} \leq E[|Y_k|^3], \\ & \max_{1 \leq k \leq n}(\tau_k^2)^{3/2} \leq \sum_{k = 1}^n(\tau_k^2)^{3/2} \leq \sum_{k = 1}^nE[|Y_k|^3] \to 0. \end{align*}

$\eqref{8.4}:$ The first sum goes to $0$ by $\eqref{7}$, while the summand in the second sum is bounded by \begin{align*} & \left|\frac{1}{2!}(-t^2\tau_k^2/2)^2 + \frac{1}{3!}(-t^2\tau_k^2/2)^3 + \cdots \right| \\ \leq & \frac{1}{4}(t^2\tau_k^2)^2\sum_{k = 2}^\infty\frac{(t^2\tau_k^2/2)^{k - 2}}{k!} \\ \leq & \frac{1}{4}(t^2\tau_k^2)^2e^{t^2\tau_k^2/2} \\ \leq & \frac{1}{4}t^4e^{t^2}\tau_k^2\max_{1 \leq j \leq n}\tau_j^2, \end{align*} hence the second sum is bounded by (using $\sum_{k = 1}^n\tau_k^2 = 1$ again) \begin{align*} \frac{1}{4}t^4e^{t^2}\max_{1 \leq j \leq n}\tau_j^2\sum_{k = 1}^n\tau_k^2 = \frac{1}{4}t^4e^{t^2}\max_{1 \leq j \leq n}\tau_j^2, \end{align*} which converges to $0$ as $n \to \infty$ by the relation shown in justifying $\eqref{8.3}$.

$\endgroup$
7
  • $\begingroup$ Why do the $Y_k$ share characteristic function $e^{-t^2/2}$ (before equation 5)? $\endgroup$ Commented Dec 2, 2023 at 0:59
  • $\begingroup$ @johnsmith Eq. 5 doesn't say $Y_k$ share the same cf -- the cf of each $Y_k$ is $\varphi_k$. Eq. 5 just states the goal of the proof. $\endgroup$ Commented Dec 2, 2023 at 1:00
  • $\begingroup$ I mean this: "Since the characteristic function of $N(0,1)$ and $Y_k$ are $e^{-t^2/2}$". If we don't know the distribution of each $X_k$, why would we know the distribution of each $Y_k$? $\endgroup$ Commented Dec 2, 2023 at 3:18
  • $\begingroup$ @johnsmith You did not finish reading the complete sentence.... $\endgroup$ Commented Dec 2, 2023 at 3:27
  • 1
    $\begingroup$ That sentence means: Denote the characteristic function of $Y_k$ by $\varphi_k$, so that to prove $Y_1 + \cdots + Y_n \to_d N(0, 1)$ is equivalent to prove $\prod_{k = 1}^n \varphi_k(t) \to e^{-t^2/2}$. Is it clear now? (there is nothing I "deduced" about $Y_k$, it is just about how I named the cf of $Y_k$). $\endgroup$ Commented Dec 2, 2023 at 4:31

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.