Skip to main content
Converted image to LaTeX
Source Link
whuber
  • 343.8k
  • 66
  • 823
  • 1.4k

Casella and Lerner's Theory of Point Estimation (2nd edition) provides a definition of the multivariate Central Limit Theorem, for which no proof is given.

Theorem 8.21 (Multivariate CLT) Let $\mathbf{X}_\nu = (X_{1\nu}, \dots, X_{r \nu}$) be iid with mean vector $\zeta = (\zeta_1, \dots, \zeta_r)$ and covariance matrix $\Sigma = \vert \vert \sigma_{ij} \vert \vert$, and let $\overline{X}_{in} = (X_{i1} + \dots + X_{in})/n$. Then, $$[ \sqrt{n} (\overline{X}_{1n} - \zeta_1), \dots, \sqrt{n} (\overline{X}_{rn} - \zeta_r)]$$ tends in law to the multivariate normal distribution with mean vector $\mathbf{0}$ and covariance matrix $\Sigma$.

What would be its derivation?

Casella and Lerner's Theory of Point Estimation (2nd edition) provides a definition of the multivariate Central Limit Theorem, for which no proof is given.

Theorem 8.21 (Multivariate CLT) Let $\mathbf{X}_\nu = (X_{1\nu}, \dots, X_{r \nu}$) be iid with mean vector $\zeta = (\zeta_1, \dots, \zeta_r)$ and covariance matrix $\Sigma = \vert \vert \sigma_{ij} \vert \vert$, and let $\overline{X}_{in} = (X_{i1} + \dots + X_{in})/n$. Then, $$[ \sqrt{n} (\overline{X}_{1n} - \zeta_1), \dots, \sqrt{n} (\overline{X}_{rn} - \zeta_r)]$$ tends in law to the multivariate normal distribution with mean vector $\mathbf{0}$ and covariance matrix $\Sigma$.

What would be its derivation?

Casella and Lerner's Theory of Point Estimation (2nd edition) provides a definition of the multivariate Central Limit Theorem, for which no proof is given.

Theorem 8.21 (Multivariate CLT) Let $\mathbf{X}_\nu = (X_{1\nu}, \dots, X_{r \nu}$) be iid with mean vector $\zeta = (\zeta_1, \dots, \zeta_r)$ and covariance matrix $\Sigma = \vert \vert \sigma_{ij} \vert \vert$, and let $\overline{X}_{in} = (X_{i1} + \dots + X_{in})/n$. Then, $$[ \sqrt{n} (\overline{X}_{1n} - \zeta_1), \dots, \sqrt{n} (\overline{X}_{rn} - \zeta_r)]$$ tends in law to the multivariate normal distribution with mean vector $\mathbf{0}$ and covariance matrix $\Sigma$.

What would be its derivation?

Casella and Lerner's "Theory of Point Estimation"Theory of Point Estimation (2nd edition) provides a definition of the multivariate Central Limit Theorem, for which no proof is given.

enter image description here

Theorem 8.21 (Multivariate CLT) Let $\mathbf{X}_\nu = (X_{1\nu}, \dots, X_{r \nu}$) be iid with mean vector $\zeta = (\zeta_1, \dots, \zeta_r)$ and covariance matrix $\Sigma = \vert \vert \sigma_{ij} \vert \vert$, and let $\overline{X}_{in} = (X_{i1} + \dots + X_{in})/n$. Then, $$[ \sqrt{n} (\overline{X}_{1n} - \zeta_1), \dots, \sqrt{n} (\overline{X}_{rn} - \zeta_r)]$$ tends in law to the multivariate normal distribution with mean vector $\mathbf{0}$ and covariance matrix $\Sigma$.

What would be its derivation?

Casella and Lerner's "Theory of Point Estimation" (2nd edition) provides a definition of the multivariate Central Limit Theorem, for which no proof is given.

enter image description here

What would be its derivation?

Casella and Lerner's Theory of Point Estimation (2nd edition) provides a definition of the multivariate Central Limit Theorem, for which no proof is given.

Theorem 8.21 (Multivariate CLT) Let $\mathbf{X}_\nu = (X_{1\nu}, \dots, X_{r \nu}$) be iid with mean vector $\zeta = (\zeta_1, \dots, \zeta_r)$ and covariance matrix $\Sigma = \vert \vert \sigma_{ij} \vert \vert$, and let $\overline{X}_{in} = (X_{i1} + \dots + X_{in})/n$. Then, $$[ \sqrt{n} (\overline{X}_{1n} - \zeta_1), \dots, \sqrt{n} (\overline{X}_{rn} - \zeta_r)]$$ tends in law to the multivariate normal distribution with mean vector $\mathbf{0}$ and covariance matrix $\Sigma$.

What would be its derivation?

Source Link
Incognito
  • 203
  • 2
  • 5

Proof of the multivariate Central Limit Theorem

Casella and Lerner's "Theory of Point Estimation" (2nd edition) provides a definition of the multivariate Central Limit Theorem, for which no proof is given.

enter image description here

What would be its derivation?