0
$\begingroup$

Suppose $X_1, X_2, \ldots , X_n$ are $n$ independent r.v.s, with the same probability distribution and with mean $\mu$ and variance $\sigma^2$. Let $$ \bar{X}=\frac{X_1+X_2+\cdots+X_n}{n} $$ I know the expected value will be $\mu$ and variance will be $\frac{\sigma^2}{n}$, but I'm not sure on how to prove it. Thank you in advance :)

Edit: I'm sorry. I'm aware of how to expand $E(\bar{X})$ and $Var(\bar{X})$ using the formulae. The part that is tripping me up the most is the last step, i.e. why $\frac{1}{n}(E(X_1) + E(X_2) + \cdots + E(X_n))$ can be simplified as E($\bar{X}$), and similarly for variance

$\endgroup$

3 Answers 3

1
$\begingroup$

\begin{align} & \operatorname{var}\left( \frac{X_1+\cdots+X_n} n \right) \\[8pt] = {} & \frac 1 {n^2} \operatorname{var}(X_1+\cdots +X_n) \\[8pt] = {} & \frac 1 {n^2} \left( \operatorname{var}(X_1)+\cdots+\operatorname{var}(X_n) \right) \\[8pt] & \text{and so on.} \\[10pt] \operatorname E\left( \overline X \right) = {} & \operatorname E\left( \frac{X_1+\cdots+X_n} n \right) \\[8pt] = {} & \frac 1 n \left( \operatorname E(X_1+\cdots+X_n) \right) \\[10pt] = {} & \frac 1 n \left( \operatorname E(X_1) + \cdots + \operatorname E(X_n) \right). \end{align}

$\endgroup$
1
  • $\begingroup$ Sorry! I have clarified my confusion in the edit. $\endgroup$ Commented May 10, 2021 at 5:03
0
$\begingroup$

\begin{align*} \mathbb E(\bar X)&= \mathbb E\left(\frac{\sum _{i=1}^nX_i}{n}\right)=\frac1n\mathbb E\left(\sum_{i=1}^nX_i\right)=\frac1n\sum_{i=1}^n\mathbb E(X_i)=\frac1n\cdot n\mathbb E(X_i)=\mathbb E(X_i)=\mu \\ \text{Var}(\bar X)&= \text{Var}\left(\frac{\sum_{i=1}^nX_i}{n}\right)=\frac{1}{n^2}\text{Var}\left(\sum_{i=1}^nX_i\right)=\frac1{n^2}\sum_{i=1}^n\text{Var}(X_i)=\frac1{n^2}\cdot n\sigma ^2=\frac{\sigma^2}{n} \end{align*}

Edit: for any random variables $A$ and $B$, $$\mathbb E(A+B)=\mathbb E(A)+\mathbb E(B)$$ If $A$ and $B$ are independent, $$\text{Var}(A+B)=\text{Var}(A)+\text{Var}(B)$$

$\endgroup$
5
  • $\begingroup$ Sorry! I have clarified my confusion in the edit. $\endgroup$ Commented May 10, 2021 at 5:03
  • $\begingroup$ @appleline made an edit. does this answer your question? $\endgroup$ Commented May 10, 2021 at 5:05
  • $\begingroup$ So does that mean $\mu =E(X_1)+E(X_2)+...E(X_n)$ and $\sigma^2 = Var(X_1) + Var(X_2)+...+ Var(X_n)$ are always true for all real number of $n$ ?? $\endgroup$ Commented May 10, 2021 at 5:13
  • $\begingroup$ 5201314, why not $n \mu$ ? or $n \sigma^2$ ? btw the 2nd equation is true if independent but only if as well? $\endgroup$ Commented May 10, 2021 at 5:16
  • $\begingroup$ I thought you defined $\mu =\mathbb E(X_1)=\mathbb E(X_2)=\cdots =\mathbb E(X_n)$ in your problem? $\endgroup$ Commented May 10, 2021 at 5:19
0
$\begingroup$

For any constant $c$, $E(cX)=cE(X)$. (An explanation at the end.)

So then $$\begin{align} E(\bar{X})=E\left(\frac{X_1+\cdots+X_n}{n}\right) &=E\left(\frac{X_1}{n}+\cdots+\frac{X_n}{n}\right)\\ &=E\left(\frac{X_1}{n}\right)+\cdots+E\left(\frac{X_n}{n}\right)\\ &=\frac{1}{n}E(X_1)+\cdots+\frac{1}{n}E\left(X_n\right)&\text{using }c=\frac1n\\ &=\frac{1}{n}\mu+\cdots+\frac{1}{n}\mu\\ &=\frac{n}{n}\mu=\mu\\ \end{align}$$

And with variance, it's similar. But the constant relation is $\operatorname{Var}(cX)=c^2\operatorname{Var}(X)$. That squaring of $c$ puts $\frac{1}{n^2}$ as the coefficient in the penultimate line, leaving a coefficient $\frac{n}{n^2}$ against the $\sigma^2$.


Why does $E(cX)=cE(X)$? Well, what is the definition of $E$? It should be something like: $$\sum_x x\cdot P(X=x)$$ or an integral version of that: $$\int_{\mathbb{R}} x\cdot p(x)\,dx$$ So $E(cx)$ is $$\sum_x cx\cdot P(X=x)\quad\text{or}\quad\int_{\mathbb{R}} cx\cdot p(x)\,dx$$ and the $c$ can be factored out of the sum or of the integral: $$c\sum_x x\cdot P(X=x)\quad\text{or}\quad c\int_{\mathbb{R}} x\cdot p(x)\,dx$$ $$c\cdot E(x)$$

$\endgroup$
3
  • $\begingroup$ I'm sorry. Why is the expected value of any X = the mean $\mu$ ? Similarly for variance $\endgroup$ Commented May 10, 2021 at 5:30
  • $\begingroup$ @appleline "Suppose $X_1, X_2, \ldots , X_n$ are $n$ independent r.v.s, with the same probability distribution and with mean $\mu$ and variance $\sigma^2$." that's what you wrote? $\endgroup$ Commented May 10, 2021 at 6:00
  • $\begingroup$ @appleline OK, $E(X)$ is defined as a sum or an integral as posted here. And $\mu$ is just defined to equal $E(X)$. When a distribution is presented with $\mu$ ahead of time, it is a requirement that should you calculate $E(X)$, it equals $\mu$. $\endgroup$ Commented May 10, 2021 at 16:38

You must log in to answer this question.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.