I have the following problem:
The regression model π=π½π₯+π (πβΌπ(π,π2)) is a model that passes by the origin meaning that $πΈ(π|π)=0$. You dispose of n independent observations. $(X_i, Y_i), i = 1,...,n$ of this model. 1) What is the estimator $\beta$ of the least squares of $\beta$ ? 2) What is the sampling law of $\beta$?
1) for this question the solution goes: $$ \sum_{i=1}^{n} e_i^2 = \sum_{i=1}^{n} (y_i - \beta x_i)^2 = L(\beta)$$ $$L'(\beta) = -2 \sum_{i=1}^{n} x_i(y_i -\beta x_i) = 0$$ $$L'(\beta) = \sum_{i=1}^{n} x_iy_i - \beta x_i^2 = 0$$ $$\hat{\beta} = \dfrac{1}{\sum x_i^2}\sum x_iy_i$$
Now we have our estimate of $\beta$.
When it comes to question 2) :
- First we want the expectation and variance of y because it will be needed to find the law of $\beta$. I'll spare the details of how we got them because I want to focus on $\beta$
$$E(y_i) = \beta x_i$$ $$var(y_i) = \sigma^2$$
- Secondly we can compute the expectation and variance of $\hat{\beta}$ we will just call it $\beta$ from now on.
$$ E(\beta) = \dfrac{1}{\sum x_i^2}\sum x_iE(y_i)$$ $$ E(\beta) = \dfrac{1}{\sum x_i^2}\sum x_i \beta x_i$$ $$ E(\beta) = \beta \dfrac{\sum x_i^2}{\sum x_i^2}$$ $$ E(\beta) = \beta$$
I simply do not understand how the var($\beta$) was found. I tried applying the theorems $E[(\beta - E[\beta])^2]$. I never end up arriving to the solution. I always have some extra constants added to the sigma below. Down below is the correct answer. It would be helpful if someone could show me step by step how we arrived to it. $$var(\beta) = \dfrac{\sum x_i^{2}}{\sum x_i^{2}} n\sigma^2$$
Finally we can state that $\beta \sim N(\beta, \dfrac{\sum x_i^{2}}{\sum x_i^{2}} n\sigma^2)$.