3
$\begingroup$

Let $\mathbf{y_1, y_2}$ be standardized $N \times 1$ column vectors, such that $\mathbb{E}\mathbf{[y_1]}=\mathbb{E}\mathbf{[y_2]}=0$, $\text{Var}\mathbf{[y_1]}=\text{Var}\mathbf{[y_2]}=1$.

Furthermore, suppose that $\mathbf{y_1, y_2}$ are correlated such that $\text{Cor}(\mathbf{y_1 ,y_2}) \frac{\mathbf{y_1^Ty_2}}{\sqrt{\mathbf{y_1^Ty_1}}\sqrt{\mathbf{y_2^Ty_2}}}=\frac{\mathbf{y_1^Ty_2}}{\sqrt{N}\sqrt{N}}=r$.

Then if we perform regression on another standardized $N \times 1$ column vector $\mathbf{x}$

$$\hat{\beta_1} = \mathbf{(x^Tx)^{-1}x^Ty_1}$$ $$\hat{\beta_2} = \mathbf{(x^Tx)^{-1}x^Ty_2}$$

It seems like it should be a relationship between $\hat{\beta_1}$ and $\hat{\beta_2}$, but I'm missing something in the proof. Ideally, I'd like to do something like:

$$\hat{\beta_1} = \mathbf{(x^Tx)^{-1}x^T(\frac{y_2y_2^T}{N})y_1}$$ $$\hat{\beta_1} = \mathbf{(x^Tx)^{-1}x^Ty_2(\frac{y_2^Ty_1}{N})}$$ $$\hat{\beta_1} = \hat{\beta_2}r$$

But that doesn't work, because $\mathbf{y_2y_2^T}$ is an $N \times N$ rank-1 symmetric matrix and therefore can't be the identity.

$\endgroup$

1 Answer 1

1
$\begingroup$

Perhaps the easiest way to do this is to stack up $y_1$ and $y_2$ and construct an X-matrix of the form

$$\left[ \begin{array}{cc} \mathbf{x}&0\\ 0&\mathbf{x} \end{array} \right]$$

and similarly stack up the error terms and put together the corresponding covariance matrix. You can then form a GLS estimate of the coefficients.

You can work out their covariance fairly easily. The (well-known) result may surprise you.

$\endgroup$
1
  • $\begingroup$ Hmm, so I'm a bit new to GLS techniques. I'm going off here: homepage.ntu.edu.tw/~ckuan/pdf/et01/et_Ch4.pdf -- So $\Sigma_0$ in this case would be a matrix with first row $[\mathbf{y_1^Ty_1}, \; \mathbf{y_1^Ty_2}]$ and second row $[\mathbf{y_2^Ty_1}, \; \mathbf{y_2^Ty_2}]$? $\endgroup$ Commented Sep 23, 2014 at 22:34

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.