I have a bunch of data that I fit a linear regression to, and now I need to find the variance of my slope. Is there an analytical way to get this?
If an example is necessary, consider this my data in R:
x <- c(1:6) y <- c(18, 14, 15, 12, 7, 6) lm(y ~ x)$coefficients So I have a slope estimate of -2.4, but I want to know the variance of that estimate.
After looking at previous questions, I've seen a few equations for estimating the slope parameter, but I'm a little confused about what the differences between equations are and what approach is valid for my problem.
For example, the answers in this question say that $\newcommand{\Var}{\rm Var}\newcommand{\slope}{\rm slope}\Var[\slope] = \frac{V[Y]}{\sum\left(\frac{x_i-\bar{x}}{\sum(x_i-\bar{x})^2}\right)}$.
This question says that $\Var[\slope] = \frac{V[Y]}{\sum(x_i-\bar{x})^2}$.
And if I look at the output in R (as a "check" mechanism), I'm given two other ways I could potentially calculate the slope variance (one using the standard error, another given the covariance matrix). I feel like I'm missing something key because all these estimates give me similar (but not the same) answer.
(model).) Or are you wondering where that comes from / why? $\endgroup$