Skip to main content

You are not logged in. Your edit will be placed in a queue until it is peer reviewed.

We welcome edits that make the post easier to understand and more valuable for readers. Because community members review edits, please try to make the post substantially better than how you found it, for example, by fixing grammar or adding additional resources and hyperlinks.

3
  • 3
    $\begingroup$ +1 Parsimony is something that often only makes sense in very specific contexts. There's no reason to be playing the bias vs. precision game if you have enough precision to do both. $\endgroup$ Commented Oct 30, 2011 at 2:01
  • 3
    $\begingroup$ +1 for a great answer. But what if you have multicollinearity and removing a variable reduces it? (This isn't the case in the original question, but often is in other data). Isn't the resulting model often superior in all sorts of ways (reduce variance of estimators, signs of coefficients more likely to reflect underlying theory, etc)? If you still use the correct (original model) degrees of freedom. $\endgroup$ Commented Feb 13, 2012 at 23:08
  • 4
    $\begingroup$ It is still better to include both variables. The only price you pay is the increased standard error in estimating one of the variable's effects adjusted for the other one. Joint tests of the two collinear variables are very powerful as then they combine forces rather than compete against one another. Also if you want to delete a variable, the data are incapable of telling you which one to delete. $\endgroup$ Commented Feb 13, 2012 at 23:30