Skip to main content

You are not logged in. Your edit will be placed in a queue until it is peer reviewed.

We welcome edits that make the post easier to understand and more valuable for readers. Because community members review edits, please try to make the post substantially better than how you found it, for example, by fixing grammar or adding additional resources and hyperlinks.

Required fields*

5
  • $\begingroup$ Welcome. Just a thought here. Depending on the structure of your random effects, there may already be some regularization in your mixed model that lasso can’t improve on. $\endgroup$ Commented Aug 27 at 16:01
  • 3
    $\begingroup$ "However, I have noticed that the full model is nearly always the best fitting model based on AIC " with 40k observations and only 10 variables, it is not really surprising that any variable selection method based on prediction performance/fit would keep them all in the model, particularly AIC with its lighter complexity penalty. How did you get glmnet to produce a model with nonzero coefs? Using CV to pick the penalty strength, or just manually adjusting it until you got the desired number of variables? $\endgroup$ Commented Aug 27 at 17:52
  • $\begingroup$ Thank you for your comments, they've given me lots to think about. @NathanWycoff, I used cv to pick the penalty strength for the lasso and then used glmmTMB to produce my glmm model. Is it better to stick with one method then, so choose either the lasso and highlight the potential issues around it being for glm, or switch to something like the MuMIn package and use AIC? $\endgroup$ Commented Aug 27 at 19:41
  • 1
    $\begingroup$ @CjC Oh interesting; the CV-based lasso really did pick only some strict subset of parameters? That's surprising to me. Anyhow, you'll have to tell us just a bit more about your goals for us to give you a definite answer. Why do you want to do variable selection? In order to get a simpler model containing fewer terms? If so, this is in tension with the way you are currently performing variable selection, which is instead geared towards maximal predictive accuracy (glmnet+CV) or based on searching for a "true" model (AIC). $\endgroup$ Commented Aug 27 at 20:38
  • $\begingroup$ @NathanWycoff, thank you again for your comments, I have updated my question above with some further information and code which hopefully explains a bit further my aims of the research. Ultimatley, I want to find the most important variables for site selection and remove any unnecessary variables (especially due to the large dataset). Currently I am just using ggpredict after model fitting but the results may be used in the future to create a predictive map of suitable perch sites in the study area. $\endgroup$ Commented Aug 28 at 8:42