You are not logged in. Your edit will be placed in a queue until it is peer reviewed.
We welcome edits that make the post easier to understand and more valuable for readers. Because community members review edits, please try to make the post substantially better than how you found it, for example, by fixing grammar or adding additional resources and hyperlinks.
Required fields*
- 2$\begingroup$ I would be careful about the interpretation of non-Bayesian approaches being `Bayesian inference in disguise'; there are many cases where approaches are superficially the same (e.g. the same equations show up), but the inferences drawn are substantially different. $\endgroup$πr8– πr82020-01-13 17:41:25 +00:00Commented Jan 13, 2020 at 17:41
- 2$\begingroup$ Perhaps, though I would still argue that this is not really the issue. One of the key misconceptions which drives this perception is the idea that Bayesian inference just means using MAP estimators instead of the MLE, which is not really accurate. One frequently hears the claim that the Lasso is `just Bayes with Laplace priors'; this is extremely untrue if one is computing posterior means instead of the MAP. $\endgroup$πr8– πr82020-01-13 19:40:51 +00:00Commented Jan 13, 2020 at 19:40
- 1$\begingroup$ @πr8 sure, I totally agree with you. I'm trying to say, that the non-Bayesian approaches to such problems are often quite strongly related to their Bayesian counterparts, & the distion gets blurry. $\endgroup$Tim– Tim2020-01-13 20:47:27 +00:00Commented Jan 13, 2020 at 20:47
- 2$\begingroup$ @Tim: to hardcore Bayesians, everything that works does so because it's similar to a Bayesian method. I'm not a hardcore Bayesian (i.e., very rarely do I actually use Bayesian methods to analyze data), but I still mostly agree with that idea anyways. $\endgroup$Cliff AB– Cliff AB2020-01-14 17:51:39 +00:00Commented Jan 14, 2020 at 17:51
- 2$\begingroup$ @πr8 I think you're alluding to the fact that Bayesian inference isn't only about using MAP estimators these days, and that that's been a historical focus because it's easier to compute. If I understand your comment then you're saying it is not true that Lasso can be considered Bayesian when something like Hamiltonian Monte Carlo is used to solve for the full posterior - if you then calculate the mean of the posterior rather than the mode (as in MAP)? Does it "become" Bayesian again, if you look at the mode rather than the mean - even under HMC? I've no idea without a lot more thought! $\endgroup$Mooks– Mooks2020-01-15 11:40:55 +00:00Commented Jan 15, 2020 at 11:40
| Show 2 more comments
How to Edit
- Correct minor typos or mistakes
- Clarify meaning without changing it
- Add related resources or links
- Always respect the author’s intent
- Don’t use edits to reply to the author
How to Format
- create code fences with backticks ` or tildes ~ ```
like so
``` - add language identifier to highlight code ```python
def function(foo):
print(foo)
``` - put returns between paragraphs
- for linebreak add 2 spaces at end
- _italic_ or **bold**
- indent code by 4 spaces
- backtick escapes
`like _so_` - quote by placing > at start of line
- to make links (use https whenever possible) <https://example.com>[example](https://example.com)<a href="https://example.com">example</a>
- MathJax equations
$\sin^2 \theta$
How to Tag
A tag is a keyword or label that categorizes your question with other, similar questions. Choose one or more (up to 5) tags that will help answerers to find and interpret your question.
- complete the sentence: my question is about...
- use tags that describe things or concepts that are essential, not incidental to your question
- favor using existing popular tags
- read the descriptions that appear below the tag
If your question is primarily about a topic for which you can't find a tag:
- combine multiple words into single-words with hyphens (e.g. machine-learning), up to a maximum of 35 characters
- creating new tags is a privilege; if you can't yet create a tag you need, then post this question without it, then ask the community to create it for you