Skip to main content

You are not logged in. Your edit will be placed in a queue until it is peer reviewed.

We welcome edits that make the post easier to understand and more valuable for readers. Because community members review edits, please try to make the post substantially better than how you found it, for example, by fixing grammar or adding additional resources and hyperlinks.

1
  • 1
    $\begingroup$ The main common thing is that you can derive cross-entropy loss and MSE as a form of log likelihood function. That is it. Not more, not less. So stop hand-waving and start with the likelihood function $\endgroup$ Commented Feb 17 at 18:32