Skip to main content

You are not logged in. Your edit will be placed in a queue until it is peer reviewed.

We welcome edits that make the post easier to understand and more valuable for readers. Because community members review edits, please try to make the post substantially better than how you found it, for example, by fixing grammar or adding additional resources and hyperlinks.

2
  • $\begingroup$ Ah, sorry. None of the errors are correlated. $\endgroup$ Commented Jul 30, 2021 at 1:15
  • 1
    $\begingroup$ Hi: This sounds bayesian to me. You can assume priors for $\mu$ and $\sigma$. Then, you have to assume some likelihood for the data given $\mu$ and $\sigma$. Then, given the data, you can derive the posterior given the likelihood and the prior. So, if you make nice priors, the posterior should have all the of the parameters in it at once. Often a normal is used as a prior for the mean and a gamma (IIRC) is used for the variance. Using these leads to nice posteriors but I forget what they are. Any decent bayesian text ( lancaster, gelman and hill, zellner ) should provide the gory details. $\endgroup$ Commented Jul 30, 2021 at 2:35