Skip to main content

You are not logged in. Your edit will be placed in a queue until it is peer reviewed.

We welcome edits that make the post easier to understand and more valuable for readers. Because community members review edits, please try to make the post substantially better than how you found it, for example, by fixing grammar or adding additional resources and hyperlinks.

4
  • $\begingroup$ Thanks, I had managed to get the answer, was calculating the fisher information with n observations $\endgroup$ Commented May 15, 2015 at 11:53
  • $\begingroup$ You mention that the variance equals the Cramer-Rao bound, but you never derive it. How does one arrive at the CR bound? $\endgroup$ Commented Feb 10, 2019 at 18:25
  • $\begingroup$ I'm trying to understand what this means in practical terms. Am I understanding correctly that ${2 \sigma^4 \over n}$ is telling you how well you know the variance of your measurement? $\endgroup$ Commented Apr 2, 2019 at 23:51
  • $\begingroup$ @olliepower $\frac{2\sigma^4}{n}$ is the minimal variance of the parameter $\sigma^2$ (regardless of which estimator you choose). This is the very best you can hope for with any estimator, as no estimator will have less variance than this. Luckily our estimator has the same minimal variance, and not more! So we are in good shape - as we have a really good estimator for that parameter. $\endgroup$ Commented Feb 13, 2020 at 19:58