You are not logged in. Your edit will be placed in a queue until it is peer reviewed.
We welcome edits that make the post easier to understand and more valuable for readers. Because community members review edits, please try to make the post substantially better than how you found it, for example, by fixing grammar or adding additional resources and hyperlinks.
Required fields*
- 3$\begingroup$ I think you have to be more specific about what you are looking for: entropy is after all as "statistical" a measure as variance etc. so the maximum entropy distribution maximises entropy is a perfectly good statistical description. So it seems to me you have to go outside statistics to come up with a "justification" $\endgroup$seanv507– seanv5072013-08-01 09:48:44 +00:00Commented Aug 1, 2013 at 9:48
- 1$\begingroup$ Seanv: I agree that entropy, as a statistical functional, is just as "statistical" as variance, expected value, skew etc. However, using mean and standard deviation as examples, these have purely probabilistic interpretations via Markov's and Chebyshev's theorems and ultimately in one of a number of central limit theorems and also intuitively as long run sums (for the mean) and RMS error (for the standard deviation). I should perhaps repharase my question to read "Probabilistic interpretation of maximum entropy distributions". $\endgroup$Annika– Annika2013-08-01 12:24:23 +00:00Commented Aug 1, 2013 at 12:24
- 1$\begingroup$ Annika, maximum entropy distribution has the following interpretation: If $X_1,X_2,\dots$ are i.i.d. random variables, then the conditional probalitity $P(\cdot|X_1+\dots+X_n=na)\to P^*(\cdot)$ as $n\to \infty$ where $P^*$ is the maximum entropy distribution from the set $\{P:\mathbb{E}_PX=a\}$. See also ieeexplore.ieee.org/xpls/abs_all.jsp?arnumber=1056374&tag=1 $\endgroup$Ashok– Ashok2013-08-01 12:45:06 +00:00Commented Aug 1, 2013 at 12:45
- 2$\begingroup$ Thanks Ashok. Ill take a look at that paper in more detail. This seems like a specific case of maximizing entropy for a given mean, but I am still curious as to what the operation of maximizing the Shanon entropy is doing mathematically such that the above result holds? Is it effectively minimizing the maximum density or average concentration of the probability measure? $\endgroup$Annika– Annika2013-08-01 15:55:42 +00:00Commented Aug 1, 2013 at 15:55
Add a comment |
How to Edit
- Correct minor typos or mistakes
- Clarify meaning without changing it
- Add related resources or links
- Always respect the author’s intent
- Don’t use edits to reply to the author
How to Format
- create code fences with backticks ` or tildes ~ ```
like so
``` - add language identifier to highlight code ```python
def function(foo):
print(foo)
``` - put returns between paragraphs
- for linebreak add 2 spaces at end
- _italic_ or **bold**
- indent code by 4 spaces
- backtick escapes
`like _so_` - quote by placing > at start of line
- to make links (use https whenever possible) <https://example.com>[example](https://example.com)<a href="https://example.com">example</a>
- MathJax equations
$\sin^2 \theta$
How to Tag
A tag is a keyword or label that categorizes your question with other, similar questions. Choose one or more (up to 5) tags that will help answerers to find and interpret your question.
- complete the sentence: my question is about...
- use tags that describe things or concepts that are essential, not incidental to your question
- favor using existing popular tags
- read the descriptions that appear below the tag
If your question is primarily about a topic for which you can't find a tag:
- combine multiple words into single-words with hyphens (e.g. machine-learning), up to a maximum of 35 characters
- creating new tags is a privilege; if you can't yet create a tag you need, then post this question without it, then ask the community to create it for you