Skip to main content

You are not logged in. Your edit will be placed in a queue until it is peer reviewed.

We welcome edits that make the post easier to understand and more valuable for readers. Because community members review edits, please try to make the post substantially better than how you found it, for example, by fixing grammar or adding additional resources and hyperlinks.

Required fields*

8
  • $\begingroup$ The common notion of entropy is the notion of Shannon entropy. The Shannon entropy H(x) of a value x that occurs with probability Pr[x] is H(x) = -log_2(Pr[x]). Related questions: crypto.stackexchange.com/q/378/6961 and crypto.stackexchange.com/q/700/6961 $\endgroup$ Commented Sep 18, 2013 at 3:32
  • 2
    $\begingroup$ There are two issues with Shannon entropy: 1) It's only defined for a probability distribution, not for an individual string 2) Shannon entropy and average key-strength aren't exactly the same thing if the probability distribution isn't uniform. $\endgroup$ Commented Sep 18, 2013 at 11:32
  • $\begingroup$ I've made some progress here. When I have a full answer, I will post it (especially given all the great help I've got it). I'm currently investigating Ping Li's work here: stanford.edu/group/mmds/slides2010/Li.pdf Ideally though I'd do something based off of NIST's work here: csrc.nist.gov/groups/ST/toolkit/rng/documentation_software.html $\endgroup$ Commented Sep 23, 2013 at 22:16
  • $\begingroup$ Can you tell us more? Why do you want to measure the amount of entropy? What do you know about the source of the byte stream? The answer is going to depend heavily on the answers to these questions and on other details, so if you can give us more details, we might be more likely to be able to give you a good answer. This is not a simple subject with a simple one-line answer... $\endgroup$ Commented Oct 1, 2013 at 22:49
  • $\begingroup$ No, I can't. I really want to simply estimate entropy. $\endgroup$ Commented Oct 3, 2013 at 17:25