12
$\begingroup$

I understand that the entropy is the number of bits that can encode a set of messages. However, I don't understand what the min-entropy is and how it is related to entropy.

Let's describe a simple password case: if a password is 100 random bits, is the min-entropy also 100?

$\endgroup$
2
  • 1
    $\begingroup$ If I understand Wikipedia correctly (which is admittely quite difficult), min entropy is always $\leq$ than shannon entropy. $\endgroup$ Commented Nov 8, 2018 at 15:14
  • $\begingroup$ Password entropy is notoriously difficult to estimate and therefore not a good example. It's highly unlikely that a human password can be measured to contain 100 bits of entropy. You'd simply have to dump all maths and guess based on heuristics and star sign. A better example is any electro-mechanically generated non uniform distribution. $\endgroup$ Commented Jul 31, 2019 at 13:41

1 Answer 1

19
$\begingroup$

There is min-entropy, Shannon entropy, and max entropy. (Plus a few more definitions, but let's only focus on these.) All of these measures are greatest, for a given number of outcomes, when each outcome occurs with equal probability. (In such case all three are equal.)

$$\text{min-entropy} \leq \text{Shannon entropy} \leq \text{max-entropy}$$

Min-entropy describes the unpredictability of an outcome determined solely by the probability of the most likely result. This is a conservative measure. It's good for describing passwords and other non-uniform distributions of secrets.

$$\text{min-entropy} = -\log_2{(p_{\text{max}})}$$

Say you have an algorithm which produces 8 digit numeric password. If the number 00000000 occurs 50% of the time, and the remaining $10^8 - 1$ passwords occur with equal probability, then the Shannon entropy would be about $14.3$ bits, but the min-entropy is precisely $1$, which is $-\log_2{(0.5)}$.

Min-entropy can be associated with the best chance of success in predicting a person's password in one guess.


Shannon entropy is defined to equal $$\sum_{i=0}^n{-p_i\log_2(p_i)}$$ for a probability distribution $p$ with $n$ possible outcomes. Shannon entropy describes the average unpredictability of the outcomes of a system.

It also measures how much information is in a system (on average). Shannon entropy is the smallest possible average file size for a compression algorithm designed specifically with the distribution $p$ in mind.


Max-entropy, called also Hartley entropy, is defined solely on the number of possible outcomes. It is equal to $\log_2(n)$. It's not particularly useful in cryptography or passwords. It's a measure of the number of bits you would need to have to designate one bit pattern for every possible outcome.


The three quantities are always the same for a uniform distribution. This is because $p_i = p_\text{max} = \frac{1}{n}$.

$$-\log_2(p_\text{max}) = -\log_2 (\frac{1}{n}) = \log_2 (n)$$

$$\sum_{i=0}^n{-p_i\log_2(p_i)} = n(\frac{1}{n} \log_2 (\frac{1}{n})) = -\log_2 (\frac{1}{n}) = \log_2 (n)$$

This is why passwords picked from a uniform distribution (using a secure RNG) are said to be stronger than human generated passwords. Humans are biased in which password they choose. (Their password distribution is non-uniform.)

$\endgroup$
4
  • $\begingroup$ measure are greatest or equal? $\endgroup$ Commented Nov 8, 2018 at 16:47
  • $\begingroup$ @kelalaka Regarding the third sentence? Greatest (maximized) AND equal (to each other). $\endgroup$ Commented Nov 8, 2018 at 16:49
  • $\begingroup$ Yes, the third sentence $\endgroup$ Commented Nov 8, 2018 at 16:50
  • $\begingroup$ As the formulas show, all definitions are independent of the encoding - they only depend on the probabilities and the number of different events with non-zero probability. If only 10 values are possible, it doesn't matter how many bits your encoding actually uses (except that you need at least 4 bits - although no measure will give full 4 bits). $\endgroup$ Commented Jul 31, 2019 at 15:24

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.