6
$\begingroup$

I have a password with a prescribed format:

111 + (2 Random lowercase letters) + (random digit) 

As you can see, it starts with a fixed value of 111, followed by two random lower case letters, and then lastly a random digit. Now, I would like to calculate the password entropy. To do so, I am calculating:

Fixed value 111 $\rightarrow \log_2(1) = 0\text{ bits}$
Two random lowercase letters $\rightarrow \log_2(26)\times2 = 9.4\text{ bits}$
One random digit $\rightarrow \log_2(10) = 3.3\text{ bits}$

This results in a total password entropy of $0 + 9.4 + 3.3 \text{ bits} = 12.7\text{ bits}$

Is this approach to calculate the password entropy correct, or am I missing something somewhere?

$\endgroup$

2 Answers 2

8
$\begingroup$

Yes, the computation is correct: the passwords will have an entropy of $\log_2(26^2\cdot10)\approx {12.7}$, which means that they are weaker than a randomly chosen secret key with 13 bits.

$\endgroup$
1
  • 2
    $\begingroup$ Assuming all 6,760 passwords are equally probable. This is an upper limit on entropy. If the passwords came from a non-uniform distribution then it would be lower. $\endgroup$ Commented May 4, 2019 at 2:39
2
$\begingroup$

The calculation is correct for the given information source.

I''m providing a second answer to emphasize that it's only meaningful to talk about the entropy of "information source" - that is mechanism from which data is generated.

Since the example given is a rule for creating passwords (insecure ones), it fits to discuss the entropy. Had it been a certain specific password string, it would be meaningless.

$\endgroup$

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.