Skip to main content
Expand
Source Link
fgrieu
  • 150.9k
  • 13
  • 327
  • 631

If I have a random key accumulating $N$ bits of entropy, I’d expect to brute-force it in the order of $2^N$ trials.

That's the right order of magnitude. For uniform distribution, the expected number of failed trials is $(2^N-1)/2$. For non-uniform distribution, that can be less by a small factor, or more, by some much larger factor. I assume $N$ is the Shannon entropy in bit.


The method leading to the least expected number of failed trials is to make guesses from most to least probable key. For a key of $k=4$ symbols with 1 having probability $0\le p\le 1/2$ and 0 probability $1-p$, a suitable order for the $2^k$ candidates is

0000 0001 0010 0100 1000 0011 0101 0110 1001 1010 1100 0111 1011 1101 1110 1111 

The expected number of failed trials before hitting the right key is $$\begin{array}{lll} &(1+2+3+4)\,p(1-p)^3\\ \quad+&(5+6+7+8+9+10)\,p^2(1-p)^2\\ \quad+&(11+12+13+14)\,p^3(1-p)\\ \quad+&15\,p^4\\ =&10p+15p^2-10p^3\end{array}$$ Note: to extend to $0\le p\le 1$, change $p$ to $\min(p,1-p)$.

Plotting that expected number of failed attempts against the entropy-based approximation $(2^N-1)/2$ with Shannon entropy (in bit) $N=-k(p\,\log_2(p)+(1-p)\log_2(1-p))$: Expected number of failed trials vs p, k=4

Thus for $k=4$ the entropy-based approximation is fine, and always by excess. But it turns our that changes for more bits, e.g. for $k=32$: Expected number of failed trials vs p, k=32

ThePlotting the ratio actual/expected for various $k$ (with log scales): ratio actual/expected for various k

As apparent in these examples, the actual expected number of failed trials can get quitearbitrarily higher than the Shannon entropy estimate suggests (e.g. for $k=80$$k=128$, $p=1/21$, it's $>60000$$>120.000.000$ times higher). But whatever the distribution of the keys it never gets much lower. As pointed by kodlu answering my reformulation in more mathematical terms on Math-SE, a lower bound or essentially $1/2$ is given in J.L. Massey's Guessing and entropy, in proceedings of ISIT 1994.

If I have a random key accumulating $N$ bits of entropy, I’d expect to brute-force it in the order of $2^N$ trials.

That's the right order of magnitude. For uniform distribution, the expected number of failed trials is $(2^N-1)/2$. For non-uniform distribution, that can be less by a small factor, or more, by some much larger factor. I assume $N$ is the Shannon entropy in bit.


The method leading to the least expected number of failed trials is to make guesses from most to least probable key. For a key of $k=4$ symbols with 1 having probability $0\le p\le 1/2$ and 0 probability $1-p$, a suitable order for the $2^k$ candidates is

0000 0001 0010 0100 1000 0011 0101 0110 1001 1010 1100 0111 1011 1101 1110 1111 

The expected number of failed trials before hitting the right key is $$\begin{array}{lll} &(1+2+3+4)\,p(1-p)^3\\ \quad+&(5+6+7+8+9+10)\,p^2(1-p)^2\\ \quad+&(11+12+13+14)\,p^3(1-p)\\ \quad+&15\,p^4\\ =&10p+15p^2-10p^3\end{array}$$ Note: to extend to $0\le p\le 1$, change $p$ to $\min(p,1-p)$.

Plotting that expected number of failed attempts against the entropy-based approximation $(2^N-1)/2$ with Shannon entropy (in bit) $N=-k(p\,\log_2(p)+(1-p)\log_2(1-p))$: Expected number of failed trials vs p, k=4

Thus for $k=4$ the entropy-based approximation is fine, and always by excess. But it turns our that changes for more bits, e.g. for $k=32$: Expected number of failed trials vs p, k=32

The actual expected number of failed trials can get quite higher than the entropy estimate suggests (e.g. for $k=80$, $p=1/21$, it's $>60000$ times higher). But whatever the distribution of the keys it never gets much lower. As pointed by kodlu answering my reformulation in more mathematical terms on Math-SE, a lower bound is given in J.L. Massey's Guessing and entropy, in proceedings of ISIT 1994.

If I have a random key accumulating $N$ bits of entropy, I’d expect to brute-force it in the order of $2^N$ trials.

That's the right order of magnitude. For uniform distribution, the expected number of failed trials is $(2^N-1)/2$. For non-uniform distribution, that can be less by a small factor, or more, by some much larger factor. I assume $N$ is the Shannon entropy in bit.


The method leading to the least expected number of failed trials is to make guesses from most to least probable key. For a key of $k=4$ symbols with 1 having probability $0\le p\le 1/2$ and 0 probability $1-p$, a suitable order for the $2^k$ candidates is

0000 0001 0010 0100 1000 0011 0101 0110 1001 1010 1100 0111 1011 1101 1110 1111 

The expected number of failed trials before hitting the right key is $$\begin{array}{lll} &(1+2+3+4)\,p(1-p)^3\\ \quad+&(5+6+7+8+9+10)\,p^2(1-p)^2\\ \quad+&(11+12+13+14)\,p^3(1-p)\\ \quad+&15\,p^4\\ =&10p+15p^2-10p^3\end{array}$$ Note: to extend to $0\le p\le 1$, change $p$ to $\min(p,1-p)$.

Plotting that expected number of failed attempts against the entropy-based approximation $(2^N-1)/2$ with Shannon entropy (in bit) $N=-k(p\,\log_2(p)+(1-p)\log_2(1-p))$: Expected number of failed trials vs p, k=4

Thus for $k=4$ the entropy-based approximation is fine, and always by excess. But it turns our that changes for more bits, e.g. for $k=32$: Expected number of failed trials vs p, k=32

Plotting the ratio actual/expected for various $k$ (with log scales): ratio actual/expected for various k

As apparent in these examples, the actual expected number of failed trials can get arbitrarily higher than the Shannon entropy estimate suggests (e.g. for $k=128$, $p=1/21$, it's $>120.000.000$ times higher). But whatever the distribution of the keys it never gets much lower. As pointed by kodlu answering my reformulation in more mathematical terms on Math-SE, a lower bound or essentially $1/2$ is given in J.L. Massey's Guessing and entropy, in proceedings of ISIT 1994.

Polish
Source Link
fgrieu
  • 150.9k
  • 13
  • 327
  • 631

If I have a random key accumulating $N$ bits of entropy, I’d expect to brute-force it in the order of $2^N$ trials.

That's the right order of magnitude. For uniform distribution, the expected number of failed trials is $(2^N-1)/2$. For non-uniform distribution, that can be less by a small factor, or more, by some much larger factor. I assume $N$ is the Shannon entropy in bit.


The method leading to the least expected number of failed trials is to make guesses from most to least probable key. For a key of $k=4$ symbols with 01 having probability $p\le 1/2$$0\le p\le 1/2$ and 10 probability $1-p$, a suitable order for the $2^k$ candidates is

11110000 01110001 10110010 11010100 11101000  0011 0101 0110 1001 1010 1100 00010111 00101011 01001101 10001110 00001111 

The expected number of failed trials before hitting the right key is $$\begin{array}{lll} &(1+2+3+4)\,p(1-p)^3\\ \quad+&(5+6+7+8+9+10)\,p^2(1-p)^2\\ \quad+&(11+12+13+14)\,p^3(1-p)\\ \quad+&15\,p^4\\ =&10p+15p^2-10p^3\end{array}$$ Note: to extend to $0\le p\le 1$, change $p$ to $\min(p,1-p)$.

Plotting that expected number of failed attempts against the entropy-based approximation $(2^N-1)/2$ with Shannon entropy (in bit) $N=-k(p\,\log_2(p)+(1-p)\log_2(1-p))$: Expected number of failed trials vs p, k=4

Thus for $k=4$ the entropy-based approximation is fine, and always by excess. But it turns our that changes for more bits, e.g. for $k=32$: Expected number of failed trials vs p, k=32

The actual expected number of failed trials can get quite higher than the entropy estimate suggests (e.g. for $k=80$, $p=1/21$, it's $>60000$ times higher); but. But whatever the distribution of the keys it never gets much lower. As pointed by kodlu answering my questionreformulation in more mathematical terms on Math-SE, a lower bound is given byin J.L. Massey's Guessing and entropy, in proceedings of ISIT 1994.

If I have a random key accumulating $N$ bits of entropy, I’d expect to brute-force it in the order of $2^N$ trials.

That's the right order of magnitude. For uniform distribution, the expected number of failed trials is $(2^N-1)/2$. For non-uniform distribution, that can be less by a small factor, or more, by some much larger factor. I assume $N$ is the Shannon entropy in bit.


The method leading to the least expected number of failed trials is to make guesses from most to least probable key. For a key of $k=4$ symbols with 0 having probability $p\le 1/2$ and 1 probability $1-p$, a suitable order for the $2^k$ candidates is

1111 0111 1011 1101 1110 0011 0101 0110 1001 1010 1100 0001 0010 0100 1000 0000 

The expected number of failed trials before hitting the right key is $$\begin{array}{lll} &(1+2+3+4)\,p(1-p)^3\\ \quad+&(5+6+7+8+9+10)\,p^2(1-p)^2\\ \quad+&(11+12+13+14)\,p^3(1-p)\\ \quad+&15\,p^4\\ =&10p+15p^2-10p^3\end{array}$$

Plotting that expected number of failed attempts against the entropy-based approximation $(2^N-1)/2$ with Shannon entropy (in bit) $N=-k(p\,\log_2(p)+(1-p)\log_2(1-p))$: Expected number of failed trials vs p, k=4

Thus for $k=4$ the entropy-based approximation is fine, and always by excess. But it turns our that changes for more bits, e.g. for $k=32$: Expected number of failed trials vs p, k=32

The actual expected number of failed trials can get quite higher than the entropy estimate suggests (e.g. for $k=80$, $p=1/21$, it's $>60000$ times higher); but it never gets much lower. As pointed by kodlu answering my question on Math-SE, a lower bound is given by J.L. Massey's Guessing and entropy, in proceedings of ISIT 1994.

If I have a random key accumulating $N$ bits of entropy, I’d expect to brute-force it in the order of $2^N$ trials.

That's the right order of magnitude. For uniform distribution, the expected number of failed trials is $(2^N-1)/2$. For non-uniform distribution, that can be less by a small factor, or more, by some much larger factor. I assume $N$ is the Shannon entropy in bit.


The method leading to the least expected number of failed trials is to make guesses from most to least probable key. For a key of $k=4$ symbols with 1 having probability $0\le p\le 1/2$ and 0 probability $1-p$, a suitable order for the $2^k$ candidates is

0000 0001 0010 0100 1000  0011 0101 0110 1001 1010 1100 0111 1011 1101 1110 1111 

The expected number of failed trials before hitting the right key is $$\begin{array}{lll} &(1+2+3+4)\,p(1-p)^3\\ \quad+&(5+6+7+8+9+10)\,p^2(1-p)^2\\ \quad+&(11+12+13+14)\,p^3(1-p)\\ \quad+&15\,p^4\\ =&10p+15p^2-10p^3\end{array}$$ Note: to extend to $0\le p\le 1$, change $p$ to $\min(p,1-p)$.

Plotting that expected number of failed attempts against the entropy-based approximation $(2^N-1)/2$ with Shannon entropy (in bit) $N=-k(p\,\log_2(p)+(1-p)\log_2(1-p))$: Expected number of failed trials vs p, k=4

Thus for $k=4$ the entropy-based approximation is fine, and always by excess. But it turns our that changes for more bits, e.g. for $k=32$: Expected number of failed trials vs p, k=32

The actual expected number of failed trials can get quite higher than the entropy estimate suggests (e.g. for $k=80$, $p=1/21$, it's $>60000$ times higher). But whatever the distribution of the keys it never gets much lower. As pointed by kodlu answering my reformulation in more mathematical terms on Math-SE, a lower bound is given in J.L. Massey's Guessing and entropy, in proceedings of ISIT 1994.

deleted 2 characters in body
Source Link
fgrieu
  • 150.9k
  • 13
  • 327
  • 631

If I have a random key accumulating $N$ bits of entropy, I’d expect to brute-force it in the order of $2^N$ trials.

That's the right order of magnitude. For uniform distribution, the expected number of failed trials is $(2^N-1)/2$. For non-uniform distribution, that can be less by a small factor, or more, by some much larger factor. I assume $N$ is the Shannon entropy in bit.


The method leading to the least expected number of failed trials is to make guesses from most to least probable key. For a key of $k=4$ symbols with 0 having probability $p\le 1/2$ and 1 probability $1-p$, a suitable order for the $2^k$ candidates is

1111 0111 1011 1101 1110 0011 0101 0110 1001 1010 1100 0001 0010 0100 1000 0000 

The expected number of failed attemptstrials before hitting the right key is $$\begin{array}{lll} &(1+2+3+4)\,p(1-p)^3\\ \quad+&(5+6+7+8+9+10)\,p^2(1-p)^2\\ \quad+&(11+12+13+14)\,p^3(1-p)\\ \quad+&15\,p^4\\ =&10p+15p^2-10p^3\end{array}$$

Plotting that expected number of failed attempts against the entropy-based approximation $(2^N-1)/2$ with Shannon entropy (in bit) $N=-k(p\,\log_2(p)+(1-p)\log_2(1-p))$: Expected number of failed trials vs p, k=4

Thus for $k=4$ the entropy-based approximation is fine, and always by excess. But it turns our that changes for more bits, e.g. for $k=32$: Expected number of failed trials vs p, k=32

The actual expected number of failed trials can get quite higher than the entropy estimate suggests (e.g. for $k=80$, $p=1/21$, it's $>60000$ times higher); but it never gets much lower. As pointed by kodlu answering my question on Math-SE, a lower bound is given by J.L. Massey's Guessing and entropy, in proceedings of ISIT 1994.

If I have a random key accumulating $N$ bits of entropy, I’d expect to brute-force it in the order of $2^N$ trials.

That's the right order of magnitude. For uniform distribution, the expected number of failed trials is $(2^N-1)/2$. For non-uniform distribution, that can be less by a small factor, or more, by some much larger factor. I assume $N$ is the Shannon entropy in bit.


The method leading to the least expected number of failed trials is to make guesses from most to least probable key. For a key of $k=4$ symbols with 0 having probability $p\le 1/2$ and 1 probability $1-p$, a suitable order for the $2^k$ candidates is

1111 0111 1011 1101 1110 0011 0101 0110 1001 1010 1100 0001 0010 0100 1000 0000 

The expected number of failed attempts before hitting the right key is $$\begin{array}{lll} &(1+2+3+4)\,p(1-p)^3\\ \quad+&(5+6+7+8+9+10)\,p^2(1-p)^2\\ \quad+&(11+12+13+14)\,p^3(1-p)\\ \quad+&15\,p^4\\ =&10p+15p^2-10p^3\end{array}$$

Plotting that expected number of failed attempts against the entropy-based approximation $(2^N-1)/2$ with Shannon entropy (in bit) $N=-k(p\,\log_2(p)+(1-p)\log_2(1-p))$: Expected number of failed trials vs p, k=4

Thus for $k=4$ the entropy-based approximation is fine, and always by excess. But it turns our that changes for more bits, e.g. for $k=32$: Expected number of failed trials vs p, k=32

The actual expected number of failed trials can get quite higher than the entropy estimate suggests (e.g. for $k=80$, $p=1/21$, it's $>60000$ times higher); but it never gets much lower. As pointed by kodlu answering my question on Math-SE, a lower bound is given by J.L. Massey's Guessing and entropy, in proceedings of ISIT 1994.

If I have a random key accumulating $N$ bits of entropy, I’d expect to brute-force it in the order of $2^N$ trials.

That's the right order of magnitude. For uniform distribution, the expected number of failed trials is $(2^N-1)/2$. For non-uniform distribution, that can be less by a small factor, or more, by some much larger factor. I assume $N$ is the Shannon entropy in bit.


The method leading to the least expected number of failed trials is to make guesses from most to least probable key. For a key of $k=4$ symbols with 0 having probability $p\le 1/2$ and 1 probability $1-p$, a suitable order for the $2^k$ candidates is

1111 0111 1011 1101 1110 0011 0101 0110 1001 1010 1100 0001 0010 0100 1000 0000 

The expected number of failed trials before hitting the right key is $$\begin{array}{lll} &(1+2+3+4)\,p(1-p)^3\\ \quad+&(5+6+7+8+9+10)\,p^2(1-p)^2\\ \quad+&(11+12+13+14)\,p^3(1-p)\\ \quad+&15\,p^4\\ =&10p+15p^2-10p^3\end{array}$$

Plotting that expected number of failed attempts against the entropy-based approximation $(2^N-1)/2$ with Shannon entropy (in bit) $N=-k(p\,\log_2(p)+(1-p)\log_2(1-p))$: Expected number of failed trials vs p, k=4

Thus for $k=4$ the entropy-based approximation is fine, and always by excess. But it turns our that changes for more bits, e.g. for $k=32$: Expected number of failed trials vs p, k=32

The actual expected number of failed trials can get quite higher than the entropy estimate suggests (e.g. for $k=80$, $p=1/21$, it's $>60000$ times higher); but it never gets much lower. As pointed by kodlu answering my question on Math-SE, a lower bound is given by J.L. Massey's Guessing and entropy, in proceedings of ISIT 1994.

Post Undeleted by fgrieu
Repair
Source Link
fgrieu
  • 150.9k
  • 13
  • 327
  • 631
Loading
Post Deleted by fgrieu
Polish
Source Link
fgrieu
  • 150.9k
  • 13
  • 327
  • 631
Loading
Polish
Source Link
fgrieu
  • 150.9k
  • 13
  • 327
  • 631
Loading
consider failed attempts
Source Link
fgrieu
  • 150.9k
  • 13
  • 327
  • 631
Loading
Polish
Source Link
fgrieu
  • 150.9k
  • 13
  • 327
  • 631
Loading
Better entropy-based approximation
Source Link
fgrieu
  • 150.9k
  • 13
  • 327
  • 631
Loading
Polish
Source Link
fgrieu
  • 150.9k
  • 13
  • 327
  • 631
Loading
TLDR
Source Link
fgrieu
  • 150.9k
  • 13
  • 327
  • 631
Loading
edited body
Source Link
fgrieu
  • 150.9k
  • 13
  • 327
  • 631
Loading
Source Link
fgrieu
  • 150.9k
  • 13
  • 327
  • 631
Loading