Skip to main content
add tag
Source Link
otus
  • 32.5k
  • 5
  • 75
  • 167

In this Intel blog posting, the author claims:

The amount of work required to brute-force predict a random value that has n bits of entropy is $O(2^n)$. If you concatenate two values together, the entropy required to brute force the result becomes only $2^n + 2^n = 2^{n+1}$. By combining two n-bit random values, each with n bits of entropy, in this manner you get a random number with effectively only one additional bit of entropy.

Is this correct? My intuition suggests it would be $2n$ bits of entropy. But on the other hand, the two values are independent, so I can kind of see why it might be $n+1$ bits. Can anyone explain this?

Thanks

In this Intel blog posting, the author claims:

The amount of work required to brute-force predict a random value that has n bits of entropy is $O(2^n)$. If you concatenate two values together, the entropy required to brute force the result becomes only $2^n + 2^n = 2^{n+1}$. By combining two n-bit random values, each with n bits of entropy, in this manner you get a random number with effectively only one additional bit of entropy.

Is this correct? My intuition suggests it would be $2n$ bits of entropy. But on the other hand, the two values are independent, so I can kind of see why it might be $n+1$ bits. Can anyone explain this?

Thanks

In this Intel blog posting, the author claims:

The amount of work required to brute-force predict a random value that has n bits of entropy is $O(2^n)$. If you concatenate two values together, the entropy required to brute force the result becomes only $2^n + 2^n = 2^{n+1}$. By combining two n-bit random values, each with n bits of entropy, in this manner you get a random number with effectively only one additional bit of entropy.

Is this correct? My intuition suggests it would be $2n$ bits of entropy. But on the other hand, the two values are independent, so I can kind of see why it might be $n+1$ bits. Can anyone explain this?

Tweeted twitter.com/#!/StackCrypto/status/499050033703313409
Source Link
user13741
  • 2.6k
  • 13
  • 16

Entropy of two concatenated random values

In this Intel blog posting, the author claims:

The amount of work required to brute-force predict a random value that has n bits of entropy is $O(2^n)$. If you concatenate two values together, the entropy required to brute force the result becomes only $2^n + 2^n = 2^{n+1}$. By combining two n-bit random values, each with n bits of entropy, in this manner you get a random number with effectively only one additional bit of entropy.

Is this correct? My intuition suggests it would be $2n$ bits of entropy. But on the other hand, the two values are independent, so I can kind of see why it might be $n+1$ bits. Can anyone explain this?

Thanks