Skip to main content
1 of 2
user13741
  • 2.6k
  • 13
  • 16

Entropy of two concatenated random values

In this Intel blog posting, the author claims:

The amount of work required to brute-force predict a random value that has n bits of entropy is $O(2^n)$. If you concatenate two values together, the entropy required to brute force the result becomes only $2^n + 2^n = 2^{n+1}$. By combining two n-bit random values, each with n bits of entropy, in this manner you get a random number with effectively only one additional bit of entropy.

Is this correct? My intuition suggests it would be $2n$ bits of entropy. But on the other hand, the two values are independent, so I can kind of see why it might be $n+1$ bits. Can anyone explain this?

Thanks

user13741
  • 2.6k
  • 13
  • 16