I would guess there is threshold at which the device changes (weakens?) the algorithm for producing "random" data so that you don't run out completely. Namely, skimping on "real" random data an relying on a CSPNRG instead.

I just asked a related question and then found the answer on the `rngd` man page:

The `rnd-tools` service invokes the program `/usr/sbin/rngd`. Looking that in up the Ubuntu documentation it can be seen to have a parameter:

> -W n, --fill-watermark=nnn
> Once we start doing it, feed entropy to random-device until at
> least fill-watermark bits of entropy are available in its
> entropy pool (default: 2048). Setting this too high will cause
> rngd to dominate the contents of the entropy pool. Low values
> will hurt system performance during entropy starves. Do not set
> fill-watermark above the size of the entropy pool (usually 4096
> bits).

My question/answer [are here][1]  

 [1]: https://unix.stackexchange.com/posts/440653