I ran tests with stopwatch. 100,000 iterations:
System.Random rnd = new System.Random(); if (rnd.Next(2) == 0) trues++;
CPUs like integers, so the Next(2) method was faster. 3,700 versus 7,500ms, which is quite substantial. Also: I think random numbers can be a bottleneck, I created around 50 every frame in Unity, even with a tiny scene that noticeably slowed down my system, so I also was hoping to find a method to create a random bool. So I also tried
if (System.DateTime.Now.Millisecond % 2 == 0) trues++;
but calling a static function was even slower with 9,600ms. Worth a shot. Finally I skipped the comparison and only created 100,000 random values, to make sure the int vs. double comparison did not influence the elapsed time, but the result was pretty much the same.
NextBytesto pre-populate a byte array, useBitArrayto turn that into a collection of booleans, and retrieve those booleans from aQueueuntil it's emptied, then repeat the process. With this method, you're only using the randomizer once, so any overhead it creates only happens when you refill the queue. This could be useful when dealing with a secure random number generator rather than the regularRandomclass.NextBytes, so it's surprisingly slowbuffer[i] = (byte)(this.InternalSample() % 256);- I'm assuming that's what you're talking about, that they could have taken that random integer and split it into 3 bytes, populating the byte array with about 1/3 the work. I wonder if there was a reason for that or if it was just an oversight by the developers.