Suppose we define a class of algorithms that is allowed to sample i.i.d. Bernoulli bitstrings of arbitrary length, and use these to generate outputs. If we are allowed to use algorithms like this, then intuitively, "white noise" images like this one are easy to generate (i.e they have a low description length algorithm allowing random noise, even though it is essentially maximum k-complexity), namely we just run a for loop width x height times and pick black/white randomly.
I've tried to look online for "stochastic algorithmic complexity" and some other keywords, but haven't found anything relevant. Is there a term that I can search for to read about this idea, of defining an algorithmic complexity measure of objects/classes of objects using algorithms that are allowed to use randomness?