I periodically get extraneous results when I run a program I have constructed. Instead of printing a float, I get this: -1.#IO. What does this result indicate? A divide by zero?
In my experience, if the result was an overflow then it should still print correctly, just a an incorrect value (perhaps some large negative).
Here is what I am getting most of the time:

Here is what happens about every 6-7 runs.

If I run the program using a constant seed (assuming that seed produces correct results), then everything works fine indefinitely. However, if I run using a random seed, like time(0), I tend to see this program explode at run-time.
In short, has anyone experienced this result before? Just looking for a starting point to dive into the code.