I'm reading through the excellent Ruby on Rails Tutorial and have encountered the following code.
if 0 true else false end The above returns true and illustrates how unlike many languages (C being the obvious example), Ruby treats 0 as true. Rather than dismiss the behavior as idiosyncratic, I assume there is a good reason for this significant departure from convention. Python, for instance, treats 0 as False, just as one would expect.
In short, what is the rationale in designing Ruby to treat 0 as true?