IMHO if (n < 0) n = 0; makes it perfectly clear that after that, `n >= 0`, so a comment explaining this would violate the DRY principle, and an assertion I would consider downright silly or confusing: could the author have faced some strange compiler/optimization bug at this point??? I consider the variant using `goto` harder to understand than plain `if`. This of course may simply be the effect of not being used to seeing `goto`s in code; but then again, most of my (current and future) coworkers aren't either, so they probably feel the same. Thus even if I got used to `goto`, it would make the code harder to maintain in the long run. > It seems to me that "don't use gotos!" is one of those dogmas like "don't use multiple returns!" that stem from a time where the real problem were functions of hundreds or even thousand of lines of code. All of the legacy programs I have seen so far contain many, many functions of hundreds (or sometimes even thousands) of lines of code. So I am very happy that at least they don't contain `goto`s :-) You are right that in a small, clean method `goto` can't make a big problem; however, if you try to draw a fuzzy line like "you can use `goto` in functions shorter than *n* lines", it is inevitably going to be abused by "clever" developers. Not to mention that functions tend to grow over time; what do you do when your originally short and clean function bloats to double its size? Do you remove the `goto`s then, or refactor? Will your successor a few years down the line remove the `goto`s, or refactor the code too?... OTOH Dijkstra's rule is clear, and has much less potential to be abused. Last but not least, your function's name does not express its intent; renaming it to e.g. `make_non_negative` would leave no doubt about what it does. YMMV.