I have the following code:
decimal? a = 2m; decimal? b = 2m; decimal c = a ?? 1m * b ?? 1m; Since both a and b have been filled in, I'm expecting c to give me the result of 4.
However, the result I get is 2, in which case b is taken as 1 instead of 2.
Does anyone know what is the reason behind this behaviour?