How come dividing two 32 bit int numbers as ( int / int ) returns to me 0
, but if I use Decimal.Divide()
I get the correct answer? I'm by no means a c# guy.
int
is an integer type; dividing two ints performs an integer division, i.e. the fractional part is truncated since it can't be stored in the result type (also int
!). Decimal
, by contrast, has got a fractional part. By invoking Decimal.Divide
, your int
arguments get implicitly converted to Decimal
s.
You can enforce non-integer division on int
arguments by explicitly casting at least one of the arguments to a floating-point type, e.g.:
int a = 42;
int b = 23;
double result = (double)a / b;