It might be a simple solution but I can not fix it.
I am dividing 2 integers :
finishedGameFinalScore = [score integerValue];
CGFloat interval = 2/finishedGameFinalScore;
NSLog(@"interval = %f",interval);
The log returns 0.000000
Is there a limit for decimal places? I need to preserve the decimal result.
Thanks Shani
The reason your code doesn't work is that you're dividing an integer by another integer and then casting the result to a float.
So you have 2 (an integer) and some other number (also an integer). Then you divide 2 by this number - which is probably greater than 2. Let's say it's 3.
Integer
sees 2/3 and he's like "0.66666667? Pshh, no one ever needs anything after the decimal point anyway". So he truncates it. You just have 0.
Then Integer
gives the number to Mr. float
and Mr float
is super happy to get a number! He's all like "yay, a 0! I'm going to add ALL OF THE SIGNIFICANT DIGITS". And that's how you end up with 0.0000000.
So yeah, just cast to a float first. Or even a double!