Surprise!
I've a variable like this,
NSInteger index = 0;
I'm comparing it with one of subviews count (which returns NSUInteger
) like this,
if((index-1) <= [[currentmonth subviews] count])
{
NSLog(@"true");
}
else
{
NSLog(@"false");
}
This always giving false.
but If I'll do like this,
if ((index-1) <= 42) {
NSLog(@"true");
} else {
NSLog(@"false");
}
This always giving true.
I feel that, this because we can't compare NSInteger
with NSUInteger
correct?
I caught this issue, when I have a working solution based on this logic. But its not true at all.
I've found this, NSUInteger vs NSInteger, int vs unsigned, and similar cases
This answer gives the good explanations on this!
You should also be aware of integer conversion rules when dealing with NSUInteger vs. NSInteger:
The following fragment for example returns 0 (false) although you'd expect it to print 1 (true):
NSInteger si = -1;
NSUInteger ui = 1;
printf("%d\n", si < ui);
The reason is that the [si] variable is being implicitly converted to an unsigned int!
See CERT's Secure Coding site for an in-depth discussion around these 'issues' and how to solve them.