why does the following c code produce real numbers only ranging between 0 and 1(eg: 0.840188,0.394383...etc) for double a,b
when the value for RAND_MAX appears to be 0.000000
. Shouldn't RAND_MAX
set the maximum value for the number generated by rand()
function ?
#include <stdio.h>
#include <stdlib.h>
int main()
{
double a,b,c;
for (int i=0;i<100;i++){
a=(double)rand()/(double)RAND_MAX;
b=(double)rand()/(double)RAND_MAX;
c=a-b;
printf("itteration : %d values a=%f,b=%f,c=%f, RAND_MAX=%f \n",i,a,b,c,RAND_MAX);
}
return 0;
}
RAND_MAX
is an integral constant, but you are printing it using the %f
specifier (used for double
), which is undefined behavior (and in your case happens to print 0). Use %d
and you'll see the actual value of RAND_MAX
.
By the way, pretty much any decent compiler will warn you about that invalid printf
if you crank the warnings high enough. Make sure to do so, C and C++ are full of traps enough, at least let the compiler help you when it can.