I have this code snippet:
Random rand = new Random();
int chance = rand.Next(1, 101);
if (chance <= 25) // probability of 25%
{
Console.WriteLine("You win");
}
else
{
Console.WriteLine("You lose");
}
My question is, does it really calculate a 25% probability for winning here? Is the chance of winning for the player here is really 25%?
Edit:
I just wrote this:
double total = 0;
double prob = 0;
Random rnd = new Random();
for (int i = 0; i < 100; i++)
{
double chance = rnd.Next(1, 101);
if (chance <= 25) prob++;
total++;
}
Console.WriteLine(prob / total);
Console.ReadKey();
And it's highly inaccurate. It goes from about 0.15 to 0.3.
But when I do more checks (change from (i < 100) to (i < 10000)) it's much more accurate.
Why is this? Why aren't 100 checks enough for it to be accurate?
This is very easy to check for yourself:
Random rand = new Random();
int yes = 0;
const int iterations = 10000000;
for (int i = 0; i < iterations; i++)
{
if (rand.Next(1, 101) <= 25)
{
yes++;
}
}
Console.WriteLine((float)yes/iterations);
the result:
0.2497914
The conslusion: Yes, yes it is.
Edit: Just for fun, the LINQy version:
Random rand = new Random();
const int iterations = 10000000;
int sum = Enumerable.Range(1, iterations)
.Count(i => rand.Next(1, 101) <= 25);
Console.WriteLine(sum / (float)iterations);