I just found this algorithm to compute the greatest common divisor in my lecture notes:
public static int gcd( int a, int b ) {
while (b != 0) {
final int r = a % b;
a = b;
b = r;
}
return a;
}
So r is the remainder when dividing b into a (get the mod). Then b is assigned to a, and the remainder is assigned to b, and a is returned. I can't for the life of my see how this works!
And then, apparently this algorithm doesn't work for all cases, and this one must then be used:
public static int gcd( int a, int b ) {
final int gcd;
if (b != 0) {
final int q = a / b;
final int r = a % b; // a == r + q * b AND r == a - q * b.
gcd = gcd( b, r );
} else {
gcd = a;
}
return gcd;
}
I don't understand the reasoning behind this. I generally get recursion and am good at Java but this is eluding me. Help please?
The Wikipedia article contains an explanation, but it's not easy to find it immediately (also, procedure + proof don't always answer the question "why it works").
Basically it comes down to the fact that for two integers a, b (assuming a >= b), it is always possible to write a = bq + r where r < b.
If d=gcd(a,b) then we can write a=ds and b=dt. So we have ds = qdt + r. Since the left hand side is divisible by d, the right hand side must also be divisible by d. And since qdt is divisible by d, the conclusion is that r must also be divisible by d.
To summarise: we have a = bq + r where r < b and a, b and r are all divisible by gcd(a,b).
Since a >= b > r, we have two cases:
Why is this a reduction? Because r < b. So we are dealing with numbers that are definitely smaller. This means that we only have to apply this reduction a finite number of times before we reach r = 0.
Now, r = a % b which hopefully explains the code you have.