I want to generate the digits of the square root of two to 3 million digits.
I am aware of Newton-Raphson but I don't have much clue how to implement it in C or C++ due to lack of biginteger support. Can somebody point me in the right direction?
Also, if anybody knows how to do it in python (I'm a beginner), I would also appreciate it.
You could try using the mapping:
a/b -> (a+2b)/(a+b)
starting with a= 1, b= 1
. This converges to sqrt(2) (in fact gives the continued fraction representations of it).
Now the key point: This can be represented as a matrix multiplication (similar to fibonacci)
If a_n and b_n are the nth numbers in the steps then
[1 2] [a_n b_n]T = [a_(n+1) b_(n+1)]T
[1 1]
which now gives us
[1 2]n [a_1 b_1]T = [a_(n+1) b_(n+1)]T
[1 1]
Thus if the 2x2 matrix is A, we need to compute An which can be done by repeated squaring and only uses integer arithmetic (so you don't have to worry about precision issues).
Also note that the a/b you get will always be in reduced form (as gcd(a,b) = gcd(a+2b, a+b)), so if you are thinking of using a fraction class to represent the intermediate results, don't!
Since the nth denominators is like (1+sqrt(2))^n, to get 3 million digits you would likely need to compute till the 3671656th term.
Note, even though you are looking for the ~3.6 millionth term, repeated squaring will allow you to compute the nth term in O(Log n) multiplications and additions.
Also, this can easily be made parallel, unlike the iterative ones like Newton-Raphson etc.