Prime Sieve in Haskell

sheep picture sheep · Aug 2, 2012 · Viewed 9.8k times · Source

I'm very new to Haskell and I'm just trying to find the sum of the first 2 million primes. I'm trying to generate the primes using a sieve (I think the sieve of Eratosthenes?), but it's really really slow and I don't know why. Here is my code.

sieve (x:xs) = x:(sieve $ filter (\a -> a `mod` x /= 0) xs)
ans = sum $ takeWhile (<2000000) (sieve [2..])

Thanks in advance.

Answer

Daniel Fischer picture Daniel Fischer · Aug 2, 2012

It is very slow because the algorithm is a trial division that doesn't stop at the square root.

If you look closely what the algorithm does, you see that for each prime p, its multiples that have no smaller prime divisors are removed from the list of candidates (the multiples with smaller prime divisors were removed previously).

So each number is divided by all primes until either it is removed as a multiple of its smallest prime divisor or it appears at the head of the list of remaining candidates if it is a prime.

For the composite numbers, that isn't particularly bad, since most composite numbers have small prime divisors, and in the worst case, the smallest prime divisor of n doesn't exceed √n.

But the primes are divided by all smaller primes, so until the kth prime is found to be prime, it has been divided by all k-1 smaller primes. If there are m primes below the limit n, the work needed to find all of them prime is

(1-1) + (2-1) + (3-1) + ... + (m-1) = m*(m-1)/2

divisions. By the Prime number theorem, the number of primes below n is asymptotically n / log n (where log denotes the natural logarithm). The work to eliminate the composites can crudely be bounded by n * √n divisions, so for not too small n that is negligible in comparison to the work spent on the primes.

For the primes to two million, the Turner sieve needs roughly 1010 divisions. Additionally, it needs to deconstruct and reconstruct a lot of list cells.

A trial division that stops at the square root,

isPrime n = go 2
  where
    go d
      | d*d > n        = True
      | n `rem` d == 0 = False
      | otherwise      = go (d+1)

primes = filter isPrime [2 .. ]

would need fewer than 1.9*109 divisions (brutal estimate if every isPrime n check went to √n - actually, it takes only 179492732 because composites are generally cheap)(1) and much fewer list operations. Additionally, this trial division is easily improvable by skipping even numbers (except 2) as candidate divisors, which halves the number of required divisions.

A sieve of Eratosthenes doesn't need any divisions and uses only O(n * log (log n)) operations, that is quite a bit faster:

primeSum.hs:

module Main (main) where

import System.Environment (getArgs)
import Math.NumberTheory.Primes

main :: IO ()
main = do
    args <- getArgs
    let lim = case args of
                (a:_) -> read a
                _     -> 1000000
    print . sum $ takeWhile (<= lim) primes

And running it for a limit of 10 million:

$ ghc -O2 primeSum && time ./primeSum 10000000
[1 of 1] Compiling Main             ( primeSum.hs, primeSum.o )
Linking primeSum ...
3203324994356

real    0m0.085s
user    0m0.084s
sys     0m0.000s

We let the trial division run only to 1 million (fixing the type as Int):

$ ghc -O2 tdprimeSum && time ./tdprimeSum 1000000
[1 of 1] Compiling Main             ( tdprimeSum.hs, tdprimeSum.o )
Linking tdprimeSum ...
37550402023

real    0m0.768s
user    0m0.765s
sys     0m0.002s

And the Turner sieve only to 100000:

$ ghc -O2 tuprimeSum && time ./tuprimeSum 100000
[1 of 1] Compiling Main             ( tuprimeSum.hs, tuprimeSum.o )
Linking tuprimeSum ...
454396537

real    0m2.712s
user    0m2.703s
sys     0m0.005s

(1) The brutal estimate is

2000000
   ∑ √k ≈ 4/3*√2*10^9
 k = 1

evaluated to two significant digits. Since most numbers are composites with a small prime factor - half the numbers are even and take only one division - that vastly overestimates the number of divisions required.

A lower bound for the number of required divisions would be obtained by considering primes only:

   ∑ √p ≈ 2/3*N^1.5/log N
 p < N
p prime

which, for N = 2000000 gives roughly 1.3*108. That is the right order of magnitude, but underestimates by a nontrivial factor (decreasing slowly to 1 for growing N, and never greater than 2 for N > 10).

Besides the primes, also the squares of primes and the products of two close primes require the trial division to go up to (nearly) √k and hence contribute significantly to the overall work if there are sufficiently many.

The number of divisions needed to treat the semiprimes is however bounded by a constant multiple of

N^1.5/(log N)^2

so for very large N it becomes negligible relative to the cost of treating primes. But in the range where trial division is feasible at all, they still contribute significantly.