O(n log log n) time complexity

chrismanderson picture chrismanderson · Mar 5, 2011 · Viewed 11.4k times · Source

I have a short program here:

Given any n:
i = 0;
while (i < n) {
   k = 2;
   while (k < n) {
        sum += a[j] * b[k]
        k = k * k;
   }
   i++;
}

The asymptotic running time of this is O(n log log n). Why is this the case? I get that the entire program will at least run n times. But I'm not sure how to find log log n. The inner loop is depending on k * k, so it's obviously going to be less than n. And it would just be n log n if it was k / 2 each time. But how would you figure out the answer to be log log n?

Answer

Saeed Amiri picture Saeed Amiri · Mar 5, 2011

For mathematical proof, inner loop can be written as:

T(n) = T(sqrt(n)) + 1

w.l.o.g assume 2 ^ 2 ^ (t-1)<= n <= 2 ^ (2 ^ t)=>
we know  2^2^t = 2^2^(t-1) * 2^2^(t-1)
T(2^2^t) = T(2^2^(t-1)) + 1=T(2^2^(t-2)) + 2 =....= T(2^2^0) + t =>
T(2^2^(t-1)) <= T(n) <= T(2^2^t) = T(2^2^0) + log log 2^2^t = O(1) + loglogn

==> O(1) + (loglogn) - 1 <= T(n) <= O(1) + loglog(n) => T(n) = Teta(loglogn).

and then total time is O(n loglogn).

Why inner loop is T(n)=T(sqrt(n)) +1? first see when inner loop breaks, when k>n, means before that k was at least sqrt(n), or in two level before it was at most sqrt(n), so running time will be T(sqrt(n)) + 2 ≥ T(n) ≥ T(sqrt(n)) + 1.