Using big-O to prove N^2 is O(2^N)

Timothy Leung picture Timothy Leung · Jul 12, 2012 · Viewed 27.5k times · Source

I can clearly see than N^2 is bounded by c2^N, but how do i prove it by using formal definition of big-O. I can simply prove it by M.I.

Here is my attempt.. By definition, there for any n>n0, there exist a constant C which f(n) <= Cg(n) where f(n) = n^2 and g(n) = 2^n

Should I take log to both side and solve for C?

and one more question about fibonacci sequence, i wanna solve the recurrence relation.

int fib(int n){
   if(n<=1) return n;
   else return fib(n-1) + fib(n-2);

The equation is..

T(n) = T(n-1)+T(n-2)+C // where c is for the adding operation
so
T(n) = T(n-2) + 2T(n-3) + T(n-4) + 3c 

and one more

T(n) = T(n-3) + 3T(n-4) + 3T(n-5) + T(n-6) + 6c

then i started to get lost to form the general equation i The pattern is somehow like pascal triangle?

 t(n) = t(n-i) + aT(n-i-1) + bT(n-i-2) + ... + kT(n-i-i) + C 

Answer

aioobe picture aioobe · Jul 12, 2012

As you point out, to see if f(x) ϵ O(g(x)) you need to find...

  • ...some c > 0 and
  • ...some x0

such that f(x) < c·g(x) for all x > x0.

In this case, you can pick c = 1 and x0 = 2. What you need to prove is that

    x2 < 2x for all x > 2

At this point you can log both sides (since if log(x) > log(y), then x > y.) Assuming you're using base-2 log you get the following

    log(x2) < log(2x)

and by standard laws of logarithms, you get

    2·log(x) < x·log(2)

Since log(2) = 1 this can be written as

    2·log(x) < x

If we set x = 2, we get

    2·log(2) = 2

and since x grows faster than log(x) we know that 2·log(x) < x holds for all x > 2.