In a previous problem, I showed (hopefully correctly) that f(n) = O(g(n))
implies lg(f(n)) = O(lg(g(n)))
with sufficient conditions (e.g., lg(g(n)) >= 1, f(n) >= 1
, and sufficiently large n).
Now, I need to prove OR disprove that f(n) = O(g(n))
implies 2^(f(n)) = O(2^g(n)))
. Intuitively, this makes sense, so I figured I could prove it with help from the previous theorem. I noticed that f(n)
can be rewritten as lg(2^f(n))
and that g(n)
is just lg(2^g(n))
, which got me excited...this is taking the log base 2 of both sides of what I want to prove, and it simplifies things a lot!
But I'm pretty sure this won't work. Just because lg(2^f(n)) = O(lg(2^g(n)))
does not necessarily mean that 2^f(n) = O(2^g(n))
...that's backwards from the previous theorem (which said "implies", not "if and only if").
Do I need to try this proof another way, or can I actually go off of what I have (at least as a starter)?
**Speaking of other ways, maybe I could just argue about how raising 2 to some g(n)
that is "above" an f(n)
will still keep it higher? It almost feels like a common sense argument, but maybe I'm missing something important..
**Oh, oops! I forgot to add that f(n)
and g(n)
are asymptotically positive. By our textbook definition, this means that they are "positive for all sufficiently large n."
Well, it's not even true to begin with.
Let's say algorithm A takes 2n steps, and algorithm B takes n steps. Then their ratio is a constant.
But the ratio of 22n and 2n is not a constant, so what you said doesn't hold.