(log n)^k = O(n)? For k greater or equal to 1.
My professor presented us with this statement in class, however I am not sure what it means for a function to a have a time complexity of O(n). Even stuff like n^2 = O(n^2)
, how can a function f(x) have a run time complexity?
As for the statement how does it equal O(n) rather than O((logn)^k)?
(log n)^k = O(n)?
Yes. The definition of big-Oh is that a function f
is in O(g(n)) if there exist positive constants N and c, such that for all n > N
: f(n) <= c*g(n)
. In this case f(n)
is (log n)^k
and g(n)
is n
, so if we insert that into the definition we get: "there exist constants N and c, such that for all n > N
: (log n)^k <= c*n
". This is true so (log n)^k
is in O(n).
how can a function f(x) have a run time complexity
It doesn't. Nothing about big-Oh notation is specific to run-time complexity. Big-Oh is a notation to classify the growth of functions. Often the functions we're talking about measure the run-time of certain algorithms, but we can use big-Oh to talk about arbitrary functions.