Are there are any cases where you would prefer O(log n)
time complexity to O(1)
time complexity? Or O(n)
to O(log n)
?
Do you have any examples?
There can be many reasons to prefer an algorithm with higher big O time complexity over the lower one:
10^5
is better from big-O point of view than 1/10^5 * log(n)
(O(1)
vs O(log(n)
), but for most reasonable n
the first one will perform better. For example the best complexity for matrix multiplication is O(n^2.373)
but the constant is so high that no (to my knowledge) computational libraries use it.O(n*log(n))
or O(n^2)
algorithm.O(log log N)
time complexity to find an item, but there is also a binary tree which finds the same in O(log n)
. Even for huge numbers of n = 10^20
the difference is negligible.O(n^2)
and requires O(n^2)
memory. It might be preferable over O(n^3)
time and O(1)
space when the n is not really big. The problem is that you can wait for a long time, but highly doubt you can find a RAM big enough to use it with your algorithmO(n^2)
, worse than quicksort or mergesort, but as an online algorithm it can efficiently sort a list of values as they are received (as user input) where most other algorithms can only efficiently operate on a complete list of values.