In algorithmic analysis, little-o notation is used to quantitatively state that one function grows strictly slower than another function.
What is the difference between Big-O notation O(n) and Little-O notation o(n)?
algorithm time-complexity big-o asymptotic-complexity little-oNotice that I am asking for little-o here (see similar question here) - for big Oh it's clearly wrong - …
computer-science asymptotic-complexity little-o