When can an algorithm have square root(n) time complexity?

vaibhav9225 picture vaibhav9225 · Oct 18, 2015 · Viewed 39.9k times · Source

Can someone give me example of an algorithm that has square root(n) time complexity. What does square root time complexity even mean?

Answer

a3.14_Infinity picture a3.14_Infinity · Oct 18, 2015
  • Square root time complexity means that the algorithm requires O(N^(1/2)) evaluations where the size of input is N.
  • As an example for an algorithm which takes O(sqrt(n)) time, Grover's algorithm is one which takes that much time. Grover's algorithm is a quantum algorithm for searching an unsorted database of n entries in O(sqrt(n)) time.
  • Let us take an example to understand how can we arrive at O(sqrt(N)) runtime complexity, given a problem. This is going to be elaborate, but is interesting to understand. (The following example, in the context for answering this question, is taken from Coding Contest Byte: The Square Root Trick , very interesting problem and interesting trick to arrive at O(sqrt(n)) complexity)

    • Given A, containing an n elements array, implement a data structure for point updates and range sum queries.

      • update(i, x)-> A[i] := x (Point Updates Query)
      • query(lo, hi)-> returns A[lo] + A[lo+1] + .. + A[hi]. (Range Sum Query)
    • The naive solution uses an array. It takes O(1) time for an update (array-index access) and O(hi - lo) = O(n) for the range sum (iterating from start index to end index and adding up).

    • A more efficient solution splits the array into length k slices and stores the slice sums in an array S.
    • The update takes constant time, because we have to update the value for A and the value for the corresponding S. In update(6, 5) we have to change A[6] to 5 which results in changing the value of S1 to keep S up to date. Updating element
    • The range-sum query is interesting. The elements of the first and last slice (partially contained in the queried range) have to be traversed one by one, but for slices completely contained in our range we can use the values in S directly and get a performance boost. Range-sum query
    • In query(2, 14) we get,

 query(2, 14) = A[2] + A[3]+ (A[4] + A[5] + A[6] + A[7]) + (A[8] + A[9] + A[10] + A[11]) + A[12] + A[13] + A[14] ; 
 query(2, 14) = A[2] + A[3] + S[1] + S[2] + A[12] + A[13] + A[14] ;
 query(2, 14) = 0 + 7 + 11 + 9 + 5 + 2 + 0;
 query(2, 14) = 34;
  • The code for update and query is:

  def update(S, A, i, k, x):
      S[i/k] = S[i/k] - A[i] + x
      A[i] = x

  def query(S, A, lo, hi, k):
      s = 0
      i = lo
      //Section 1 (Getting sum from Array A itself, starting part)
      while (i + 1) % k != 0 and i <= hi:
          s += A[i]
          i += 1
      //Section 2 (Getting sum from Slices directly, intermediary part)
      while i + k <= hi:
          s += S[i/k]
          i += k
      //Section 3 (Getting sum from Array A itself, ending part)
      while i <= hi:
          s += A[i]
          i += 1
  return s
  • Let us now determine the complexity.
  • Each query takes on average
    • Section 1 takes k/2 time on average. (you might iterate atmost k/2)
    • Section 2 takes n/k time on average, basically number of slices
    • Section 3 takes k/2 time on average. (you might iterate atmost k/2)
    • So, totally, we get k/2 + n/k + k/2 = k + n/k time.
  • And, this is minimized for k = sqrt(n). sqrt(n) + n/sqrt(n) = 2*sqrt(n)
  • So we get a O(sqrt(n)) time complexity query.