Gaming App Development Company

BLOG ON TIME COMPLEXITY OF ALGORITHM

What is Time Complexity?

Time Complexity is the time consumed by the algorithm to complete its execution. Time Complexity is measured by summing the time of all the elementary operations. There are many notations to represent the time complexity like:-

(1)Big Oh, O

(2)Omega, Ω

(3)ThetaΘ

But, the most commonly used notation for Time Complexity is Big O Notation. It is nothing but an asymptotic notation to represent the time complexity of an algorithm. There are three types of Time Complexity:-

  1. Worst Case Time Complexity
  2. Best Case Time Complexity
  3. Average Case Time Complexity

BIG O NOTATION: Big O Notation is used for the measurement of the complete execution of the algorithm. And basically, it represents the worst-case time complexity. Bog O Notation gives the upper limit of a given function and is described as f(n)=O(g(n)).

Here we can see that n’ is an intersection point of both the graph f(n) and c*g(n). And f(n) = O(g(n)) means there exists some positive constants c and n’  so that  0<=f(n)<=c*g(n) for all n>=n’.

Now, we have to analyze the rate of growth of some elementary functions and some derived functions:-

———–>>>>

  • f(n)=2*n^2 + 4*n
  • f(1)=2*1^2 +4*1 = 2 + 4 = 6
  • f(2)=2*2^2 +4*2 = 8 + 8 = 16
  • f(3)=2*3^2 +4*3 = 18 + 12 =30
  • f(4)=2*4^2 +4*4 = 32 + 16= 48

From the above analysis, we can say that n^2 is the leading term in this function and that’s nature is Quadratic. So we can say that the Order of this function is n^2 because it is a leading term.

———–>>>>

  • f(n) = 2nlogn + 4n +6
  • f(1) = 2*1*log1 + 4*1 +6 = 2*1*0 + 4 + 6 = 10
  • f(2) = 2*2*log2 + 4*2 +6 = 2*2*1 + 8 + 6 = 18
  • f(3) = 2*3*log3 + 4*3 +6 = 2*2*1.584963 + 12+6 = 24.339852
  • f(4) = 2*4*log4 + 4*4 +6 =2*4*2 + 16 + 6 =38

From the above calculations, we can observe here that the leading term in this function is n*log n(base 2) and that’s nature is Linearithmic. So we can say that the Order of this function is n*log n(base 2) because it is a leading term.

In this way, We can find the time complexities of different-different algorithms.

———*************——————-

public static void main(String args[])

{

    System.out.println("Hello Universe ");//takes a constant time t1 to execute

}

f(n) = c1

f(n) = c1*n^0;

So, O(f(n)) = n^0=1

———*************——————-

sum = 0;                                              //takes a constant time t1

for(int i=0 ; i<n ; i++)

{

   sum = sum + i;                              //take a constant time t2

}

f(n) = t1 + t2 * n;

So, O(f(n)) = n^1=n

———*************——————-

public static void main (String args[])
{

    int m , n;

    int sum=0;                                                                 //takes constant time t1

   for(int i=0;i<m;i++)

   {

            for(int j=0;j<n;j++)

          {

                     System.out.println("Hello Universe ");  //takes constant time t2  

                     sum=sum+n;                                                  //takes constant time t3

          }

    }

}

expression= m*(n*(t3+t2)) + t1

expression= (m*n) * (t2+t3) + t1

When m=n

expression= (n^2) * (t2+t3) + t1

f(n) = (n^2) * (t2+t3) + t1

So, O(f(n)) =n^2

———*************——————-

public static void main(String args[])

{

    int sum=0;   //takes a constant time t1

   for(int i=1 ; i<=n ; i * 2)

     {  

         sum = sum +i ;  //takes constant time t2

      }

}

consider for loop:  2*2*2*…………2(k times) = n

2^k = n

k = log n(base 2)

So, f(n) = k * t2 + t1

f(n) = log n * t2 + t1

Time complexity O(f(n)) = log n(base 2)

Now We’ll talk about some sorting algorithms and their Worst Case Time Complexity.

  • Insertion Sort – O(n^2),
  • Selection Sort – O(n^2),
  • Bubble Sort – O(n^2),
  • Quick Sort – O(n^2),
  • Merge Sort – O(n*log n)

Comparison of Worst Case Time Complexity of given algorithms

By Graph we can analyze the:

  • When n=2 then   n^2   >  n*log n
  • When n=3 then   n^2   >  n*log n
  • When n=4 then   n^2   >  n*log n

Here there is a question, Why are we considering Worst Case Time Complexity?, Because in any algorithm we do not make sure which case will happen which is why we are considering Worst Case Time Complexity.

So, By the above Comparision, we can say that Merge Sort is better than all the other four Sorting Algorithms in the worst case.

Leave a Reply

Your email address will not be published. Required fields are marked *