ScaleFree

DS & A (Algorithm Analysis - Asymptotic Analysis)

| Asymptotic Analysis - CHAPTER 4 |

In the field of Computer Science Asymptotic Analysis is the branch that deals with measuring algorithm efficiency. In Asymptotic Analysis, we estimate how the time required(or space required) by an algorithm increases with the increase in problem size. Typically, we focus on the time required by an algorithm(or a program) but at times we need to consider the required space too.

Time Complexity 

Time Complexity is the quantification of the time taken by an algorithm or a program to execute as a function of the size of input/problem. It is generally denoted by T(n), where n is the size of the input/problem and T is the time taken by the algorithm to run as a function of n. 

i.e. Time complexity = T(n) = function(n) 

Note: T(n) is always positive

How to calculate the Time Complexity?


In order to calculate the time complexity of an algorithm, at first, we need to determine the basic operations required by the algorithm to process an input/problem of a certain size. The "basic operations" and "size" generally depends on the algorithm being analyzed. Adding, multiplying or comparing two numbers are examples of basic operations.

Note: The time required to complete a basic operation is always independent of the values of its operands. For some problems, the size of the input is the size of the problem, such as in sorting algorithms, but that's not true for every problem.

But something like finding the product of all the contents in an array(say, A[n]) is not a basic operation because the cost of the operation depends on the size of the array n, i.e. the size of the Data Structure

Note: 

Example

Consider the following code:
  1. int ans = 0;                 //Assignment operation
  2. for(int i = 0; i<10; i++)
  3. {
  4.    ans++;                    //Increment Operation 
  5. }

So, what will be the time complexity of the above code? Let's figure it out.

As we can see that the above-mentioned code has two basic operations, namely the assignment operation and the increment operation. So, in simple words

T(n) = Time Complexity of Assignment operation + 10 x (Time Complexity of Increment operation)

Why 10 x (Time Complexity of Increment operation)?


The Time Complexity of increment operation is multiplied by 10 because the increment operation executes inside a for loop which runs 10 times, which means the increment operation executes 10 times.


And now all we have to do is put the values for Time Complexity of the operations in the equation T(n). But we never calculated the 
Time Complexity of the operations. So, how do we find the Time Complexity for the operations? Actually, we don't really need to calculate the Time Complexity of the basic operations because, we know by definition, basic operations require constant time to execute. so, let's assume,
  • Time Complexity of Assignment operation =  C1
  • Time Complexity of Increment operation = C2
Therefore, T(n) = C1 + 10C2 = C (which is a constant)
So, we can say that T(n) is of constant running time

Drawback

I know that, by now, you guys might be wondering what's the use of knowing if an algorithm is of constant Time Complexity, how is it of any use to us. But remember what we learned in our previous chapter of "Algorithm evaluation"



The major purpose of knowing the Time Complexity of the code above is so that we can determine the efficiency of the code, but only with respect to some other code.

In fact, this is the very drawback of Asymptotic analysis of the algorithm. Asymptotic analysis is actually an estimating technique and does not tell us anything about the relative merits of the two compared algorithms where one is "faster" than the other.

Note: Notice that I considered one algorithm to be faster than the other one because it's completely meaningless to compare algorithms with equivalent efficiency as Asymptotic analysis does not tell us anything about the relative merits of the compared algorithms.


Another Example

Consider the following code:

  1. int var = 0;                 //Assignment operation
  2. for (int i=0; i<= n; i++)
  3.    for (int j=0; j<= n; j++) 
  4.    var++;                    //Increment Operation 
  5. }

So, what's the Time Complexity of this code fragment?

It's, T(n) = C1 + C2n² = C2n² (Since, C2n² >> C1 for greater values of n )

Growth Rate


The growth rate of an algorithm is the rate at which the cost of the algorithm increases with respect to the size of the input/problem.


COST
GROWTH RATE
n
Linear
Log n
Logarithmic
n2
Quadratic
2n
Exponential
n!
Factorial

In the field of Computer Science algorithms with Linear and Logarithmic growth rate are usually preferred. Algorithms with Quadratic growth rate are acceptable at times too. But something like exponential growth rate must be really avoided.


Growth Rate Graph

( If you like the content here, follow TSP on social media to get notified on future updates)



Comments

Share on Social Media

Most Viewed Posts

DS & A (Lists - Array Based)

SPACE & DRAGONS

DATA STRUCTURES AND ALGORITHMS