Performance Analysis and Asymptotic Notations | Big Oh | Big Omega | Theta Notation | BitsOfCS


Do you know? How can you analyze the performance of an algorithm? 

Performance Analysis:

Performance analysis of an algorithm is the process of calculating time and space required by that algorithm. There are three types of analysis- 

1. Worst Case:

In the worst case, an algorithm performs the maximum number of steps on the input data of size n.

2. Best Case: 

In the best case, an algorithm performs the minimum number of steps on input data of size n.

3. Average Case:

In the average case, an algorithm performs an average number of steps on input data of n elements.


Space Complexity:-

Total amount of computer memory required by an algorithm to complete its execution is called the space complexity of that algorithm. It can include program space and data space.

Time Complexity:-

The time complexity of an algorithm is the total amount of time required by an algorithm to complete its execution.

 

Asymptotic Notations: 

Using asymptotic analysis, we can evaluate the performance of an algorithm in terms of input size. We can calculate how the time and space taken by an algorithm increases with the input size.

One more thing, asymptotic analysis is not perfect, but in the present time that is the best way available for analyzing the algorithm (It can give an idea about the algorithm). Asymptotic notation of an algorithm is a mathematical representation of its complexity.

Let ‘f‘ and ‘g’ be two functions (non-negative). And is the input size. Then we can define the three most common asymptotic bounds as follows:

  1. Big-Oh ( O ) Notation (Upper Bounding Function)

  2. Big-Omega (Ω ) Notation (Lowest Bounding Function)

  3. Theta (Θ )  Notation (Order Function)

1. Big-Oh (O) Notation:

This notation gives the tight upper bound of the given function (algorithm). Generally, this notation can be represented as-

f(n) = O(g(n))

That means, at the larger values of  ‘n’ the upper bound of f(n) is g(n).

 


Where, 

 0 ≤ f(n) ≤ c*g(n)

For all, 

n ≥ n0 , c > 0 , n0 ≥ 1

Generally,  we can discard the lower values of ‘n’. Because, we know the rate of growth at the lower values of ‘n’ is not important. In the given diagram, ‘n0‘ is the point from which we need to consider the values of ‘n’ that show the actual rate of growth for a given algorithm. As we can see that below n0, the rate of growth could be different. Here, n0 is called ‘threshold’ for the given function.

 

2. Big-Omega (Ω) Notation:

This notation gives the tightest lower bound for the given function (algorithm). Generally, this notation can be represented as-

 f(n) =Ω(g(n))

That means, at larger values of n, the tightest lower bound of f(n) is g(n).


Where, 

 0 ≤ c*g(n) ≤ f(n) 

For all, 

n ≥ n0 , c > 0 , n0 ≥ 1


2. Theta (Θ) Notation:

Now, this notation decides whether the upper and lower bound of a given function (algorithm) are the same. The average running time of an algorithm is always between the lower bound and the upper bound. If the upper bound (O) and lower bound (Ω) gives the same result then the theta (Θ) notation will also have the same rate of growth.


 

f(n)=O(g(n)) && f(n)=Ω(g(n)) => f(n)=Θ(g(n))

Where, 

 0 ≤ c1*g(n) ≤ f(n) ≤ c2*g(n)

For all, 

n ≥ n0 , c1> 0 , c2> 0 , n0 ≥ 1

You may think that there are two more bounding notations called small-oh and small-omega where it is?

If you are curious about small-oh and small-omega, then you can read here!

Properties of Asymptotic Notations:

1. Transitivity:

You know about transitive property? If not then click here!


f(n) = O(g(n)) and g(n) = O(h(n)) then f(n) = O(h(n))

f(n) = Θ(g(n)) and g(n) = Θ(h(n)) then f(n) = Θ(h(n))

f(n) = Ω(g(n)) and g(n) = Ω(h(n)) then f(n) = Ω(h(n))


2. Reflexivity: 

     You know about reflexive property? If not then click here!


f(n) = O(f(n)); f(n) = Θ(f(n)); f(n) = Ω(f(n))

3. Symmetry:

     You know about symmetry? If not then click here!


f(n) = Θ(g(n)) if and only if g(n) = Θ(f(n))


4. Transpose Symmetry:

     You know about transpose symmetry? If not then click here!

 

f(n) = Θ(g(n)) if and only if g(n) = Ω(f(n))


5. If f(n) is in O(k.g(n)) for any constant k>0 then f(n) is in O(g(n)).

Example:

If f(n) = O(5n3) then we can write f(n) = O(n3)

6. If f1(n) is O(g1(n)) and f2(n) is O(g2(n)), then f1(n)+f2(n) is in O(max(g1(n),O(g2(n))).

Example:

If f1(n) = O(n3) and f2(n) = O(n2)

then, f1(n) + f2(n) = O(max(n3,n2)) = O(n3)

7. If f1(n) is O(g1(n)) and f2(n) is O(g2(n)), then f1(n).f2(n) is in O(g1(n).O(g2(n))).

Example:

If f1(n) = O(n3) and f2(n) = O(n2)

then, f1(n).f2(n) = O(n3*n2) = O(n5)

 

Commonly Used Rate of Growths:


Increasing order of growth of time complexity.(top to down)


Time Complexity

Name

O(1)

Constant

logn

Logarithmic

n

Linear

n.logn

Linear Logarithmic

n2

Quadratic

n3

Cubic

nk

Polynomial (k>3)

cn

Exponential (c-constant)

Comparison of Functions:


The decreasing order of growth of time complexity is-

22n > n! > 4n > n2 > nlogn > log(n!) > n > 2logn > √logn > log(logn) > 1

If you want to plot these functions then click here!

 


Conclusion:

This post is extremely important to understanding the algorithms. Why? Because, in every algorithm we will definitely be concerned about time and space complexity and this complexity can be calculated in terms of asymptotic notations.

Resources: Wikipedia | Desmos | slideshare

If you find any problem related to this article, please comment below or contact me here.

manorinfinity Written by:

Complex Problem Solver, Outloud Thinker, An Outstanding Writer, and a very curious human being

Be First to Comment

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.