Time Complexity Order
Let’s learn about various commonly used complexity functions along with their usage.
Algorithms’ time complexity
The following table clarifies big- representations of different time complexity functions.
Name | Notation |
---|---|
Constant | |
Logarithmic | |
Linear | |
Quadratic | |
Polynomial | where c is constant and greater than 1 |
Exponential | where c is constant and greater than 1 |
Factorial or N-power-N | or |
The time taken by certain algorithms to run varies dramatically with the size of the input. Some algorithms take seconds or even minutes to run on huge input, whereas others may take days to complete their execution. To understand how the rate of growth changes with the size of the input in different functions, the following table presents the approximate number of steps required to run an algorithm:
Growth rate of functions
N | Function Growth Rate (Approximate) | ||||||
O(1) | O(log n) | O(n) | O(n log n) | O(n2) | O(n3) | O(2 n) | |
10 | 1 | 3 | 10 | 30 | 102 | 103 | 103 |
102 | 1 | 6 | 102 | 6x102 | 104 | 106 | 1030 |
103 | 1 | 9 | 103 | 9x103 | 106 | 109 | 10300 |
104 | 1 | 13 | 104 | 13x104 | 108 | 1012 | 103000 |
105 | 1 | 16 | 105 | 16x105 | 1010 | 1015 | 1030000 |
106 | 1 | 19 | 106 | 19x106 | 1012 | 1018 | 10300000 |
Growth of functions
Let’s look at these growth rates of functions in more detail to apply them easily in later lessons. The size of the input is represented as .
Constant function
The number of steps required to execute an algorithm does not change with the change in the input size. The complexity of such algorithms is represented as . For example:
- Push and pop operations of a stack.
- Returning the first element of a list.
- Getting an element from a hash table.
Linear time
If the execution time of an algorithm is directly proportional to its input size, that means the algorithm runs in linear time. It is represented as . Some examples of linear time are:
- Searching for an element in an array.
- Traversal of a linked list to find the maximum or minimum element.
Note: When we need to see all of the nodes in a data structure for any job, the complexity is .
Logarithmic time
If the execution time of an algorithm is proportional to the logarithm of the input size, it is said to run in logarithmic time. It is represented as . A large portion (e.g., half) of the input is pruned out without traversing it at each algorithm stage. Binary Search is a common example of logarithmic time.
Quadratic time
If the execution time of an algorithm is proportional to the square of the input size, it is said to run in quadratic time. It is represented as (). Each element is compared to all other elements in these algorithms. For example, in two nested loops, if the outer loop runs times and the inner loop also runs times, then the body of the inner loop will execute times.
Exponential time
The exponential time is represented as . Usually, in such algorithms, all possible subsets of elements of input data are generated. A common example is the power set.
Factorial time $O(n!)
In such algorithms, all possible permutations of the elements of input data are generated. Finding permutations is an example of a factorial time algorithm. It is represented as .