Time Complexity of Algorithms

Learn about fast vs. slow algorithms and the big-O notation to estimate the time complexity of algorithms.

Fast vs. slow algorithms

Real computers require a certain amount of time to perform an operation such as addition, subtraction, or testing the conditions in a while loop. A supercomputer might take 101010^{-10} seconds to perform addition, while a calculator might take 10510^{5} seconds. Suppose we had a computer that took 101010^{-10} seconds to perform an elementary operation such as addition, and we knew how many operations a particular algorithm would perform.

We could estimate the running time of the algorithm simply by taking the product of the number of operations and the time per operation. However, computers are constantly improving, leading to a decreasing time per operation, so our notion of the running time would soon be outdated.

Rather than computing an algorithm’s running time on every computer, we rely on the total number of operations that the algorithm performs to describe its running time since this is an attribute of the algorithm and not an attribute of the computer we happen to be using.

Unfortunately, determining how many operations an algorithm will perform is not always easy. Suppose we know how to compute the number of basic operations that an algorithm performs. In that case, we have a basis for comparing it against a different algorithm that solves the same problem. Rather than tediously counting every multiplication and addition, we can perform this comparison by gaining a high-level understanding of the growth of each algorithm’s operation count as the size of the input increases.

Suppose an algorithm AA performs n2n^2 operations on an input of size nn, and an algorithm BB solves the same problem in 3n+23n + 2 operations. Which algorithm, AA or BB, is faster? Although AA may be faster than BB for some small nn (for example, for nn between 11 and 33), BB will become faster for large nn (for example, for all n>4n > 4). See the figure below. Since f(n)=n2f (n) = n^2 is, in some sense, a faster-growing function than g(n)=ng(n) = n with respect to nn, the constants 33 and 22 in 3n+23n + 2 do not affect the competition between the two algorithms for large nn. We refer to AA as a quadratic algorithm and to BB as a linear algorithm and say that AA is less efficient than BB because it performs more operations to solve the same problem when nn is large. Therefore, we will often be somewhat imprecise when we count the operations of an algorithm—the behavior of algorithms on small inputs does not matter.

Level up your interview prep. Join Educative to access 70+ hands-on prep courses.