The Big-O Notation
Have a look at the runtime of different algorithms.
We'll cover the following
To figure out how long a program would take to run on a real computer, we would need to know things like the speed of the computer, the system architecture, the compiler being used, details of the memory hierarchy, and so on. Therefore, carefully estimating the running time is a rather difficult task. Moreover, in practice, we might not even know some of these details. That is why computer scientists use the big- notation to estimate the running time of an algorithm without knowing anything about all these details!
Big- notation: Comparing running times
Consider two algorithms and denote by and their running times on an input of size . We say that grows no faster than , if there exists a constant such that for every positive integer , (equivalently, ). In this case, we write or . The notation is a standard one, whereas some learners find the notation to be more intuitive.
To give an example, let and . To visualize the order of growth of the two considered functions, let’s plot them for . Here is a simple Python code for generating this plot that you can use as a template for plotting any other functions.
Click the “Run” button below to generate a graph.
import matplotlib.pyplot as pltimport numpy as npn = np.linspace(1, 30)plt.plot(n, 4 * n * np.log2(n), label='$4n\log_2(n)$')plt.plot(n, n ** 2, label='$n^2$')plt.legend()plt.savefig('output/to.png')
As the picture reveals, for , but then the two functions switch. Indeed, for . In particular,
for all .
We conclude that .
Now, let and . On one hand, is larger than for all positive . On the other hand,
for all positive integers . That is, is at most ten times larger than . We conclude that grows no faster than and write or .
Click the “Run” button below to generate a graph.
import matplotlib.pyplot as pltimport numpy as npn = np.linspace(1, 100)plt.plot(n, 2 * n ** 2 + 5 * n + 3, label='$2n^2+5n+3$')plt.plot(n, n ** 2, label='$n^2$')plt.plot(n, 10 * n ** 2, label='$10n^2$')plt.legend(loc='upper left')plt.savefig('output/to.png')
We can also plot the fraction . Run the code below to generate a graph.
import matplotlib.pyplot as pltimport numpy as npn = np.linspace(1, 100)plt.plot(n, (2 * n ** 2 + 5 * n + 3) / (n ** 2))plt.savefig('output/to.png')
This plot shows that (at least, in the considered range ) the fraction is at most and, in fact, approaches as grows.
Now, let’s compare and . First, let’s look at their plots. Click the “Run” button below to generate a graph.
import matplotlib.pyplot as pltimport numpy as npn = np.linspace(1, 100)plt.plot(n, 11 * n, label='$11n$')plt.plot(n, 2 * n ** 2 + 5 * n + 3, label='$2n^2+5n+3$')plt.legend()plt.savefig('output/to.png')
We notice that both functions grow (as grows) but grows slower. This can be made formal as follows.
For two functions, we say that grows slower than and write or , if the fraction goes to zero as grows.
Exercise break: Plot the fraction to ensure that it goes to zero as grows.
Of course, if (equivalently, ), then also (equivalently, ). In plain English: if grows slower than , then certainly grows no faster than .