Introduction to Asymptotic Analysis and Big O

Learn to classify the running time complexity of algorithms.

We have seen that the time complexity of an algorithm can be expressed as a polynomial. To compare two algorithms, we can compare their respective polynomials. However, the analysis performed in the previous lessons is a bit cumbersome and would become intractable for bigger algorithms that we tend to encounter in practice.

Asymptotic analysis

One observation that helps us is that we want to worry about large input sizes only. If the input size is really small, how bad can a poorly-designed algorithm get, right? Mathematicians have a tool for this sort of analysis called the asymptotic notation. The asymptotic notation compares two functions, say, f(n)f(n) and g(n)g(n) for very large values of nn. This fits in nicely with our need for comparing algorithms for very large input sizes.

Big O notation

One of the asymptotic notations is the Big O notation. A function f(n)f(n) is considered O(g(n))O(g(n))—read as “big oh” of g(n)g(n)—if there exists some positive real constant cc and an integer n00n_0 \geq 0, such that the following inequality holds for all nn0n \geq n_0:

f(n)cg(n)f(n) \leq cg(n)

The following graph shows a plot of a function f(n)f(n) and cg(n)cg(n) that demonstrates this inequality.

Level up your interview prep. Join Educative to access 70+ hands-on prep courses.