...

/

Introduction to Asymptotic Analysis and Big O

Introduction to Asymptotic Analysis and Big O

In this lesson, we will learn about asymptotic notation, an important tool applied to the analysis of algorithms.

We have seen that the time complexity of an algorithm can be expressed as a polynomial. To compare two algorithms, we can compare the respective polynomials. However, the analysis performed in the previous lessons is a bit cumbersome and would become intractable for bigger algorithms that we tend to encounter in practice.

Asymptotic Analysis

One observation that helps us is that we want to worry about large input sizes only. If the input size is really small, how bad can a poorly designed algorithm get, right? Mathematicians have a tool for this sort of analysis called the asymptotic notation. The asymptotic notation compares two functions, say, f(n)f(n) and g(n)g(n) for very large values of nn. This fits in nicely with our need for comparing algorithms for very large input sizes.

Big O Notation

One of the asymptotic notations is the Big O notation. A function f(n)f(n) is considered O(g(n))O(g(n)), read as big oh of g(n)g(n), if there exists some positive real constant cc and an integer n0>0n_0 > 0, such that the following inequality holds for all nn0n \geq n_0:

f(n)cg(n)f(n) \leq cg(n)

The following graph shows a plot of a function f(n)f(n) and cg(n)cg(n) that demonstrates this inequality.

Note that the above inequality does not have to hold for all nn. For n<n0n < n_0, f(n)f(n) is allowed to exceed cg(n)cg(n), but for all values of nn beyond some value n0n_0, f(n)f(n) is not allowed to exceed cg(n)cg(n).

What good is this? It tells us that for very large values of nn, f(n)f(n) will be at most within a constant factor of g(n)g(n) ...