Big O and Big Omega Notations
Discusses the Big O notation with examples
We'll cover the following...
In the previous section, we defined Θ notation. In this section, we'll discuss big O and big Ω notations. The reader would notice the use of the adjective "big" with these notations and rightly suspect that there exist complementary small o and small ω notations, which we'll discuss in the next lesson.
Big O
Big O is expressed as O(g(n)) and is pronounced as "big oh of g of n". We observed that Θ provides both an asymptotically upper and lower tight bound. Big O only provides an asymptotic upper bound which may not necessarily be a tight bound.
Big O is the most commonly talked-about notation when discussing algorithms in industry settings or in interviews. You won't see Θ or other notations being discussed, and for good reason too. Generally, if we are guaranteed that an algorithm will perform no worse than a certain threshold, and that threshold is acceptable for the problem at hand, then knowing its best-case performance or finding a very tight upper bound may not be productive.
Formal Definition
Similar to Θ we define O(g(n)) as a set of functions and a function f(n) can be a member of this set if it satisfies the following conditions:
where c is a positive constant and the inequality holds after n crosses a positive threshold n0
Explanation
if f(n) = Θ (g(n)) then it also implies that f(n) = O (g(n))
The notation means that the function f(n) is bounded by above to within a constant factor of g(n).
Note that the inequality requires f(n) to be greater than 0 after n crosses n0.
The set O(g(n)) may also contain other functions also that satisfy the inequality. There could be several values of c that can be used to satisfy the inequality.
Insertion Sort
Previously, we came up with the following f(n) for the insertion sort algorithm
...