Functions in Asymptotic Notation
When we use asymptotic notation to express the rate of growth of an algorithm's running time in terms of the input size
Let's start with something easy. Suppose that an algorithm took a constant amount of time, regardless of the input size. For example, if you were given an array that is already sorted into increasing order and you had to find the minimum element, it would take constant time, since the minimum element must be at index
Now suppose an algorithm took
Create a free account to access the full course.
By signing up, you agree to Educative's Terms of Service and Privacy Policy