Analyzing Algorithms

Learn about the different techniques used to analyze algorithms, such as asymptotic analysis and worst-case analysis.

It’s not enough just to write down an algorithm and say, “Behold!” We must also convince our audience (and ourselves!) that the algorithm actually does what it’s supposed to do and that it does so efficiently.

Correctness

In some application settings, it is acceptable for programs to behave correctly most of the time, on all “reasonable” inputs. Not in this course: we require algorithms that are always correct, for all possible inputs. Moreover, we must prove that our algorithms are correct; trusting our instincts, or trying a few test cases, isn’t good enough. Sometimes correctness is truly obvious. On the other hand, “obvious” is all too often a synonym for “wrong.” Most of the algorithms we discuss in this course require real work to be proven correct. In particular, correctness proofs usually involve induction. Induction is an incredibly useful method.

Of course, before we can formally prove that our algorithm does what it’s supposed to do, we have to formally describe what it’s supposed to do.

Running time

The most common way of ranking different algorithms for the same problem is by how quickly they run. Ideally, we want the fastest possible algorithm for any particular problem. In many application settings, it is acceptable for programs to run efficiently most of the time on all “reasonable” inputs. Not in this course—we require algorithms that always run efficiently, even in the worst case.

But how do we measure running time? As a specific example, how long does it take to sing the song BottlesOfBeer(n)BottlesOfBeer(n)? This is obviously a function of the input value nn, but it also depends on how quickly someone can sing. Some singers might take ten seconds to sing a verse; others might take twenty. Technology widens the possibilities even further. Dictating the song over a telegraph using Morse code might take a full minute per verse. Downloading an mp3 over the Web might take a tenth of a second per verse. Duplicating the mp3 in a computer’s main memory might take only a few microseconds per verse.

What’s important here is how the singing time changes as nn grows. Singing BottlesOfBeer(2n)BottlesOfBeer(2n) requires about twice as much time as singing BottlesOfBeer(n)BottlesOfBeer(n), no matter what technology is being used. This is reflected in the asymptotic singing time Θ(n)Θ(n).

We can measure time by counting how many times the algorithm executes a certain instruction or reaches a certain milestone in the “code.” For example, we might notice that the word “beer” is sung three times in every verse of “Bottles of Beer,” so the number of times you sing “beer” is a good indication of the total singing time. For this question, we can give an exact answer: BottlesOfBeer(n)BottlesOfBeer(n) mentions beer exactly 3n+33n + 3 times.

Incidentally, there are lots of songs with quadratic singing time. This one is probably familiar to most English speakers:

Algorithm


NDaysOfChristmas(gifts[2..n]):for inSing “On the ith day of Christmas, my true love gave to mefor ji down to 2Sing “j gifts[j],if i>1Sing “andSing “a partridge in a pear tree.\underline{NDaysOfChristmas(gifts[2 .. n]):} \\ \hspace{0.4cm} for\space i←n \\ \hspace{1cm} Sing\space “On\space the\space ith\space day\space of\space Christmas,\space my\space true\space love\space gave\space to\space me” \\ \hspace{1cm} for\space j ← i\space down\space to\space 2 \\ \hspace{1.7cm} Sing\space “j\space gifts[j],” \\ \hspace{1cm} if\space i > 1 \\ \hspace{1.7cm} Sing\space “and” \\ \hspace{1cm} Sing\space “a\space partridge\space in\space a\space pear\space tree.” ...

Create a free account to access the full course.

By signing up, you agree to Educative's Terms of Service and Privacy Policy