We use big-Θ notation to asymptotically bound the growth of a running time to within constant factors above and below. Sometimes we want to bound from only above. For example, although the worst-case running time of binary search is Θ(lgn), it would be incorrect to say that binary search runs in Θ(lgn) time in all cases. What if we find the target value upon the first guess? Then it runs in Θ(1) time. The running time of binary search is never worse than Θ(lgn), but it's sometimes better. It would be convenient to have a form of asymptotic notation that means "the running time grows at most this much, but it could grow more slowly." We use "big-O" notation for just such occasions.
If a running time is O(f(n)), then for large enough n, the running time is at most k⋅f(n) for some constant k. Here's how to think of a running time that is O(f(n)):