34G. Summary

An analysis of an algorithm tells how much time (or sometimes memory) the algorithm takes in terms of the size n of the input. Usually, we aren't concerned with constant factors, and express time using notation O(f(n)) or Θ(f(n)). Roughly, Θ(f(n)) means some constant times f(n), and O(f(n)) means no more than some constant times f(n). (A constant must really be a constant. It cannot depend on n.)

Finding the length of a null-terminated string or a linked list of length n takes time Θ(n) because you need to move across each member once, and it takes a constant amount of time to move across one member. On the other hand, doing a binary search of a sorted array of size n only takes time Θ(log(n).

It is important to get a rough understanding of how rapidly particular functions grow as n grows.

Functions n1, n2, n3 etc. grow faster as the exponent increases. Function log(n) grows very slowly as n grows. Here is a table to illustrate. The values of log(n) are approximate.

n n2 n3 n4 log(n)
10 100 1000 10,000 3
100 10,000 1,000,000 100,000,000 7
1000 1,000,000 1,000,000,000 1,000,000,000,000 10
10,000 100,000,000 1,000,000,000,000 10,000,000,000,000,000 13

Some problems require that you look at each thing in the input at least once. For example, to sort a list, you need to look at each thing. Those problems require time at least n, just to look at each thing once.

Suppose someone says that he has figured out how to sort a list of length n in time Θ(log(n))? That is clearly impossible, because not only is log(n) < n, but, for every constant c > 0 and for all sufficiently large n, log(n) < cn.