Algorithm Analysis

The performance of a program usually depends on:

An algorithm is a step-by-step procedure for solving a problem in a finite amount of time. Time efficiency of an algorithm indicates how fast an algorithm runs; space efficiency deals with the extra space the algorithm requires. Other analysis includes correctness, robustness and maintainability.

The running time of an algorithm typically grows with the input size. Average case time efficiency is often difficult to determine. So we usually focus on the worst case scenario analysis because it's easier to analyze and crucial to applications.

We can use experiments to measure an algorithm's running time as a function of the input size. But there are limitations with this method:

The preferred way is to analyze the running time of an algorithm theoretically, characterize the running time as a function of the input size (N). This method allows us to evaluate the time efficiency of an algorithm independent of the hardware and software environment.

General steps of asymptotic analysis of an algorithm:

It is obvious that almost all algorithms takes longer to run on larger inputs. We usually use a parameter, N, to indicate the input size of an algorithm. Sometimes, several parameters combined together are used to define the input size.

Primitive Operations include:

Treating all primitive operations the same and ignoring the hardware and software environment difference may affect the estimate of the running time of an algorithm by a constant factor, but it does not alter the growth rate of the running time efficiency function.

For large input size N, it is the function's order of growth that matters:
NlogNNNlogNN2 N32NN!
1031033100 1K1K3.6M
325321601K 32K4Blarge
646643844K 256K16BBlarge
10310103104 106109--
10620106107 10121018--

As a comparison, 16 billion billion seconds is 543 billion years. The Earth's age is about 4.5 billion years, and the universe's age is about 14 billion years.

Asymptotic Notations

The Ο notation denotes a class of functions. The growth rates of all functions in the same class are the same. The Ο families provide us an convenient way to analyze algorithms because they let us focus on the big picture rather than the details.

Ο Rules

Rules for Ο Analysis of running time of an algorithm:

Ο Analysis for recursive algorithms:

The Master Theorem:

T(n) = aT(n/b) + f(n), if n >= d

  1. If there is a small constant ε > 0, such that f(n) is Ο(nlogba - ε), then T(n) is Θ(nlogba).
  2. If there is a constant k >= 0, such that f(n) is Θ(nlogbalogkn), then T(n) is Θ(nlogbalogk+1n).
  3. If there are small constants ε > 0 and δ < 1, such that f(n) is &Omiga;(nlogba + ε) and af(n/b) <= δf(n) for n >= d, then T(n) is Θ(f(n)).