Frequently when examining computational problems we wish to compare different problem solutions to each other in terms of efficiency, as well as considering theoretical limits on attainable efficiency.

To do so, we require a few basic tools and techniques, as outlined below.

Time and space efficiency of algorithms


Why time trials are inadequate


Why worry about algorithm efficiency?


The basics of asymptotic notation

The rationale for using asymptotic notation is, firstly, as the value of N gets large the contribution of the constants becomes relatively unimportant when comparing the behaviour of two or more algorithms, and secondly, some functions show very erratic behaviour in some data ranges so an overall view gives a clearer picture than we would get from using a specific set of values for N.

With respect to the first point, some functions grow much more quickly than others once the value of N becomes large.

For instance, N2 grows much more quickly than lg(N).

Given sufficienctly large values of N, even the addition of extra terms would have relatively little impact: e.g. N2 - N would still grow much more quickly than 100*lg(N), even for large values of N.

For this reason, we will discuss the order of functions - look at only the largest relevant non-constant terms when comparing functions.

Thus we will group large sets of functions into orders, which have approximately the same relative growth.

To be precise, we will say function f(N) is in order O(g(N)) if, for all sufficiently large values of N, there exists some constant C such that f(N) ≤ Cg(N).

Here are a few examples of functions in different orders:
O(1) O(lg(N)) O(N) O(N2) O(2N)
0.00001
1
1000000
lg(N) / 17
100000 lg(N)
100 lg(N) + 99999
1
N + 1/N
10000 N + 99999 lg(N)
N/100000
99999999
3N2 + 10000 N + lg(N) + 900
1000 lg(N)
N
30000000 N2 + 10000 N1.9
2N + N2 + √N + N + lg(N) + 1
2N + 1.5N + N100
0.00001
Observe that the "smaller" orders are completely contained within the larger ones.

When we determine the running time of two algorithms as some functions, f(N) and g(N), we will usually want to show that one algorithm is better (or worse, or no better, etc) than another.

We use asymptotic complexity as the broad categories to group functions in terms of "equivalent efficiency".

Examples:


What are the elementary operations?

To actually compute the efficiency of a specific algorithm, we need to define what we use as indicators of running time - i.e. what are the fundamental operations performed, and how many of them are performed on a problem of size N?

How do you measure problem size?


Example: binary search

Suppose we are given an array of N integers, sorted in increasing order, and a search routine which determines if a specific key is found anywhere in the array between two specific points (lower and upper).

Below we give one version which uses a linear search through the data, and two different versions which use a binary search, one iterative and one recursive:


int lin_search(int arr[], int key, int lower, int upper)
// return the position of the key if found
// otherwise return -1
{
   int pos;
   pos = lower;
   while ((pos <= upper) && (arr[pos] != key)) pos++;
   if (pos <= upper) return(pos);
   else return(-1);
}

int it_bsearch(int arr[], int key, int lower, int upper)
// return the position of the key if found
// otherwise return -1
{
   int low, upp, mid, found;
   found = 0;  low = lower; upp = upper;
   
   while ((low <= upp) && (found == 0)) {
      mid = (low + upp) / 2;
      if (arr[mid] == key) found = 1;
      else {
         if (arr[mid] < key) low = mid + 1;
         else upp = mid - 1;
      }
   }

   if (found == 1) return(mid);
   else return(-1);
}

int rec_bsearch(int arr[], int key, int lower, int upper)
// return the position of the key if found
// otherwise return -1
{
   int mid;
   if (lower > upper) return(-1);
   mid = (lower + upper) / 2;
   if (arr[mid] == key) return(mid);
   if (arr[mid] < key) return(rec_bsearch(arr, key, mid+1, upper));
   else return(rec_bsearch(arr, key, lower, mid-1)); 
}