Determining the complexities given codes

In general, there is no way to determine the complexity of a given function

Warning! Wall of text incoming!

1. There are very simple algorithms that no one knows whether they even halt or not.

There is no algorithm that can decide whether a given program halts or not, if given a certain input. Calculating the computational complexity is an even harder problem since not only do we need to prove that the algorithm halts but we need to prove how fast it does so.

//The Collatz conjecture states that the sequence generated by the following
// algorithm always reaches 1, for any initial positive integer. It has been
// an open problem for 70+ years now.
function col(n){
    if (n == 1){
        return 0;
    }else if (n % 2 == 0){ //even
        return 1 + col(n/2);
    }else{ //odd
        return 1 + col(3*n + 1);
    }
}

2. Some algorithms have weird and off-beat complexities

A general “complexity determining scheme” would easily get too complicated because of these guys

//The Ackermann function. One of the first examples of a non-primitive-recursive algorithm.
function ack(m, n){
    if(m == 0){
        return n + 1;
    }else if( n == 0 ){
        return ack(m-1, 1);
    }else{
        return ack(m-1, ack(m, n-1));
    }
}

function f(n){ return ack(n, n); }

//f(1) = 3
//f(2) = 7
//f(3) = 61
//f(4) takes longer then your wildest dreams to terminate.

3. Some functions are very simple but will confuse lots of kinds of static analysis attempts

//Mc'Carthy's 91 function. Try guessing what it does without
// running it or reading the Wikipedia page ;)
function f91(n){
    if(n > 100){
        return n - 10;
    }else{
        return f91(f91(n + 11));
    }
}

That said, we still need a way to find the complexity of stuff, right? For loops are a simple and common pattern. Take your initial example:

for(i=0; i<N; i++){
   for(j=0; j<i; j++){
       print something
   }
}

Since each print something is O(1), the time complexity of the algorithm will be determined by how many times we run that line. Well, as your TA mentioned, we do this by looking at the combinations in this case. The inner loop will run (N + (N-1) + … + 1) times, for a total of (N+1)*N/2.

Since we disregard constants we get O(N2).

Now for the more tricky cases we can get more mathematical. Try to create a function whose value represents how long the algorithm takes to run, given the size N of the input. Often we can construct a recursive version of this function directly from the algorithm itself and so calculating the complexity becomes the problem of putting bounds on that function. We call this function a recurrence

For example:

function fib_like(n){
    if(n <= 1){
        return 17;
    }else{
        return 42 + fib_like(n-1) + fib_like(n-2);
    }
 }

it is easy to see that the running time, in terms of N, will be given by

T(N) = 1 if (N <= 1)
T(N) = T(N-1) + T(N-2) otherwise

Well, T(N) is just the good-old Fibonacci function. We can use induction to put some bounds on that.

For, example, Lets prove, by induction, that T(N) <= 2^n for all N (ie, T(N) is O(2^n))

  • base case: n = 0 or n = 1
    T(0) = 1 <= 1 = 2^0
    T(1) = 1 <= 2 = 2^1
  • inductive case (n > 1):
    T(N) = T(n-1) + T(n-2)
    aplying the inductive hypothesis in T(n-1) and T(n-2)...
    T(N) <= 2^(n-1) + 2^(n-2)
    so..
    T(N) <= 2^(n-1) + 2^(n-1)
         <= 2^n

(we can try doing something similar to prove the lower bound too)

In most cases, having a good guess on the final runtime of the function will allow you to easily solve recurrence problems with an induction proof. Of course, this requires you to be able to guess first – only lots of practice can help you here.

And as f final note, I would like to point out about the Master theorem, the only rule for more difficult recurrence problems I can think of now that is commonly used. Use it when you have to deal with a tricky divide and conquer algorithm.


Also, in your “if case” example, I would solve that by cheating and splitting it into two separate loops that don; t have an if inside.

for (int i = 0; i < n; i++) {
    if (i % 2 ==0) {
        for (int j = i; j < n; j++) { ... }
    } else {
        for (int j = 0; j < i; j++) { ... }
    }
}

Has the same runtime as

for (int i = 0; i < n; i += 2) {
    for (int j = i; j < n; j++) { ... }
}

for (int i = 1; i < n; i+=2) {
    for (int j = 0; j < i; j++) { ... }
}

And each of the two parts can be easily seen to be O(N^2) for a total that is also O(N^2).

Note that I used a good trick trick to get rid of the “if” here. There is no general rule for doing so, as shown by the Collatz algorithm example

Leave a Comment