With large programs or ones dealing with huge amounts of data it is useful to check performance by measuring how fast a program works. However since computers run so fast and the amount of data processed is so large, it is more imporantant to know the rate rather than the acutal speed a program runs in to check performance. The measurement used is called Big Oh. The O stands for order, as in what order of magnitude.
For example lets say you want to count the number of seats in a rectagular theater. One way to to go seat by seat in each row and count them. Let's say the count is 100. Another way is to count the rows, then count the columns, and finally multiple the two. Let's say there are 5 rows and 20 columns, so we get 100. Another way is let's say by change the seats are numbered, and you are in the far corner. Well the seat says 20J, so there is 100 seats by multiplying 20 and fifth letter in the alphabet.
As you can see each process gets the same answer but each would take a different amount of time. The first would take the longest and the last would be the fastest. Now image that the theater has 1000 seats. The first process would take a really long time, the second a bit longer, and the third is still the same. However the Big O values are the same since the rate/process of them are still the same.
The best way to learn Big Oh is to do it, but first there are some ground terms, and rules.
Now for examples.
Code | Order |
---|---|
A = 144 * x * y; | O(1) |
int grades[classSize]; int sum; for(i=0; i<classSize; i++) { cin >>grades[i]; sum += grades[i]; } int ave = sum/classSize cout >>ave; | First there is a simple array and integer declaration, which gives us O(2). Next inside the loop is another O(2). The loop executes these classSize times, so we add the inside times classSize to our preexisting order, O(classSize*2 + 2). Finally there are two other simple steps, O(2classSize + 4). Now we reduce O(classSize). (Drop the lower order 4 and the constant 2.) |
for(i=1; i<n; i++) for(j=0; j<i; j++) if(a[i]==a[j]) count++; | First work inside out. The if block is a simple O(1), since in any case only and increment is done. Now the inner for loop is i times, so it becomes O(i*1). Now the outter loop is done n times, so our order is O(n*i*1). If we examine it closely it is basical a sum of 1 + 2 + 3 + ... n at the most. That however is a simple formula of .5n(n+1) which gives us O(.5n2 + .5), which reduces to O(n2). |
int h=0; while(n>1) //n is positive { h++; n/2; } | Here is a tricky. First there is a simple assignment, O(1). Next is a while loop with two assignments inside O(1 + ?*2). Now what about the inside. Every execution of it halves n until is is one or less. So working backwards it will execute the loop x times where 2x == n. This is a log2 n, so we get O(1 + 2log2 n). With simple arithematic we convert this to O(2log n + 1), which reduces to O(log n). |
if(num != 0) return num; else //order O(2^n) reallyHighOrder(num); | This is an exageration of a best and worst case scenario. The best case is when num is not zero. All the fuction does then is return it. That is a simple constant operation with O(1). The worst case is when num is zero. If that's true then the really high order function is call. It has a O(2^n) which is really slow, slower than a quadratic. In fact exponential funcions are theorhetical. They just fit the pattern of functional degrees with orders, although some program feel like they have that order. |
Prev -- Back to Portal -- Next