Let's say we have two algorithms. We input n numbers to each of them, and the result pops out at different times depending on which algorithm we're using.
The first algorithm (Algorithm A) runs in 3(nnn) time, or n-cubed time. Thus, if we enter 10 numbers, it will take 3(10)(10)(10) = 3000 instructions to run through the algorithm.
The second algorithm (Algorithm B) runs in 19,500,000n time, or linear time. If we enter 10 numbers into B, it will take 195,000,000 instructions to execute the algorithm.
OK, let's assume Algorithm A is running on a Cray-1 supercomputer (programmed in Fortran), and Algorithm B is running on an old TRS-80 (programmed in BASIC).
3(nnn) nanoseconds |
19,500,000n nanoseconds |
|
Cray-1 is a trademark of Cray Research, Inc. TRS-80 is a trademark of Tandy Corporation. |
If that doesn't convince you, how about this?
Time | ||||
n=10 | ||||
n=1,000 | ||||
n=100,000 | 3.3 seconds | 1.3 minutes | 1.5 days | 108 years |