开发者

How can one test time complexity "experimentally"?

开发者 https://www.devze.com 2023-01-21 04:23 出处:网络
Could it be done by keeping a counter to see how many iterations an algorithm goes through, or doe开发者_运维百科s the time duration need to be recorded?The currently accepted won\'t give you any theo

Could it be done by keeping a counter to see how many iterations an algorithm goes through, or doe开发者_运维百科s the time duration need to be recorded?


The currently accepted won't give you any theoretical estimation, unless you are somehow able to fit the experimentally measured times with a function that approximates them. This answer gives you a manual technique to do that and fills that gap.

You start by guessing the theoretical complexity function of the algorithm. You also experimentally measure the actual complexity (number of operations, time, or whatever you find practical), for increasingly larger problems.

For example, say you guess an algorithm is quadratic. Measure (Say) the time, and compute the ratio of time to your guessed function (n^2):

for n = 5 to 10000 //n: problem size
  long start = System.time()
  executeAlgorithm(n)
  long end = System.time()
  long totalTime = end - start

  double ratio = (double) time / (n * n)
end

. As n moves towards infinity, this ratio...

  • Converges to zero? Then your guess is too low. Repeat with something bigger (e.g. n^3)
  • Diverges to infinity? Then your guess is too high. Repeat with something smaller (e.g. nlogn)
  • Converges to a positive constant? Bingo! Your guess is on the money (at least approximates the theoretical complexity for as large n values as you tried)

Basically that uses the definition of big O notation, that f(x) = O(g(x)) <=> f(x) < c * g(x) - f(x) is the actual cost of your algorithm, g(x) is the guess you put, and c is a constant. So basically you try to experimentally find the limit of f(x)/g(x); if your guess hits the real complexity, this ratio will estimate the constant c.


Algorithm complexity is defined as (something like:)

the number of operations the algorithm does as a function of its input size.

So you need to try your algorithm with various input sizes (i.e. for sort - try sorting 10 elements, 100 elements etc.), and count each operation (e.g. assignment, increment, mathematical operation etc.) the algorithm does.

This will give you a good "theoretical" estimation.
If you want real-life numbers on the other hand - use profiling.


As others have mentioned, the theoretical time complexity is a function of number of cpu operations done by your algorithm. In general processor time should be a good approximation for that modulo a constant. But the real run time may vary because of a number of reasons such as:

  1. processor pipeline flushes
  2. Cache misses
  3. Garbage collection
  4. Other processes on the machine

Unless your code is systematically causing some of these things to happen, with enough number of statistical samples, you should have a fairly good idea of the time complexity of your algorithm, based on observed runtime.


The best way would be to actually count the number of "operations" performed by your algorithm. The definition of "operation" can vary: for an algorithm such as quicksort, it could be the number of comparisons of two numbers.

You could measure the time taken by your program to get a rough estimate, but various factors could cause this value to differ from the actual mathematical complexity.


yes.

you can track both, actual performance and number of iterations.


Might I suggest using ANTS profiler. It will provide you this kind of detail while you run your app with "experimental" data.

0

精彩评论

暂无评论...
验证码 换一张
取 消