开发者

What kind of algorithms have shortened running time for longer inputs? [duplicate]

开发者 https://www.devze.com 2022-12-12 17:32 出处:网络
This question already has answers here: Closed 13 years ago. Possible Duplicate: are there any O(1/n) algorithms?
This question already has answers here: Closed 13 years ago.

Possible Duplicate:

are there any O(1/n) algorithms?

I've been reading up on various algorithms recently, and have gotten very used to seeing things with O([some combination of n, n^2 and log n). It seems pretty normal for algorithms to increase in running time with more input, so this doesn't reall开发者_开发知识库y bother me, but are there many well-known algorithms that decrease in running time with increased input? Or are there other algorithms with something like, say, periodic running time based off of input length?


This is useless example, but it would at least answer your question. For example a program that generates an alphabet a-z, based on the input. If you give it one letter it needs to generate 25 more letters, if you give a-y as input in only needs to generate one more. That way the program runs longer for smaller input and shorter for larger input.


The Boyer-Moore algorithm has a shortened runtime for longer search strings.


Though we usually think of the n in O(n) as the input size, it would be more accurate to think of it as the problem size. Larger problems (bigger problems) are harder. Observe furthermore that big-oh complexity is about worst-case complexity.

Example: consider a sudoku solver. Assuming a regular 9x9 grid, the input for this solver is a set of pairs (position, number). How much work the solver needs to do depends on the input: if no pairs or all pairs are given, then a solution is easily found. Still the complexity of the sudoku solver will be expressed as a monotonically increasing function, based on the most difficult input one can find.


You can never go below O(N) assuming you read the whole input. It much depends on the algorithm. Many algorithms need to read all the input, other don't. If you only do a binary search (or your data is stored as a B-tree like in data bases) you can go as low as O(logN).


An algorithm, to be useful, usually has to process all of the input that it's given. Thus, running time is proportional to input in some way. You can get some different answers depending on how you look at it (length of input, vs value of input, sometimes), but I can't think of any algo that actually gets faster as input gets longer.


The closest an algorithm can come to decreasing in run time with a growing data set is to increase at a decreasing rate, as in O(logN). If we have datasets A and B, and all we know is that A is smaller than B, B will take at least as long as A to run.


If you are talking about the empirical, measured time of an algorithm, e.g: the time to process each input element, that may be true (caching and environmental computing conditions can influence). But not as a generalized asymptotic running time (big-Oh notation), since we are specifying an upperbound running time over N elements.

0

精彩评论

暂无评论...
验证码 换一张
取 消