开发者

Algorithm complexity

开发者 https://www.devze.com 2023-03-01 08:43 出处:网络
Finding intersection of two arrays can be done if you sort the 2 arrays and then do a linear step through to record the common elements.

Finding intersection of two arrays can be done if you sort the 2 arrays and then do a linear step through to record the common elements. This would take O(nlogn) + O(nlogn) + O(n)

Alternatively, you could compare each element in the first array with each element in the second array and you would get a O(n^2) run-time complexity.

How can I prove the firs开发者_StackOverflow社区t approach is better than the second?


First of all, O(nlogn) + O(nlogn) + O(n) doesn't make much sense, since O(f) is a set, not a number.

What you mean is O(nlogn + nlogn + n), which can be shown to be equal to O(nlogn). Just look at what it means for a function f to belong to the set O(g):

f is an element of O(g) iff there exists c>0 and x0 such that for all x>x0:
|f(x)| <= c*|g(x)|

By setting f=nlogn and g=nlogn+nlogn+n, it follows that f is in O(g), and hence that O(nlogn) == O(nlogn + nlogn + n).


It's important to remember that Big-O notation is the "worst case" run-time meaning that you are iterating over an array that is crushingly large.

O(N) + O(N) is equivalent to O(2N) which is equivalent to O(N)

Thus, O(nlogn) + O(nlogn) + O(n) is merely O(nlogn)

O(nlogn) < O(n^2)


O(nlogn) + O(nlogn) + O(n) = O(nlogn)

O(nlogn) < c*nlogn

O(n^2) < d*(n^2)

c*nlogn < d*(n^2)

logn < d/c*n

You can prove in many ways that there's an N where logn < d/c*n is always true.

You could even draw it.

0

精彩评论

暂无评论...
验证码 换一张
取 消

关注公众号