开发者

merge sort space

开发者 https://www.devze.com 2023-03-26 04:30 出处:网络
In a top-down merge sortthe recursive functions are called in this fashion: void mergesort(Item a[], int l, int r) {

In a top-down merge sort the recursive functions are called in this fashion:

void mergesort(Item a[], int l, int r) {
    if (r <= l) return;
    int m = (r+l)/2;
    mergesort(a, l, m);
    mergesort(a, m+1, r);
    merge(a, l, m, r);
}

It is given in text book that space complexity of this strategy is O(n). whereas if we look at the recursion closely : we are passing pointer to array in recursive calls. Second the r开发者_C百科ecursion is resolved in preorder order of traversal by merging bottom nodes to parent nodes. so at each time there are O(logn) variables on stack (or O(log n) frames on stack). So how is it that space complexity is O(n) inspite of having in-place merging techniques?


So how is it that space complexity is O(n) inspite of having in-place merging techniques?

Because the implementation given in your book probably doesn't use an in-place merging technique. If an O(1) space and O(n log n) time sort is required, heapsort is usually preferred to merge sort since it is much easier. Only when you're talking about sorting lists does doing an O(1) merge sort make sense... and then, it is easy to do. Merge sort specified for e.g. a linked list would be O(1) space and O(n log n) time.

The fundamental misunderstanding here seems to be this: time complexities apply to algorithms, not the problems they solve. I can write an O(n^3) merge sort if I want... doesn't mean my algorithm isn't O(n^3), and it doesn't say anything about your O(n log n) merge sort. This is a little different from computational complexity, where we talk about e.g. problems being in P... a problem is in P if there's a polynomial time algorithm for it. However, problems in P can also be solved by non-polynomial time algorithms, and if you think about it, it's trivial to construct such an algorithm. Same goes for space complexities.


You are right that the space taken up by the recursive calls is O(log n).

But the space taken by the the array itself is O(n).

The total space complexity is O(n) + O(log n).

This is O(n), because it is bounded above by (n)=>2(n).


How are you going to even store n items in log n space? That doesn't make sense. If you're sorting n items, O(n) space is the best you're going to get.


Since you are not allocating any space inside the mergesort function besides the constant's, space complexity of this one is O(lg(n)). But your merge procedure will allocate memory, in case of array, hence keeping that mind it becomes O(lg(n)) + O(n) = O(n). If you use linked list you can avoid the scratch space inside merge procedure hence arriving at O(lg(n) best.

0

精彩评论

暂无评论...
验证码 换一张
取 消