开发者

Algorithmic riddle: sequence with random access, insertion and remotion

开发者 https://www.devze.com 2023-02-01 12:18 出处:网络
Describe a data structure where: Any item is indexed by an integral value like in an array an integer can index a single value

Describe a data structure where:

  • Any item is indexed by an integral value like in an array
    • an integer can index a single value
    • integers used to index items are contiguous: they go from 1 to n without holes
  • Getting the item at position i (ie: the item associated to the integer i) should be as fast as possible
    • random access
  • Inserting a new item at position i should be as fast as possible
    • this will right-shift any item from i onwards
  • Removing an item at position i should also be as fast as possible

EDIT: a little thing I forgot about: item indices can only be shifted when adding/removing one, they cannot be randomly swapped.

In this description n is the size of the structure (ie: how many items it contains), and i is a generic integer (1 <= i <= n), of course.


I heard this from a guy I met in my faculty. Don't know if it's an interview question, an exam question, just a riddle or what, but I guess it could be everything.

If I recall correctly (but hey, it was before December 24th) he said such a data structure could be implemented either with O(sqrt n) insertion/remotion and O(1) access time, or with O(log n) for any operation.


EDIT: Some right answers have been given. Read it if you don't want to think any more about this problem.


Well, any kind of self-balancing binary tree (e.g., red-black) will provide all three operations for O(logn). C++ map RB mentioned is one example (if I didn't forget C++ completely).

As for the indexing (get operation), there's a standard trick. We'll store in each node how many nodes its left subtree has. Now we can locate element at position i in O(logn) time in a manner like this

Data get(Node root, int i) {
    if (i <= root.leftCount) {
        return get(root.left, i);
    } else if (i == root.leftCount + 1) {
        return node;
    } else {
        return get(root.right, i - root.leftCount - 1);
    }
}

Obviously, each time element is added or removed, leftCount values will have to be recomputed, but that'll need to be done only for O(logn) nodes. (think how many nodes include removed one in their left subtree - only the ones directly between it and root)


A skip list can do insertion/removal/index lookup each in O(log n): http://en.wikipedia.org/wiki/Skip_list#Indexable_skiplist

For the O(n^1/2) insert/remove and O(1) index, I assume something a bit like a C++ deque, but consisting of n^1/2 contiguous blocks, each of n^1/2 elements (+/-1 on each square root, of course). To remove, you have to shuffle down part of the block containing the removed element (O(n^1/2)), and then shuffle down the whole of the higher blocks, which you'd do by adjusting an offset for each block (at most O(n^1/2) blocks) rather than actually moving anything. To insert, you may have to reallocate a block (O(n^1/2)) and adjust offsets. In both cases, you may also have to shift one or two elements off the start/end of some blocks to the end/start of the previous/next block, again at most O(n^1/2) times, to maintain an invariant that no two blocks differ in size by more than 1, except for the last which is used to take up slack. Finally, you sometimes have to create or destroy a block. Lookup is O(1) because you can get to within one block of the element you're looking for with a single division, then consult the stored offsets for one or two blocks to actually find it.

Don't know what that's called, though, or whether I've missed some important detail that means it doesn't work as I've described it.

0

精彩评论

暂无评论...
验证码 换一张
取 消