Assume we have three arrays of length N which contain arbitrary numbers of type long
. Then we are given a number M (of the same type) and our mission is to pick three numbers A, B a开发者_C百科nd C one from each array (in other words A should be picked from first array, B from second one and C from third) so the sum A + B + C = M.
Question: could we pick all three numbers and end up with time complexity of O(N2)?
Illustration:
Arrays are:
1) 6 5 8 3 9 2
2) 1 9 0 4 6 4
3) 7 8 1 5 4 3
And M we've been given is 19. Then our choice would be 8 from first, 4 from second and 7 from third.
This can be done in O(1) space and O(N2) time.
First lets solve a simpler problem:
Given two arrays A
and B
pick one element from each so that their sum is equal to given number K
.
Sort both the arrays which takes O(NlogN).
Take pointers i
and j
so that i
points to the start of the array A
and j
points to the end of B
.
Find the sum A[i] + B[j]
and compare it with K
- if
A[i] + B[j] == K
we have found the pairA[i]
andB[j]
- if
A[i] + B[j] < K
, we need to increase the sum, so incrementi
. - if
A[i] + B[j] > K
, we need to decrease the sum, so decrementj
.
This process of finding the pair after sorting takes O(N).
Now lets take the original problem. We've got a third array now call it C
.
So the algorithm now is :
foreach element x in C
find a pair A[i], B[j] from A and B such that A[i] + B[j] = K - x
end for
The outer loop runs N
times and for each run we do a O(N) operation making the entire algorithm O(N2).
You can reduce it to the similar problem with two arrays, which is kinda famous and has simple O(n) solution (involving iterating from both ends).
- Sort all arrays.
- Try each number
A
from the first array once. - Find if the last two arrays can give us numbers
B
andC
, such thatB + C = M - A
.
Steps 2 and 3 multiplied give us O(n^2) complexity.
The other solutions are already better, but here's my O(n^2) time and O(n) memory solution anyway.
Insert all elements of array C into a hashtable. (time complexity O(n), space O(n)) Take all pairs (a,b), a from A and b from B (time complexity O(n^2)). For each pair, check if M-(a+b) exists in the hastable (complexity O(1) expected per query).
So, the overall time complexity is O(n^2), and a space complexity of O(n) for the hashtable.
Hash the last list. The time taken to do this is O(N) on that particular list but this will be added to the next phase.
The next phase is to create a "matrix" of the first two rows of their sums. Then look up in the hash if their matching number is there. Creating the matrix is O(N*N) whilst looking up in the hash is constant time.
1.Store A[i]*B[j] for all pair (i,j) in another array D, organized in the hash data-structure. The complexity of this step is O(N*N).
construct a hash named D
for i = 1 to n
for j = 1 to n
insert A[i]*B[j] into D
2.For each C[i] in the array C, find if M-C[i] exists in D. The complexity of this step is O(N).
for i = 1 to n
check if M - C[i] is in D
I have a solution. Insert all elements from one of the list into a hash table. This will not take O(n) time.
Once that is complete you find all the pairs from the remaining 2 arrays and see if their sum is present in the hash table.
Because hash hookup is constant we get a quadratic time in total.
Using this approach you save the time on sorting.
Another idea is if you know the max size of each element, you can use a variation of bucket sort and do it in nlogn time.
At the cost of O(N^2) space, but still using O(N^2) time, one could handle four arrays, by computing all possible sums from the first two arrays, and all possible residues from the last two, sort the lists (possible in linear time since they are all of type 'long', whose number of bits is independent of N), and then seeing if any sum equals any residue.
Sorting all 3 arrays and using binary search seems a better approach. Once the arrays are sorted, one should definitely go for binary search rather than linear search, which take n rather than log(n).
Hash table is also a viable option.
The combination of hash and sort can bring the time down but at cost of O(N square) space.
I have another O(N^2)
time complexity, O(N)
additional space complexity solution.
First, sort the three arrays, this step is O(N*log(N))
. Then, for each element in A
, create two arrays V = Ai + B
and W = Ai + C
(Ai
is the current element). Ai + B
means that each element of this new array V
is the elemnent in that position in B
plus Ai
(the current element in A
). W = Ai + C
is similar.
Now, merge V
and W
, as in merge sort. Since both are sorted, this is O(N)
. In this new array with 2*N
elements, search for M + Ai
(because Ai
is used twice). This can be done in O(log n)
with binary search.
Therefore, total complexity is O(N^2)
.
Sort the three arrays.Then initialize three indices
- i pointing to first element of A,
- j pointing to last element of B and
k pointing to first element of C. While i,j,k are in the limits of their respective arrays A,B,C
If A[i]+B[j]+C[k] == M return
If A[i]+B[j]+C[k] < M .Increment i if A[i]<=C[k] otherwise increment k.
- If A[i]+B[j]+C[k] > M. Decrement j.
Which should run in O(n).
How about:
for a in A
for b in B
hash a*b
for c in C
if K-c is in hash
print a b c
The idea is to hash all possible pairs in A and B. Next for every element in C see if the residual zum is present in hash.
精彩评论