I want to write fastest possible algorithm for 2 number multiplications. Each number has max digits around 1000000 and is contained in string.
Anyone want to tell about this problem? I searching really speed solution.
You should convert your string to a binary representation of the number. After that, one of the fastest multiplication algorithms I know of is Karatsuba's.
Just to expand on Pablo's answer, suppose each number is a string 1000008 decimal digits long. You could convert that to be 111112 9-digit decimal numbers, each stored in a UInt32. Do your multiplication algorithm on those. (Note you will have to use UInt64 to hold the result of multiplying two UInt32 sections, so you may want a 64-bit machine.) That should give you a factor of 9^2 or 9^log2(3) speedup over base 10, depending on the algorithm.
精彩评论