I'm looking for a fast and efficient Radix-Sort Implementation for Dictionary/KeyValuePair Collection if possible in C# (but not mandatory). The key is an Integer between 1 000 000 and 9 999 999 999. The number of values are varying between 5 to several thousand. At the moment I'm using LINQ-OrderBy, which is I think QuickSort. For me performance is really imp开发者_运维技巧ortant and I would like to test whether a Radix-Sort would be faster. I found only Array implementations. Of course I could try it by myself but because I'm new to this topic I believe it wouldn't be the fastest and most efficient algorithm. ;-) Thank you.
Rene
Have you tested your code to determine that the LINQ-based sort is the bottleneck in your program? LINQ's sort is pretty darned quick. For example, the code below times the sorting of a dictionary that contains from 1,000 to 10,000 items. The average, over 1,000 runs, is on the order of 3.5 milliseconds.
static void DoIt()
{
int NumberOfTests = 1000;
Random rnd = new Random();
TimeSpan totalTime = TimeSpan.Zero;
for (int i = 0; i < NumberOfTests; ++i)
{
// fill the dictionary
int DictionarySize = rnd.Next(1000, 10000);
var dict = new Dictionary<int, string>();
while (dict.Count < DictionarySize)
{
int key = rnd.Next(1000000, 9999999);
if (!dict.ContainsKey(key))
{
dict.Add(key, "x");
}
}
// Okay, sort
var sw = Stopwatch.StartNew();
var sorted = (from kvp in dict
orderby kvp.Key
select kvp).ToList();
sw.Stop();
totalTime += sw.Elapsed;
Console.WriteLine("{0:N0} items in {1:N6} ms", dict.Count, sw.Elapsed.TotalMilliseconds);
}
Console.WriteLine("Total time = {0:N6} ms", totalTime.TotalMilliseconds);
Console.WriteLine("Average time = {0:N6} ms", totalTime.TotalMilliseconds / NumberOfTests);
Note that the reported average includes the JIT time (the first time through the loop, which takes approximately 35 ms).
Whereas it's possible that a good radix sort implementation will improve your sorting performance, I suspect your optimization efforts would be better spent somewhere else.
精彩评论