I've a large amount of real time data need to be proceed as fast as possible.
These data are coming from multi network connection threads.
All network threads are passing the data to a shared function to proceed some translation and interpretation to the data revived, after that it saves the information into Concurrent Dictionary as object by object.
The problem is I have an amount of objects might reach 150K stored on this dictionary. What happens is while fetching the object to update it takes much time rather than the accepted time.
public class MyObject
{
System.Timers.Timer LostTimer = new System.Timers.Timer();
public int ID;
public DateTime UpdateTime;
public MyObject()
{
LostTimer.Interval = 20000;
LostTimer.Elapsed+=TimerElapsedHandler(LostTimer_Elapsed);
LostTim开发者_如何学编程er.Enabled = true;
}
void LostTimer_Elapsed(object sender,EventArgs e)
{
if(UpdateTime > DateTime.Now.AddSeconds(-20))
Console.WriteLine(ID + " Lost...");
}
}
public class MyClass
{
public MyClass(){}
private ConcurrentDictionary<int,MyObject> Objects = new ConcurrentDictionary<int,MyObject>();
void NetworkThread1DataRecived(eventArgs e)
{
Translate(e.Data);
}
void Translate(string[] data)
{
Task.Factory.StartNew(()=>
{
Parallel.ForEach(data, s (()=>
{
MyObject o = null;
Objects.TryGet(int.Parse(s),out o)
if(o == null)
{
o = new MyObject();
o.ID = int.Parse(s);
o.UpdateTime = DateTime.Now;
Objects.TryAdd(s,o);
}
else
{
o.UpdateTime = DateTime.Now;
}
});
});
}
}
Now while working with more than 30K of objects it gives me objects lost.
The logic is that I'm subtracting the object grace period from the current system time and compare it with the last update time for that object. Do you think that this type of thread safe array (Dictionary) can not handle this large amount of data and causes read./write access delays which causes object lost?
Before I were using List<MyObject>
and Lock(Object){}
to handle the multi thread access to this shared memory, but it fails after 10K of objects. After changing it to dictionary (.Net build in thread safe list) it works well with 30K.
My target is to reach 150K, can I reach it with this logic?
So, for each object you add (30K objects) you create a timer. Thats 30.000 timers active.
I think this is creating a lot of overhead.
If this is just for logging/audit, you should do this in 1 timer. Possibly create a seperate list/dictionary of objects that you want to log.
精彩评论