I need a linked list to be able to add items at both sides. The list will hold data to be shown in a trend viewer. As there is a huge amount of data I need to show data before its completely read so what I want to do is to read a block of data that I know has been already written and while I read that block have two threads filling the sides of the collection:
I thought on using LinkedList but it says that is does not support this scenario. Any ideas on something at the Framework that can help me or will I have to develop my custom list from scratch?
Thanks in advance.
EDIT: The main idea of the solution is to do it without locking anything because I'm reading a piece of the list that is not going to be changed while wri开发者_运维技巧ting at other places. I mean, the read Thread will only read one chunk (from A to B) (a section that has been already written). When it finishes and other chunks have been completely written the reader will read those chunks while the writters write new data.
See the updated diagram:
If you are on .NET4 you can use two ConcurrentQueue
. One for the left side and one for the right side.
If I understand correctly you have a linked list to which you are adding data at the beginning and the end. You are never adding or removing from anywhere else. If this is the case you do not have to worry about threading since the other threads will never interfere.
Simply do something like this:
//Everything between first and last is thread safe since
//the other threads only add before and after.
LinkedListNode<object> first = myList.First;
LinkedListNode<object> current = first;
LinkedListNode<object> last = myList.Last;
bool done = false;
if (current == null) return; //empty list
do
{
//do stuff
if (current == last) done = true;
current = current.Next;
} while (!done && current != null);
After you are done with this section you can do the same with two more sections from the new myList.First to first and from last to the new myList.Last.
You could use the linked list and just use normal .NET threading constructs like the lock keyword in order to protect access to the list.
Any custom list you developed would probably do something like that anyway.
I would recommend to consider an other approach with single data structure to persist incomming data, in this way you can keep order of incomming data messages. For instance you can use blocking queue, in this SO post you can find nice example Creating a blocking Queue in .NET?.
Why not using LinkedList
class.
The documentation says its not threadsafe so you have to synchronize access to the list for yourself, but you have to do this with any datastructure accessed by multiple threads.
Performance should be quiet good here is what msdn says about inserting nodes at any position:
LinkedList provides separate nodes of type LinkedListNode, so insertion and removal are O(1) operations.
You just have to lock read and insert operations with the lock
construct.
EDIT
Ok, i think i understand what you want. You want a list like datastructure which is split into chunks of items. You want to independently write and read chunks of items without locking the whole list.
I suggest to use the LinkedList holding your chunks of data items. The chunks themself can be represented as simple List or can be LinkedList instances as well.
You have to lock the access to the global LinkedList.
Now your writer threads fill one private List with n items a time. When finished the writer locks the LinkedList and adds his private list with dataitems to the LinkedList.
The reader thread locks the LinkedList reads one chunk and releases the lock. Now it can process n dataitems without locking them.
精彩评论