I'm dealing with a hardware resource that can only handle 1 command at a time. I'm going to exposing some of it's API functions via a web interface, so obviously there'开发者_如何学运维s a good chance more than 1 command will get sent at a time. I have decided that queuing these commands when they're submitted is the best way to ensure serial processing.
I'm planning on implementing the queue in a static class. The web app code-behind will add a command by calling a method corresponding to the command they want. I want the calling method to wait until it gets the output of its command, so no async magic is required.
Am I doing this right? Is there a better way?
How do I start implementing the queue in C# (I usually work with Java)? I assume I'll need some sort Event to signal a job has been added, and a Handler to initiate processing of the queue...
I'm using .NET Framework 4.
You can use the ConcurrentQueue
class for your implementation and have a dedicated thread to process items in the queue.
For the waiting part you can use an AutoResetEvent
, producers pass the event instance to the singleton class along with the request, then calls WaitOne()
which blocks until the processor has signaled processing is completed by calling Set()
.
Sounds like a good approach EXCEPT: Use the Generic Queue collections class. Do not write your own! You would be reinventing a well-built wheel.
精彩评论