I'm writing a server in python that needs to take requests from clients, queue the requests, execute them one at a time, then tell the clients that their particular request has been processed.
Currently the way I've approached it is using a TCP socket server -- however, I'm not sure how to make it so that only one request is being executed at a time from a queue?
The way I would like for it to look:
Client1 -> (a) -> Server
Client2 -> (b) -> Server
Client3 -> (c) -> Server
Server makes queue |a, b, c|
Execute a first. Done? Tell Client 1
Execute b second. Done? Tell Client 2
Execute c third. Done? Tell Client 3
From what I understand, if I have the server recv the client's request, execute it, and respond, that may happen at the same time in different threa开发者_开发百科ds. I only want one thread executing all the tasks (because I anticipate many tasks coming in and it'd be slow if everyone was running one at the same time). How do I accomplish that?
There are tons of ways to skin it, but a solution is going to look something like the below:
Client -> Client-Mediator (TCP Port) <--> Server Mediator -> (ServerQ) <- Task Process
The flow would be like this:
Client Process:
- Client creates a client mediator on a tcp socket.
- Sends whatever info it needs over the port.
- Server Mediator receives the request
- Creates a response Q for the Task Process
- Places the request on the Server Q (command + responseQ)
- Wait for response on responseQ
- No response after X time timeout ?
- Once response comes, read and send response over tcp port.
Server Process:
- Reads from Server Q.
- Processes command
- Write the response to the response Q
Components involved
Client - Simple process that sends requests for tasks to be completed.
Client-Mediator - Creates a connection to the server process.
Server-Mediator - Accepts a client request for task processing, enqueues tasks and waits for response.
Task Process - Reads from ServerQ and waits for a task to come in.
Okay so what Nix said was right but I wasn't sure how to make that exactly happen (my question was how to go about actually making this)
As it turns out I had to start 2 threads: one that executes from the queue, and the other being the main server handler. The server handler spawns threads for each new connection, and the client blocks after sending a request / if the request is successfully queued. This means that the queue needs to be thread-safe / protected with a semaphore or mutex. In the case of python, there is a multiprocessing.Queue class that handles that for you. Whenever a task is executed, the execution thread does a notifyAll() which causes all sleeping threads to wake up and check if their requested task is done. I use a condition variable for that.
精彩评论