开发者

Python: DJango how to own a long runing process?

开发者 https://www.devze.com 2023-02-19 17:03 出处:网络
So I have a background procss that I need to expose/control as a web service.I have wrapped the process to be able to accept commands via a pipe, but now am trying to find out how to control it.

So I have a background procss that I need to expose/control as a web service. I have wrapped the process to be able to accept commands via a pipe, but now am trying to find out how to control it.

Requiremen开发者_如何学运维ts are as follows:

  1. Need to be able to start the process via the web
  2. Need to be able to send cmds
  3. Need to be able to return results from cmds
  4. Process once started is alive until killed

I think the main question is how do I get django to own the process? Own in the sense, keep a valid save the pipe for future communication with the background process. Right now its something along the lines (just an example):

if __name__ == '__main__':
 to_process_pipe, process_pipe = Pipe()
 node = PFacade(process_pipe)
 p.start()

 to_process_pipe.send(['connect'])
 print to_process_pipe.recv()

 p.killed = True
 p.join()

I think I need a better way to be able to communicate, bc I am not sure how I could store the Pipe in DJango.


And please, if you are going to respond with use Celery, please give me a good explination of how.


ok, so you want a process to be up and running and accepting commands from the django workers?

In such a case celery will not be a good solution as it is not providing communication after task is spawned.

IMHO a good solution will be to have a deamon (implemented as django management command) with a infinite main loop, some sleep between runs, listening to the commands from the specific queue.

For the communication - kombu/django-kombu will be great (that was a part of celery).


My final solution was to write a custom "mailbox" based on pidbox.Mailbox. Their implementation was horribly broken but the algorithm was solid.

I basically stood up a REST API hosted via django and then had that rest api send a message to a AMQP Queue(QPID implementation).

I then had a process that sits, monitors the queues, and passess along any commands as they came in.

It worked well and was pretty awesome when it came together.


Maybe Celery (asynchronous task queue/job queue based on distributed message passing) fits your bill.

0

精彩评论

暂无评论...
验证码 换一张
取 消