开发者

How to have django give a HTTP response before continuing on to complete a task associated to the request?

开发者 https://www.devze.com 2023-03-18 05:54 出处:网络
In my django piston API, I want to yield/return a http response to the the client before calling another function that will take quite some time. How do I make the yield give a HTTP response containin

In my django piston API, I want to yield/return a http response to the the client before calling another function that will take quite some time. How do I make the yield give a HTTP response containing the desired JSON and not a string relating to the creation of a generator object?

My piston handler method looks like so:

def create(self, request):
    data = request.data 

    *other 开发者_如何学编程operations......................*

    incident.save()
    response = rc.CREATED
    response.content = {"id":str(incident.id)}
    yield response
    manage_incident(incident)

Instead of the response I want, like:

   {"id":"13"}

The client gets a string like this:

 "<generator object create at 0x102c50050>"

EDIT:

I realise that using yield was the wrong way to go about this, in essence what I am trying to achieve is that the client receives a response right away before the server moves onto the time costly function of manage_incident()


This doesn't have anything to do with generators or yielding, but I've used the following code and decorator to have things run in the background while returning the client an HTTP response immediately.

Usage:

@postpone
def long_process():
    do things...

def some_view(request):
    long_process()
    return HttpResponse(...)

And here's the code to make it work:

import atexit
import Queue
import threading

from django.core.mail import mail_admins


def _worker():
    while True:
        func, args, kwargs = _queue.get()
        try:
            func(*args, **kwargs)
        except:
            import traceback
            details = traceback.format_exc()
            mail_admins('Background process exception', details)
        finally:
            _queue.task_done()  # so we can join at exit

def postpone(func):
    def decorator(*args, **kwargs):
        _queue.put((func, args, kwargs))
    return decorator

_queue = Queue.Queue()
_thread = threading.Thread(target=_worker)
_thread.daemon = True
_thread.start()

def _cleanup():
    _queue.join()   # so we don't exit too soon

atexit.register(_cleanup)


Perhaps you could do something like this (be careful though):

import threading
def create(self, request):
    data = request.data 
    # do stuff...
    t = threading.Thread(target=manage_incident,
                         args=(incident,))
    t.setDaemon(True)
    t.start()
    return response

Have anyone tried this? Is it safe? My guess is it's not, mostly because of concurrency issues but also due to the fact that if you get a lot of requests, you might also get a lot of processes (since they might be running for a while), but it might be worth a shot.

Otherwise, you could just add the incident that needs to be managed to your database and handle it later via a cron job or something like that.

I don't think Django is built either for concurrency or very time consuming operations.

Edit

Someone have tried it, seems to work.

Edit 2

These kind of things are often better handled by background jobs. The Django Background Tasks library is nice, but there are others of course.


You've turned your view into a generator thinking that Django will pick up on that fact and handle it appropriately. Well, it won't.

def create(self, request):
  return HttpResponse(real_create(request))

EDIT:

Since you seem to be having trouble... visualizing it...

def stuff():
  print 1
  yield 'foo'
  print 2

for i in stuff():
  print i

output:

1
foo
2
0

精彩评论

暂无评论...
验证码 换一张
取 消