I'm porting one of my projects from C# and am having trouble solving a multithreading issue in Python. The problem relates to a long-lived HTTP request, which is expected (the request will respond when a certain event occurs on the server). Here's the summary:
I send the request using urllib2
on a separate thread. When the request returns or times out, the main thread is notified. This works fine. However, there are cases where I need to abort this outstanding request and switch to a different URL. There are four solutions that I can consider:
- Abort the outstanding request. C# has
WebRequest.Abort()
, which I can call cross-thread to开发者_运维知识库 abort the request. Pythonurllib2.Request
appears to be a pure data class, in that instances only store request information; responses are not connected to Request objects. So I can't do this. - Interrupt the thread. C# has
Thread.Interrupt()
, which will raise aThreadInterruptedException
in the thread if it is in a wait state, or the next time it enters such a state. (Waiting on a monitor and file/socket I/O are both waiting states.) Python doesn't seem to have anything comparable; there does not appear to be a way to wake up a thread that is blocked on I/O. - Set a low timeout on the request. On a timeout, check an "aborted" flag. If it's false, restart the request.
- Similar to option 3, add an "aborted" flag to the state object so that when the request does finally end in one way or another, the thread knows that the response is no longer needed and just shuts itself down.
Options 3 and 4 seem to be the only ones supported by Python, but option 3 is a horrible solution and 4 will keep open a connection I don't need. I am hoping to be a good netizen and close this connection when I no longer need it. Is there any way to actually abort the outstanding request, one way or another?
Consider using gevent. Gevent uses non-thread cooperating units of execution called greenlets. Greenlets can "block" on IO, which really means "go to sleep until the IO is ready". You could have a requester greenlet that owns the socket and a main greenlet that decides when to abort. When you want to abort and switch URLs the main greenlet kills the requester greenlet. The requester catches the resulting exception, closes its socket/urllib2 request, and starts over.
Edited to add: Gevent is not compatible with threads, so be careful with that. You'll have to either use gevent all the way or threads all the way. Threads in python are kinda lame anyway because of the GIL.
Similar to Spike Gronim's answer, but even more heavy handed.
Consider rewriting this in twisted. You probably would want to subclass twisted.web.http.HTTPClient
, in particular implementing handleResponsePart
to do your client interaction (or handleResponseEnd
if you don't need to see it before the response ends). To close the connection early, you just call the loseConnection
method on the client protocol.
Maybe this snippet of "killable thread" could be useful to you if you have no other choice. But i would have the same opinion as Spike Gronim
and recommend using gevent
.
I found this question using google and used Spike Gronim's answer to come up with:
from gevent import monkey
monkey.patch_all()
import gevent
import requests
def post(*args, **kwargs):
if 'stop_event' in kwargs:
stop_event = kwargs['stop_event']
del kwargs['stop_event']
else:
stop_event = None
req = gevent.spawn(requests.post, *args, **kwargs)
while req.value is None:
req.join(timeout=0.1)
if stop_event and stop_event.is_set():
req.kill()
break
return req.value
I thought it might be useful for other people as well.
It works just like a regular request.post, but takes an extra keyword argument 'stop_event'. This is a threading.Event. The request will abort if the stop_event gets set.
Use with caution, because if it's not waiting for either the connection or the communitation, it can block GIL (as mentioned). It (gevent) does seem compatible with threading these days (through monkey patch).
精彩评论