I am running my application on my production server and it is giving me this error:
File "/usr/lib/python2.6/urllib2.py", line 391, in open
response = self._open(req, data)
File "/usr/lib/python2.6/urllib2.py", line 409, in _open
'_open', req)
File "/usr/lib/python2.6/urllib2.py", line 369, in _call_chain
result = func(*args)
File "/usr/lib/python2.6/urllib2.py", line 1161, in http_open
return self.do_open(httplib.HTTPConnection, req)
File "/usr/lib/python2.6/urllib2.py", line 1136, in do_open
raise URLError(err)
URLError: <urlopen error timed out>
BTW I understand the error
. But this is not interesting. The interesting part is that when I run this on my local machine or test server
every thing is working great. It is just too annoying.
Every where I am using the same OS:
ubuntu 10.04
What could be the possible reason? Any help will be appr开发者_C百科eciated.
Can you retrieve the URL in question with wget
at your production server? This could be a firewall problem rather than a Python bug.
It seems reasonable that the production machine may be taking significantly longer than your test or local machines perhaps due to under-provisioning or excess load on the production system so you should be checking that you know what your timeouts are and you aren't accidentally setting the global timeout too agressively.
This is probably an issue with socket.setdefaulttimeout / socket.settimeout. The call to urllib2.urlopen should accept a timeout argument in python2.6. Try urllib2.urlopen(..., timeout=None) and see if that resolves things.
I'd also confirm the value of socket.getdefaulttimeout before making your urlopen call.
精彩评论