I have a PHP script which makes many consecutive calls to a SOAP-based web service in a short space of time; something like 200 within 0.5 seconds. I've noticed that rarely, there can be connection errors or "fatal protocol errors" in maybe 1 out of every 200 requests. I wondered if this could be开发者_开发百科 because I'm "hammering" the service.
Would there be any benefit in adding something like usleep(100);
between each SOAP call to reduce the impact on the service, or does SOAP have some kind of built-in buffering/queueing/retrying system so that services can't be flooded like this from a single client?
Thanks.
There is no specific limit in terms of the SOAP protocol, a limit may be applied by the service provider. 200 requests within 0.5 seconds is indeed quite a lot and may cause one of your requests to fail due to lack of server resources.
If you control the service provider, I suggest optimizing your protocol to make fewer requests - bundle many within each request.
If not you cannot change the protocol, a caching mechanism/proxy server on your side can limit the amount of requests. A caching mechanism could be implemented by memcached, the database or even files saved in the temp directory on your webserver. A proxy server such as Squid will be more than happy proxying SOAP requests.
In a production environment, a page request from a client that depends on 200 subsequent SOAP requests will scale very badly. I suggest you look into improving your application to make less requests if possible, using one of the methods outlined above. Once that has been optimised as much as is feasible, then look into improving the service provider.
精彩评论