I have a php script which takes a long time to finish and it fails due to execution timeout (script runs too long) or network timeout.
Essentially the script does a for loop which does two to three processes on each iteration.
- request an external service through curl
- parse the xml
- insert the response into a database
Suppose that each iteration takes 1-2 seconds to complete.
To resolve the timeout issues I tried to set max_execution_timeout
and default_socket_timeout
to 0. Please let me know if this is correct.
The key point to note is that the for loop may process 70,000 records -- it's basically a bulk SMS system.
I've tried sending partial responses to the browser using flush()
or ob_implicit_flush(true)
and setting output_buffering
to 0 but I'm only getting the final response and not the partial ones.
(Original below)
Hi, I would like to get assistance in one of my projects..ill explain the issues..what am doin is executing a long php script...where it takes a long time to finish running it..im experiencing script execution timeout and network timeout issues....in my script..its a for loop which does 2-3 processes, on each iteration...one is requesting an external service(through curl)and getting the response and pasrsing the xml,inserting the response to the database...suppose that each iteration takes around 1-2 sec...for timeout issues i tried to set the "max_execution_timeout" and "default_socket_timout" to 0..please suggest if there is any issue with it..eventhough this is temporary...and main thing to note is for the loop..there may be more than 70k records....its basically a bulk sms system... This timeout mainly when the browser complaints the server..that im waiting for your response for many hours and its time for me to sleep..so bye bye..this is what i learned so far...so i thought of queing the longs list of records into small chunks and executing those and sending the response to the browser thereby satisfying it.开发者_StackOverflow社区..but the problem im been working around for few days is sending the partial response to the browser...using flush() or ob_implicit_flush(true);...and setting output_buffering to 0...tried all possibilities...but im getting only the final response and not the partial ones...and atlasy my only intention was to keep the process active...what could be the possibility for this...any solutions for this...please help me out..
You should be doing that sort of thing in a cron job. If a remote server has significant lag or there is a network error, it would easily time out the connection with the client browser and possibly leave zombie processes running after the web client connection crashes.
As said your script should run as a cron.
If the client must have semi-real time preview of what is happening I would suggest a different html page that queries the server via AJAX on pre-determined periods of time and updates the response that would have been saved to a database.
You can have the best of both worlds if you separate things.
a) the server will process the long query on its own - client or no client connected b) the client will, if needed, see a progress of the server's work done
精彩评论