开发者

Does running a PHP script from SSH bypass max execution time?

开发者 https://www.devze.com 2023-01-05 08:03 出处:网络
Basically I have an issue. I am posting to my users facebook statuses using a cron job, but when i run the cron from the browser I get an error after about 30 seconds. I have edited the .ini file to a

Basically I have an issue. I am posting to my users facebook statuses using a cron job, but when i run the cron from the browser I get an error after about 30 seconds. I have edited the .ini file to allow max execution time but it dont seem to work.

It updates the statuses开发者_高级运维 of the first 700ish users but after that it stops.

Can I run it from the terminal or is there anything I can check/do to get around this?


When running PHP scripts from the command line the default max execution time is 0 - that is, unlimited. From an HTTP context there's other settings that can shutdown your script, including the Apache Timeout directive. This is definitely a job I'd run through the PHP CLI.

I would enable error logging which would describe what limits your script is running into. There's a lot of possibilities - you may be hitting the memory limit, the execution time may be too low, the Facebook API may be rate-limiting your requests, etc.


Make sure that you'll see errors by doing:

error_reporting(E_ALL);
ini_set('display_errors',1);

at the top of your script.

You could be running into a max_execution_time ceiling, or you could be running out of memory, etc. Error messages will help with determining that.

As Frank Farmer implies in his comment, you can use set_time_limit(0); in your script to allow it to run indefinitely.

If you're having memory limit issues, you can up time memory limit in your script (ini_set('memory_limit',...);) -- but you should really consider fixing your code so it doesn't keep consuming memory.

0

精彩评论

暂无评论...
验证码 换一张
取 消

关注公众号