I'm planning to sync 2 folders from 2 dyfferent servers using a cronjob scheduled to run every hour. The script that is to be scheduled will check if a previous instance is already running, and if not开发者_高级运维, continue with processing.
so, the first stage is to sync these folders. Daily (or more often), the second server will have 1-2 thousands of new files added, that need to be moved onto the first server and processed.
The processing will be handled by PHP. it will open each file, read each row and add it to a database.
What i'd like to do is use the same cron that syncs folders to call the PHP script to parse the files. But since i don't know how long it'll take to process that many files, I'd like to impose a limit on the number of files PHP handles on each request.
If no limit is added to the PHP script (number of files / each request), it could easily reach a timeout.
The question is: is there a way i can keep looping and call this PHP script until it finishes processing? Ideally when both wget (folder sync) and PHP will finish, the lock will be removed so a new instance of the cron can run.
Any suggestion is welcomed, thank you
You can place a header()
at the end of the script and use GET parameters to control the execution flow. For example:
parse.php?action=1
In script:
switch ($_GET["action"])
{
case 1:
//action 1 goes here
header("Location: ...");
exit();
break;
case 2:
//action 2 goes here
header("Location: ...");
break;
}
exit();
精彩评论