I want to download a large amount of Files to my Server. I have a List of different Files to download and Locations where to put them. This all is not a Problem, i use wget to download th开发者_运维百科e File, execute this with shell_exec
$command = 'wget -b -O' . $filenameandpathtoput . ' ' . $submission['url'];
shell_exec($command);
This works great, the Server starts all the Threads and the Files are downloaded in no Time.
Problem is, I want to notify the User when the Files are downloaded... And this does not work with my current way of doing things. So how would you implement this?
Any Suggestions would be helpful!
I guess that you are able to check whether all files are in place with something like
function checkFiles ()
{
foreach ($_SESSION["targetpaths"] as $p)
{
if (!is_file($p)) return false;
}
return true;
}
Now all you have to do is to call a script on your server that calls this function every second (or so). You can either accomplish this with Meta Refresh (forcing the browser to reload the page after n seconds) or by using AJAX (have a look at jQuery's .getJSON
, for example).
If the script is called and the files are not yet all downloaded, print something like "Please wait" and refresh again later. Otherwise, show the success message. Thats all.
You can consider using exec to run the external wget command. Your PHP script will block till the external command completes. Once it completes you can echo the name of the completed file.
精彩评论