I am currently using external SOAP web services that allow chunked downloads/uploads of binary files (should allow larger files). I need to allow a end user to download files through the browser with a my PHP app. Serving small files works well, but 25MB+ files cause th开发者_StackOverflow社区e web server to run out of memory.
I am using the native PHP Soap Client (no MTOM support), and requesting the download by submitting a form. Currently it seems like the web server is trying to download the whole file before outputing anything to the browser (e.g. the "Download" prompt doesn't show until after the whole file has been process via PHP).
My method looks something like this (sorry if it's messy, I've been hacking away at this problem for a while).
public function download()
{
$file_info_from_ws ... //Assume setup from $_REQUEST params
//Don't know if these are needed
gc_enable();
set_time_limit(0);
@apache_setenv('no-gzip', 1);
@ini_set('zlib.output_compression', 0);
//File Info
$filesize = $file_info_from_ws->get_filesize();
$fileid = $file_info_from_ws->get_id();
$filename = $file_info_from_ws->get_name();
$offset = 0;
$chunksize = (1024 * 1024);
//Clear any previous data
ob_clean();
ob_start();
//Output headers
header('Content-Type: application/octet-stream');
header('Content-Length: ' . $filesize);
header('Content-Transfer-Encoding: binary');
header('Content-Disposition: attachment; filename="' . $filename . '"');
header('Accept-Ranges: bytes');
while($offset < $filesize)
{
$chunk = $this->dl_service()->download_chunked_file($fileid, $offset, $chunksize);
if($chunk)
{
//Immediately echo out the stream
$chunk->render();
$offset += $chunksize;
unset($chunk); //Shouldn't this trigger GC?
ob_flush();
}
}
ob_end_flush();
}
So my main question is: What is the best way to output large binary chunks from external resources (Webservices, DB, etc.) through PHP to the end user? Preferably without killing memory/CPU too much.
Also I'm curious about the following:
Why wouldn't the Download prompt pop-up after the first output? Why is memory not freed after each loop in the about method?http://php.net/manual/en/function.fpassthru.php
This may be of some help. It also may change the way you want to do everything.
Well I feel silly. This turned out to be just another PHP-ism. Apparently even though I was flushing the output buffer with ob_flush
which (I thought) should have been sending the headings and chunks to the browser, the headers and output weren't actually getting flushed to the browser until the script finished.
Even though the output is self was getting flushed, you still have to explicitly flush
the write buffers of PHP and the web server back to the client. Not doing this lead to the memory expansion, and the download prompt not showing until the entire download completed.
Here is a version of the working method:
public function download()
{
$file_info ... //Assume init'ed from WS or DB
//Allow for long running process
set_time_limit(0);
//File Info
$filesize = $file_info->get_filesize();
$fileid = $file_info->get_id();
$filename = $file_info->get_name();
$offset = 0;
$chunksize = (1024 * 1024);
//Clear any previous data
ob_clean();
ob_start();
//Output headers to notify browser it's a download
header('Content-Type: application/octet-stream');
header('Content-Length: ' . $filesize);
header('Content-Disposition: attachment; filename="' . $filename . '"');
while($offset < $filesize)
{
//Retrieve chunk from service
$chunk = $this->dl_service()->download_chunked_file($fileid, $offset, $chunksize);
if($chunk)
{
//Immediately echo out the stream
$chunk->render();
//NOTE: The order of flushing IS IMPORTANT
//Flush the data to the output buffer
ob_flush();
//Flush the write buffer directly to the browser
flush();
//Cleanup and prepare next request
$offset += $chunksize;
unset($chunk);
}
}
//Exit the script immediately to prevent other output from corrupting the file
exit(0);
}
精彩评论