Using a similar script found in the comments of http://php.net/manual/en/function.fread.php, I've devised this function:
function readfile_chunked_remote($filename, $seek = 0, $retbytes = true, $timeout = 3) {
set_time_limit(0);
$defaultchunksize = 1024*1024;
$chunksize = $defaultchunksize;
$buffer = '';
$cnt = 0;
$remotereadfile = false;
if (preg_match('/[a-zA-Z]+:\/\//', $filename))
$remotereadfile = true;
$handle = @fopen($filename, 'rb');
if ($handle === false) {
return false;
}
header("Content-Type: application/octet-stream; ");
header("Content-Transfer-Encoding: binary");
header("Cache-Control: no-cache, must-revalidate");
header("Content-Length: " . filesize($filename));
开发者_JAVA百科 header("Content-Disposition: attachment; filename=\"" . basename($filename) . "\"");
stream_set_timeout($handle, $timeout);
if ($seek != 0 && !$remotereadfile)
fseek($handle, $seek);
while (!feof($handle)) {
if ($remotereadfile && $seek != 0 && $cnt+$chunksize > $seek)
$chunksize = $seek-$cnt;
else
$chunksize = $defaultchunksize;
$buffer = @fread($handle, $chunksize);
if ($retbytes || ($remotereadfile && $seek != 0)) {
$cnt += strlen($buffer);
}
if (!$remotereadfile || ($remotereadfile && $cnt > $seek))
echo $buffer;
ob_flush();
flush();
}
$info = stream_get_meta_data($handle);
$status = fclose($handle);
if ($info['timed_out'])
return false;
if ($retbytes && $status) {
return $cnt;
}
return $status;
}
However, it seems to still time out for files over 100mb or so... where might I be going wrong?
Try xmoovstream. It will do it for you. Opensource as well.
According to the manual, use readfile()
Note: readfile() will not present any memory issues, even when sending large files, on its own. If you encounter an out of memory error ensure that output buffering is off with ob_get_level().
http://php.net/readfile
精彩评论