开发者

Downloading big files and writing it locally

开发者 https://www.devze.com 2023-02-11 04:37 出处:网络
which 开发者_如何学Gois the best way to download from php large files without consuming all server\'s memory?

which 开发者_如何学Gois the best way to download from php large files without consuming all server's memory?

I could do this (bad code):

$url='http://server/bigfile';
$cont = file_get_contents($url);
file_put_contents('./localfile',$cont);

This example loads entry remote file in $cont and this could exceed memory limit.

Is there a safe function (maybe built-in) to do this (maybe stream_*)?

Thanks


You can use curl and the option CURLOPT_FILE to save the downloaded content directly to a file.

set_time_limit(0);
$fp = fopen ('file', 'w+b');
$ch = curl_init('http://remote_url/file');
curl_setopt($ch, CURLOPT_TIMEOUT, 75);
curl_setopt($ch, CURLOPT_FILE, $fp);
curl_exec($ch);
curl_close($ch);
fclose($fp);


Here is a function I use when downloading large files. It will avoid loading the entire file into the buffer. Instead, it will write to the destination as it receives the bytes.

function download($file_source, $file_target) 
{
    $rh = fopen($file_source, 'rb');
    $wh = fopen($file_target, 'wb');
    if (!$rh || !$wh) {
        return false;
    }

    while (!feof($rh)) {
        if (fwrite($wh, fread($rh, 1024)) === FALSE) {
            return false;
        }
    }

    fclose($rh);
    fclose($wh);

    return true;
}
0

精彩评论

暂无评论...
验证码 换一张
取 消