开发者

Curl hangs when downloading large file (500MB+)

开发者 https://www.devze.com 2023-02-19 22:57 出处:网络
I\'m using cURL to download large XML files (between 500MB and 1GB) from a remote server. Although the script works fine for smaller test files, every time I try to download a file larger than a f开发

I'm using cURL to download large XML files (between 500MB and 1GB) from a remote server. Although the script works fine for smaller test files, every time I try to download a file larger than a f开发者_高级运维ew hundred megabytes, the script seems to hang - it doesn't quit, there's no error message, it just hangs there.

I'm executing the script from the command line (CLI), so PHP itself should not time out. I have also tried cURL's verbose mode, but this shows nothing beyond the initial connection. Every time I download the file, it stops at exactly the same size (463.3MB). The file's XML at this point is incomplete.

Any ideas much appreciated.

        $ch = curl_init();
    $fh = fopen($filename, 'w');

    curl_setopt($ch,CURLOPT_URL, $url);
    curl_setopt($ch, CURLOPT_FOLLOWLOCATION, TRUE );
    curl_setopt($ch, CURLOPT_FILE, $fh);
    curl_setopt($ch, CURLOPT_HEADER, 0);
    curl_setopt($ch, CURLOPT_TIMEOUT, 0);

    if(curl_exec($ch) === false)
    {
        echo 'Curl error: ' . curl_error($ch) . "\n";
    }
    else
    {
        echo 'Operation completed without any errors';
    }

    $response = array(
        'header' => curl_getinfo($ch)
    );

    curl_close($ch);
    fclose($fh);

    if($response['header']['http_code'] == 200) {
        echo "File downloaded and saved as " . $filename . "\n";
    }

Again, this script works fine with smaller files, but with the large file I try to download it does not even get as far as printing out an error message.

Could this be something else (Ubuntu 10.04 on Linode) terminating the script? As far as I understand, my webserver shouldn't matter here since I am running it through CLI.

Thanks,

Matt


The files appear to be complete when they stop downloading right? Adding -m 10800 to the command will timeout and end the transfer after the specified number of seconds. This will work if you set the timeout to be longer than the transfer takes, but is still annoying.


Check this post, maybe you can try to download file as parts. Or if you have access to remote server you can try to archive file and then download it. You can check your php.ini configuration too. See for file size, memory limits and other.


You might be out of disk space on the partition you are saving, Or running over quota for the user running the script.

0

精彩评论

暂无评论...
验证码 换一张
取 消