开发者

Is it possible to stream a file to and from servers without holding the data in ram?

开发者 https://www.devze.com 2022-12-24 06:40 出处:网络
PHP question (new to PHP after 10 years of Perl, arrggghhh!). I have a 100mb file that I want to send to another server.

PHP question (new to PHP after 10 years of Perl, arrggghhh!).

I have a 100mb file that I want to send to another server.

I have managed to read the file in and "post" it without curl (cannot use curl for this app). Everything is working fine on smaller files.

However, with the larger files, PHP complains about not being able to allocate memory.

Is there a way to open a file, line by line, and send it as a post ALSO line by line?

This way nothing is held in ram thus preventing my errors and gettin开发者_开发知识库g around strict limitations.

Chris

Here's my current code that errors with large files:

<?php
ini_set('display_errors',1);
error_reporting(E_ALL|E_STRICT);

$resp = do_post_request("/local/file.txt","http://www.mysite.com/receivedata.php");
exit;

function do_post_request($file,$url){
   $fileHandle = fopen($file, "rb");
   $fileContents = stream_get_contents($fileHandle);
   fclose($fileHandle);

   $params = array(
      'http' => array
      (
          'method' => 'POST',
          'header'=>"Content-Type: multipart/form-data\r\n",
          'content' => $fileContents
      )
   );

   $ctx = stream_context_create($params);
   $fp = fopen($url, 'rb', false, $ctx);

   $response = stream_get_contents($fp);
   return $response;
}
?>


You can use fopen and fgets (or fread alternatively) to read the file sequentially.

However if your only purpose is to flush the file to standard output, you can simply use readfile('filename') and it'll do exactly what you want.

0

精彩评论

暂无评论...
验证码 换一张
取 消

关注公众号