I have a PHP script that occasionally needs to write large files to disk. Using file_put_contents()
, if the file is large enough (in this case around 2 MB), the PHP script runs out of memory (PHP Fatal error: Allowed memory size of ###开发者_JS百科##### bytes exhausted). I know I could just increase the memory limit, but that doesn't seem like a full solution to me--there has to be a better way, right?
What is the best way to write a large file to disk in PHP?
You'll need a temporary file in which you put bits of the source file plus what's to be appended:
$sp = fopen('source', 'r');
$op = fopen('tempfile', 'w');
while (!feof($sp)) {
$buffer = fread($sp, 512); // use a buffer of 512 bytes
fwrite($op, $buffer);
}
// append new data
fwrite($op, $new_data);
// close handles
fclose($op);
fclose($sp);
// make temporary file the new source
rename('tempfile', 'source');
That way, the whole contents of source
aren't read into memory. When using cURL, you might omit setting CURLOPT_RETURNTRANSFER
and instead, add an output buffer that writes to a temporary file:
function write_temp($buffer) {
global $handle;
fwrite($handle, $buffer);
return ''; // return EMPTY string, so nothing's internally buffered
}
$handle = fopen('tempfile', 'w');
ob_start('write_temp');
$curl_handle = curl_init('http://example.com/');
curl_setopt($curl_handle, CURLOPT_BUFFERSIZE, 512);
curl_exec($curl_handle);
ob_end_clean();
fclose($handle);
It seems as though I always miss the obvious. As pointed out by Marc, there's CURLOPT_FILE
to directly write the response to disk.
Writing line by line (or packet by packet in case of binary files) using functions like fwrite()
Try this answer:
$file = fopen("file.json", "w");
$pieces = str_split($content, 1024 * 4);
foreach ($pieces as $piece) {
fwrite($file, $piece, strlen($piece));
}
fclose($file);
精彩评论