开发者

Memory exhausted error for json_parse with PHP

开发者 https://www.devze.com 2023-02-09 15:16 出处:网络
I have the following code: <?php $FILE=\"giant-data-barf.txt\"; $fp = fopen($FILE,\'r\'); //read everything into data

I have the following code:

<?php
$FILE="giant-data-barf.txt";

$fp = fopen($FILE,'r');

//read everything into data
$data = fread($fp, filesize($FILE));
fclose($fp);

$data_arr = json_decode($data);
var_dump($data_arr);
?>

The file gia开发者_C百科nt-data-barf.txt is, as its name suggests, a huge file(it's 5.4mb right now, but it could go up to several GB)

When I execute this script, I get the following error:

Fatal error: Allowed memory size of 134217728 bytes exhausted (tried to allocate 71 bytes) in ........./data.php on line 12

I looked at possible solutions, and saw this:

ini_set('memory_limit','16M');

and my question is, is there a limit to how big I should set my memory? Or is there a better way of solving this problem?


THIS IS A VERY BAD IDEA, that said, you'll need to set

ini_set('memory_limit',filesize($FILE) + SOME_OVERHEAD_AMOUNT);

because you're reading the entire thing into memory. You may very well have to set the memory limit to two times the size of the file since you also want to JSON_DECODE

NOTE THAT ON A WEB SERVER THIS WILL CONSUME MASSIVE AMOUNTS OF MEMORY AND YOU SHOULD NOT DO THIS IF THE FILE WILL BE MANY GIGABYTES AS YOU SAID!!!!

Is it really a giant JSON blob? You should look at converting this to a database or other format which you can use random or row access with before parsing with PHP.


I've given all my servers a memory_limit of 100M... didn't run into trouble yet.

I would consider splitting up that file somehow, or get rid of it and use a database

0

精彩评论

暂无评论...
验证码 换一张
取 消