I'm running the following code over a set of 5,000 results. It's failing due to the memory being exhausted.
foreach ($data as $key => $report) {
$data[$key]['data'] = unserialize($report['serialized_values']);
}
I know I can up the memory limit, but I'd like to run this without a problem instead. I'm not going to be able to keep upping the memory forever.
EDIT
The $data
is in this format:
[1] => Array
(
[0] => 127654619178790249
[report_id] => 127654619178790249
[1] => 1
[user_id] => 1
[2] => 2010-12-31 19:43:24
[sent_on] => 2010-12-31 19:43:24
[3] =>
[fax_trans_id] =>
[4] => 1234567890
[fax_to_nums] => 1234567890
开发者_开发知识库 [5] => ' long html string here',
[html_content] => 'long html string here',
[6] => 'serialization_string_here',
[serialized_values] => 'serialization_string_here',
[7] => 70
[id] => 70
)
Beyond the problems of for and foreach, you need to re-architect your solution. You're hitting memory limits because you're legitimately using too much memory. Each time you unserialize the contents of the database column and store it in an array
$data[$key]['data']
PHP needs to set aside a chunk of memory to store that data so it can be accessed later. When your array gets too big you're out of memory. In plain english, you're telling PHP
Take all 5000 rows of data and store them in memory, I'm going to do something with them later.
You need to think of a different way to approach your problem. The below items are two quick thoughts on the problem.
You could not store the items in memory and just take whatever actions you wanted to in the loop, allowing php to discard the items as need be
foreach ($data as $key => $report) {
$object = unserialize($report['serialized_values']);
//do stuff with $object here
}
You could also only store the information you need from the unserialized object, rather than storing the entire object
foreach ($data as $key => $report) {
$object = unserialize($report['serialized_values']);
$data = array();
$data['foo'] = $object->foo;
$data[$key]['data'] = $data;
}
Long story short: you're hitting memory limits because you're actually using too much memory. There's no magic solution here. Storing serialized data and attempting to load it all in a single program is a memory intensive approach, irrespective of language/platform.
A foreach
will load all 5,000 results into memory. See the numerous complaints in the docs. Use a for
loop and access each result as you need it.
What is $data
and where are you getting it from? If it's a file, can't you fgets() one line at a time to parse, and if it's a database, can't you process one record at a time (at the expense of the MySQL waiting to close the result set). I think you should reconsider loading the entire of $data
into memory at once and then looping over it.
I think these bugs are not closed yet:
- https://bugs.php.net/bug.php?id=65814
- https://bugs.php.net/bug.php?id=60937
"When unserializing inside a loop the same serialized object, the total memory consumption increases every couple of iterations"
try this way
foreach ($data as $key => &$report) {
}
This will assign reference instead of copying the value.
This is actually the reason why many sites divide result in pages.
Suppose I have 5000 results (say users, to simplify) and I have a page that should display all those 5000 results. I would divide those 5000 results into 500 per page so that one page 1 displays 1 - 500, page 2 displays 501 - 1000, page 3 displays 1001 - 1500 and so on. In this way, memory is saved.
If you really need to display all 5000 results in one page, you really need to increase the memory limit. Or use for loop instead.
I don't know for sure but you might use:
- gzip($data set) to compress data to safe memory and deflate it on the fly.
- limit (data)set.
- create a cache like system. evict least recently used(LRU) data from cache if using too much memory.
精彩评论