My end goal is to create invoices as PDFs for a billing project im working on - creating the PDF is something that I have already done - im first creating an HTML file per invoice then batch processing the HTMLs to PDFs in a seperate process.
Then problem im having is that the data im trying to get into the HTML file is very large ... its stored in MySQL (Im using Symfony 1.4 and Doctrine). php.ini currently setup to allow 500M of memory for php. I need to have all invoiceLines for a single invoice in the same HTML/PDF file.
I am currently looping a invoice table (1 r开发者_如何学JAVAow per invoice) and then getting the actual lines of the invoice from another table (many rows per invoice) - the problem Im having is that of huge memory usage - can anyone suggest an alternative way of doing this.
foreach ($invoices as $inv)
{
$outfile = '/ivoice' . $inv[0] . '.html';
$tempinvfile = fopen($outfile, 'w'); //open file and replace current it exists
echo '1 - Used memory : ' . number_format(memory_get_usage() - $mem) . PHP_EOL;
$invoiceLines = Doctrine::getTable('InvoiceLine')
->createQuery()
->where('invoiceid = ?', $inv[0])
->setHydrationMode(Doctrine::HYDRATE_NONE)
->execute();
echo "Count = " . count($invoiceLines) . PHP_EOL;
echo '2 - Used memory : ' . number_format(memory_get_usage() - $mem) . PHP_EOL;
$bodyhtml = '';
foreach ($invoiceLines as $invline)
{
// $bodyhtml .= '<tr>';
// $bodyhtml .= ' <td><b>' . $invline[2] . '</b></td>';
// $bodyhtml .= ' <td>£' . $invline[4] . '</td>';
// $bodyhtml .= ' <td>£' . $invline[5]. '</td>';
// $bodyhtml .= ' <td>£' . $invline[8] . '</td>';
// $bodyhtml .= ' <td>£' . $invline[9] . '</td>';
// $bodyhtml .= ' <td>£' . $invline[10] . '</td>';
// $bodyhtml .= ' <td>£' . $invline[12] . '</td>';
// $bodyhtml .= '</tr>';
}
echo '3 - Used memory : ' . number_format(memory_get_usage() - $mem) . PHP_EOL;
// fwrite($tempinvfile, $bodyhtml);
// fclose($tempinvfile);
}
echo 'Used memory : ' . number_format(memory_get_usage() - $mem) . PHP_EOL;
echo "Done processing, time = " . (time() - $start) . PHP_EOL;
The output this produces is as follows :
Starting Processing
1 - Used memory : 615,736
Count = 39
2 - Used memory : 1,033,264
3 - Used memory : 1,033,344
after end loop - Used memory : 1,033,344
1 - Used memory : 1,055,448
Count = 11
2 - Used memory : 1,118,200
3 - Used memory : 1,118,200
after end loop - Used memory : 1,118,200
1 - Used memory : 1,140,304
Count = 30340
2 - Used memory : 89,061,552
3 - Used memory : 88,977,472
after end loop - Used memory : 88,977,472
1 - Used memory : 88,999,576
Count = 156
2 - Used memory : 89,482,752
3 - Used memory : 89,482,752
after end loop - Used memory : 89,482,752
1 - Used memory : 89,505,368
Count = 3867
2 - Used memory : 100,737,248
3 - Used memory : 100,737,248
after end loop - Used memory : 100,737,248
Used memory : 100,737,248
Done processing, time = 0
So I can clearly see that the main memory use is when the invoiceLines are grabbed from the table. But I don't know the best way of approaching this ..... How can I reclaim the memory at the end of each loop ?
I have tried using DoctrinePager and looping one page at a time - worked but used more memory that the above approach.
Suggestions are very welcome .....
EDIT :
Memory output with all file operations commented out and unset($invoiceLines)
added just after echo 3:
Starting Processing
1 - Used memory : 615,712
Count = 39
2 - Used memory : 1,033,248
3 - Used memory : 1,033,328
after end loop - Used memory : 1,033,328
1 - Used memory : 1,055,432
Count = 11
2 - Used memory : 1,118,176
3 - Used memory : 1,118,176
after end loop - Used memory : 1,118,176
1 - Used memory : 1,140,280
Count = 30340
2 - Used memory : 89,061,760
3 - Used memory : 88,977,680
after end loop - Used memory : 88,977,680
1 - Used memory : 88,999,784
Count = 156
2 - Used memory : 89,482,968
3 - Used memory : 89,482,968
after end loop - Used memory : 89,482,968
1 - Used memory : 89,505,584
Count = 3867
2 - Used memory : 100,737,464
3 - Used memory : 100,737,464
after end loop - Used memory : 100,737,464
Used memory : 100,737,464
Done processing, time = 0
EDIT 2: Suggestion by @Crack -> added profiler:false to the databases.yml file .. here are the memory results :
Starting Processing
1 - Used memory : 600,392
Count = 39
2 - Used memory : 1,006,768
3 - Used memory : 1,006,848
after end loop - Used memory : 896,352
1 - Used memory : 903,192
Count = 11
2 - Used memory : 954,624
3 - Used memory : 951,824
after end loop - Used memory : 922,576
1 - Used memory : 929,416
Count = 30340
2 - Used memory : 88,840,104
3 - Used memory : 88,751,672
after end loop - Used memory : 863,168
1 - Used memory : 870,008
Count = 156
2 - Used memory : 1,342,816
3 - Used memory : 1,340,016
after end loop - Used memory : 889,392
1 - Used memory : 896,232
Count = 3867
2 - Used memory : 12,117,080
3 - Used memory : 12,114,280
after end loop - Used memory : 915,616
Used memory : 915,616
Done processing, time = 0
Use fwrite()
after you read 1000 rows, then reset $bodyhtml ($bodyhtml = '';
). That way you won't be storing a large string in memory.
EDIT
Oops, Doctrine must be storing all previously read rows in some cache in $invoiceLines
... Try looking at solutions for php/symfony/doctrine memory leak?.
精彩评论