开发者

How to prevent pulling in all data into memory from a table with PDO?

开发者 https://www.devze.com 2023-03-04 17:50 出处:网络
Let\'s say I have a table containing 10,000 rows of product data and I want to build an XML feed containing all data in the table.

Let's say I have a table containing 10,000 rows of product data and I want to build an XML feed containing all data in the table.

Obviously I don't want to pull in all products into memory at once. So, how would I prevent eating all my server's memory? With PDO I can iterate through all records like so:

$fp = fopen('feed.xml', 'w');

// write s开发者_开发技巧ome XML data
// ..

$statement = $db->prepare("SELECT * FROM products");
$statement->execute();

while ($row = $statement->fetch()) {
    fputs($fp, '<product>' . $row->name . '</product>');
}

// write some more XML data
// ..

fclose($fp);

Is this method going to pull all products into memory at once, or does PDO provide some sort of behind-the-scenes pagination functionality? If not, what can I do?

I should note I'm using MySQL.


Apart from you have missed the ">" off in the while statement and you need to tell it to fetch a class

while ($row = $statement->fetch(PDO::FETCH_CLASS)) {

I personally to use an Associative Array and don't forget to close the statement at the end- so

while ($row = $statement->fetch(PDO::FETCH_ASSOC)) {
    fputs($fp, '<product>' . $row("name") . '</product>');
}
$statement->closeCursor()

But I think that is personal preference.

0

精彩评论

暂无评论...
验证码 换一张
取 消