I have a large table in PostgresDB (55GB). I want to scan it in C++ with QT, summarize the result, and send it back to the DB. As far as I understand, by default QSql开发者_如何转开发Query transfers all the data into the main memory. Is it possible to state explicit buffer size for the object? Is it also possible to tell to QT/Postgres that I would like to get the answer incrementally, i.e., not after all the data is comuted and only then sent to the C++ program?
The code I am using is the following:
QSqlDatabase db = QSqlDatabase::addDatabase("QPSQL");
db.setHostName("server");
db.setDatabaseName("db");
db.setUserName("user");
db.setPassword("pass");
bool ok = db.open();
QSqlQuery query;
query.setForwardOnly(true);
query.prepare("select attributes veryLargeTable");
while (query.next()) {
int i = query.value(0).toInt(); // and work with the data
}
I saw an answer on the net recently, but I spent hours looking for the answer again without any success.
I would split my select (of course in a transaction). First determine iMaxLines
size of select like:
SELECT COUNT(*) FROM veryLargeTable
If i now want to read only 1000 lines ad once, i can do somethin like this
SELECT attribute FROM veryLargeTable LIMIT 1000 OFFSET X
Where X
is can be iterating from 0, 1000, 2000,..., iMaxLines-1000
Regards, Lars
精彩评论