开发者

How to split a resultset in Java into chunks (500 rows each) without looping?

开发者 https://www.devze.com 2023-03-05 12:31 出处:网络
I have a query which returns millions of records which are held in a result set. I have to process (insert/update) these records.Instead of inserting all the record at once I would like to split the t

I have a query which returns millions of records which are held in a result set. I have to process (insert/update) these records. Instead of inserting all the record at once I would like to split the the resultset into chunks of 500 records each and store in an ArrayList or Vector and process these 500 records at a time.

How do I split the resultset into chunks & store in an ArrayList or Vector without looping through the million records?

i found the answer , got to use CachedRowSet instead of resultset. and use set setPageSize

CachedRowSet crs = new CachedRowSetImpl(); 
crs.setPageSize(500); 
crs.execute(conHandle); 

and then use

  while(crs.nextPage()) {

        collection ob开发者_如何学Goj = crs.toCollections();
 }

this would ensure that a we could process large data into smaller chunks

but i have a dought here how would the crs populate by passing a connection object where do mention the query string ??


Depends on your SQL dialect. For example, in PostgreSQL there are OFFSET and LIMIT clauses for SELECT:

SELECT * FROM table LIMIT 500 OFFSET 0;

You still will need some kind of loop to generate queries to fetch all millions of your records.


You have to fetch you data in the single query because you will not get the same result in multiple queries in multiuser environment.

If the client memory is the issue - bcp out the query result in file at the first and split your file with something like unix split command.

You can parse your splited data file by file or bcp in in a working table and load data in the your ArrayList.

0

精彩评论

暂无评论...
验证码 换一张
取 消