开发者

SOLR - how to index database partly?

开发者 https://www.devze.com 2022-12-09 21:04 出处:网络
I have a postgresql database. In table, which i need to index, i have about 20 million rows. When i want to index them all in one attempt(smth like \"select * from table_name\"), i have Java OutOfMemo

I have a postgresql database. In table, which i need to index, i have about 20 million rows. When i want to index them all in one attempt(smth like "select * from table_name"), i have Java OutOfMemory 开发者_C百科error, even, if i`ll give to JVM more memory.

Is there any option in SOLR to index a table part by part(e.g. execute sql for first 1000000 rows, then index it, then execute sql for second million)?

Now i am using sql query with LIMIT. But, everytime, when solr has indexed it, i need manually start it again.

UPDATE: Ok, 1.4 is out now. No OutOfMemory Exceptions, seems, Apache had done very big work on DIH. Also, now we can pass parameters through request, and use them in our SQL selects. Wow!


See the bit about "cursors" here, that might well help.

http://jdbc.postgresql.org/documentation/83/query.html


Do you have autoCommit, batchSize configured? If you do, it might be this bug, try updating to trunk.


Have you looked at using SolrJ as a client? While DIH is great, the tight coupling between Solr and your Database means that it can be hard to manipulate your data and work around issues.

With a SolrJ client, you could then iterate in batches that you control over your database, and then turnaround and dump then directly into Solr. Also, using SolrJ new binary java stream format instead of XML means that indexing your 20 million rows should go fairly quickly.

DIH is great, until you end up in issues like this!

0

精彩评论

暂无评论...
验证码 换一张
取 消