开发者

Too Many open file exception while indexin using solr

开发者 https://www.devze.com 2023-01-18 08:33 出处:网络
I am using SOLR for indexing documents in my web application and solr.war is deployed on the jboss server.

I am using SOLR for indexing documents in my web application and solr.war is deployed on the jboss server. But while indexing i am getting too many files开发者_开发知识库 open exception. Below is some of exceptions stack trace:

12:31:33,267 ERROR [STDERR] Exception in thread "Lucene Merge Thread #0"
12:31:33,267 ERROR [STDERR] org.apache.lucene.index.MergePolicy$MergeException: java.io.FileNotFoundException: /data/jbossesb/bin/solr/data/index/_2rw.prx (Too many open files)
12:31:33,267 ERROR [STDERR] at org.apache.lucene.index.ConcurrentMergeScheduler.handleMergeException(ConcurrentMergeScheduler.java:351)
12:31:33,267 ERROR [STDERR] at org.apache.lucene.index.ConcurrentMergeScheduler$MergeThread.run(ConcurrentMergeScheduler.java:315)
12:31:33,267 ERROR [STDERR] Caused by: java.io.FileNotFoundException: /data/jbossesb/bin/solr/data/index/_2rw.prx (Too many open files)
12:31:33,267 ERROR [STDERR] at java.io.RandomAccessFile.open(Native Method)


As explained in this SOLR Jira, you can try the following options:

  • increasing your ulimit using: ulimit -n 1000000
  • set useCompoundFile to true in solrconfig.xml to use Lucene's compound file format
  • use a lower mergeFactor which will result in fewer segments and hence fewer open files.


File Descriptor will be your most likely cause.

Check the limit which your operating system has set. and adjust accordingliy. on Unix, the command to view and set is ulimit.


Optimize the index. It probably has too many segments.


Also try reducing the merge factor

0

精彩评论

暂无评论...
验证码 换一张
取 消