Lucene 5.0.0在多线程下运行时LockObtainFailedException

时间:2015-04-30 05:50:58

标签: java lucene apache-spark

在多个线程下运行时,我收到了LockObtainFailedException异常。如果只在单线程下运行,则错误消失了。

  org.apache.lucene.store.LockObtainFailedException: Lock obtain timed out: NativeFSLock@/home/dic_new/indexfolder/write.lock. 

代码片段粘贴如下。

    Path indexLoc = Paths.get(this.indexLoc);
    Directory fsDir = FSDirectory.open(indexLoc);
    if (!DirectoryReader.indexExists(fsDir)) {
        return null; //return if index not exist
    }

    //new text added in with append mode
    Directory fsDir = FSDirectory.open(indexLoc);
    IndexWriterConfig iwConf = new IndexWriterConfig(analyzer);
    iwConf.setOpenMode(IndexWriterConfig.OpenMode.APPEND);
    IndexWriter indexWriter = new IndexWriter(fsDir, iwConf);


    Document d = new Document();
    d.add(new TextField(this.fieldName, text, Store.NO));
    indexWriter.addDocument(d);

    indexWriter.commit();
    indexWriter.close();

P.s。,我尝试使用FSDirectory.open(indexLoc,NoLockFactory.INSTANCE)但它不能用于多个线程,因为生成了索引损坏错误消息。

这里,当将spark.master设置为local [K]时,Spark会处理并发性。

0 个答案:

没有答案
相关问题