在Databricks上使用sparknlp提供的预训练模型

时间:2020-06-20 13:55:23

标签: pyspark databricks johnsnowlabs-spark-nlp

我正在尝试遵循John Snow Labs的官方示例,但是每次遇到TypeError: 'JavaPackage' object is not callable错误时,我都会这样做。我遵循了Databricks install documentation中的所有步骤,但是无论我尝试什么演练,此onethis one都会失败。

第一个示例(安装后):

import sparknlp
from sparknlp.pretrained import *

pipeline = PretrainedPipeline('recognize_entities_dl', 'en')

recognize_entities_dl download started this may take some time.
TypeError: 'JavaPackage' object is not callable

TypeError                                 Traceback (most recent call last)
<command-937510457011238> in <module>
----> 1 pipeline = PretrainedPipeline('recognize_entities_dl', 'en')
      2 
      3 # ner_bert = NerDLModel.pretrained('ner_dl_bert')
      4 
      5 # pipeline = PretrainedPipeline('recognize_entities_dl', 'en', 'https://s3.amazonaws.com/auxdata.johnsnowlabs.com/public/models/ner_dl_bert_en_2.4.3_2.4_1584624951079.zip')

/databricks/python/lib/python3.7/site-packages/sparknlp/pretrained.py in __init__(self, name, lang, remote_loc, parse_embeddings, disk_location)
     89     def __init__(self, name, lang='en', remote_loc=None, parse_embeddings=False, disk_location=None):
     90         if not disk_location:
---> 91             self.model = ResourceDownloader().downloadPipeline(name, lang, remote_loc)
     92         else:
     93             self.model = PipelineModel.load(disk_location)

/databricks/python/lib/python3.7/site-packages/sparknlp/pretrained.py in downloadPipeline(name, language, remote_loc)
     49     def downloadPipeline(name, language, remote_loc=None):
     50         print(name + " download started this may take some time.")
---> 51         file_size = _internal._GetResourceSize(name, language, remote_loc).apply()
     52         if file_size == "-1":
     53             print("Can not find the model to download please check the name!")

/databricks/python/lib/python3.7/site-packages/sparknlp/internal.py in __init__(self, name, language, remote_loc)
    190     def __init__(self, name, language, remote_loc):
    191         super(_GetResourceSize, self).__init__(
--> 192             "com.johnsnowlabs.nlp.pretrained.PythonResourceDownloader.getDownloadSize", name, language, remote_loc)
    193 
    194 

/databricks/python/lib/python3.7/site-packages/sparknlp/internal.py in __init__(self, java_obj, *args)
    127         super(ExtendedJavaWrapper, self).__init__(java_obj)
    128         self.sc = SparkContext._active_spark_context
--> 129         self._java_obj = self.new_java_obj(java_obj, *args)
    130         self.java_obj = self._java_obj
    131 

/databricks/python/lib/python3.7/site-packages/sparknlp/internal.py in new_java_obj(self, java_class, *args)
    137 
    138     def new_java_obj(self, java_class, *args):
--> 139         return self._new_java_obj(java_class, *args)
    140 
    141     def new_java_array(self, pylist, java_class):

/databricks/spark/python/pyspark/ml/wrapper.py in _new_java_obj(java_class, *args)
     65             java_obj = getattr(java_obj, name)
     66         java_args = [_py2java(sc, arg) for arg in args]
---> 67         return java_obj(*java_args)
     68 
     69     @staticmethod

TypeError: 'JavaPackage' object is not callable

如果尝试,我会得到类似的错误,即使不是确切的错误:

pipeline = PretrainedPipeline('recognize_entities_dl', 'en', 'https://s3.amazonaws.com/auxdata.johnsnowlabs.com/public/models/ner_dl_bert_en_2.4.3_2.4_1584624951079.zip')

对于第二个示例,我也遇到相同的错误。 Databricks运行时版本为:6.5(包括Apache Spark 2.4.5,Scala 2.11),位于已批准的运行时列表中。

我不确定错误消息的含义或解决方法。

1 个答案:

答案 0 :(得分:0)

我发现'JavaPackage' object is not callable是由spark-nlp(装配罐)缺失引起的。因此,我确保下载了这些jar,然后将其同时放在执行程序和驱动程序中。例如

在构建Spark docker映像时做类似的事情

RUN cd /opt/spark/jars && \
    wget https://s3.amazonaws.com/auxdata.johnsnowlabs.com/public/spark-nlp-assembly-2.6.4.jar

以及在驱动程序映像/机器上,请确保jar位于本地目录中。然后设置

conf.set("spark.driver.extraClassPath", "/opt/spark/jars/spark-nlp-assembly-2.6.4.jar")
conf.set("spark.executor.extraClassPath", "/opt/spark/jars/spark-nlp-assembly-2.6.4.jar")

数据块的解决方案可能有所不同,因此可能需要将它们托管在S3上并以这种方式引用,而不是在jar中进行烘焙。

相关问题