java.io.InvalidClassException:没有有效的构造函数scala

时间:2018-03-23 10:20:17

标签: scala

我遇到了java.io.InvalidClassException:没有使用Scala的有效构造函数。我读到了这个异常,我发现在尝试反序列化序列化类的对象时,会触发no_arg构造函数,如果不存在此构造函数,则会抛出此异常。我已经在我的代码中使用了这个无参数构造函数,但是仍有这个例外。

这是我的班级:

object KB {

class KB(var UrlOwlFile: String, rdd: OWLAxiomsRDD, sparkSession: SparkSession) extends Serializable {

    var ontology: OWLOntology = initKB()
    var reasoner: OWLReasoner = _
    var hermit: Reasoner = _
    var manager: OWLOntologyManager = _
    var Concepts: RDD[OWLClass] = _
    var Roles: RDD[OWLObjectProperty] = _
    var dataFactory: OWLDataFactory = _
    var Examples: RDD[OWLIndividual] = _
    var dataPropertiesValue: RDD[RDD[OWLLiteral]] = _
    var Properties: RDD[OWLDataProperty] = _
    var domain: Array[Array[OWLIndividual]] = _
    var classifications: RDD[((OWLClassExpression, OWLIndividual), Int)]= _
    var newEle: ((OWLClassExpression, OWLIndividual), Int) = _

  def KB(){
      var ontology: OWLOntology = null
      var reasoner: OWLReasoner = null
      var hermit: Reasoner = null
      var manager: OWLOntologyManager = null
      var Concepts: RDD[OWLClass] = null
      var Roles: RDD[OWLObjectProperty] = null
      var dataFactory: OWLDataFactory = null
      var Examples: RDD[OWLIndividual] = null
      var dataPropertiesValue: RDD[RDD[OWLLiteral]] = null
      var Properties: RDD[OWLDataProperty] = null
      var domain: Array[Array[OWLIndividual]] = null
      var classifications: RDD[((OWLClassExpression, OWLIndividual), Int)]= null
      var newEle: ((OWLClassExpression, OWLIndividual), Int) = null
      val d: Double = 0.3
      var generator: Random = new Random(2)
    }
def getClassMembershipResult(testConcepts: Array[OWLClassExpression], negTestConcepts: Array[OWLClassExpression],
                                 examples: RDD[OWLIndividual]): RDD[((OWLClassExpression, OWLIndividual), Int)] = {

      println("\nClassifying all examples \n ------------ ")

      var flag: Boolean = false
      println("Processed concepts (" + testConcepts.size + "): \n")

      for (c <- 0 until testConcepts.size) {
        var p: Int = 0
        var n: Int = 0
        println("\nTest Concept number " + (c+1) + ": " + testConcepts(c))

        var c1 = examples.map{ x => (testConcepts(c),x)}
        var newEle1 = c1.map{x => (x, 0)}
        var c2 = newEle1.mapPartitions{data => data.map{ele => 
          if (getReasoner.isEntailed(getDataFactory.getOWLClassAssertionAxiom(testConcepts(c), ele._1._2))){
            newEle = (ele._1, +1)
            p = p + 1
          }
          else {
                if (!flag){
                  if (getReasoner.isEntailed(getDataFactory.getOWLClassAssertionAxiom(negTestConcepts(c), ele._1._2)))
                     newEle = (ele._1, -1)  
                }
                else
                     newEle = (ele._1, -1)  
                n = n + 1
              }
          println("\n Pos: " + p + "\t Neg: " + n)  

          newEle  
         }
        }

        classifications = c2
     }
      classifications.take(10).foreach(println(_))
      classifications
    }
    }
}

这是输出:

18/03/23 11:10:09 ERROR Executor: Exception in task 0.0 in stage 36.0 (TID 44)
java.io.InvalidClassException: net.sansa_stack.ml.spark.classification.KB$KB$$anon$2; no valid constructor
    at java.io.ObjectStreamClass$ExceptionInfo.newInvalidClassException(ObjectStreamClass.java:157)
    at java.io.ObjectStreamClass.checkDeserialize(ObjectStreamClass.java:862)
    at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2034)
    at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1567)
    at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2278)
    at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:2202)
    at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2060)
    at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1567)
    at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2278)
    at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:2202)
    at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2060)
    at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1567)
    at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2278)
    at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:2202)
    at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2060)
    at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1567)
    at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2278)
    at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:2202)
    at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2060)
    at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1567)
    at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2278)
    at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:2202)
    at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2060)
    at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1567)
    at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2278)
    at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:2202)
    at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2060)
    at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1567)
    at java.io.ObjectInputStream.readObject(ObjectInputStream.java:427)
    at org.apache.spark.serializer.JavaDeserializationStream.readObject(JavaSerializer.scala:75)
    at org.apache.spark.serializer.JavaSerializerInstance.deserialize(JavaSerializer.scala:114)
    at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:80)
    at org.apache.spark.scheduler.Task.run(Task.scala:99)
    at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:322)
    at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
    at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
    at java.lang.Thread.run(Thread.java:748)
18/03/23 11:10:09 ERROR TaskSetManager: Task 0 in stage 36.0 failed 1 times; aborting job
Exception in thread "main" org.apache.spark.SparkException: Job aborted due to stage failure: Task 0 in stage 36.0 failed 1 times, most recent failure: Lost task 0.0 in stage 36.0 (TID 44, localhost, executor driver): java.io.InvalidClassException: net.sansa_stack.ml.spark.classification.KB$KB$$anon$2; no valid constructor
    at java.io.ObjectStreamClass$ExceptionInfo.newInvalidClassException(ObjectStreamClass.java:157)
    at java.io.ObjectStreamClass.checkDeserialize(ObjectStreamClass.java:862)
    at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2034)
    at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1567)
    at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2278)
    at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:2202)
    at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2060)
    at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1567)
    at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2278)
    at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:2202)
    at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2060)
    at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1567)
    at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2278)
    at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:2202)
    at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2060)
    at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1567)
    at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2278)
    at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:2202)
    at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2060)
    at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1567)
    at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2278)
    at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:2202)
    at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2060)
    at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1567)
    at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2278)
    at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:2202)
    at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2060)
    at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1567)
    at java.io.ObjectInputStream.readObject(ObjectInputStream.java:427)
    at org.apache.spark.serializer.JavaDeserializationStream.readObject(JavaSerializer.scala:75)
    at org.apache.spark.serializer.JavaSerializerInstance.deserialize(JavaSerializer.scala:114)
    at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:80)
    at org.apache.spark.scheduler.Task.run(Task.scala:99)
    at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:322)
    at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
    at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
    at java.lang.Thread.run(Thread.java:748)

Driver stacktrace:
    at org.apache.spark.scheduler.DAGScheduler.org$apache$spark$scheduler$DAGScheduler$$failJobAndIndependentStages(DAGScheduler.scala:1435)
    at org.apache.spark.scheduler.DAGScheduler$$anonfun$abortStage$1.apply(DAGScheduler.scala:1423)
    at org.apache.spark.scheduler.DAGScheduler$$anonfun$abortStage$1.apply(DAGScheduler.scala:1422)
    at scala.collection.mutable.ResizableArray$class.foreach(ResizableArray.scala:59)
    at scala.collection.mutable.ArrayBuffer.foreach(ArrayBuffer.scala:48)
    at org.apache.spark.scheduler.DAGScheduler.abortStage(DAGScheduler.scala:1422)
    at org.apache.spark.scheduler.DAGScheduler$$anonfun$handleTaskSetFailed$1.apply(DAGScheduler.scala:802)
    at org.apache.spark.scheduler.DAGScheduler$$anonfun$handleTaskSetFailed$1.apply(DAGScheduler.scala:802)
    at scala.Option.foreach(Option.scala:257)
    at org.apache.spark.scheduler.DAGScheduler.handleTaskSetFailed(DAGScheduler.scala:802)
    at org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.doOnReceive(DAGScheduler.scala:1650)
    at org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.onReceive(DAGScheduler.scala:1605)
    at org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.onReceive(DAGScheduler.scala:1594)
    at org.apache.spark.util.EventLoop$$anon$1.run(EventLoop.scala:48)
    at org.apache.spark.scheduler.DAGScheduler.runJob(DAGScheduler.scala:628)
    at org.apache.spark.SparkContext.runJob(SparkContext.scala:1925)
    at org.apache.spark.SparkContext.runJob(SparkContext.scala:1938)
    at org.apache.spark.SparkContext.runJob(SparkContext.scala:1951)
    at org.apache.spark.rdd.RDD$$anonfun$take$1.apply(RDD.scala:1354)
    at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:151)
    at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:112)
    at org.apache.spark.rdd.RDD.withScope(RDD.scala:362)
    at org.apache.spark.rdd.RDD.take(RDD.scala:1327)
    at net.sansa_stack.ml.spark.classification.KB$KB.getClassMembershipResult(KB.scala:226)
    at net.sansa_stack.ml.spark.classification.ClassMembership$ClassMembership.<init>(ClassMembership.scala:48)
    at net.sansa_stack.ml.spark.classification.TermDecisionTrees$.main(TermDecisionTrees.scala:47)
    at net.sansa_stack.ml.spark.classification.TermDecisionTrees.main(TermDecisionTrees.scala)
Caused by: java.io.InvalidClassException: net.sansa_stack.ml.spark.classification.KB$KB$$anon$2; no valid constructor
    at java.io.ObjectStreamClass$ExceptionInfo.newInvalidClassException(ObjectStreamClass.java:157)
    at java.io.ObjectStreamClass.checkDeserialize(ObjectStreamClass.java:862)
    at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2034)
    at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1567)
    at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2278)
    at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:2202)
    at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2060)
    at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1567)
    at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2278)
    at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:2202)
    at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2060)
    at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1567)
    at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2278)
    at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:2202)
    at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2060)
    at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1567)
    at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2278)
    at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:2202)
    at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2060)
    at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1567)
    at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2278)
    at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:2202)
    at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2060)
    at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1567)
    at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2278)
    at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:2202)
    at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2060)
    at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1567)
    at java.io.ObjectInputStream.readObject(ObjectInputStream.java:427)
    at org.apache.spark.serializer.JavaDeserializationStream.readObject(JavaSerializer.scala:75)
    at org.apache.spark.serializer.JavaSerializerInstance.deserialize(JavaSerializer.scala:114)
    at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:80)
    at org.apache.spark.scheduler.Task.run(Task.scala:99)
    at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:322)
    at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
    at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
    at java.lang.Thread.run(Thread.java:748)

1 个答案:

答案 0 :(得分:2)

您的代码存在两个主要问题。

首先在scala中,辅助构造函数的定义如下:

class Greeter(message: String, secondaryMessage: String) {
    def this(message: String) = this(message, "")   //Secondary constructor      
    def SayHi() = println(message + secondaryMessage)
}

因此,您的辅助构造函数定义不正确。使用def this并从辅助构造函数中调用主构造函数(即类定义,即KB(var UrlOwlFile:String,rdd:OWLAxiomsRDD,sparkSession:SparkSession)。此外,def KB()被视为类KB中的方法而不是作为构造函数。正如@Andrey已经在评论中提到的那样,查看this以获取更多信息。

其次,

def KB(){
  var ontology: OWLOntology = null
  var ontology: OWLOntology = null
  var reasoner: OWLReasoner = null
  ....

var ontology 在函数KB()中声明一个新变量,它与类KB中定义的变量本体无关。要重新初始化本体变量,只需使用:

ontology = null

同样使用 null 是一种糟糕的编程习惯,而在scala中,您可以使用Option代替。

相关问题