Spark Context创建错误

时间:2017-09-18 07:15:10

标签: scala apache-spark sbt

尝试从我的应用程序创建spark上下文时出现以下错误集。

Exception in thread "main" java.lang.NoClassDefFoundError: scala/Product$class
        at org.apache.spark.SparkConf$DeprecatedConfig.<init>(SparkConf.scala:723)
        at org.apache.spark.SparkConf$.<init>(SparkConf.scala:571)
        at org.apache.spark.SparkConf$.<clinit>(SparkConf.scala)
        at org.apache.spark.SparkConf.set(SparkConf.scala:92)
        at org.apache.spark.SparkConf.set(SparkConf.scala:81)
        at org.apache.spark.SparkConf.setAppName(SparkConf.scala:118)
        at sparkEnvironment$.<init>(Ticket.scala:33)
        at sparkEnvironment$.<clinit>(Ticket.scala)
        at Ticket$.main(Ticket.scala:40)
        at Ticket.main(Ticket.scala)
    Caused by: java.lang.ClassNotFoundException: scala.Product$class
        at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
        at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
        at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:335)
        at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
        ... 10 more

我正在使用Spark 2.2.0和Scala版本2.12.3。我的build.sbt如下所示:

scalaVersion := "2.12.3"

libraryDependencies += "com.typesafe" % "config" % "1.3.1"
libraryDependencies += "mysql" % "mysql-connector-java" % "5.1.36"

//Sparks dependencies
libraryDependencies += "org.apache.spark" % "spark-core_2.11" % "2.2.0"
libraryDependencies += "org.apache.spark" % "spark-mllib_2.10" % "2.2.0"

我尝试创建上下文的片段如下:

object sparkEnvironment {

  Logger.getLogger("org.apache.spark").setLevel(Level.WARN)
  Logger.getLogger("org.eclipse.jetty.server").setLevel(Level.OFF)

  val conf : SparkConf = new SparkConf().setAppName("Ticketing").setMaster("local[2]")
  val sc = new SparkContext(conf)
}

object Ticket {
  def main(args: Array[String]): Unit = {

    println(sc)
}
}

1 个答案:

答案 0 :(得分:2)

构建并分发Spark 2.2.0以默认使用Scala 2.11。 要在Scala中编写应用程序,您需要使用兼容的Scala版本(例如2.11.X)。 而你的scala版本是2.12.X.这就是它抛出异常的原因。