运行时出现Spark错误

时间:2016-11-28 22:23:11

标签: apache-spark

我遇到版本问题,请指导一些doc如何检查版本兼容性,尝试设置环境

plugins.sbt

addSbtPlugin("com.typesafe.sbteclipse" % "sbteclipse-plugin" % "4.0.0")
build.sbt
name := "wcount"
version := "1.0"
scalaVersion := "2.10.5"

libraryDependencies ++= Seq(
  "org.apache.spark" % "spark-core_2.11" % "1.6.0",
  "com.typesafe" % "config" % "1.3.0"
    )
----------------------------------------
 scala -version
Scala code runner version 2.10.5 -- Copyright 2002-2013, LAMP/EPFL
----------------------------------------

Spark version 1.6.0

使用Scala版本2.10.5(OpenJDK 64位服务器VM,Java 1.7.0_95)

运行input.txt output.txt      import org.apache.spark.SparkContext,org.apache.spark.SparkConf

object WordCount {
  def main(args: Array[String]) {
    val conf = new SparkConf().
      setAppName("Word Count").
      setMaster("local")
    val sc = new SparkContext(conf)
    val inputPath = args(0)
    val outputPath = args(1)

    val wc = sc.textFile(inputPath).
      flatMap(rec => rec.split(" ")).
      map(rec => (rec, 1)).
      reduceByKey((acc, value) => acc + value)

    wc.saveAsTextFile(outputPath)

  }
}
---------------------------------------------

收到错误

[info] Loading global plugins from /root/.sbt/0.13/plugins
[info] Set current project to wcount (in build file:/root/spark/wcount/)
[info] Running WordCount
[error] (run-main-0) java.lang.NoSuchMethodError:             scala.Predef$.$conforms()Lscala/Predef$$less$colon$less;
java.lang.NoSuchMethodError:     scala.Predef$.$conforms()Lscala/Predef$$less$colon$less;
    at org.apache.spark.util.Utils$.getSystemProperties(Utils.scala:1546)
    at org.apache.spark.SparkConf.<init>(SparkConf.scala:59)
    at org.apache.spark.SparkConf.<init>(SparkConf.scala:53)
    at WordCount$.main(WordCount.scala:5)
    at WordCount.main(WordCount.scala)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at     sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
    at     sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java    :43)
    at java.lang.reflect.Method.invoke(Method.java:606)
[trace] Stack trace suppressed: run last compile:run for the full output.
java.lang.RuntimeException: Nonzero exit code: 1
    at scala.sys.package$.error(package.scala:27)
[trace] Stack trace suppressed: run last compile:run for the full output.
[error] (compile:run) Nonzero exit code: 1
[error] Total time: 1 s, completed Nov 28, 2016 10:04:20 PM

1 个答案:

答案 0 :(得分:0)

您已将spark core libs设置为2.11

"org.apache.spark" % "spark-core_2.11" % "1.6.0",

将其更改为2.10并重试。