运行Spark wordCount时java.lang.NoClassDefFoundError

时间:2019-02-04 12:10:26

标签: apache-spark

在执行spark wordcount程序时遇到以下错误,请对此进行点灯

火花版本:2.4.0以独立模式运行

打包com.ahshan.sparklearning

import org.apache.spark._
import org.apache.spark.SparkConf
import org.apache.spark.SparkContext
import org.apache.spark.SparkContext._

object WordCount {
    def main(args: Array[String]) {

val conf = new SparkConf().setAppName("wordCount")
val sc = new SparkContext(conf)
// Load our input data.
val input =  sc.textFile("file:///tmp/README.md")
// Split it up into words.
val words = input.flatMap(line => line.split(" "))
// Transform into pairs and count.
val counts = words.map(word => (word, 1)).reduceByKey{case (x, y) => x + y}
// Save the word count back out to a text file, causing evaluation.
counts.saveAsTextFile("file:///tmp/sparkout.txt")
 }
}


 sbt  clean package
[info] Loading project definition from /root/SparkProject/project
[info] Loading settings for project sparkproject from build.sbt ...
[info] Set current project to learning-spark (in build file:/root/SparkProject/)
[success] Total time: 0 s, completed Feb 4, 2019 7:05:37 AM
[info] Updating ...
[info] Done updating.
[warn] There may be incompatibilities among your library dependencies; run 'evicted' to see detailed eviction warnings.
[info] Compiling 1 Scala source to /root/SparkProject/target/scala-2.12/classes ...
[info] Done compiling.
[info] Packaging /root/SparkProject/target/scala-2.12/learning-spark_2.12-0.0.1.jar ...
[info] Done packaging.
[success] Total time: 8 s, completed Feb 4, 2019 7:05:45 AM
  

2019-02-04 07:02:54 INFO BlockManagerInfo:54-添加   内存中的广播_0_piece0   sparkserver:39679(大小:22.9 KB,免费:366.3   MB)2019-02-04 07:02:54 INFO SparkContext:54-创建的广播0   从WordCount.scala上的textFile:14线程“ main”中的异常   java.lang.BootstrapMethodError:java.lang.NoClassDefFoundError:   scala / runtime / java8 / JFunction2 $ mcIII $ sp在   com.ahshan.sparklearning.WordCount $ .main(WordCount.scala:18)在   com.ahshan.sparklearning.WordCount.main(WordCount.scala)位于   sun.reflect.NativeMethodAccessorImpl.invoke0(本机方法)位于   sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)     在   sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)     在java.lang.reflect.Method.invoke(Method.java:498)在   org.apache.spark.deploy.JavaMainApplication.start(SparkApplication.scala:52)     在   org.apache.spark.deploy.SparkSubmit.org $ apache $ spark $ deploy $ SparkSubmit $$ runMain(SparkSubmit.scala:849)     在   org.apache.spark.deploy.SparkSubmit.doRunMain $ 1(SparkSubmit.scala:167)     在org.apache.spark.deploy.SparkSubmit.submit(SparkSubmit.scala:195)     在org.apache.spark.deploy.SparkSubmit.doSubmit(SparkSubmit.scala:86)     在   org.apache.spark.deploy.SparkSubmit $$ anon $ 2.doSubmit(SparkSubmit.scala:924)     在org.apache.spark.deploy.SparkSubmit $ .main(SparkSubmit.scala:933)     在org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)上引起   创建人:java.lang.NoClassDefFoundError:   scala / runtime / java8 / JFunction2 $ mcIII $ sp ... 14更多原因:   java.lang.ClassNotFoundException:   scala.runtime.java8.JFunction2 $ mcIII $ sp在   java.net.URLClassLoader.findClass(URLClassLoader.java:381)在   java.lang.ClassLoader.loadClass(ClassLoader.java:424)在   java.lang.ClassLoader.loadClass(ClassLoader.java:357)...还有14个

0 个答案:

没有答案