线程“main”中的异常java.lang.NoClassDefFoundError:org / apache / spark / Logging

时间:2016-11-10 04:54:23

标签: scala apache-spark sbt spark-streaming apache-spark-mllib

我是Spark Mllib的新手,我只是试图从他们的网站上运行示例代码。但是我收到了Logging错误。当我尝试做一些Twitter分析时,我也遇到了同样的错误。错误如下

**Exception in thread "main" java.lang.NoClassDefFoundError: org/apache/spark/Logging**
at java.lang.ClassLoader.defineClass1(Native Method)`
at java.lang.ClassLoader.defineClass(ClassLoader.java:763)
at java.security.SecureClassLoader.defineClass(SecureClassLoader.java:142)
at java.net.URLClassLoader.defineClass(URLClassLoader.java:467)
at java.net.URLClassLoader.access$100(URLClassLoader.java:73)
at java.net.URLClassLoader$1.run(URLClassLoader.java:368)
at java.net.URLClassLoader$1.run(URLClassLoader.java:362)
at java.security.AccessController.doPrivileged(Native Method)
at java.net.URLClassLoader.findClass(URLClassLoader.java:361)
at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:331)
at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
at org.apache.spark.mllib.recommendation.ALS$.train(ALS.scala:599)
at org.apache.spark.mllib.recommendation.ALS$.train(ALS.scala:616)
at scalamornprac.ML$.main(ML.scala:30)
at scalamornprac.ML.main(ML.scala)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at com.intellij.rt.execution.application.AppMain.main(AppMain.java:147)
Caused by: java.lang.ClassNotFoundException: org.apache.spark.Logging
at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:331)
at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
... 21 more
16/11/10 10:02:29 INFO SparkContext: Invoking stop() from shutdown hook

我使用 Intellij IDEA。 sbt如下。

name := "Spark-Packt"
version := "1.0"
scalaVersion := "2.10.6"

libraryDependencies += "org.apache.spark" % "spark-core_2.10" % "2.0.0"
libraryDependencies += "org.apache.spark" % "spark-streaming_2.10" % "2.0.0"
libraryDependencies += "org.apache.spark" % "spark-mllib_2.10" % "1.0.0"
libraryDependencies += "org.apache.spark" % "spark-mllib-local_2.10" % "2.0.0"

另请注意,我已在我的代码中导入了 import org.apache.log4j。{Level,Logger} 。还是不行。

1 个答案:

答案 0 :(得分:0)

从火花壳检查 导入org.apache.spark.Logging是否有效,然后在build.sbt中进行适当的更改

libraryDependencies += "org.apache.spark" % "spark-core_2.10" % "1.6.1"
libraryDependencies += "org.apache.spark" % "spark-streaming_2.10" % "1.6.1"
libraryDependencies += "org.apache.spark" % "spark-mllib_2.10" % "1.6.1"
libraryDependencies += "org.apache.spark" % "spark-mllib-local_2.10" % "1.6.1"

如果不是,则表示您在环境中运行的spark包不包含org.apache.spark.Logging