sbt-assembly不包括依赖项

时间:2016-02-09 01:57:37

标签: scala apache-spark sbt sbt-assembly

我正在尝试使用sbt程序集构建一个发送到spark-submit的胖罐。但是,我似乎无法使构建过程正确。

我当前的build.sbt如下

name := "MyAppName"

version := "1.0"

scalaVersion := "2.10.6"


libraryDependencies  ++= Seq(
  "org.apache.spark" %% "spark-core" % "1.6.0" % "provided",
  "org.apache.spark" %% "spark-mllib" % "1.6.0" % "provided",
  "org.scalanlp" %% "breeze" % "0.12",
  "org.scalanlp" %% "breeze-natives" % "0.12"
)

resolvers ++= Seq(
  "Sonatype Snapshots" at "https://oss.sonatype.org/content/repositories/snapshots/"
)

运行sbt-sssembly产生一个jar。但是,提交jar后要提交spark spark-submit MyAppName-assembly-1.0.jar(已经指定了一个主类,所以我假设它没有指定一个类),抛出以下异常:

java.lang.NoSuchMethodError: breeze.linalg.DenseVector.noOffsetOrStride()Z
at breeze.linalg.DenseVector$canDotD$.apply(DenseVector.scala:629)
at breeze.linalg.DenseVector$canDotD$.apply(DenseVector.scala:626)
at breeze.linalg.ImmutableNumericOps$class.dot(NumericOps.scala:98)
at breeze.linalg.DenseVector.dot(DenseVector.scala:50)
at RunMe$.cosSimilarity(RunMe.scala:103)
at RunMe$$anonfun$4.apply(RunMe.scala:35)
at RunMe$$anonfun$4.apply(RunMe.scala:33)
at scala.collection.Iterator$$anon$11.next(Iterator.scala:328)
at scala.collection.convert.Wrappers$IteratorWrapper.next(Wrappers.scala:30)
at org.spark-project.guava.collect.Ordering.leastOf(Ordering.java:658)
at org.apache.spark.util.collection.Utils$.takeOrdered(Utils.scala:37)
at org.apache.spark.rdd.RDD$$anonfun$takeOrdered$1$$anonfun$29.apply(RDD.scala:1377)
at org.apache.spark.rdd.RDD$$anonfun$takeOrdered$1$$anonfun$29.apply(RDD.scala:1374)
at org.apache.spark.rdd.RDD$$anonfun$mapPartitions$1$$anonfun$apply$20.apply(RDD.scala:710)
at org.apache.spark.rdd.RDD$$anonfun$mapPartitions$1$$anonfun$apply$20.apply(RDD.scala:710)
at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:38)
at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:306)
at org.apache.spark.rdd.RDD.iterator(RDD.scala:270)
at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:66)
at org.apache.spark.scheduler.Task.run(Task.scala:89)
at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:213)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
at java.lang.Thread.run(Thread.java:745)

我对scala和sbt的世界相对较新,所以任何帮助都将不胜感激!

2 个答案:

答案 0 :(得分:2)

事实证明,问题是微风已经包含在火花中。问题是spark包含了一个更新的Breeze版本,其中包含我的版本没有的方法。

我的参考:Apache Spark - java.lang.NoSuchMethodError: breeze.linalg.DenseVector

答案 1 :(得分:0)

我有类似的问题。我最终将jar保存在lib目录下,然后在assembly.sbt中添加:

unmanagedJars in Compile += file("lib/my.jar")
相关问题