如何用sbt-assembly制作多项目胖罐

时间:2016-06-02 09:52:18

标签: scala apache-spark jar sbt sbt-assembly

我有一个使用spark的scala多项目,并尝试使用sbt插件sbt-assembly 0.14.3制作一个胖罐。我的建筑看起来像那样:

lazy val commonSettings = Seq(
  organization := "blabla",
  version := "0.1.0",
  scalaVersion := "2.11.8"
)


lazy val core = (project in file("."))
  .settings(commonSettings: _*)
  .settings(libraryDependencies ++= Seq(
"org.apache.spark" %% "spark-core" % "1.6.1" % "provided",
"org.apache.spark" %% "spark-sql" % "1.6.1" % "provided",
"org.apache.spark" %% "spark-mllib" % "1.6.1" % "provided",...)


lazy val sub_project = project
  .settings(commonSettings: _*)
  .aggregate(core)
  .dependsOn(core)

我想创建一个sub_project的胖jar,这个胖jar包含项目核心的所有库和代码。 我尝试了以下方法:

sbt
project sub_project
assembly

我收到以下错误:

[error] missing or invalid dependency detected while loading class file 'blabla.class'.
[error] Could not access term spark in package org.apache,
[error] because it (or its dependencies) are missing. Check your build definition for
[error] missing or conflicting dependencies. (Re-run with `-Ylog-classpath` to see the problematic classpath.)
[error] A full rebuild may help if 'blabla.class' was compiled against an incompatible version of org.apache.
[error] one error found

然而,当我使用" assembly"在核心项目上,我可以得到我的胖罐。

1 个答案:

答案 0 :(得分:2)

您的构建显示在provided的类路径中不存在对Spark的库依赖(不论sub_project语句),并且您获得的错误消息与此匹配。您可能希望将此依赖项添加到常用设置。