sbt.ResolveException未解析的依赖项

时间:2017-06-20 03:14:06

标签: scala sbt

我正在努力研究Twitter流示例项目。我在定义sbt时面临问题。

我的build.sbt

name := "Tutorial"
version := "0.1.0"
scalaVersion := "2.11.8"
retrieveManaged := true
libraryDependencies ++= Seq(
  "org.apache.spark" % "spark-core" % "2.11.0",
  "org.apache.spark" % "spark-streaming" % "1.1.0",
  "org.apache.spark" % "spark-streaming-twitter" % "1.1.0"
)

错误日志:

[warn]  Note: Some unresolved dependencies have extra attributes.  Check that these dependencies exist with the requested attributes.
[warn]      com.eed3si9n:sbt-assembly:0.9.2 (sbtVersion=0.11.3, scalaVersion=2.11.8)
[warn]      com.typesafe.sbteclipse:sbteclipse-plugin:2.2.0 (sbtVersion=0.11.3, scalaVersion=2.11.8)
[warn]      com.github.mpeltonen:sbt-idea:1.5.1 (sbtVersion=0.11.3, scalaVersion=2.11.8)
[warn] 
[error] {file:/home/muralee1857/scala/workspace/Tutorial/}default-109f4d/*:update: sbt.ResolveException: unresolved dependency: org.apache.spark#spark-core_2.11.8;1.5.1: not found
[error] unresolved dependency: com.eed3si9n#sbt-assembly;0.9.2: not found
[error] unresolved dependency: com.typesafe.sbteclipse#sbteclipse-plugin;2.2.0: not found
[error] unresolved dependency: com.github.mpeltonen#sbt-idea;1.5.1: not found

2 个答案:

答案 0 :(得分:2)

我建议你明确地将依赖的打包版本定义为

libraryDependencies ++= Seq(
  "org.apache.spark" % "spark-core_2.10" % "1.1.0",
  "org.apache.spark" % "spark-streaming_2.10" % "1.1.0" % "provided",
  "org.apache.spark" % "spark-streaming-twitter_2.10" % "1.1.0"
)

您可以在不定义%%的情况下使用packaged version,但会尝试下载系统中的package with scala version。有时sbt将找不到会导致依赖性问题的scala version packaged packages

答案 1 :(得分:0)

这应该有效。请注意,我在这里使用%%方法而不是%,以便在此处选择正确版本的spark库(scala版本2.11)。确保将相同的%%函数应用于其他插件,如sbt-assembly,sbt-assembly等。

libraryDependencies ++= Seq(
  "org.apache.spark" %% "spark-core" % "2.11.0",
  "org.apache.spark" %% "spark-streaming" % "1.1.0",
  "org.apache.spark" %% "spark-streaming-twitter" % "1.1.0"
)
相关问题