SBT内置错误:[错误](*:更新)sbt.ResolveException:未解析的依赖项:

时间:2018-01-25 10:32:38

标签: apache-spark sbt

我正在尝试使用sbt构建执行spark程序,但是得到以下错误。

'[error] (*:update) sbt.ResolveException: unresolved dependency: org.apache.hadoop#hadoop-mapreduce-client-app;2.2.0: not found
[error] unresolved dependency: org.apache.hadoop#hadoop-yarn-api;2.2.0: not found
[error] unresolved dependency: org.apache.hadoop#hadoop-mapreduce-client-core;2.2.0: not found
[error] unresolved dependency: org.apache.hadoop#hadoop-mapreduce-client-jobclient;2.2.0: not found
[error] unresolved dependency: asm#asm;3.1: not found
[error] unresolved dependency: org.apache.spark#hadoop-core_2.10;2.2.0: not found
[error] unresolved dependency: org.apache.hadoop#hadoop-client_2.11;2.2.0: not found
[error] download failed: org.apache.avro#avro;1.7.7!avro.jar
[error] download failed: commons-codec#commons-codec;1.4!commons-codec.jar

附加代码以及构建的sbt。我已经正确设置了文件夹结构。

helloSpark.scala:

import org.apache.spark.SparkConetxt
import org.apache.spark.SparkConetxt._
import org.apache.spark.SparkConf

object HelloSpark {
  def main(args: Array[String]){
    val conf = new SparkConf().setMaster("local").setAppName("Hello Spark")
    val sc = new SparkContext(conf)
    val rddFile = sc.textFile("data.txt").filter(line => line.contains("spark")).count()
    println("lines with spark: %s".format(rddFile))

  }

}

simple.sbt:

name := "Hello Spark"

version := "1.0"

scalaVersion := "2.10.4"

libraryDependencies += "org.apache.spark" %% "spark-core" % "1.5.2"

libraryDependencies += "org.apache.spark" %% "hadoop-core" % "2.2.0"

libraryDependencies += "org.apache.hadoop" % "hadoop-client_2.11" % "2.2.0"

0 个答案:

没有答案
相关问题