在纱线集群中通过spark-submit提交spark作业期间获取错误(文件不存在)

时间:2017-03-16 14:09:23

标签: java apache-spark yarn

我已在 core-site.xml

中设置了具有以下配置的纱线群集
<configuration>
    <property>
            <name>fs.defaultFS</name>
            <value>hdfs://localhost:54310</value>
    </property>
</configuration>

我使用以下命令提交spark作业;

./spark-submit --class main.MainClass --deploy-mode cluster --master yarn sparkJob.jar [options]

现在我在工作失败后得到了一个例外,

Exception in thread "main" java.io.FileNotFoundException: File does not exist: hdfs://localhost:54310/user/root/.sparkStaging/application_1489672189113_0002/__spark_conf__.zip
at org.apache.hadoop.hdfs.DistributedFileSystem$18.doCall(DistributedFileSystem.java:1122)
at org.apache.hadoop.hdfs.DistributedFileSystem$18.doCall(DistributedFileSystem.java:1114)
at org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81)
at org.apache.hadoop.hdfs.DistributedFileSystem.getFileStatus(DistributedFileSystem.java:1114)
at org.apache.spark.deploy.yarn.ApplicationMaster$$anonfun$6.apply(ApplicationMaster.scala:158)
at org.apache.spark.deploy.yarn.ApplicationMaster$$anonfun$6.apply(ApplicationMaster.scala:155)
at scala.Option.foreach(Option.scala:257)
at org.apache.spark.deploy.yarn.ApplicationMaster.<init>(ApplicationMaster.scala:155)
at org.apache.spark.deploy.yarn.ApplicationMaster$$anonfun$main$1.apply$mcV$sp(ApplicationMaster.scala:748)
at org.apache.spark.deploy.SparkHadoopUtil$$anon$1.run(SparkHadoopUtil.scala:71)
at org.apache.spark.deploy.SparkHadoopUtil$$anon$1.run(SparkHadoopUtil.scala:70)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:422)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1656)
at org.apache.spark.deploy.SparkHadoopUtil.runAsSparkUser(SparkHadoopUtil.scala:70)
at org.apache.spark.deploy.yarn.ApplicationMaster$.main(ApplicationMaster.scala:747)
at org.apache.spark.deploy.yarn.ApplicationMaster.main(ApplicationMaster.scala)

我是纱线的新手。你能给出关于这个例外的任何暗示吗?如果需要更多资源,请提及。

我在我的java代码中启动SparkSession,如下所示

ss=SparkSession.builder()
                  .appName("appname")
                  .getOrCreate();

有人请问,我做错了什么?

0 个答案:

没有答案