完成阶段后的纱线错误

时间:2016-12-01 15:32:06

标签: apache-spark yarn hpc

我对纱线很新。我尝试在纱线集群中运行应用程序代码。当我启动集群时,似乎一切正常,我可以看到ACCEPTED AND RUNNING阶段。最终国家完成了。但是,之后它会抛出如下错误。

16/12/01 15:04:30 INFO Client: Application report for application_1480601004675_0001 (state: ACCEPTED)
...

16/12/01 15:15:38 INFO Client: Application report for application_1480601004675_0001 (state: RUNNING)
16/12/01 15:15:39 INFO Client: Application report for application_1480601004675_0001 (state: FINISHED)
16/12/01 15:15:39 INFO Client: Deleting staging directory .sparkStaging/application_1480601004675_0001



Exception in thread "main" org.apache.spark.SparkException: Application application_1480601004675_0001 finished with failed status
            at org.apache.spark.deploy.yarn.Client.run(Client.scala:855)
            at org.apache.spark.deploy.yarn.Client$.main(Client.scala:881)
            at org.apache.spark.deploy.yarn.Client.main(Client.scala)
            at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
            at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
            at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
            at java.lang.reflect.Method.invoke(Method.java:606)
            at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:665)
            at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:170)
            at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:193)
            at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:112)
            at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)

我的火花配置为.setMaster("yarn-cluster")。另外,如果我将其设置为.setMaster("yarn-client"),则会抛出有关hadoop配置的错误,甚至不会传递到RUNNING状态。

我知道这可能不那么有用。但是,有人可以从错误源于何处获得一个想法?最后,当我在具有本地[*]配置的节点中运行时,相同的jar文件可以正常工作。

0 个答案:

没有答案