火花2

时间:2018-03-09 18:04:30

标签: apache-spark oozie hortonworks-data-platform oozie-workflow

Oozie spark 2动作失败了,但是当我使用spark-submit运行时它会起作用。

错误 - /./assembly/target/scala-2.11/jars'不存在;确保构建Spark。

StackTrace下面: 失败的Oozie Launcher,主要课程

[org.apache.oozie.action.hadoop.SparkMain], main() threw exception, Library directory '/mnt/resource/hadoop/yarn/local/usercache/admin/appcache/application_1518822111928_3264/container_e06_1518822111928_3264_01_000002/./assembly/target/scala-2.11/jars' does not exist; make sure Spark is built.
java.lang.IllegalStateException: Library directory '/mnt/resource/hadoop/yarn/local/usercache/admin/appcache/application_1518822111928_3264/container_e06_1518822111928_3264_01_000002/./assembly/target/scala-2.11/jars' does not exist; make sure Spark is built.
at org.apache.spark.launcher.CommandBuilderUtils.checkState(CommandBuilderUtils.java:260)
at org.apache.spark.launcher.CommandBuilderUtils.findJarsDir(CommandBuilderUtils.java:380)
at org.apache.spark.launcher.YarnCommandBuilderUtils$.findJarsDir(YarnCommandBuilderUtils.scala:38)
at org.apache.spark.deploy.yarn.Client.prepareLocalResources(Client.scala:587)
at org.apache.spark.deploy.yarn.Client.createContainerLaunchContext(Client.scala:912)
at org.apache.spark.deploy.yarn.Client.submitApplication(Client.scala:172)
at org.apache.spark.deploy.yarn.Client.run(Client.scala:1248)
at org.apache.spark.deploy.yarn.Client$.main(Client.scala:1307)
at org.apache.spark.deploy.yarn.Client.main(Client.scala)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:751)
at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:187)
at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:212)
at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:126)
at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
at org.apache.oozie.action.hadoop.SparkMain.runSpark(SparkMain.java:312)
at org.apache.oozie.action.hadoop.SparkMain.run(SparkMain.java:233)
at org.apache.oozie.action.hadoop.LauncherMain.run(LauncherMain.java:58)
at org.apache.oozie.action.hadoop.SparkMain.main(SparkMain.java:62)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.apache.oozie.action.hadoop.LauncherMapper.map(LauncherMapper.java:239)
at org.apache.hadoop.mapred.MapRunner.run(MapRunner.java:54)
at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:453)
at org.apache.hadoop.mapred.MapTask.run(MapTask.java:343)
at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:170)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:422)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1866)
at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:164)

更多信息纱线日志: Spark Action Main类:org.apache.spark.deploy.SparkSubmit

Oozie Spark动作配置

                --master
                yarn-cluster
                --name
                WordCount
                --class
                rdd.WordCount
                --conf
                spark.executor.extraClassPath=$PWD/*
                --conf
                spark.driver.extraClassPath=$PWD/*
                --conf
                spark.yarn.security.tokens.hive.enabled=false
                --conf
                spark.yarn.security.tokens.hbase.enabled=false
                --conf
                spark.executor.extraJavaOptions=-Dlog4j.configuration=spark-log4j.properties
                --conf
                spark.driver.extraJavaOptions=-Dlog4j.configuration=spark-log4j.properties

...

0 个答案:

没有答案