hive -e使用<shell>从oozie子工作流调用时抛出NoSuchMethodError,从主工作流调用时完美运行

时间:2019-05-02 05:40:00

标签: hive hadoop2 oozie oozie-workflow

我正在重构oozie工作流程,所有工作流程都以单个文件编写,并试图将其分解为子工作流程。 但是重构后,它开始抛出

Exception in thread "main" java.lang.NoSuchMethodError: org.apache.hadoop.io.retry.RetryPolicies.retryForeverWithFixedSleep(JLjava/util/concurrent/TimeUnit;)Lorg/apache/hadoop/io/retry/RetryPolicy;

原始工作流程:

 <?xml version="1.0" encoding="UTF-8"?>
    <workflow-app xmlns="uri:oozie:workflow:0.5" name="Main">
 <start to="loadToHive"/>
<action name="loadToHive">
    <shell xmlns="uri:oozie:shell-action:0.2">
      <job-tracker>${jobTracker}</job-tracker>
      <name-node>${nameNode}</name-node>
      <configuration>
        <property>
          <name>yarn.nodemanager.container-executor.class</name>
          <value>LinuxContainerExecutor</value>
        </property>
        <property>
          <name>yarn.nodemanager.linux-container-executor.nonsecure-mode.limit-user</name>
          <value>true</value>
        </property>
      </configuration>
      <exec>${loadToHiveActionScript}</exec>
      <argument>${outPutPath}</argument>
      <argument>${dataSetPath}</argument>
      <argument>${hiveDB}</argument>
      <env-var>HADOOP_USER_NAME=${wf:user()}</env-var>
      <file>${loadToHiveActionScriptPath}#${loadToHiveActionScript}</file>
    </shell>
    <ok to="uplaodToMysql"/>
    <error to="handleFailure"/>
  </action>

重构文件:

<?xml version="1.0" encoding="UTF-8"?>
    <workflow-app xmlns="uri:oozie:workflow:0.5" name="Main">
 <start to="loadToHive"/>
<action name="loadToHive">
  <sub-workflow>
    <app-path>${oozieProjectRoot}/commonWorkflows/mongoTransform.xml</app-path>
<propagate-configuration/>
  </sub-workflow>
  <ok to="uplaodToMysql"/>
  <error to="handleFailure"/>
</action>

子工作流程文件:

<?xml version="1.0" encoding="UTF-8"?>
<workflow-app name="mongoTransform-${module}" xmlns="uri:oozie:workflow:0.5">
    <start to="loadToHiveSub"/>
<action name="loadToHive">
                <shell xmlns="uri:oozie:shell-action:0.2">
                <job-tracker>${jobTracker}</job-tracker>
                <name-node>${nameNode}</name-node>
                <configuration>
                         <property>
                                   <name>yarn.nodemanager.container-executor.class</name>
                                   <value>LinuxContainerExecutor</value>
                         </property>
                         <property>
                                   <name>yarn.nodemanager.linux-container-executor.nonsecure-mode.limit-user</name>
                                   <value>true</value>
                         </property>
                </configuration>
                <exec>${loadToHiveActionScript}</exec>
                <argument>${outPutPath}</argument>
                <argument>${dataSetPath}</argument>
                <argument>${hiveDB}</argument>
                 <env-var>HADOOP_USER_NAME=${wf:user()}</env-var>
                <file>${loadToHiveActionScriptPath}#${loadToHiveActionScript}</file>
                </shell>
                <ok to="end"/>
                <error to="handleFailure"/>
        </action>

loadToHiveActionScript.sh

hive -e "Drop table if exists ${3}.${i}_intermediate";
...
hive -e " Alter table ${3}.${i}_intermediate RENAME TO ${3}.$i";

在主工作流程文件中运行此命令时,它执行得很好。 这可以是 env-var:HADOOP_USER_NAME = $ {wf:user()}

0 个答案:

没有答案
相关问题