以MapReduce开始的工作被杀了。为什么?

时间:2016-02-05 18:45:32

标签: java hadoop mapreduce hdfs oozie

我尝试了几天与Oozie开始一个wordount(MapReduce)工作。正常(CMD:“hadoop jar * .jar mainClass输入输出”)工作开始一切都很顺利。目前的oozie配置是:

  • /ApplicationDIR/lib/WordCount.jar
  • /ApplicationDIR/workflow.xml
  • /文字-IN
  • /文字-OUT

    workflow.xml

        

    <action name='wordcount'>
        <map-reduce>
            <job-tracker>${jobTracker}</job-tracker>
            <name-node>${nameNode}</name-node>
            <prepare>
                <delete path="${outputDir}" />
            </prepare>
            <configuration>
    
                <property>
                    <name>mapred.job.queue.name</name>
                    <value>${queueName}</value>
                </property>
                <property>
                    <name>mapred.mapper.class</name>
                    <value>HadoopJobs.wordCound.WordCountMR.Map</value>
                </property>
                <property>
                    <name>mapred.reducer.class</name>
                    <value>HadoopJobs.wordCound.WordCountMR.Reduce</value>
                </property>
                <property>
                    <name>mapreduce.input.fileinputformat.inputdir</name>
                    <value>${inputDir}</value>
                </property>
                <property>
                    <name>mapreduce.output.fileoutputformat.outputdir</name>
                    <value>${outputDir}</value>
                </property>
            </configuration>
        </map-reduce>
        <ok to='end'/>
        <error to='kill'/>
    </action>
    
    <kill name='kill'>
        <message>ERROR: [${wf:errorMessage(wf:lastErrorNode())}]</message>
    </kill>
    
    <end name='end'/>
    

job.properties

nameNode=hdfs://192.168.1.110:8020    
jobTracker=192.168.1.110:8050
queueName=default

oozie.wf.application.path=${nameNode}/tmp/testDIR/wordcount-example/ApplicationDIR
inputDir=hdfs://192.168.1.110:8020/tmp/testDIR/wordcount-example/Text-IN
outputDir=hdfs://192.168.1.110:8020/tmp/testDIR/wordcount-example/Text-OUT

命令:

oozie job -oozie http://192.168.1.110:11000/oozie/ -config job.properties -run

结果:

Job gets killed

- UPDATE -

  

Oozie日志:   https://docs.google.com/document/d/1BKnv4dSEscRqpzKLhOjUaryveSP3q0454uL_5_xVPdk/edit?usp=sharing

1 个答案:

答案 0 :(得分:0)

我通过下载Cloudera CDH解决了这个问题。他们有HUE,它有一个非常好的用户界面,我可以看到我的错误细节。总的来说,当我从工作流程xml中删除以下部分时,我解决了我的错误:

            <property>
                <name>mapred.mapper.class</name>
                <value>HadoopJobs.wordCound.WordCountMR.Map</value>
            </property>
            <property>
                <name>mapred.reducer.class</name>
                <value>HadoopJobs.wordCound.WordCountMR.Reduce</value>
            </property>