在Cloudera Hadoop中构建MapReduce程序时出错

时间:2014-02-03 21:58:18

标签: hadoop mapreduce cloudera

在Hadoop中构建MapReduce文件时出现以下错误。 我正在使用Cloudera hadoop发行版。 testmr_classes是一个文件夹,TestMR.java是MapReduce文件

[cloudera@localhost ~]$ echo `hadoop classpath`
/etc/hadoop/conf:/usr/lib/hadoop/lib/*:/usr/lib/hadoop/.//*:/usr/lib/hadoop-hdfs/./:/usr/lib/hadoop-hdfs/lib/*:/usr/lib/hadoop-hdfs/.//*:/usr/lib/hadoop-yarn/lib/*:/usr/lib/hadoop-yarn/.//*:/usr/lib/hadoop-0.20-mapreduce/./:/usr/lib/hadoop-0.20-mapreduce/lib/*:/usr/lib/hadoop-0.20-mapreduce/.//*
[cloudera@localhost ~]$

[cloudera@localhost ~]$ javac -classpath `hadoop classpath`:. -d testmr_classes TestMR.java
TestMR.java:32: TestMR.Reduce is not abstract and does not override abstract method reduce(org.apache.hadoop.io.IntWritable,java.util.Iterator<org.apache.hadoop.io.Text>,org.apache.hadoop.mapred.OutputCollector<org.apache.hadoop.io.IntWritable,org.apache.hadoop.io.DoubleWritable>,org.apache.hadoop.mapred.Reporter) in org.apache.hadoop.mapred.Reducer
    public static class Reduce extends MapReduceBase implements Reducer<IntWritable,Text,IntWritable,DoubleWritable>
                  ^
1 error
[cloudera@localhost ~]$

以下是TestMR.java的内容,

import java.io.IOException;
import java.util.*;

import org.apache.hadoop.fs.Path;
import org.apache.hadoop.conf.*;
import org.apache.hadoop.io.*;
import org.apache.hadoop.mapred.*;
import org.apache.hadoop.mapred.Reducer;
import org.apache.hadoop.util.*;

public class TestMR 
{
    public static class Map extends MapReduceBase implements Mapper<IntWritable,Text,IntWritable,Text>
    {
        private IntWritable key = new IntWritable();
        private Text value = new Text();

        public void map(IntWritable key, Text line, OutputCollector<IntWritable, Text> output, Reporter reporter) throws IOException
        {
            String [] split = line.toString().split(",");
            key.set(Integer.parseInt(split[0]));

            if(split[2] == "Test")
            {
                value.set(split[4] + "," + split[7]);
                output.collect(key, value);
            }
        }
    }

    public static class Reduce extends MapReduceBase implements Reducer<IntWritable,Text,IntWritable,DoubleWritable>
    {
        public void reduce(IntWritable key, Iterable<Text> v, OutputCollector<IntWritable, DoubleWritable> output, Reporter reporter) throws IOException
        {
            Iterator values = v.iterator();
            while(values.hasNext())
            {
                String [] tmp_buf_1 = values.next().toString().split(",");
                String V1 = tmp_buf_1[0];
                String T1 = tmp_buf_1[1];

                if(!values.hasNext())   
                    break;  

                String [] tmp_buf_2 = values.next().toString().split(",");       
                String V2 = tmp_buf_2[0];
                String T2 = tmp_buf_2[1];           

                double dResult = (Double.parseDouble(V2) - Double.parseDouble(V1)) / (Double.parseDouble(T2) - Double.parseDouble(T1));

                output.collect(key, new DoubleWritable(dResult));
            }
        }
    }

    public static void main(String[] args) throws Exception
    {
        JobConf conf = new JobConf(TestMR.class);
        conf.setJobName("TestMapReduce");

        conf.setOutputKeyClass(IntWritable.class);
        conf.setOutputValueClass(DoubleWritable.class);

        conf.setMapperClass(Map.class);
        conf.setCombinerClass(Reduce.class);
        conf.setReducerClass(Reduce.class);

        conf.setInputFormat(TextInputFormat.class);
        conf.setOutputFormat(TextOutputFormat.class);

        FileInputFormat.setInputPaths(conf, new Path(args[0]));
        FileOutputFormat.setOutputPath(conf, new Path(args[1]));

        JobClient.runJob(conf);
    }
}

这是我第一次尝试MapReduce,很高兴知道我在这里错过了什么。

2 个答案:

答案 0 :(得分:0)

您需要指定类路径来代替'hadoop classpath',例如:

javac -classpath /opt/hadoop-1.1.2/hadoop-core-1.1.2.jar:/opt/hadoop-1.1.2/lib/commons-cli-1.2.jar -d testmr_classes TestMR.java 

确切路径取决于您安装hadoop的位置。

答案 1 :(得分:0)

仔细查看reduce()的第二个arg和错误。你写了Iterable但是它给了你一个Iterator。