在我的mapReduce代码中捕获了ClassCastException

时间:2014-06-20 18:57:25

标签: java hadoop mapreduce

这里附上了我的代码,它可以将所有avro文件混合成一个大文件。当我运行它时,会捕获classCastException,有人可以帮助我吗?

public class FileCompactionDriver extends Configured implements Tool {

    public static void main(String[] args) throws Exception {

        int returnCode = ToolRunner.run(new FileCompactionDriver(), args);
        /* Terminate job */
        System.exit(returnCode);
    }

    public int run(String[] arg) throws Exception {

        /* Setup configuration */
        Configuration conf = this.getConf();

        /* Test if it is a valid running command */
        if (arg.length != 2) {
            System.err.println("Usage: FileCompaction <input path> <output path>");
            System.exit(2);
        }

        // Creating the job object from configuration
        Job job = Job.getInstance(conf);

        job.setJobName("FileCompactor");
        job.setJarByClass(FileCompactionMapper.class);

        /* Setup the Mapper and Reducer classes */
        job.setMapperClass(FileCompactionMapper.class);
        job.setReducerClass(FileCompactionReducer.class);
        job.setNumReduceTasks(1);

        /* Setup the output types */
        job.setOutputKeyClass(AvroKey.class);
        job.setOutputValueClass(NullWritable.class);

        /* Setup the input and output classes format */
        /* Handle and writing avro container files */
        job.setInputFormatClass(AvroKeyInputFormat.class);
        job.setOutputFormatClass(AvroKeyOutputFormat.class);

        /* Specify input and output directories */
        FileInputFormat.setInputPaths(job, new Path(arg[0]));
        FileOutputFormat.setOutputPath(job, new Path(arg[1]));

        return (job.waitForCompletion(true) ? 0 : 1);
    }
}

1 个答案:

答案 0 :(得分:0)

我想您需要指定job.setMapOutputKeyClassjob.setMapOutputValueClass来摆脱ClassCastException