CSV类未找到例外

时间:2017-02-25 20:04:00

标签: java hadoop mapreduce hbase opencsv

我有一个上传到hdfs的CSV文件。我正在使用opencsv解析器来读取数据。我也在hadoop类路径中有我的jar文件,并将其上传到hdfs中的以下位置/jars/opencsv-3.9.jar。我也得到了错误。

这是我的代码段

public class TermLabelledPapers {

   public static class InputMapper extends Mapper<LongWritable, Text, Text, Text> {

    @Override
    protected void map(LongWritable key, Text value, Context context)
            throws IOException, InterruptedException {

        CSVParser parser = new CSVParser();
        String[] lines = parser.parseLine(value.toString());
        //readEntry.readHeaders();
        String doi = lines[0];
        String keyphrases = lines[3];

        Get g = new Get(Bytes.toBytes(doi.toString()));
        context.write(new Text(doi), new Text(keyphrases));

    }
}

public static class PaperEntryReducer extends TableReducer<Text, Text, ImmutableBytesWritable> {

    @Override
    protected void reduce(Text doi, Iterable<Text> values, Context context)
            throws IOException, InterruptedException {

    }
}


public static void main(String[] args) throws Exception {

    Configuration conf = HBaseConfiguration.create();
    conf.set("hbase.zookeeper.quorum", "172.17.25.18");
    conf.set("hbase.zookeeper.property.clientPort", "2183");
    //add the external jar to hadoop distributed cache 
    //addJarToDistributedCache(CsvReader.class, conf);

    Job job = new Job(conf, "TermLabelledPapers");
    job.setJarByClass(TermLabelledPapers.class);
    job.setMapperClass(InputMapper.class);
    job.setMapOutputKeyClass(Text.class);
    job.setMapOutputValueClass(Text.class);
    job.addFileToClassPath(new Path("/jars/opencsv-3.9.jar"));
    FileInputFormat.setInputPaths(job, new Path(args[0]));  // "metadata.csv"

    TableMapReduceUtil.initTableReducerJob("PaperBagofWords", PaperEntryReducer.class, job);
    job.setReducerClass(PaperEntryReducer.class);
    job.waitForCompletion(true);
 }

}

运行作业后显示的错误是

Error: java.lang.ClassNotFoundException: com.csvreader.CsvReader
 at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:331)
at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
at mcad.TermLabelledPapers$InputMapper.map(TermLabelledPapers.java:69)
at mcad.TermLabelledPapers$InputMapper.map(TermLabelledPapers.java:1)
at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:146)
at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:787)
at org.apache.hadoop.mapred.MapTask.run(MapTask.java:341)
at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:164)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:422)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1657)
at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:158)

1 个答案:

答案 0 :(得分:0)

理想情况下,如果jar在hadoop类路径中,则不应出现此错误。如果你是一个maven项目,你可以尝试创建jar-with-dependencies,它将包含所有依赖的jar和jar。这有助于诊断问题。

相关问题