用Java访问HDFS文件系统的例外情况

时间:2016-06-06 03:28:10

标签: java hadoop hdfs

我尝试使用Java API访问HDFS文件,如下代码:

import org.apache.hadoop.fs.FileSystem;
import org.apache.hadoop.conf.Configuration;
import org.apache.hadoop.fs.Path;
import org.apache.hadoop.fs.FSDataInputStream;

public static void main(args[]) {
    Configuration conf = new Configuration();
    conf.addResource(new Path("/etc/hadoop/conf/core-site.xml"));
    conf.addResource(new Path("/etc/hadoop/conf/hdfs-site.xml"));

    try {
        Path path = new Path("hdfs://mycluster/user/mock/test.txt");
        FileSystem fs = FileSystem.get(path.toUri(), conf);
        if (fs.exists(path)) {
            FSDataInputStream inputStream = fs.open(path);
            // Process input stream ...
        }
        else
            System.out.println("File does not exist");
    } catch (IOException e) {
       System.out.println(e.getMessage());

FileSystem.get(path.toUri(), conf)处发生异常,说Couldn't create proxy provider class org.apache.hadoop.hdfs.server.namenode.ha.ConfiguredFailoverProxyProvider导致java.lang.NoClassDefFoundError: Could not initialize class org.apache.hadoop.security.Credentials

我没有找到有关错误的更多信息。问题是由于错误的API(org.apache.hadoop.hdfs而不是org.apache.hadoop.fs)?

1 个答案:

答案 0 :(得分:0)

1)你的类路径中是否有hadoop-hdfs-.jar?

2)你是如何下载依赖项的?行家/手动/其他

3)你能提供一下堆栈跟踪吗?