如何访问以编程方式启用Kerberos的hadoop集群?

时间:2017-04-09 13:59:46

标签: java hadoop kerberos

我有这段代码可以从Hadoop文件系统中获取文件。我在单个节点上设置hadoop,并从我的本地计算机运行此代码,以查看它是否能够从该节点上的HDFS设置中获取文件。它奏效了。

package com.hdfs.test.hdfs_util;
/*  Copy file from hdfs to local disk without hadoop installation
*  
*  params are something like 
*  hdfs://node01.sindice.net:8020 /user/bob/file.zip file.zip
*
*/

import java.io.IOException;

import org.apache.hadoop.conf.Configuration;

import org.apache.hadoop.fs.FileSystem;

import org.apache.hadoop.fs.Path;

public class HDFSdownloader{

public static void main(String[] args) throws Exception {

    System.getProperty("java.classpath");
    if (args.length != 3) {

        System.out.println("use: HDFSdownloader hdfs src dst");

        System.exit(1);

    }

    System.out.println(HDFSdownloader.class.getName());
    HDFSdownloader dw = new HDFSdownloader();

    dw.copy2local(args[0], args[1], args[2]);

}

private void copy2local(String hdfs, String src, String dst) throws IOException {

    System.out.println("!! Entering function !!");
    Configuration conf = new Configuration();
    conf.set("fs.hdfs.impl", org.apache.hadoop.hdfs.DistributedFileSystem.class.getName());
    conf.set("fs.file.impl", org.apache.hadoop.fs.LocalFileSystem.class.getName());


    conf.set("fs.default.name", hdfs);

    FileSystem.get(conf).copyToLocalFile(new Path(src), new Path(dst));

    System.out.println("!! copytoLocalFile Reached!!");

}

}

现在我使用了相同的代码,将它捆绑在一个jar中并尝试在另一个节点上运行它(比如说B)。这次代码必须从适当的分布式Hadoop集群中获取文件。该群集中已启用Kerberos。

代码已运行,但有例外:

Exception in thread "main" org.apache.hadoop.security.AccessControlException: SIMPLE authentication is not enabled.  Available:[TOKEN, KERBEROS]
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
at org.apache.hadoop.ipc.RemoteException.instantiateException(RemoteException.java:106)
at org.apache.hadoop.ipc.RemoteException.unwrapRemoteException(RemoteException.java:73)
at org.apache.hadoop.hdfs.DFSClient.getFileInfo(DFSClient.java:2115)
at org.apache.hadoop.hdfs.DistributedFileSystem$22.doCall(DistributedFileSystem.java:1305)
at org.apache.hadoop.hdfs.DistributedFileSystem$22.doCall(DistributedFileSystem.java:1301)
at org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81)
at org.apache.hadoop.hdfs.DistributedFileSystem.getFileStatus(DistributedFileSystem.java:1317)
at org.apache.hadoop.fs.FileUtil.copy(FileUtil.java:337)
at org.apache.hadoop.fs.FileUtil.copy(FileUtil.java:289)
at org.apache.hadoop.fs.FileSystem.copyToLocalFile(FileSystem.java:2030)
at org.apache.hadoop.fs.FileSystem.copyToLocalFile(FileSystem.java:1999)
at org.apache.hadoop.fs.FileSystem.copyToLocalFile(FileSystem.java:1975)
at com.hdfs.test.hdfs_util.HDFSdownloader.copy2local(HDFSdownloader.java:49)
at com.hdfs.test.hdfs_util.HDFSdownloader.main(HDFSdownloader.java:35)

有没有办法以以编程方式运行此代码。出于某种原因,我无法在源节点上安装kinit。

1 个答案:

答案 0 :(得分:3)

这是一个代码段,可以在上面描述的场景中工作,即以编程方式访问启用了Kerberos的群集。需要注意的重点是

  • 在UserGroupInformation中提供keytab文件位置
  • 在JVM参数中提供kerberos领域详细信息 - krb5.conf文件
  • 将hadoop安全认证模式定义为kerberos

    import org.apache.hadoop.conf.Configuration;
    import org.apache.hadoop.fs.FileStatus;
    import org.apache.hadoop.fs.FileSystem;
    import org.apache.hadoop.fs.Path;
    import org.apache.hadoop.security.UserGroupInformation;
    
    public class KerberosHDFSIO {
    
    public static void main(String[] args) throws IOException {
    
        Configuration conf = new Configuration();
        //The following property is enough for a non-kerberized setup
        //      conf.set("fs.defaultFS", "localhost:9000");
    
        //need following set of properties to access a kerberized cluster
        conf.set("fs.defaultFS", "hdfs://devha:8020");
        conf.set("hadoop.security.authentication", "kerberos");
    
        //The location of krb5.conf file needs to be provided in the VM arguments for the JVM
        //-Djava.security.krb5.conf=/Users/user/Desktop/utils/cluster/dev/krb5.conf
    
        UserGroupInformation.setConfiguration(conf);
        UserGroupInformation.loginUserFromKeytab("user@HADOOP_DEV.ABC.COM",
                "/Users/user/Desktop/utils/cluster/dev/.user.keytab");
    
        try (FileSystem fs = FileSystem.get(conf);) {
            FileStatus[] fileStatuses = fs.listStatus(new Path("/user/username/dropoff"));
            for (FileStatus fileStatus : fileStatuses) {
                System.out.println(fileStatus.getPath().getName());
            }
        } catch (IOException e) {
            // TODO Auto-generated catch block
            e.printStackTrace();
        }
    }
    

    }