获取Hbase异常没有区域通过

时间:2014-12-01 10:33:41

标签: hadoop mapreduce hbase

您好我是Hbase的新手,我正在尝试学习如何使用MapReduce将批量数据加载到Hbase表

但我的情况低于例外

  

线程“main”中的异常java.lang.IllegalArgumentException:没有传递任何区域           在org.apache.hadoop.hbase.mapreduce.HFileOutputFormat2.writePartitions(HFileOutputFormat2.java:307)           在org.apache.hadoop.hbase.mapreduce.HFileOutputFormat2.configurePartitioner(HFileOutputFormat2.java:527)           at org.apache.hadoop.hbase.mapreduce.HFileOutputFormat2.configureIncrementalLoad(HFileOutputFormat2.java:391)           at org.apache.hadoop.hbase.mapreduce.HFileOutputFormat2.configureIncrementalLoad(HFileOutputFormat2.java:356)           在JobDriver.run(JobDriver.java:108)           在org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70)           在org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:84)           在JobDriver.main(JobDriver.java:34)           at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)           at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)           at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)           at java.lang.reflect.Method.invoke(Method.java:606)           在org.apache.hadoop.util.RunJar.main(RunJar.java:212)

这是mY Mapper Code

public void map(LongWritable key, Text value, Context context) throws IOException, InterruptedException {
		
		
		System.out.println("Value in Mapper"+value.toString());
		String[] values = value.toString().split(",");
         byte[] row = Bytes.toBytes(values[0]);
         ImmutableBytesWritable k = new ImmutableBytesWritable(row);
         KeyValue kvProtocol = new KeyValue(row, "PROTOCOLID".getBytes(), "PROTOCOLID".getBytes(), values[1]
                         .getBytes());
         context.write(k, kvProtocol);
}

这是我的工作配置

public class JobDriver extends Configured implements Tool{

	public static void main(String[] args) throws Exception {
		// TODO Auto-generated method stub
		ToolRunner.run(new JobDriver(), args);
		System.exit(0);

	}

	@Override
	public int run(String[] arg0) throws Exception {
		// TODO Auto-generated method stub
		
		// HBase Configuration
		System.out.println("**********Starting Hbase*************");
				Configuration conf = HBaseConfiguration.create();
                Job job = new Job(conf, "TestHFileToHBase");
                job.setJarByClass(JobDriver.class);
                job.setOutputKeyClass(ImmutableBytesWritable.class);
                job.setOutputValueClass(KeyValue.class);
                job.setMapperClass(LoadMapper.class);
                job.setOutputFormatClass(HFileOutputFormat2.class);                
                HTable table = new HTable(conf, "kiran");
                FileInputFormat.addInputPath(job, new Path("hdfs://192.168.61.62:9001/sampledata.csv"));
                FileOutputFormat.setOutputPath(job, new Path("hdfs://192.168.61.62:9001/deletions_6.csv"));
                HFileOutputFormat2.configureIncrementalLoad(job, table);
                //System.exit(job.waitForCompletion(true) ? 0 : 1);
				return job.waitForCompletion(true) ? 0 : 1;
	}
}

任何人都可以帮助我解决异常问题。

1 个答案:

答案 0 :(得分:4)

您必须先创建表格。您可以使用以下代码

来完成
//Create table and do pre-split
HTableDescriptor descriptor = new HTableDescriptor(
Bytes.toBytes(tableName)
);

descriptor.addFamily(
new HColumnDescriptor(Constants.COLUMN_FAMILY_NAME)
);

HBaseAdmin admin = new HBaseAdmin(config);

byte[] startKey = new byte[16];
Arrays.fill(startKey, (byte) 0);

byte[] endKey = new byte[16];
Arrays.fill(endKey, (byte)255);

admin.createTable(descriptor, startKey, endKey, REGIONS_COUNT);
admin.close();

或直接从hbase shell使用以下命令:

create 'kiran', 'colfam1'

引发异常是因为startkeys列表为空:line 306

可以找到更多信息here

请注意,表名必须与您在代码中使用的名称相同(kiran)。

相关问题