获取hbase异常未传递任何区域

frebpwbc  于 2021-05-30  发布在  Hadoop
关注(0)|答案(1)|浏览(321)

嗨,我是hbase的新手,我正在尝试学习如何使用mapreduce将批量数据加载到hbase表
但我正在变得异常
线程“main”java.lang.illegalargumentexception中出现异常:org.apache.hadoop.hbase.mapreduce.hfileoutputformat2.writepartitions(hfileoutputformat2)中未传递任何区域。java:307)位于org.apache.hadoop.hbase.mapreduce.hfileoutputformat2.configurepartitioner(hfileoutputformat2。java:527)在org.apache.hadoop.hbase.mapreduce.hfileoutputformat2.configureincrementalload(hfileoutputformat2。java:391)在org.apache.hadoop.hbase.mapreduce.hfileoutputformat2.configureincrementalload(hfileoutputformat2。java:356)在jobdriver.run(jobdriver。java:108)在org.apache.hadoop.util.toolrunner.run(toolrunner。java:70)在org.apache.hadoop.util.toolrunner.run(toolrunner。java:84)在jobdriver.main(jobdriver。java:34)在sun.reflect.nativemethodaccessorimpl.invoke0(本机方法)在sun.reflect.nativemethodaccessorimpl.invoke(nativemethodaccessorimpl)。java:57)在sun.reflect.delegatingmethodaccessorimpl.invoke(delegatingmethodaccessorimpl。java:43)在java.lang.reflect.method.invoke(方法。java:606)在org.apache.hadoop.util.runjar.main(runjar。java:212)
这是我的Map代码

  1. public void map(LongWritable key, Text value, Context context) throws IOException, InterruptedException {
  2. System.out.println("Value in Mapper"+value.toString());
  3. String[] values = value.toString().split(",");
  4. byte[] row = Bytes.toBytes(values[0]);
  5. ImmutableBytesWritable k = new ImmutableBytesWritable(row);
  6. KeyValue kvProtocol = new KeyValue(row, "PROTOCOLID".getBytes(), "PROTOCOLID".getBytes(), values[1]
  7. .getBytes());
  8. context.write(k, kvProtocol);
  9. }

这是我的作业配置

  1. public class JobDriver extends Configured implements Tool{
  2. public static void main(String[] args) throws Exception {
  3. // TODO Auto-generated method stub
  4. ToolRunner.run(new JobDriver(), args);
  5. System.exit(0);
  6. }
  7. @Override
  8. public int run(String[] arg0) throws Exception {
  9. // TODO Auto-generated method stub
  10. // HBase Configuration
  11. System.out.println("**********Starting Hbase*************");
  12. Configuration conf = HBaseConfiguration.create();
  13. Job job = new Job(conf, "TestHFileToHBase");
  14. job.setJarByClass(JobDriver.class);
  15. job.setOutputKeyClass(ImmutableBytesWritable.class);
  16. job.setOutputValueClass(KeyValue.class);
  17. job.setMapperClass(LoadMapper.class);
  18. job.setOutputFormatClass(HFileOutputFormat2.class);
  19. HTable table = new HTable(conf, "kiran");
  20. FileInputFormat.addInputPath(job, new Path("hdfs://192.168.61.62:9001/sampledata.csv"));
  21. FileOutputFormat.setOutputPath(job, new Path("hdfs://192.168.61.62:9001/deletions_6.csv"));
  22. HFileOutputFormat2.configureIncrementalLoad(job, table);
  23. //System.exit(job.waitForCompletion(true) ? 0 : 1);
  24. return job.waitForCompletion(true) ? 0 : 1;
  25. }
  26. }

有人能帮我解决这个例外吗。

xzlaal3s

xzlaal3s1#

必须先创建表。你可以用下面的代码来做

  1. //Create table and do pre-split
  2. HTableDescriptor descriptor = new HTableDescriptor(
  3. Bytes.toBytes(tableName)
  4. );
  5. descriptor.addFamily(
  6. new HColumnDescriptor(Constants.COLUMN_FAMILY_NAME)
  7. );
  8. HBaseAdmin admin = new HBaseAdmin(config);
  9. byte[] startKey = new byte[16];
  10. Arrays.fill(startKey, (byte) 0);
  11. byte[] endKey = new byte[16];
  12. Arrays.fill(endKey, (byte)255);
  13. admin.createTable(descriptor, startKey, endKey, REGIONS_COUNT);
  14. admin.close();

或者直接从hbase shell使用以下命令:

  1. create 'kiran', 'colfam1'

导致异常的原因是startkeys列表为空:第306行
更多信息可以在这里找到。
请注意,表名必须与代码中使用的表名(kiran)相同。

展开查看全部

相关问题