无法编译hbase mapreduce代码以将输出存储到hbase

xdnvmnnf  于 2021-05-27  发布在  Hadoop
关注(0)|答案(0)|浏览(247)

我的代码试图从输入文件中获取套装和等级,并将丢失的卡片组中的卡片存储到hbase中。如果输入和输出都是文本文件并将其正确存储到hdfs中,则代码可以正常工作。
我遵循了hbase的教程并添加了代码,以便获取输出并将其存储到hbase而不是输出文件中。当我为hbase tablemapper、tablereducer和tablemapreduceutil添加导入时,会出现以下错误:

ubuntu@ip-172-31-42-214:~/server/hadoop-2.10.0/repo/hw3$ javac cards.java  -cp $(hbase classpath):$(hadoop classpath)cards.java:21: error: cannot find symbol
import org.apache.hadoop.hbase.mapreduce.TableMapReduceUtil;
                                        ^
  symbol:   class TableMapReduceUtil
  location: package org.apache.hadoop.hbase.mapreduce
cards.java:22: error: cannot find symbol
import org.apache.hadoop.hbase.mapreduce.TableMapper;
                                        ^
  symbol:   class TableMapper
  location: package org.apache.hadoop.hbase.mapreduce
cards.java:23: error: cannot find symbol
import org.apache.hadoop.hbase.mapreduce.TableReducer;
                                        ^
  symbol:   class TableReducer
  location: package org.apache.hadoop.hbase.mapreduce
cards.java:82: error: cannot find symbol
        public static class reduce extends TableReducer<Text, IntWritable, Text, IntWritable> {
                                           ^
  symbol:   class TableReducer
  location: class cards
cards.java:89: error: cannot find symbol
                public void reduce(Text key, Iterable<IntWritable> value, Context context)
                                                                          ^
  symbol:   class Context
  location: class reduce
cards.java:39: error: cannot find symbol
                Configuration conf = HbaseConfiguration.create();
                                     ^
  symbol:   variable HbaseConfiguration
  location: class cards
cards.java:41: error: cannot find symbol
                Admin admin = conn.getAdmin();
                ^
  symbol:   class Admin
  location: class cards
cards.java:42: error: cannot find symbol
                HTableDescriptor htdesc = new HTableDescriptor(TableName.valueOf("MissingCards"));
                ^
  symbol:   class HTableDescriptor
  location: class cards
cards.java:42: error: cannot find symbol
                HTableDescriptor htdesc = new HTableDescriptor(TableName.valueOf("MissingCards"));
                                              ^
  symbol:   class HTableDescriptor
  location: class cards
cards.java:42: error: cannot find symbol
                HTableDescriptor htdesc = new HTableDescriptor(TableName.valueOf("MissingCards"));
                                                               ^
  symbol:   variable TableName
  location: class cards
cards.java:43: error: cannot find symbol
                HColumnDescriptor hcdesc = new HColumnDescriptor(Bytes.toBytes("cf"));
                ^
  symbol:   class HColumnDescriptor
  location: class cards
cards.java:43: error: cannot find symbol
                HColumnDescriptor hcdesc = new HColumnDescriptor(Bytes.toBytes("cf"));
                                               ^
  symbol:   class HColumnDescriptor
  location: class cards
cards.java:54: error: cannot find symbol
                TableMapReduceUtil.initTableReducerJob("cards", reduce.class, job);
                ^
  symbol:   variable TableMapReduceUtil
  location: class cards
cards.java:121: error: cannot find symbol
                        put.addColumn(CF, SUIT, Bytes.toBytes(key.toString()));
                                          ^
  symbol:   variable SUIT
  location: class reduce
14 errors

我不确定我的hadoop和hbase类路径中是否缺少任何东西,但据我所知,它包含了所有必需的jar:

ubuntu@ip-172-31-42-214:~/server/hadoop-2.10.0/repo/hw3$ hbase classpath
/usr/local/Hbase/conf:/usr/lib/jvm/java-1.8.0-openjdk-amd64/jre/lib/tools.jar:/usr/local/Hbase:/usr/local/Hbase/lib/shaded-clients/hbase-shaded-client-byo-hadoop-2.2.3.jar:/usr/local/Hbase/lib/client-facing-thirdparty/audience-annotations-0.5.0.jar:/usr/local/Hbase/lib/client-facing-thirdparty/commons-logging-1.2.jar:/usr/local/Hbase/lib/client-facing-thirdparty/findbugs-annotations-1.3.9-1.jar:/usr/local/Hbase/lib/client-facing-thirdparty/htrace-core4-4.2.0-incubating.jar:/usr/local/Hbase/lib/client-facing-thirdparty/log4j-1.2.17.jar:/usr/local/Hbase/lib/client-facing-thirdparty/slf4j-api-1.7.25.jar:/home/ubuntu/server/hadoop-2.10.0/etc/hadoop:/home/ubuntu/server/hadoop-2.10.0/share/hadoop/common/lib/*:/home/ubuntu/server/hadoop-2.10.0/share/hadoop/common/*:/home/ubuntu/server/hadoop-2.10.0/share/hadoop/hdfs:/home/ubuntu/server/hadoop-2.10.0/share/hadoop/hdfs/lib/*:/home/ubuntu/server/hadoop-2.10.0/share/hadoop/hdfs/*:/home/ubuntu/server/hadoop-2.10.0/share/hadoop/yarn:/home/ubuntu/server/hadoop-2.10.0/share/hadoop/yarn/lib/*:/home/ubuntu/server/hadoop-2.10.0/share/hadoop/yarn/*:/home/ubuntu/server/hadoop-2.10.0/share/hadoop/mapreduce/lib/*:/home/ubuntu/server/hadoop-2.10.0/share/hadoop/mapreduce/*
ubuntu@ip-172-31-42-214:~/server/hadoop-2.10.0/repo/hw3$ hadoop classpath
/home/ubuntu/server/hadoop-2.10.0/etc/hadoop:/home/ubuntu/server/hadoop-2.10.0/share/hadoop/common/lib/*:/home/ubuntu/server/hadoop-2.10.0/share/hadoop/common/*:/home/ubuntu/server/hadoop-2.10.0/share/hadoop/hdfs:/home/ubuntu/server/hadoop-2.10.0/share/hadoop/hdfs/lib/*:/home/ubuntu/server/hadoop-2.10.0/share/hadoop/hdfs/*:/home/ubuntu/server/hadoop-2.10.0/share/hadoop/yarn:/home/ubuntu/server/hadoop-2.10.0/share/hadoop/yarn/lib/*:/home/ubuntu/server/hadoop-2.10.0/share/hadoop/yarn/*:/home/ubuntu/server/hadoop-2.10.0/share/hadoop/mapreduce/lib/*:/home/ubuntu/server/hadoop-2.10.0/share/hadoop/mapreduce/*

我的类路径中是否还缺少其他jar,或者在更新版本的hbase中是否更改了导入路径,这在我随后的教程中没有指出?我还没弄明白。
我可以根据需要提供任何其他信息。先谢谢你。

暂无答案!

目前还没有任何答案,快来回答吧!

相关问题