为什么我会得到这个异常java.lang.noclassdeffounderror?

9udxz4iz  于 2021-06-02  发布在  Hadoop
关注(0)|答案(1)|浏览(382)

我正在尝试同时使用hbase和hadoop。当我运行jar文件时,我得到了这个错误。以下是我的源代码:

public class TwitterTable {

    final static Charset ENCODING = StandardCharsets.UTF_8;
    final static String FILE_NAME = "/home/hduser/project04/sample.txt";

    static class Mapper1 extends TableMapper<ImmutableBytesWritable, IntWritable> 
    {
        byte[] value;

        @Override
        public void map(ImmutableBytesWritable row, Result values, Context context) throws IOException 
        {
            value = values.getValue(Bytes.toBytes("text"), Bytes.toBytes(""));
            String valueStr = Bytes.toString(value);
            System.out.println("GET: " + valueStr);
        }
    }

    public static class Reducer1 extends TableReducer<ImmutableBytesWritable, IntWritable, ImmutableBytesWritable> {

        public void reduce(ImmutableBytesWritable key, Iterable<IntWritable> values, Context context)
                throws IOException, InterruptedException {

        }
    }

    public static void main( String args[] ) throws IOException, ClassNotFoundException, InterruptedException 
    {
        Configuration conf = new Configuration();

         @SuppressWarnings("deprecation")
        Job job = new Job(conf, "TwitterTable");
        job.setJarByClass(TwitterTable.class);

        HTableDescriptor ht = new HTableDescriptor( "twitter" );
        ht.addFamily( new HColumnDescriptor("text"));
        HBaseAdmin hba = new HBaseAdmin( conf );

        if(!hba.tableExists("twitter"))
        {
            hba.createTable( ht );
            System.out.println( "Table Created!" );
        }

        //Read the file and add to the database
        TwitterTable getText = new TwitterTable();

        Scan scan = new Scan();
        String columns = "text"; 
        scan.addColumn(Bytes.toBytes(columns), Bytes.toBytes(""));

        TableMapReduceUtil.initTableMapperJob("twitter", scan, Mapper1.class, ImmutableBytesWritable.class,
                IntWritable.class, job);

        job.waitForCompletion(true);

        //getText.readTextFile(FILE_NAME);
    }

    void readTextFile(String aFileName) throws IOException 
     {
            Path path = Paths.get(aFileName);
            try (BufferedReader reader = Files.newBufferedReader(path, ENCODING)){
              String line = null;
              while ((line = reader.readLine()) != null) {
                //process each line in some way
                  addToTable(line);
              }      
            }
          System.out.println("all done!");

     }

    void addToTable(String line) throws IOException
    {
        Configuration conf = new Configuration();
        HTable table = new HTable(conf, "twitter");

        String LineText[] = line.split(","); 

        String row = "";
        String text = "";

        row = LineText[0].toString();
        row = row.replace("\"", "");
        text = LineText[1].toString();
        text = text.replace("\"", "");

        Put put = new Put(Bytes.toBytes(row));
        put.addColumn(Bytes.toBytes("text"), Bytes.toBytes(""), Bytes.toBytes(text));
        table.put(put);
        table.flushCommits();
        table.close();   
    }   
}

我将类路径添加到hadoop-env.sh中,但仍然不走运。。我不知道有什么问题。这里是我的hadoop-env.sh类路径:

export HADOOP_CLASSPATH=
    /usr/lib/hbase/hbase-1.0.0/lib/hbase-common-1.0.0.jar:
    /usr/lib/hbase/hbase-1.0.0/lib/hbase-client.jar:
    /usr/lib/hbase/hbase-1.0.0/lib/log4j-1.2.17.jar:
    /usr/lib/hbase/hbase-1.0.0/lib/hbase-it-1.0.0.jar:
    /usr/lib/hbase/hbase-1.0.0/lib/hbase-common-1.0.0-tests.jar:
    /usr/lib/hbase/hbase-1.0.0/conf:
    /usr/lib/hbase/hbase-1.0.0/lib/zookeeper-3.4.6.jar:
    /usr/lib/hbase/hbase-1.0.0/lib/protobuf-java-2.5.0.jar:
    /usr/lib/hbase/hbase-1.0.0/lib/guava-12.0.1.jar
lkaoscv7

lkaoscv71#

好吧,我找到了。。可能您无法将所有内容添加到类路径。在这种情况下,从hbase复制所有库并添加到hadoop中(请参阅hadoop.env.sh)
hadoop目录/contrib/capacity调度器
它对我有用。

相关问题