具有附加依赖项的hadoopjava程序编译

iugsix8n  于 2021-06-02  发布在  Hadoop
关注(0)|答案(1)|浏览(526)

我正在尝试构建一个hadoop程序,其目的是 cat 我之前上传到hdfs的文件主要基于本教程,程序如下所示:

import java.io.*;
import java.net.URI;
import org.apache.hadoop.fs.*;
import org.apache.hadoop.conf.*;
import org.apache.hadoop.io.*;

public class ReadHDFS {
    public static void main(String[] args) throws IOException {

        String uri = args[0];

        Configuration conf = new Configuration();
        FileSystem fs = FileSystem.get(URI.create(uri), conf);
        FSDataInputStream in = null ;

        try
        {
            in = fs.open(new Path(uri));
            IOUtils.copyBytes(in, System.out, 4096, false);
        }
        finally
        {
            IOUtils.closeStream(in);
        }   
    }
}

在我看来,这个教程是有缺陷的,因为根据我的理解,ioutils是 apache.commons 图书馆。但是,尽管我在我一直尝试部署的程序中添加了以下行:

import org.apache.commons.compress.utils.IOUtils;

我仍然遇到以下错误:

即:

FileSystemCat.java:37: error: cannot find symbol
        IOUtils.copyBytes(in, System.out, 4096, false);
               ^
  symbol:   method copyBytes(InputStream,PrintStream,int,boolean)
  location: class IOUtils
FileSystemCat.java:40: error: cannot find symbol
        IOUtils.closeStream(in);
                            ^
  symbol:   variable in
  location: class FileSystemCat
2 errors

我在namenode上执行这个命令:

javac -cp /usr/local/hadoop/share/hadoop/common/hadoop-common-2.8.1.jar:/home/ubuntu/job_program/commons-io-2.5/commons-io-2.5.jar FileSystemCat.java
tzdcorbm

tzdcorbm1#

必要的附录 ~/.bashrc :


# Classpath for Java

# export HADOOP_CLASSPATH=$($HADOOP_HOME/bin/hadoop classpath)

export HADOOP_CLASSPATH=$($HADOOP_HOME/bin/hadoop classpath)

如何计算底部的程序:

javac -cp ${HADOOP_CLASSPATH}:commons-io-2.5.jar ReaderHDFS.java

如何为该程序生成jar文件:

jar cf rhdfs.jar ReaderHDFS*.class

运行命令:

$HADOOP_HOME/bin/hadoop jar rhdfs.jar ReaderHDFS hdfs://master:9000/input_1/codes.txt

程序如下:

import org.apache.hadoop.io.IOUtils;
//import org.apache.commons.io.IOUtils;
import java.io.*;
import java.net.URI;
import org.apache.hadoop.fs.*;
import org.apache.hadoop.conf.*;
import org.apache.hadoop.io.*;

public class ReaderHDFS {
    public static void main(String[] args) throws IOException {

        String uri = args[0];

        Configuration conf = new Configuration();
        FileSystem fs = FileSystem.get(URI.create(uri), conf);
        FSDataInputStream in = null ;

        try
        {
            in = fs.open(new Path(uri));
            IOUtils.copyBytes(in, System.out, 4096, false);
        }
        finally
        {
            IOUtils.closeStream(in);
        }   
    }
}

相关问题