我尝试使用java api访问hdfs文件,代码如下:
import org.apache.hadoop.fs.FileSystem;
import org.apache.hadoop.conf.Configuration;
import org.apache.hadoop.fs.Path;
import org.apache.hadoop.fs.FSDataInputStream;
public static void main(args[]) {
Configuration conf = new Configuration();
conf.addResource(new Path("/etc/hadoop/conf/core-site.xml"));
conf.addResource(new Path("/etc/hadoop/conf/hdfs-site.xml"));
try {
Path path = new Path("hdfs://mycluster/user/mock/test.txt");
FileSystem fs = FileSystem.get(path.toUri(), conf);
if (fs.exists(path)) {
FSDataInputStream inputStream = fs.open(path);
// Process input stream ...
}
else
System.out.println("File does not exist");
} catch (IOException e) {
System.out.println(e.getMessage());
在发生异常 FileSystem.get(path.toUri(), conf)
这么说 Couldn't create proxy provider class org.apache.hadoop.hdfs.server.namenode.ha.ConfiguredFailoverProxyProvider
这是由 java.lang.NoClassDefFoundError: Could not initialize class org.apache.hadoop.security.Credentials
.
我没有找到多少关于这个错误的信息。问题是由于错误的api引起的吗( org.apache.hadoop.hdfs
而不是 org.apache.hadoop.fs
)?
1条答案
按热度按时间0ejtzxu11#
1) 你的类路径中有hadoop hdfs-.jar吗?
2) 如何下载依赖项?maven/手动/其他
3) 你能提供stacktrace吗?