当我试图列出hdfs目录中的文件时,收到以下错误:
main] ERROR com.pr.hdfs.common.hadooputils.HdfsUtil - Failed to connect to hdfs directory /HDFS/Datastore/DB
org.apache.hadoop.security.AccessControlException: SIMPLE authentication is not enabled. Available:[TOKEN, KERBEROS]
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
at java.lang.reflect.Constructor.newInstance(Constructor.java:422)
at org.apache.hadoop.ipc.RemoteException.instantiateException(RemoteException.java:106)
at org.apache.hadoop.ipc.RemoteException.unwrapRemoteException(RemoteException.java:73)
我使用的代码(尽管作为scala访问)是以下java:
public static List<Path> listHdfsFilePaths(final FileSystem hdfs, final String directory) throws IOException
{
final List<Path> result = new ArrayList<>();
final FileStatus[] fileStatuses;
try
{
fileStatuses = hdfs.listStatus(new Path(directory));
for (final FileStatus fileStatus : fileStatuses)
{
result.add(fileStatus.getPath());
}
}
catch (final IOException e)
{
LOGGER.error("Failed to connect to hdfs directory " + directory, e);
throw e;
}
return result;
}
scala代码将用户组信息 Package 为访问文件夹:
val auctions = ugiDoAs(ugi){
val hdfs = FileSystem.get(hadoopConf)
HdfsUtils.listHdfsFilePaths(hdfs, "/HDFS/Datastore/DB/")
}
ugidoas方法指定为:
def ugiDoAs[T](ugi: Option[UserGroupInformation])(code: => T) = ugi match {
case None => code
case Some(u) => u.doAs(new PrivilegedExceptionAction[T] {
override def run(): T = code
})
}
ugidoas Package 器用于从zookeeper读取,从hbase读取,但在本例中不访问hdfs。
另外,我的ugi变量包含keytab和所需的主体。
暂无答案!
目前还没有任何答案,快来回答吧!