我们正在尝试使用kerberos从karaf容器通过osgi包连接hdfs。我们已经使用apacheservicemix包在karaf中安装了hadoop客户机。
<groupId>org.apache.servicemix.bundles</groupId>
<artifactId>org.apache.servicemix.bundles.hadoop-client</artifactId>
<version>2.4.1_1</version>
pom文件附后:
<build>
<plugins>
<plugin>
<groupId>org.apache.felix</groupId>
<artifactId>maven-bundle-plugin</artifactId>
<version>2.3.7</version>
<extensions>true</extensions>
<configuration>
<instructions>
<Bundle-Activator>com.bdbizviz.hadoop.activator.PaHdfsActivator</Bundle-Activator>
<Bundle-SymbolicName>${project.artifactId}</Bundle-SymbolicName>
<Bundle-Version>${project.version}</Bundle-Version>
<Export-Package>
<!-- com.google.*, !org.apache.camel.model.dataformat, !org.apache.poi.ddf,
!org.apache.xmlbeans, org.apache.commons.collections.*, org.apache.commons.configuration.*,
org.apache.hadoop.hdfs*, org.apache.hadoop.hdfs.client*, org.apache.hadoop.hdfs.net*,
org.apache.hadoop.hdfs.protocol.datatransfer*, org.apache.hadoop.hdfs.protocol.proto*,
org.apache.hadoop.hdfs.protocolPB*, org.apache.hadoop.conf.*, org.apache.hadoop.io.*,
org.apache.hadoop.fs.*, org.apache.hadoop.security.*, org.apache.hadoop.metrics2.*,
org.apache.hadoop.util.*, org.apache.hadoop*; -->
<!-- org.apache.*; -->
</Export-Package>
<Import-Package>
org.apache.hadoop*,org.osgi.framework,*;resolution:=optional
</Import-Package>
<Include-Resource>
{maven-resources},
@org.apache.servicemix.bundles.hadoop-client-2.4.1_1.jar!/coredefault.
xml,
@org.apache.servicemix.bundles.hadoop-client-2.4.1_1.jar!/hdfsdefault.
xml,
@org.apache.servicemix.bundles.hadoop-client-
2.4.1_1.jar!/mapred-default.xml,
@org.apache.servicemix.bundles.hadoop-client-
2.4.1_1.jar!/hadoop-metrics.properties
</Include-Resource>
<DynamicImport-Package>*</DynamicImport-Package>
</instructions>
</configuration>
</plugin>
</plugins>
</build>
<dependencies>
<dependency>
<groupId>org.apache.servicemix.bundles</groupId>
<artifactId>org.apache.servicemix.bundles.hadoop-client</artifactId>
<version>2.4.1_1</version>
<exclusions>
<exclusion>
<groupId>jdk.tools</groupId>
<artifactId>jdk.tools</artifactId>
<!-- <version>1.7</version> -->
</exclusion>
</exclusions>
</dependency>
</dependencies>
代码段:
public class TestHdfs implements ITestHdfs{
public void printName() throws IOException{
/*
Configuration config = new Configuration();
config.set("fs.default.name", "hdfs://192.168.1.17:8020");
config.set("fs.hdfs.impl", org.apache.hadoop.hdfs.DistributedFileSystem.class.getName());
config.set("fs.file.impl", org.apache.hadoop.fs.LocalFileSystem.class.getName());
try {
fs = FileSystem.get(config);
getHostnames(fs);
} catch (IOException e) {
e.printStackTrace();
}*/
Thread.currentThread().setContextClassLoader(getClass().getClassLoader());
final Configuration config = new Configuration();
config.set("fs.default.name", "hdfs://192.168.1.124:8020");
config.set("fs.file.impl", LocalFileSystem.class.getName());
config.set("fs.hdfs.impl", DistributedFileSystem.class.getName());
config.set("hadoop.security.authentication", "KERBEROS");
config.set("dfs.namenode.kerberos.principal.pattern",
"hdfs/*@********.COM");
System.setProperty("HADOOP_JAAS_DEBUG", "true");
System.setProperty("sun.security.krb5.debug", "true");
System.setProperty("java.net.preferIPv4Stack", "true");
System.out.println("--------------status---:"
+ UserGroupInformation.isSecurityEnabled());
UserGroupInformation.setConfiguration(config);
// UserGroupInformation.loginUserFromKeytab(
// "hdfs/hadoop1.********.com@********.COM",
// "file:/home/kaushal/hdfs-hadoop1.keytab");
UserGroupInformation app_ugi = UserGroupInformation
.loginUserFromKeytabAndReturnUGI("hdfs/hadoop1.********.com@********.COM",
"C:\\Users\\desanth.pv\\Desktop\\hdfs-hadoop1.keytab");
UserGroupInformation proxy_ugi = UserGroupInformation.createProxyUser(
"ssdfsdfsdfsdfag", app_ugi);
System.out.println("--------------status---:"
+ UserGroupInformation.isSecurityEnabled());
/*ClassLoader tccl = Thread.currentThread()
.getContextClassLoader();*/
try {
/*Thread.currentThread().setContextClassLoader(
getClass().getClassLoader());*/
proxy_ugi.doAs(new PrivilegedExceptionAction() {
@Override
public Object run() throws Exception {
/*ClassLoader tccl = Thread.currentThread()
.getContextClassLoader();*/
try {
/*Thread.currentThread().setContextClassLoader(
getClass().getClassLoader());*/
System.out.println("desanth");
FileSystem fs = FileSystem.get(config);
DistributedFileSystem hdfs = (DistributedFileSystem) fs;
DatanodeInfo[] dataNodeStats = hdfs.getDataNodeStats();
String[] names = new String[dataNodeStats.length];
for (int i = 0; i < dataNodeStats.length; i++) {
names[i] = dataNodeStats[i].getHostName();
System.out.println((dataNodeStats[i].getHostName()));
}
} catch (IOException e) {
e.printStackTrace();
} finally {
//Thread.currentThread().setContextClassLoader(tccl);
}
return null;
}
});
} catch (InterruptedException e) {
// TODO Auto-generated catch block
e.printStackTrace();
} finally {
/*Thread.currentThread().setContextClassLoader(tccl);*/
}
}
public void getHostnames(FileSystem fs) throws IOException {
DistributedFileSystem hdfs = (DistributedFileSystem) fs;
DatanodeInfo[] dataNodeStats = hdfs.getDataNodeStats();
String[] names = new String[dataNodeStats.length];
for (int i = 0; i < dataNodeStats.length; i++) {
names[i] = dataNodeStats[i].getHostName();
System.out.println((dataNodeStats[i].getHostName()));
}
}
}
错误:
Caused by: org.apache.hadoop.security.AccessControlException: Client cannot authenticate via:[TOKEN, KERBEROS]
[12:35:51 PM] Jayendra Parsai: java.io.IOException: Failed on local exception: java.io.IOException: org.apache.hadoop.security.AccessControlException: Client cannot authenticate via:[TOKEN, KERBEROS]; Host Details : local host is: "jayendra-dynabook-T451-34EW/127.0.1.1"; destination host is: "hadoop2.********.com":8020;
2条答案
按热度按时间k7fdbhmy1#
根据弗拉基米尔回答的背景部分,我尝试了很多方法,但最简单的方法是添加
之前
UserGroupInformation.loginUserFromKeytab
帮助我解决了这个问题。qmb5sa222#
我没有尝试在osgi环境中重现这个问题,但是我认为您可能面临着一个类似于您在kerberised环境中运行时所面临的问题,该环境包含hadoop/hdfs依赖项。
即
org.apache.hadoop.security.AccessControlException: Client cannot authenticate via:[TOKEN, KERBEROS]
错误。背景
开机后
DEBUG
在sasl谈判之后有一句有趣的话:Get kerberos info proto:interface org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolPB info:null
注意null
-成功的执行在这里有一个类引用。追踪这个,
SaslRpcClient
电话SecurityUtil.getTokenInfo
. 这将启动对所有org.apache.hadoop.security.SecurityInfo
供应商。org.apache.hadoop.security.SecurityUtil
使用java.util.ServiceLoader
向上看SecurityInfo
示例。ServiceLoader
默认情况下,使用当前线程的ContextClassLoader
在数据库中查找文件META-INF/services/
类路径上的目录。文件的名称与服务名称相对应,因此它正在查找META-INF/services/org.apache.hadoop.security.SecurityInfo
当一个jar是一个uberjar(或者我猜如果你在osgi包中加载了一些东西)并且你的类路径上只有一个这样的文件,那么你必须确保所有的条目都被追加。例如,在maven中,可以使用servicesresourcetransformer来附加条目。sbt组件有一个类似的合并选项,更易于配置。解决方案
如后台所述,确保类加载器
java.util.ServiceLoader
正在使用可以找到META-INF/services/org.apache.hadoop.security.SecurityInfo
所有来自hadoop jar的条目。在osgi中,您仍然需要以某种方式合并条目。试着把他们也包括进来
<Include-Resources>
包xml的一部分?日志输出
这是我在它不工作时得到的输出: