连接到Kerberized hdfs,java.lang.illegalargumentexception:未能指定服务器的kerberos主体名称;

k75qkfdt  于 2021-05-29  发布在  Hadoop
关注(0)|答案(2)|浏览(417)

我正在尝试连接到kerberized hdfs群集,使用下面的代码,使用下面相同的代码,我当然可以通过hbaseconfiguration访问hbase,

Configuration config = new Configuration();
config.set("hadoop.security.authentication", "Kerberos");

UserGroupInformation.setConfiguration(config);
UserGroupInformation ugi = null;
ugi = UserGroupInformation.loginUserFromKeytabAndReturnUGI("me@EXAMPLE>COM","me.keytab");
model = ugi.doAs((PrivilegedExceptionAction<Map<String,Object>>) () -> { 
  testHadoop(hcb.gethDFSConfigBean());
  return null;
});

我已经能够成功地使用相同的keytab和principal访问solr、impala,我得到了一个奇怪的消息:找不到hdfs的服务名。
请看下面的堆栈跟踪

java.io.IOException: Failed on local exception: java.io.IOException: java.lang.IllegalArgumentException: Failed to specify server's Kerberos principal name; Host Details : local host is: "Securonix-int3.local/10.0.4.36"; destination host is: "sobd189.securonix.com":8020; 
    at org.apache.hadoop.net.NetUtils.wrapException(NetUtils.java:772)
    at org.apache.hadoop.ipc.Client.call(Client.java:1472)
    at org.apache.hadoop.ipc.Client.call(Client.java:1399)
    at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:232)
    at com.sun.proxy.$Proxy9.getFileInfo(Unknown Source)
    at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.getFileInfo(ClientNamenodeProtocolTranslatorPB.java:752)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    at java.lang.reflect.Method.invoke(Method.java:497)
    at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:187)
    at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:102)
    at com.sun.proxy.$Proxy10.getFileInfo(Unknown Source)
    at org.apache.hadoop.hdfs.DFSClient.getFileInfo(DFSClient.java:1988)
    at org.apache.hadoop.hdfs.DistributedFileSystem$18.doCall(DistributedFileSystem.java:1118)
    at org.apache.hadoop.hdfs.DistributedFileSystem$18.doCall(DistributedFileSystem.java:1114)
    at org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81)
    at org.apache.hadoop.hdfs.DistributedFileSystem.getFileStatus(DistributedFileSystem.java:1114)
    at org.apache.hadoop.fs.FileSystem.exists(FileSystem.java:1400)
    at com.securonix.application.ui.uiUtil.SnyperUIUtil.lambda$main$4(SnyperUIUtil.java:1226)
    at com.securonix.application.ui.uiUtil.SnyperUIUtil$$Lambda$6/1620890840.run(Unknown Source)
    at java.security.AccessController.doPrivileged(Native Method)
    at javax.security.auth.Subject.doAs(Subject.java:422)
    at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1628)
    at com.securonix.application.ui.uiUtil.SnyperUIUtil.main(SnyperUIUtil.java:1216)
Caused by: java.io.IOException: java.lang.IllegalArgumentException: Failed to specify server's Kerberos principal name
    at org.apache.hadoop.ipc.Client$Connection$1.run(Client.java:680)
    at java.security.AccessController.doPrivileged(Native Method)
    at javax.security.auth.Subject.doAs(Subject.java:422)
    at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1628)
    at org.apache.hadoop.ipc.Client$Connection.handleSaslConnectionFailure(Client.java:643)
    at org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:730)
    at org.apache.hadoop.ipc.Client$Connection.access$2800(Client.java:368)
    at org.apache.hadoop.ipc.Client.getConnection(Client.java:1521)
    at org.apache.hadoop.ipc.Client.call(Client.java:1438)
    ... 23 more
Caused by: java.lang.IllegalArgumentException: Failed to specify server's Kerberos principal name
    at org.apache.hadoop.security.SaslRpcClient.getServerPrincipal(SaslRpcClient.java:322)
    at org.apache.hadoop.security.SaslRpcClient.createSaslClient(SaslRpcClient.java:231)
    at org.apache.hadoop.security.SaslRpcClient.selectSaslClient(SaslRpcClient.java:159)
    at org.apache.hadoop.security.SaslRpcClient.saslConnect(SaslRpcClient.java:396)
    at org.apache.hadoop.ipc.Client$Connection.setupSaslConnection(Client.java:553)
    at org.apache.hadoop.ipc.Client$Connection.access$1800(Client.java:368)
    at org.apache.hadoop.ipc.Client$Connection$2.run(Client.java:722)
    at org.apache.hadoop.ipc.Client$Connection$2.run(Client.java:718)
    at java.security.AccessController.doPrivileged(Native Method)
    at javax.security.auth.Subject.doAs(Subject.java:422)
    at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1628)
    at org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:717)

在我为kerberos启用了调试代码之后,我在调用filesystem.get()时得到了下面的调试日志;kerberor调试日志:
java config name:null java config name:null native config name:/etc/krb5.conf native config name:/etc/krb5.conf从native config加载从native config加载16/02/22 15:53:14 warn util.nativecodeloader:无法加载平台的本机hadoop库。。。在适用的情况下使用内置java类
java config name:null java config name:null native config name:/etc/krb5.conf native config name:/etc/krb5.conf从native config加载从native config加载
kdcacaccessibility:重置>>>kdcacaccessibility:重置kdcacaccessibility:重置>>>kdcacaccessibility:重置keytabinputstream,readname():example.com>>>keytabinputstream,readname():example.com keytabinputstream,readname():securonix keytab:load()条目长度:55;type:23>>>键tab:load()条目长度:55;键入:23 keytabinputstream,readname():example.com>>>keytabinputstream,readname():example.com keytabinputstream,readname():securonix>>>keytabinputstream,readname():securonix keytab:load()条目长度:71;type:18>>>键tab:load()条目长度:71;键入:18查找以下项的键:securonix@example.com 正在查找以下项的密钥:securonix@example.com 添加的密钥:18版本:1添加的密钥:18版本:1添加的密钥:23版本:1添加的密钥:23版本:1查找的密钥:securonix@example.com 正在查找以下项的密钥:securonix@example.com 添加密钥:18版本:1添加密钥:18版本:1添加密钥:23version:1添加的键:23version:1默认类型的默认etype:18 16。默认类型的默认etype:18 16。krbasreq创建消息>>>krbasreq创建消息krbkdcreq send:kdc=sobd189.securonix.comtcp:88,超时=30000,重试次数=3,#字节=139>>>krbkdcreq发送:kdc=sobd189.securonix.comtcp:88,超时=30000,重试次数=3,#字节=139 kdc通信:kdc=sobd189.securonix.comtcp:88,超时=30000,尝试=1,#bytes=139>>>kdc通信:kdc=sobd189.securonix.comtcp:88,超时=30000,尝试=1,#bytes=139调试:tcpclient reading 639 bytes>>>调试:tcpclient reading 639 bytes krbkdcreq send:#bytes read=639>>>krbkdcreq send:#bytes read=639 kdcacaccessibility:remove sobd189.securonix.com>>>kdcacaccessibility:remove sobd189.securonix.com查找以下项的键:securonix@example.com 正在查找以下项的密钥:securonix@example.com 添加的密钥:18版本:1添加密钥:18版本:1添加密钥:23版本:1添加密钥:23版本:1类型:sun.security.krb5.internal.crypto.aes256ctshmacsha1etype>>>类型:sun.security.krb5.internal.crypto.aes256ctshmacsha1etype krbasreq.getreply securonix中的krbasrep cons
有趣的是,当我使用hdfs.exists()这样的文件系统api时

>>>KinitOptions cache name is /tmp/krb5cc_501
 >> Acquire default native Credentials
 default etypes for default_tkt_enctypes: 18 18 16.
 >>> Found no TGT's in LSA
vuktfyat

vuktfyat1#

我对spark2和hdp3.1也有同样的问题,使用isilon/onefs作为存储,而不是hdfs。
onefs服务管理包没有为spark2期望的某些hdfs参数提供配置(它们在ambari中根本不可用),例如dfs.datanode.kerberos.principal。如果没有这些参数,spark2 historyserver可能无法启动并报告错误,例如“指定服务器的主体名称失败”。
我在自定义hdfs站点下向onefs添加了以下属性:

dfs.datanode.kerberos.principal=hdfs/_HOST@<MY REALM>
dfs.datanode.keytab.file=/etc/security/keytabs/hdfs.service.keytab
dfs.namenode.kerberos.principal=hdfs/_HOST@<MY REALM>
dfs.namenode.keytab.file=/etc/security/keytabs/hdfs.service.keytab

这解决了最初的错误。此后,我得到了以下形式的错误:

Server has invalid Kerberos principal: hdfs/<isilon>.my.realm.com@my.realm.com, expecting: hdfs/somewhere.else.entirely@my.realm.com

这与跨领域身份验证有关。通过将以下设置添加到自定义hdfs站点来解决:

dfs.namenode.kerberos.principal.pattern=*
o3imoua4

o3imoua42#

我认为问题是hdfs期望配置具有dfs.datanode.kerberos.principal的值,这是datanodes的主体,在本例中它是缺失的。
当我只从core-site.xml创建一个配置示例,却忘记添加hdfs-site.xml时,我也遇到了同样的问题。我一添加hdfs-site.xml它就开始工作了,hdfs-site.xml有:

<property>
      <name>dfs.datanode.kerberos.principal</name>
      <value>....</value>
 </property>

希望这有帮助。

相关问题