kerberized集群上的kafka控制台使用者:krberror:需要额外的预身份验证,在kerberos数据库中找不到服务器

yvt65v4c  于 2021-06-07  发布在  Kafka
关注(0)|答案(2)|浏览(885)

请帮助我修复一些异常,而连接到Kafka经纪在一个kerberized集群。
我正在cloudera集群上运行3.0.0-1版本的kafka。kafka是作为cloudera manager(cm)的服务安装的。经纪人开局不错。我可以创建和列出主题。
但是我的控制台制作人无法连接到kafka代理主题。我提供我的Kafka客户和制片人财产如下:
使用的命令和错误

  1. [root@local-dn-1.HADOOP.COM ~]$ /opt/cloudera/parcels/KAFKA/lib/kafka/bin/kafka-console-producer.sh --broker-list local-dn-1.HADOOP.COM:9092 --topic "Kafka-Sucker" --producer.config /etc/kafka/conf/producer-conf/producer.properties
  2. SLF4J: Class path contains multiple SLF4J bindings.
  3. SLF4J: Found binding in [jar:file:/opt/cloudera/parcels/KAFKA-3.0.0-1.3.0.0.p0.40/lib/kafka/libs/slf4j-log4j12-1.7.25.jar!/org/slf4j/impl/StaticLoggerBinder.class]
  4. SLF4J: Found binding in [jar:file:/opt/cloudera/parcels/KAFKA-3.0.0-1.3.0.0.p0.40/lib/kafka/libs/slf4j-log4j12-1.7.5.jar!/org/slf4j/impl/StaticLoggerBinder.class]
  5. SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
  6. SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]
  7. 18/03/28 07:38:45 INFO producer.ProducerConfig: ProducerConfig values:
  8. acks = 1
  9. batch.size = 16384
  10. bootstrap.servers = [local-dn-1.HADOOP.COM:9092]
  11. buffer.memory = 33554432
  12. client.id = console-producer
  13. compression.type = none
  14. connections.max.idle.ms = 540000
  15. enable.idempotence = false
  16. interceptor.classes = null
  17. key.serializer = class org.apache.kafka.common.serialization.ByteArraySerializer
  18. linger.ms = 1000
  19. max.block.ms = 60000
  20. max.in.flight.requests.per.connection = 5
  21. max.request.size = 1048576
  22. metadata.max.age.ms = 300000
  23. metric.reporters = []
  24. metrics.num.samples = 2
  25. metrics.recording.level = INFO
  26. metrics.sample.window.ms = 30000
  27. partitioner.class = class org.apache.kafka.clients.producer.internals.DefaultPartitioner
  28. receive.buffer.bytes = 32768
  29. reconnect.backoff.max.ms = 1000
  30. reconnect.backoff.ms = 50
  31. request.timeout.ms = 1500
  32. retries = 3
  33. retry.backoff.ms = 100
  34. sasl.jaas.config = null
  35. sasl.kerberos.kinit.cmd = /usr/bin/kinit
  36. sasl.kerberos.min.time.before.relogin = 60000
  37. sasl.kerberos.service.name = "kafka"
  38. sasl.kerberos.ticket.renew.jitter = 0.05
  39. sasl.kerberos.ticket.renew.window.factor = 0.8
  40. sasl.mechanism = GSSAPI
  41. security.protocol = SASL_PLAINTEXT
  42. send.buffer.bytes = 102400
  43. ssl.cipher.suites = null
  44. ssl.enabled.protocols = [TLSv1.2, TLSv1.1, TLSv1]
  45. ssl.endpoint.identification.algorithm = null
  46. ssl.key.password = null
  47. ssl.keymanager.algorithm = SunX509
  48. ssl.keystore.location = null
  49. ssl.keystore.password = null
  50. ssl.keystore.type = JKS
  51. ssl.protocol = TLS
  52. ssl.provider = null
  53. ssl.secure.random.implementation = null
  54. ssl.trustmanager.algorithm = PKIX
  55. ssl.truststore.location = null
  56. ssl.truststore.password = null
  57. ssl.truststore.type = JKS
  58. transaction.timeout.ms = 60000
  59. transactional.id = null
  60. value.serializer = class org.apache.kafka.common.serialization.ByteArraySerializer
  61. 18/03/28 07:38:45 DEBUG metrics.Metrics: Added sensor with name bufferpool-wait-time
  62. 18/03/28 07:38:45 DEBUG metrics.Metrics: Added sensor with name buffer-exhausted-records
  63. 18/03/28 07:38:45 DEBUG clients.Metadata: Updated cluster metadata version 1 to Cluster(id = null, nodes = [local-dn-1.HADOOP.COM:9092 (id: -1 rack: null)], partitions = [])
  64. Java config name: null
  65. Native config name: /etc/krb5.conf
  66. Loaded from native config
  67. >>> KeyTabInputStream, readName(): HADOOP.COM
  68. >>> KeyTabInputStream, readName(): kafka-client
  69. >>> KeyTab: load() entry length: 93; type: 18
  70. >>> KeyTabInputStream, readName(): HADOOP.COM
  71. >>> KeyTabInputStream, readName(): kafka-client
  72. >>> KeyTab: load() entry length: 77; type: 17
  73. >>> KeyTabInputStream, readName(): HADOOP.COM
  74. >>> KeyTabInputStream, readName(): kafka-client
  75. >>> KeyTab: load() entry length: 77; type: 23
  76. Looking for keys for: kafka-client@HADOOP.COM
  77. Added key: 23version: 1
  78. Added key: 17version: 1
  79. Added key: 18version: 1
  80. >>> KdcAccessibility: reset
  81. Looking for keys for: kafka-client@HADOOP.COM
  82. Added key: 23version: 1
  83. Added key: 17version: 1
  84. Added key: 18version: 1
  85. default etypes for default_tkt_enctypes: 23 17 18.
  86. >>> KrbAsReq creating message
  87. >>> KrbKdcReq send: kdc=ForestAD.HADOOP.COM TCP:88, timeout=3000, number of retries =3, #bytes=180
  88. >>> KDCCommunication: kdc=ForestAD.HADOOP.COM TCP:88, timeout=3000,Attempt =1, #bytes=180
  89. >>>DEBUG: TCPClient reading 240 bytes
  90. >>> KrbKdcReq send: #bytes read=240
  91. >>>Pre-Authentication Data:
  92. PA-DATA type = 19
  93. PA-ETYPE-INFO2 etype = 18, salt = HADOOP.COMkafka-client, s2kparams = null
  94. PA-ETYPE-INFO2 etype = 23, salt = null, s2kparams = null
  95. >>>Pre-Authentication Data:
  96. PA-DATA type = 2
  97. PA-ENC-TIMESTAMP
  98. >>>Pre-Authentication Data:
  99. PA-DATA type = 16
  100. >>>Pre-Authentication Data:
  101. PA-DATA type = 15
  102. >>> KdcAccessibility: remove hadoop.com
  103. >>> KDCRep: init() encoding tag is 126 req type is 11
  104. >>>KRBError:
  105. sTime is Wed Mar 28 07:37:50 EDT 2018 1522237070000
  106. suSec is 110488
  107. error code is 25
  108. error Message is Additional pre-authentication required
  109. sname is krbtgt/HADOOP.COM@HADOOP.COM
  110. eData provided.
  111. msgType is 30
  112. >>>Pre-Authentication Data:
  113. PA-DATA type = 19
  114. PA-ETYPE-INFO2 etype = 18, salt = HADOOP.COMkafka-client, s2kparams = null
  115. PA-ETYPE-INFO2 etype = 23, salt = null, s2kparams = null
  116. >>>Pre-Authentication Data:
  117. PA-DATA type = 2
  118. PA-ENC-TIMESTAMP
  119. >>>Pre-Authentication Data:
  120. PA-DATA type = 16
  121. >>>Pre-Authentication Data:
  122. PA-DATA type = 15
  123. KrbAsReqBuilder: PREAUTH FAILED/REQ, re-send AS-REQ
  124. default etypes for default_tkt_enctypes: 23 17 18.
  125. Looking for keys for: kafka-client@HADOOP.COM
  126. Added key: 23version: 1
  127. Added key: 17version: 1
  128. Added key: 18version: 1
  129. Looking for keys for: kafka-client@HADOOP.COM
  130. Added key: 23version: 1
  131. Added key: 17version: 1
  132. Added key: 18version: 1
  133. default etypes for default_tkt_enctypes: 23 17 18.
  134. >>> EType: sun.security.krb5.internal.crypto.Aes256CtsHmacSha1EType
  135. >>> KrbAsReq creating message
  136. >>> KrbKdcReq send: kdc=ForestAD.HADOOP.COM TCP:88, timeout=3000, number of retries =3, #bytes=269
  137. >>> KDCCommunication: kdc=ForestAD.HADOOP.COM TCP:88, timeout=3000,Attempt =1, #bytes=269
  138. >>>DEBUG: TCPClient reading 1678 bytes
  139. >>> KrbKdcReq send: #bytes read=1678
  140. >>> KdcAccessibility: remove hadoop.com
  141. Looking for keys for: kafka-client@HADOOP.COM
  142. Added key: 23version: 1
  143. Added key: 17version: 1
  144. Added key: 18version: 1
  145. >>> EType: sun.security.krb5.internal.crypto.Aes256CtsHmacSha1EType
  146. >>> KrbAsRep cons in KrbAsReq.getReply kafka-client
  147. 18/03/28 07:38:45 INFO authenticator.AbstractLogin: Successfully logged in.
  148. 18/03/28 07:38:45 DEBUG kerberos.KerberosLogin: [Principal=kafka-client@HADOOP.COM]: It is a Kerberos ticket
  149. 18/03/28 07:38:45 INFO kerberos.KerberosLogin: [Principal=kafka-client@HADOOP.COM]: TGT refresh thread started.
  150. 18/03/28 07:38:45 DEBUG kerberos.KerberosLogin: Found TGT with client principal 'kafka-client@HADOOP.COM' and server principal 'krbtgt/HADOOP.COM@HADOOP.COM'.
  151. 18/03/28 07:38:45 INFO kerberos.KerberosLogin: [Principal=kafka-client@HADOOP.COM]: TGT valid starting at: Wed Mar 28 07:37:50 EDT 2018
  152. 18/03/28 07:38:45 INFO kerberos.KerberosLogin: [Principal=kafka-client@HADOOP.COM]: TGT expires: Wed Mar 28 17:37:50 EDT 2018
  153. 18/03/28 07:38:45 INFO kerberos.KerberosLogin: [Principal=kafka-client@HADOOP.COM]: TGT refresh sleeping until: Wed Mar 28 15:42:00 EDT 2018
  154. 18/03/28 07:38:45 DEBUG metrics.Metrics: Added sensor with name produce-throttle-time
  155. 18/03/28 07:38:45 DEBUG metrics.Metrics: Added sensor with name connections-closed:
  156. 18/03/28 07:38:45 DEBUG metrics.Metrics: Added sensor with name connections-created:
  157. 18/03/28 07:38:45 DEBUG metrics.Metrics: Added sensor with name bytes-sent-received:
  158. 18/03/28 07:38:45 DEBUG metrics.Metrics: Added sensor with name bytes-sent:
  159. 18/03/28 07:38:45 DEBUG metrics.Metrics: Added sensor with name bytes-received:
  160. 18/03/28 07:38:45 DEBUG metrics.Metrics: Added sensor with name select-time:
  161. 18/03/28 07:38:45 DEBUG metrics.Metrics: Added sensor with name io-time:
  162. 18/03/28 07:38:45 DEBUG metrics.Metrics: Added sensor with name batch-size
  163. 18/03/28 07:38:45 DEBUG metrics.Metrics: Added sensor with name compression-rate
  164. 18/03/28 07:38:45 DEBUG metrics.Metrics: Added sensor with name queue-time
  165. 18/03/28 07:38:45 DEBUG metrics.Metrics: Added sensor with name request-time
  166. 18/03/28 07:38:45 DEBUG metrics.Metrics: Added sensor with name records-per-request
  167. 18/03/28 07:38:45 DEBUG metrics.Metrics: Added sensor with name record-retries
  168. 18/03/28 07:38:45 DEBUG metrics.Metrics: Added sensor with name errors
  169. 18/03/28 07:38:45 DEBUG metrics.Metrics: Added sensor with name record-size-max
  170. 18/03/28 07:38:45 DEBUG metrics.Metrics: Added sensor with name batch-split-rate
  171. 18/03/28 07:38:45 DEBUG internals.Sender: Starting Kafka producer I/O thread.
  172. 18/03/28 07:38:45 INFO utils.AppInfoParser: Kafka version : 0.11.0-kafka-3.0.0
  173. 18/03/28 07:38:45 INFO utils.AppInfoParser: Kafka commitId : unknown
  174. 18/03/28 07:38:45 DEBUG producer.KafkaProducer: Kafka producer with client id console-producer created
  175. >Hello World
  176. 18/03/28 07:38:53 DEBUG clients.NetworkClient: Initialize connection to node local-dn-1.HADOOP.COM:9092 (id: -1 rack: null) for sending metadata request
  177. 18/03/28 07:38:53 DEBUG clients.NetworkClient: Initiating connection to node local-dn-1.HADOOP.COM:9092 (id: -1 rack: null)
  178. 18/03/28 07:38:53 DEBUG authenticator.SaslClientAuthenticator: Set SASL client state to SEND_HANDSHAKE_REQUEST
  179. 18/03/28 07:38:53 DEBUG authenticator.SaslClientAuthenticator: Creating SaslClient: client=kafka-client@HADOOP.COM;service="kafka";serviceHostname=local-dn-1.HADOOP.COM;mechs=[GSSAPI]
  180. 18/03/28 07:38:53 DEBUG metrics.Metrics: Added sensor with name node--1.bytes-sent
  181. 18/03/28 07:38:53 DEBUG metrics.Metrics: Added sensor with name node--1.bytes-received
  182. 18/03/28 07:38:53 DEBUG metrics.Metrics: Added sensor with name node--1.latency
  183. 18/03/28 07:38:53 DEBUG network.Selector: Created socket with SO_RCVBUF = 32768, SO_SNDBUF = 102400, SO_TIMEOUT = 0 to node -1
  184. 18/03/28 07:38:53 DEBUG authenticator.SaslClientAuthenticator: Set SASL client state to RECEIVE_HANDSHAKE_RESPONSE
  185. 18/03/28 07:38:53 DEBUG clients.NetworkClient: Completed connection to node -1. Fetching API versions.
  186. 18/03/28 07:38:53 DEBUG authenticator.SaslClientAuthenticator: Set SASL client state to INITIAL
  187. Found ticket for kafka-client@HADOOP.COM to go to krbtgt/HADOOP.COM@HADOOP.COM expiring on Wed Mar 28 17:37:50 EDT 2018
  188. Entered Krb5Context.initSecContext with state=STATE_NEW
  189. Found ticket for kafka-client@HADOOP.COM to go to krbtgt/HADOOP.COM@HADOOP.COM expiring on Wed Mar 28 17:37:50 EDT 2018
  190. Service ticket not found in the subject
  191. >>> Credentials acquireServiceCreds: same realm
  192. default etypes for default_tgs_enctypes: 23 17 18.
  193. >>> CksumType: sun.security.krb5.internal.crypto.RsaMd5CksumType
  194. >>> EType: sun.security.krb5.internal.crypto.ArcFourHmacEType
  195. >>> KrbKdcReq send: kdc=ForestAD.HADOOP.COM TCP:88, timeout=3000, number of retries =3, #bytes=1631
  196. >>> KDCCommunication: kdc=ForestAD.HADOOP.COM TCP:88, timeout=3000,Attempt =1, #bytes=1631
  197. >>>DEBUG: TCPClient reading 151 bytes
  198. >>> KrbKdcReq send: #bytes read=151
  199. >>> KdcAccessibility: remove hadoop.com
  200. >>> KDCRep: init() encoding tag is 126 req type is 13
  201. >>>KRBError:
  202. sTime is Wed Mar 28 07:37:59 EDT 2018 1522237079000
  203. suSec is 467340
  204. error code is 7
  205. error Message is Server not found in Kerberos database
  206. sname is "kafka"/local-dn-1.HADOOP.COM@HADOOP.COM
  207. msgType is 30
  208. KrbException: Server not found in Kerberos database (7)
  209. at sun.security.krb5.KrbTgsRep.<init>(KrbTgsRep.java:70)
  210. at sun.security.krb5.KrbTgsReq.getReply(KrbTgsReq.java:251)
  211. at sun.security.krb5.KrbTgsReq.sendAndGetCreds(KrbTgsReq.java:262)
  212. at sun.security.krb5.internal.CredentialsUtil.serviceCreds(CredentialsUtil.java:308)
  213. at sun.security.krb5.internal.CredentialsUtil.acquireServiceCreds(CredentialsUtil.java:126)
  214. at sun.security.krb5.Credentials.acquireServiceCreds(Credentials.java:458)
  215. at sun.security.jgss.krb5.Krb5Context.initSecContext(Krb5Context.java:693)
  216. at sun.security.jgss.GSSContextImpl.initSecContext(GSSContextImpl.java:248)
  217. at sun.security.jgss.GSSContextImpl.initSecContext(GSSContextImpl.java:179)
  218. at com.sun.security.sasl.gsskerb.GssKrb5Client.evaluateChallenge(GssKrb5Client.java:192)
  219. at org.apache.kafka.common.security.authenticator.SaslClientAuthenticator$2.run(SaslClientAuthenticator.java:280)
  220. at org.apache.kafka.common.security.authenticator.SaslClientAuthenticator$2.run(SaslClientAuthenticator.java:278)
  221. at java.security.AccessController.doPrivileged(Native Method)
  222. at javax.security.auth.Subject.doAs(Subject.java:422)
  223. at org.apache.kafka.common.security.authenticator.SaslClientAuthenticator.createSaslToken(SaslClientAuthenticator.java:278)
  224. at org.apache.kafka.common.security.authenticator.SaslClientAuthenticator.sendSaslToken(SaslClientAuthenticator.java:215)
  225. at org.apache.kafka.common.security.authenticator.SaslClientAuthenticator.authenticate(SaslClientAuthenticator.java:183)
  226. at org.apache.kafka.common.network.KafkaChannel.prepare(KafkaChannel.java:76)
  227. at org.apache.kafka.common.network.Selector.pollSelectionKeys(Selector.java:376)
  228. at org.apache.kafka.common.network.Selector.poll(Selector.java:326)
  229. at org.apache.kafka.clients.NetworkClient.poll(NetworkClient.java:454)
  230. at org.apache.kafka.clients.producer.internals.Sender.run(Sender.java:224)
  231. at org.apache.kafka.clients.producer.internals.Sender.run(Sender.java:162)
  232. at java.lang.Thread.run(Thread.java:748)
  233. Caused by: KrbException: Identifier doesn't match expected value (906)
  234. at sun.security.krb5.internal.KDCRep.init(KDCRep.java:140)
  235. at sun.security.krb5.internal.TGSRep.init(TGSRep.java:65)
  236. at sun.security.krb5.internal.TGSRep.<init>(TGSRep.java:60)
  237. at sun.security.krb5.KrbTgsRep.<init>(KrbTgsRep.java:55)
  238. ... 23 more
  239. 18/03/28 07:38:53 DEBUG network.Selector: Connection with local-dn-1.HADOOP.COM/10.133.144.108 disconnected
  240. javax.security.sasl.SaslException: An error: (java.security.PrivilegedActionException: javax.security.sasl.SaslException: GSS initiate failed [Caused by GSSException: No valid credentials provided (Mechanism level: Server not found in Kerberos database (7))]) occurred when evaluating SASL token received from the Kafka Broker. Kafka Client will go to AUTH_FAILED state. [Caused by javax.security.sasl.SaslException: GSS initiate failed [Caused by GSSException: No valid credentials provided (Mechanism level: Server not found in Kerberos database (7))]]
  241. at org.apache.kafka.common.security.authenticator.SaslClientAuthenticator.createSaslToken(SaslClientAuthenticator.java:298)
  242. at org.apache.kafka.common.security.authenticator.SaslClientAuthenticator.sendSaslToken(SaslClientAuthenticator.java:215)
  243. at org.apache.kafka.common.security.authenticator.SaslClientAuthenticator.authenticate(SaslClientAuthenticator.java:183)
  244. at org.apache.kafka.common.network.KafkaChannel.prepare(KafkaChannel.java:76)
  245. at org.apache.kafka.common.network.Selector.pollSelectionKeys(Selector.java:376)
  246. at org.apache.kafka.common.network.Selector.poll(Selector.java:326)
  247. at org.apache.kafka.clients.NetworkClient.poll(NetworkClient.java:454)
  248. at org.apache.kafka.clients.producer.internals.Sender.run(Sender.java:224)
  249. at org.apache.kafka.clients.producer.internals.Sender.run(Sender.java:162)
  250. at java.lang.Thread.run(Thread.java:748)
  251. Caused by: javax.security.sasl.SaslException: GSS initiate failed [Caused by GSSException: No valid credentials provided (Mechanism level: Server not found in Kerberos database (7))]
  252. at com.sun.security.sasl.gsskerb.GssKrb5Client.evaluateChallenge(GssKrb5Client.java:211)
  253. at org.apache.kafka.common.security.authenticator.SaslClientAuthenticator$2.run(SaslClientAuthenticator.java:280)
  254. at org.apache.kafka.common.security.authenticator.SaslClientAuthenticator$2.run(SaslClientAuthenticator.java:278)
  255. at java.security.AccessController.doPrivileged(Native Method)
  256. at javax.security.auth.Subject.doAs(Subject.java:422)
  257. at org.apache.kafka.common.security.authenticator.SaslClientAuthenticator.createSaslToken(SaslClientAuthenticator.java:278)
  258. ... 9 more
  259. Caused by: GSSException: No valid credentials provided (Mechanism level: Server not found in Kerberos database (7))
  260. at sun.security.jgss.krb5.Krb5Context.initSecContext(Krb5Context.java:770)
  261. at sun.security.jgss.GSSContextImpl.initSecContext(GSSContextImpl.java:248)
  262. at sun.security.jgss.GSSContextImpl.initSecContext(GSSContextImpl.java:179)
  263. at com.sun.security.sasl.gsskerb.GssKrb5Client.evaluateChallenge(GssKrb5Client.java:192)
  264. ... 14 more
  265. Caused by: KrbException: Server not found in Kerberos database (7)
  266. at sun.security.krb5.KrbTgsRep.<init>(KrbTgsRep.java:70)
  267. at sun.security.krb5.KrbTgsReq.getReply(KrbTgsReq.java:251)
  268. at sun.security.krb5.KrbTgsReq.sendAndGetCreds(KrbTgsReq.java:262)
  269. at sun.security.krb5.internal.CredentialsUtil.serviceCreds(CredentialsUtil.java:308)
  270. at sun.security.krb5.internal.CredentialsUtil.acquireServiceCreds(CredentialsUtil.java:126)
  271. at sun.security.krb5.Credentials.acquireServiceCreds(Credentials.java:458)
  272. at sun.security.jgss.krb5.Krb5Context.initSecContext(Krb5Context.java:693)
  273. ... 17 more
  274. Caused by: KrbException: Identifier doesn't match expected value (906)
  275. at sun.security.krb5.internal.KDCRep.init(KDCRep.java:140)
  276. at sun.security.krb5.internal.TGSRep.init(TGSRep.java:65)
  277. at sun.security.krb5.internal.TGSRep.<init>(TGSRep.java:60)
  278. at sun.security.krb5.KrbTgsRep.<init>(KrbTgsRep.java:55)
  279. ... 23 more
  280. 18/03/28 07:38:53 DEBUG clients.NetworkClient: Node -1 disconnected.
  281. 18/03/28 07:38:53 WARN clients.NetworkClient: Connection to node -1 terminated during authentication. This may indicate that authentication failed due to invalid credentials.
  282. 18/03/28 07:38:53 DEBUG clients.NetworkClient: Give up sending metadata request since no node is available
  283. 18/03/28 07:38:53 DEBUG clients.NetworkClient: Give up sending metadata request since no node is available
  284. 18/03/28 07:38:53 DEBUG clients.NetworkClient: Initialize connection to node local-dn-1.HADOOP.COM:9092 (id: -1 rack: null) for sending metadata request
  285. 18/03/28 07:38:53 DEBUG clients.NetworkClient: Initiating connection to node local-dn-1.HADOOP.COM:9092 (id: -1 rack: null)
  286. 18/03/28 07:38:53 DEBUG authenticator.SaslClientAuthenticator: Set SASL client state to SEND_HANDSHAKE_REQUEST
  287. 18/03/28 07:38:53 DEBUG authenticator.SaslClientAuthenticator: Creating SaslClient: client=kafka-client@HADOOP.COM;service="kafka";serviceHostname=local-dn-1.HADOOP.COM;mechs=[GSSAPI]
  288. 18/03/28 07:38:53 DEBUG network.Selector: Created socket with SO_RCVBUF = 32768, SO_SNDBUF = 102400, SO_TIMEOUT = 0 to node -1
  289. 18/03/28 07:38:53 DEBUG authenticator.SaslClientAuthenticator: Set SASL client state to RECEIVE_HANDSHAKE_RESPONSE
  290. 18/03/28 07:38:53 DEBUG clients.NetworkClient: Completed connection to node -1. Fetching API versions.
  291. 18/03/28 07:38:53 DEBUG authenticator.SaslClientAuthenticator: Set SASL client state to INITIAL
  292. ^C18/03/28 07:38:54 INFO producer.KafkaProducer: Closing the Kafka producer with timeoutMillis = 9223372036854775807 ms.
  293. 18/03/28 07:38:54 DEBUG internals.Sender: Beginning shutdown of Kafka producer I/O thread, sending remaining records.
  294. 18/03/28 07:38:54 DEBUG metrics.Metrics: Removed sensor with name connections-closed:
  295. 18/03/28 07:38:54 DEBUG metrics.Metrics: Removed sensor with name connections-created:
  296. 18/03/28 07:38:54 DEBUG metrics.Metrics: Removed sensor with name bytes-sent-received:
  297. 18/03/28 07:38:54 DEBUG metrics.Metrics: Removed sensor with name bytes-sent:
  298. 18/03/28 07:38:54 DEBUG metrics.Metrics: Removed sensor with name bytes-received:
  299. 18/03/28 07:38:54 DEBUG metrics.Metrics: Removed sensor with name select-time:
  300. 18/03/28 07:38:54 DEBUG metrics.Metrics: Removed sensor with name io-time:
  301. 18/03/28 07:38:54 DEBUG metrics.Metrics: Removed sensor with name node--1.bytes-sent
  302. 18/03/28 07:38:54 DEBUG metrics.Metrics: Removed sensor with name node--1.bytes-received
  303. 18/03/28 07:38:54 DEBUG metrics.Metrics: Removed sensor with name node--1.latency
  304. 18/03/28 07:38:54 WARN kerberos.KerberosLogin: [Principal=kafka-client@HADOOP.COM]: TGT renewal thread has been interrupted and will exit.
  305. 18/03/28 07:38:54 DEBUG internals.Sender: Shutdown of Kafka producer I/O thread has completed.
  306. 18/03/28 07:38:54 DEBUG producer.KafkaProducer: Kafka producer with client id console-producer has been closed
  307. [root@local-dn-1.HADOOP.COM ~]$

配置和环境变量

  1. export KAFKA_HOME=/opt/cloudera/parcels/KAFKA-3.0.0-1.3.0.0.p0.40/lib/kafka
  2. export JAVA_HOME=/usr/java/jdk1.8.0_131
  3. export KAFKA_OPTS="-Djava.security.auth.login.config=/etc/kafka/conf/producer-conf/kafka-client-jaas.conf -Dsun.security.krb5.debug=true"
  4. export JVM_ARGS="-Djava.security.krb5.conf=/etc/krb5.conf -Djava.security.auth.login.config=/etc/kafka/conf/producer-conf/kafka-client-jaas.conf"
  5. export BROKER_JAVA_OPTS="-Djava.security.krb5.conf=/etc/krb5.conf"
  6. ``` `/etc/kafka/conf/producer-conf/kafka-client-jaas.conf` ```
  7. KafkaServer {
  8. com.sun.security.auth.module.Krb5LoginModule required
  9. doNotPrompt=true
  10. useKeyTab=true
  11. storeKey=true
  12. keyTab="/etc/kafka/conf/kafka.keytab"
  13. principal="kafka/local-dn-1.hadoop.com@HADOOP.COM";
  14. };
  15. KafkaClient {
  16. com.sun.security.auth.module.Krb5LoginModule required
  17. useKeyTab=true
  18. storeKey=true
  19. useTicketCache=false
  20. keyTab="/etc/kafka/conf/producer-conf/kafka-client.keytab"
  21. principal="kafka-client@HADOOP.COM";
  22. };
  23. Client {
  24. com.sun.security.auth.module.Krb5LoginModule required
  25. useKeyTab=true
  26. storeKey=true
  27. useTicketCache=false
  28. keyTab="/etc/kafka/conf/kafka.keytab"
  29. principal="kafka/local-dn-1.hadoop.com.com@HADOOP.COM";
  30. };
  31. ``` `producer.properties` ```
  32. bootstrap.servers=local-dn-1.hadoop.com:9092
  33. security.protocol=SASL_PLAINTEXT
  34. sasl.kerberos.service.name="kafka"
  35. sasl.mechanism = GSSAPI

我用来启动生产者的命令是:

  1. /opt/cloudera/parcels/KAFKA/bin/kafka-console-producer --broker-list local-dn-1.hadoop.com:9092 --topic "Kafka-Test" --producer.config /etc/kafka/conf/producer-conf/producer.properties
k2arahey

k2arahey1#

从提供的日志中我得到了最重要的信息

  1. >>>KRBError:
  2. sTime is Wed Mar 28 07:37:59 EDT 2018 1522237079000
  3. suSec is 467340
  4. error code is 7
  5. error Message is Server not found in Kerberos database
  6. sname is "kafka"/local-dn-1.HADOOP.COM@HADOOP.COM
  7. msgType is 30
  8. KrbException: Server not found in Kerberos database (7)
  9. Caused by: KrbException: Identifier doesn't match expected value (906)
  10. 18/03/28 07:38:53 DEBUG network.Selector: Connection with local-dn-1.HADOOP.COM/10.133.144.108 disconnected
  11. javax.security.sasl.SaslException: An error: (java.security.PrivilegedActionException: javax.security.sasl.SaslException: GSS initiate failed [Caused by GSSException: No valid credentials provided (Mechanism level: Server not found in Kerberos database (7))]) occurred when evaluating SASL token received from the Kafka Broker. Kafka Client will go to AUTH_FAILED state. [Caused by javax.security.sasl.SaslException: GSS initiate failed [Caused by GSSException: No valid credentials provided (Mechanism level: Server not found in Kerberos database (7))]]
  12. Caused by: GSSException: No valid credentials provided (Mechanism level: Server not found in Kerberos database Caused by: KrbException: Server not found in Kerberos database (7)
  13. Caused by: KrbException: Identifier doesn't match expected value (906)

此外 local-dn-1.HADOOP.COM ,以及所有其他节点都需要可解析(通过dns)。
你的 /etc/kafka/conf/producer-conf/kafka-client-jaas.conf 有些条目似乎不匹配:

  1. KafkaServer {
  2. ...
  3. keyTab="/etc/kafka/conf/kafka.keytab"
  4. principal="kafka/local-dn-1.hadoop.com@HADOOP.COM";
  5. };
  6. ...
  7. Client {
  8. ...
  9. keyTab="/etc/kafka/conf/kafka.keytab"
  10. principal="kafka/local-dn-1.hadoop.com.com@HADOOP.COM";
  11. };

因此,我建议您检查kerberos身份验证的配置。似乎节点的kerberos身份验证 local-dn-1 尚未正确设置。

展开查看全部
os8fio9y

os8fio9y2#

以上是我在kafka中由于ssl证书而遇到的错误。在修复上面的ssl证书之后,keerberos错误消失了。

相关问题