如何使用spark overwrite jdbc connect open kernethive?

bkkx9g8r  于 2023-10-18  发布在  Hive
关注(0)|答案(1)|浏览(181)

环境:原始Hadoop;开放式Hive; depoly-mode:yanr-clint;每个haoop节点放置证书;进程:重写spark jdbcsource,spark使用此源连接hive,连接前有auth,auth成功,但执行者执行连接时有异常:

  1. sqlexception:could not open client for any of server uri is zookeeper:null

问题:如何解决这个错误,我已经尝试设置UserGroupInformation和auth成功,并设置spark javaextraoption但不生效

56lgkhnf

56lgkhnf1#

这是验证码:

  1. public static void initkerberos() {
  2. try {
  3. String configPath = "/opt/hbaseConfig/tx/"+krbConfig;
  4. String keytabPath = "/opt/hbaseConfig/tx/"+krbKeytab;
  5. System.setProperty("java.security.krb5.conf", configPath);
  6. Configuration conf = new Configuration();
  7. conf.set("hadoop.security.authentication", "kerberos");
  8. UserGroupInformation.setConfiguration(conf);
  9. UserGroupInformation.loginUserFromKeytab(krbUser, keytabPath);
  10. } catch (Exception e) {
  11. e.printStackTrace();
  12. logger.error("Kerberos 验证失败", e);
  13. }
  14. }
展开查看全部

相关问题