使用kerberos访问hive jdbc和spark hive

mlmc2os5  于 2021-06-27  发布在  Hive
关注(0)|答案(2)|浏览(484)

我正在尝试使用hivejdbc和spark local-in-two接口连接hive。但jdbc接口在1,2天后出现错误:
kerberos身份验证一开始就成功了,我可以使用这两个接口成功地查询配置单元数据。
过了一会儿,jdbc接口失败了(错误如下)。
执行spark接口后,jdbc接口立即恢复。
所有的事情都发生在同一个springboot jvm中。好像kerberos票证过期了。
jars版本:

<spark.version>2.3.0.cloudera4</spark.version>
    <hive.version>1.1.0-cdh5.15.1</hive.version>
    <hadoop.version>2.6.0-cdh5.15.1</hadoop.version>

springboot应用程序:

@SpringBootApplication
@EnableScheduling
public class Application {
    @PostConstruct
    void started() {
        Logger LOGGER = LoggerFactory.getLogger(Application.class);
        System.setProperty("java.security.krb5.conf", "/etc/krb5.conf");
        Configuration configuration = new Configuration();
        configuration.set("hadoop.security.authentication" , "Kerberos" );
        UserGroupInformation.setConfiguration(configuration);
        try {
            UserGroupInformation.loginUserFromKeytab("xxx@USER", "/etc/datasource.keytab");
        } catch (IOException e) {
            LOGGER.error("Auth failed. ", e);
        }
        LOGGER.info("authentication with kerberos successful");
    }

配置单元jdbc接口:

Connection conn = null;
ResultSet rs = null;
PreparedStatement ps = null;
    try {
    conn = DriverManager.getConnection(CONNECTION_URL);

    ps = conn.prepareStatement("show databases");
    rs = ps.executeQuery();
    while (rs.next()) {
        System.out.println(rs.getString(1));
    }
} catch (Exception e) {
    e.printStackTrace();
} finally {
    conn.close();
}

spark sql:

SparkSession spark = SparkSession
        .builder()
        .setMaster("local")
        .appName("Java Spark Hive Example")
        .enableHiveSupport()
        .getOrCreate();
spark.sql("show databases").show();
spark.sql("select * from balldb.ods_addcrest limit 10").show();

错误:

2019-03-01 02:08:39.450  INFO 1 --- [pool-3-thread-5] org.apache.hive.jdbc.Utils               : Supplied authorities: bigdata-01:36003
2019-03-01 02:08:39.451  INFO 1 --- [pool-3-thread-5] org.apache.hive.jdbc.Utils               : Resolved authority: bigdata-01:36003
2019-03-01 02:08:39.456 ERROR 1 --- [pool-3-thread-5] o.a.thrift.transport.TSaslTransport      : SASL negotiation failure

javax.security.sasl.SaslException: GSS initiate failed
    at com.sun.security.sasl.gsskerb.GssKrb5Client.evaluateChallenge(GssKrb5Client.java:211) ~[na:1.8.0_191]
    at org.apache.thrift.transport.TSaslClientTransport.handleSaslStartMessage(TSaslClientTransport.java:94) [hive-exec-1.1.0-cdh5.15.1.jar!/:1.1.0-cdh5.15.1]
    at org.apache.thrift.transport.TSaslTransport.open(TSaslTransport.java:271) ~[hive-exec-1.1.0-cdh5.15.1.jar!/:1.1.0-cdh5.15.1]
    at org.apache.thrift.transport.TSaslClientTransport.open(TSaslClientTransport.java:37) [hive-exec-1.1.0-cdh5.15.1.jar!/:1.1.0-cdh5.15.1]
    at org.apache.hadoop.hive.thrift.client.TUGIAssumingTransport$1.run(TUGIAssumingTransport.java:52) [hive-shims-common-1.1.0-cdh5.15.1.jar!/:1.1.0-cdh5.15.1]
    at org.apache.hadoop.hive.thrift.client.TUGIAssumingTransport$1.run(TUGIAssumingTransport.java:49) [hive-shims-common-1.1.0-cdh5.15.1.jar!/:1.1.0-cdh5.15.1]
    at java.security.AccessController.doPrivileged(Native Method) [na:1.8.0_191]
    at javax.security.auth.Subject.doAs(Subject.java:422) [na:1.8.0_191]
    at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1924) [hadoop-common-2.6.0-cdh5.15.1.jar!/:na]
    at org.apache.hadoop.hive.thrift.client.TUGIAssumingTransport.open(TUGIAssumingTransport.java:49) [hive-shims-common-1.1.0-cdh5.15.1.jar!/:1.1.0-cdh5.15.1]
    at org.apache.hive.jdbc.HiveConnection.openTransport(HiveConnection.java:203) [hive-jdbc-1.1.0-cdh5.15.1.jar!/:1.1.0-cdh5.15.1]
    at org.apache.hive.jdbc.HiveConnection.<init>(HiveConnection.java:168) [hive-jdbc-1.1.0-cdh5.15.1.jar!/:1.1.0-cdh5.15.1]
    at org.apache.hive.jdbc.HiveDriver.connect(HiveDriver.java:105) [hive-jdbc-1.1.0-cdh5.15.1.jar!/:1.1.0-cdh5.15.1]
    at java.sql.DriverManager.getConnection(DriverManager.java:664) [na:1.8.0_191]
    at java.sql.DriverManager.getConnection(DriverManager.java:247) [na:1.8.0_191]
    at com.task.HiveDSTask.init(HiveDSTask.java:83) [classes!/:na]
    at com.task.HiveDSTask.call(HiveDSTask.java:95) [classes!/:na]
    at com.task.HiveDSTask.call(HiveDSTask.java:22) [classes!/:na]
    at java.util.concurrent.FutureTask.run(FutureTask.java:266) [na:1.8.0_191]
    at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) [na:1.8.0_191]
    at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) [na:1.8.0_191]
    at java.lang.Thread.run(Thread.java:748) [na:1.8.0_191]
Caused by: org.ietf.jgss.GSSException: No valid credentials provided (Mechanism level: Failed to find any Kerberos tgt)
    at sun.security.jgss.krb5.Krb5InitCredential.getInstance(Krb5InitCredential.java:147) ~[na:1.8.0_191]
    at sun.security.jgss.krb5.Krb5MechFactory.getCredentialElement(Krb5MechFactory.java:122) ~[na:1.8.0_191]
    at sun.security.jgss.krb5.Krb5MechFactory.getMechanismContext(Krb5MechFactory.java:187) ~[na:1.8.0_191]
    at sun.security.jgss.GSSManagerImpl.getMechanismContext(GSSManagerImpl.java:224) ~[na:1.8.0_191]
    at sun.security.jgss.GSSContextImpl.initSecContext(GSSContextImpl.java:212) ~[na:1.8.0_191]
    at sun.security.jgss.GSSContextImpl.initSecContext(GSSContextImpl.java:179) ~[na:1.8.0_191]
    at com.sun.security.sasl.gsskerb.GssKrb5Client.evaluateChallenge(GssKrb5Client.java:192) ~[na:1.8.0_191]
    ... 21 common frames omitted

/etc/krb5.conf格式:

[libdefaults]
default_realm = bigdata-02
dns_lookup_kdc = false
dns_lookup_realm = false
ticket_lifetime = 86400
renew_lifetime = 604800
forwardable = true
default_tgs_enctypes = aes256-cts arcfour-hmac des3-hmac-sha1 des-cbc-crc des
default_tkt_enctypes = aes256-cts arcfour-hmac des3-hmac-sha1 des-cbc-crc des
permitted_enctypes = aes256-cts arcfour-hmac des3-hmac-sha1 des-cbc-crc des
udp_preference_limit = 1
kdc_timeout = 3000
[realms]
BIGDATA-02 = {
kdc = bigdata-02
admin_server = bigdata-02
default_domain = bigdata-02
kdc = bigdata-03
}
[domain_realm]
bigdata-02 = BIGDATA-02

我在这里呆了很长时间,请帮我纠正这个错误。谢谢。

bz4sfanl

bz4sfanl1#

在springboot应用程序中

configuration.set("hadoop.security.authentication" , "Kerberos" );

应该是

configuration.set("hadoop.security.authentication" , "kerberos" );

小写字母“k”

vaqhlq81

vaqhlq812#

“kerberos身份验证一开始就成功了,我可以使用这两个接口成功地查询配置单元数据。”这意味着您的配置是正确的,除了这些配置之外,我还添加了另一个类:

package com.netease.athena.job;
import com.netease.athena.util.KrbUtil;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
import org.springframework.scheduling.annotation.Scheduled;
import org.springframework.stereotype.Component;

@Component
public class KrbAuthScheduled {
    private Logger logger = LoggerFactory.getLogger(this.getClass());

    @Scheduled(cron = "0 0/5 * * * ?")
    public void run(){
        logger.info("KrbAuth run ...");
        KrbUtil.authKrb5();
    }
}


如你所见,它每5分钟验证一次,对我很有效。

相关问题