从udf查找hbase tbl(直线、hbase、委派令牌)

xpcnnkqh  于 2021-06-09  发布在  Hbase
关注(0)|答案(0)|浏览(311)

我需要为hbase表中的数据查找编写自定义udf。
注意:我已经用hive做了单元测试。它似乎起作用了。
但是当我使用相同的udf直线时,它失败了。默认情况下,cloudera限制模拟,只允许配置单元用户以直线方式运行查询。在作业启动时,yarnchild正在设置以下令牌。
我想添加令牌(种类:hbase\u auth\u token)来处理hbase。

Kind: mapreduce.job
Kind: HDFS_DELEGATION_TOKEN 
Kind: kms-dt

我研究并发现hbasestoragehandler是如何为hbase使用委托令牌(即hbase\u auth\u token)的。所以我使用了相同的函数集,它也不起作用。
hbasestoragehandler中的函数(用于获取要作业的令牌):

private void addHBaseDelegationToken(Configuration conf, JobConf jconf) throws IOException {
        if (User.isHBaseSecurityEnabled(conf)) {
            try {
                logger.info("isHbaseSecurityEnabled :True ");
                User e = User.getCurrent();
                logger.info("isHbaseSecurityEnabled :User ==> " + e.toString());
                Token authToken = getAuthToken(conf, e);
                logger.info("isHbaseSecurityEnabled :AuthToken==> "+authToken.toString());
                Job job = new Job(conf);
                if(authToken == null) {
                    UserGroupInformation ugi = UserGroupInformation.getLoginUser();
                    ugi.setAuthenticationMethod(UserGroupInformation.AuthenticationMethod.KERBEROS);
                    e.obtainAuthTokenForJob(jconf);
  } else {
                    logger.info("authToken is not null"+authToken.toString());
                    job.getCredentials().addToken(authToken.getService(), authToken);
                }

                logger.info("obtained Token /....");
            } catch (InterruptedException var5) {
                throw new IOException("Error while obtaining hbase delegation token", var5);
            }
        }

    }

private static Token<AuthenticationTokenIdentifier> getAuthToken(Configuration conf, User user) throws IOException, InterruptedException {
        ZooKeeperWatcher zkw = new ZooKeeperWatcher(conf, "mr-init-credentials", (Abortable) null);

        Token var4;
        try {
            String e = ZKClusterId.readClusterIdZNode(zkw);
            logger.info("====== clusterID : " + e);
            var4 = (new AuthenticationTokenSelector()).selectToken(new Text(e), user.getUGI().getTokens());
             if (var4 == null) {
                logger.info("var4 is null===========================");
            } else {
                logger.info("====== Hbase Token : " + var4.toString());
            }
        } catch (KeeperException var8) {
            throw new IOException(var8);
        } catch (NullPointerException np) {
            return null;
        } finally {
            zkw.close();
        }
        return var4;
 }

在udf的configure()中调用addhbasedelegationtoken()之后。我得到以下例外。我不知道如何才能让hvi用户与hbase交谈,因为hive.keytab是由cloudera处理的,它是安全的。
任何输入都可能有用。谢谢!
异常堆栈跟踪:
2018-10-11 04:48:07625警告[main]org.apache.hadoop.security.usergroupinformation:priviledgedactionexception as:hive(auth:simple) cause:javax.security.sasl.saslexception:gss initiate failed[由gssexception引起:未提供有效凭据(机制级别:找不到任何kerberos tgt)]2018-10-11 04:48:07,627 warn[main]org.apache.hadoop.hbase.ipc.rpcclientimpl:连接到服务器时遇到异常:javax.security.sasl.saslexception:gss initiate失败[由gssexception引起:未提供有效凭据(机制级别:找不到任何kerberos tgt)]2018-10-11 04:48:07,628 fatal[main]org.apache.hadoop.hbase.ipc.rpcclientimpl:sasl身份验证失败。最可能的原因是凭据丢失或无效。考虑“kinit”。javax.security.sasl.saslexception:gss initiate failed[由gssexception引起:未提供有效凭据(机制级别:找不到任何kerberos tgt)],位于com.sun.security.sasl.gsskerb.gsskrb5client.evaluatechallenge(gsskrb5client)。java:211)位于org.apache.hadoop.hbase.security.hbasaslrpclient.saslconnect(hbasaslrpclient。java:181)在org.apache.hadoop.hbase.ipc.rpcclientimpl$connection.setupsaslconnection(rpcclientimpl。java:618)在org.apache.hadoop.hbase.ipc.rpcclientimpl$connection.access$700(rpcclientimpl。java:163)在org.apache.hadoop.hbase.ipc.rpcclientimpl$connection$2.run(rpcclientimpl。java:744)在org.apache.hadoop.hbase.ipc.rpcclientimpl$connection$2.run(rpcclientimpl。java:741)位于javax.security.auth.subject.doas(subject)的java.security.accesscontroller.doprivileged(本机方法)。java:422)在org.apache.hadoop.security.usergroupinformation.doas(usergroupinformation。java:1920)在org.apache.hadoop.hbase.ipc.rpcclientimpl$connection.setupiostreams(rpcclientimpl。java:741)在org.apache.hadoop.hbase.ipc.rpcclientimpl$connection.writerequest(rpcclientimpl。java:907)在org.apache.hadoop.hbase.ipc.rpcclientimpl$connection.tracedwriterequest(rpcclientimpl。java:874)在org.apache.hadoop.hbase.ipc.rpcclientimpl.call(rpcclientimpl。java:1246)在org.apache.hadoop.hbase.ipc.abstractrpcclient.callblockingmethod(abstractrpcclient。java:227)在org.apache.hadoop.hbase.ipc.abstractrpcclient$blockingrpcchannelimplementation.callblockingmethod(abstractrpcclient。java:336)在org.apache.hadoop.hbase.protobuf.generated.clientprotos$clientservice$blockingstub.execservice(clientprotos)。java:34118)在org.apache.hadoop.hbase.protobuf.protobufutil.execservice(protobufutil。java:1633)在org.apache.hadoop.hbase.ipc.regionprocessorrpchannel$1.call(regionprocessorrpchannel。java:104)在org.apache.hadoop.hbase.ipc.regionProcessorRpcChannel$1.call(regionProcessorRpcChannel。java:94)在org.apache.hadoop.hbase.client.rpcretryingcaller.callwithretries(rpcretryingcaller。java:136)在org.apache.hadoop.hbase.ipc.regionprocessorrpchannel.callexecservice(regionprocessorrpchannel。java:107)在org.apache.hadoop.hbase.ipc.coprocessorrpcchannel.callblockingmethod(coprocessorrpcchannel。java:73)位于org.apache.hadoop.hbase.protobuf.generated.authenticationprotos$authenticationservice$blockingstub.getauthenticationtoken(authenticationprotos)。java:4512)在org.apache.hadoop.hbase.security.token.tokenutil.obtaintoken(tokenutil。java:86)在org.apache.hadoop.hbase.security.token.tokenutil$1.run(tokenutil。java:111)在org.apache.hadoop.hbase.security.token.tokenutil$1.run(tokenutil。java:108)位于javax.security.auth.subject.doas(subject)的java.security.accesscontroller.doprivileged(本机方法)。java:422)在org.apache.hadoop.security.usergroupinformation.doas(用户组信息。java:1920)在org.apache.hadoop.hbase.security.user$securehadoopuser.runas(user。java:340)在org.apache.hadoop.hbase.security.token.tokenutil.obtaintoken(tokenutil。java:108)在com.barclaycardus.hadoop.utils.udfs.hbasetblookupudf.configure(hbasetblookupudf。java:131)在org.apache.hadoop.hive.ql.exec.mapredcontext.setup(mapredcontext)。java:120)位于org.apache.hadoop.hive.ql.exec.exprnodegenericfuncealuator.initialize(exprnodegenericfuncealuator)。java:143)位于org.apache.hadoop.hive.ql.exec.operator.initevaluators(operator。java:954)在org.apache.hadoop.hive.ql.exec.operator.initevaluatorsandreturnstruct(运算符。java:980)在org.apache.hadoop.hive.ql.exec.selectoperator.initializeop(selectoperator。java:63)在org.apache.hadoop.hive.ql.exec.operator.initialize(operator。java:385)在org.apache.hadoop.hive.ql.exec.operator.initialize(operator。java:469)在org.apache.hadoop.hive.ql.exec.operator.initializechildren(operator。java:425)在org.apache.hadoop.hive.ql.exec.tablescanoperator.initializeop(tablescanoperator.org.apache.hadoop.hive.ql.exec.tablescanoperator.initializeop)上。java:196)在org.apache.hadoop.hive.ql.exec.operator.initialize(operator。java:385)在org.apache.hadoop.hive.ql.exec.mapoperator.initializeop(mapoperator。java:431)在org.apache.hadoop.hive.ql.exec.operator.initialize(操作符。java:385)在org.apache.hadoop.hive.ql.exec.mr.execmapper.configure(execmapper。java:126)在sun.reflect.nativemethodaccessorimpl.invoke0(本机方法)在sun.reflect.nativemethodaccessorimpl.invoke(nativemethodaccessorimpl)。java:62)在sun.reflect.delegatingmethodaccessorimpl.invoke(delegatingmethodaccessorimpl。java:43)在java.lang.reflect.method.invoke(方法。java:498)在org.apache.hadoop.util.reflectionutils.setjobconf(reflectionutils。java:106)位于org.apache.hadoop.util.reflectionutils.setconf(reflectionutils。java:75)在org.apache.hadoop.util.reflectionutils.newinstance(reflectionutils。java:133)在org.apache.hadoop.mapred.maprunner.configure(maprunner。java:38)在sun.reflect.nativemethodaccessorimpl.invoke0(本机方法)在sun.reflect.nativemethodaccessorimpl.invoke(nativemethodaccessorimpl)。java:62)在sun.reflect.delegatingmethodaccessorimpl.invoke(delegatingmethodaccessorimpl。java:43)在java.lang.reflect.method.invoke(方法。java:498)在org.apache.hadoop.util.reflectionutils.setjobconf(reflectionutils。java:106)位于org.apache.hadoop.util.reflectionutils.setconf(reflectionutils。java:75)在org.apache.hadoop.util.reflectionutils.newinstance(reflectionutils。java:133)在org.apache.hadoop.mapred.maptask.runoldmapper(maptask。java:455)在org.apache.hadoop.mapred.maptask.run(maptask。java:343)在org.apache.hadoop.mapred.yarnchild$2.run(yarnchild。java:164)位于java.security.accesscontroller.doprivileged(本机方法)javax.security.auth.subject.doas(主题。java:422)在org.apache.hadoop.security.usergroupinformation.doas(usergroupinformation。java:1920)在org.apache.hadoop.mapred.yarnchild.main(yarnchild。java:158)原因:gssexception:未提供有效凭据(机制级别:找不到任何kerberos tgt)sun.security.jgss.krb5.krb5initcredential.getinstance(krb5initcredential)。java:147)在sun.security.jgss.krb5.krb5mechfactory.getcredentialelement(krb5mechfactory。java:122)在sun.security.jgss.krb5.krb5mechfactory.getmechanismcontext(krb5mechfactory。java:187)在sun.security.jgss.gssmanagerimpl.getmechanismcontext(gssmanagerimpl。java:224)在sun.security.jgss.gsscontextimpl.initseccontext(gsscontextimpl。java:212)位于sun.security.jgss.gsscontextimpl.initseccontext(gsscontextimpl。java:179)在com.sun.security.sasl.gsskerb.gsskrb5client.evaluatechallenge(gsskrb5client。java:192) ... 66个以上
已尝试以下选项:
https://github.com/apache/oozie/blob/master/core/src/main/java/org/apache/oozie/action/hadoop/hbasecredentials.java
https://github.com/ibm-research-ireland/sparkoscope/blob/master/yarn/src/main/scala/org/apache/spark/deploy/yarn/security/hbasecredentialprovider.scala

暂无答案!

目前还没有任何答案,快来回答吧!

相关问题