authorizationexception:不允许用户模拟用户

q5iwbnjs  于 2021-06-26  发布在  Hive
关注(0)|答案(1)|浏览(522)

我编写了一个spark作业,它注册了一个temp表,当我通过beeline(jdbc客户机)公开它时

$ ./bin/beeline
beeline> !connect jdbc:hive2://IP:10003 -n ram -p xxxx
0: jdbc:hive2://IP> show tables;
+---------------------------------------------+--------------+---------------------+
|                    tableName                          | isTemporary  |
+---------------------------------------------+--------------+---------------------+
| f238                                                        | true              |
+---------------------------------------------+--------------+---------------------+
2 rows selected (0.309 seconds)
0: jdbc:hive2://IP>

我可以看table。当我得到这个错误信息时

0: jdbc:hive2://IP> select * from f238;
Error: org.apache.hadoop.ipc.RemoteException(org.apache.hadoop.security.authorize.AuthorizationException): User: ram is not allowed to impersonate ram (state=,code=0)
0: jdbc:hive2://IP>

我在hive-site.xml中有这个,

<property>
  <name>hive.metastore.sasl.enabled</name>
  <value>false</value>
  <description>If true, the metastore Thrift interface will be secured with SASL. Clients must authenticate with Kerberos.</description>
</property>

<property>
  <name>hive.server2.enable.doAs</name>
  <value>false</value>
</property>

<property>
  <name>hive.server2.authentication</name>
  <value>NONE</value>
</property>

我在core-site.xml中有这个,

<property>
  <name>hadoop.proxyuser.hive.groups</name>
  <value>*</value>
</property>

<property>
  <name>hadoop.proxyuser.hive.hosts</name>
  <value>*</value>
</property>

完整日志

ERROR [pool-19-thread-2] thriftserver.SparkExecuteStatementOperation: Error running hive query:
org.apache.hive.service.cli.HiveSQLException: org.apache.hadoop.ipc.RemoteException(org.apache.hadoop.security.authorize.AuthorizationException): User: ram is not allowed to impersonate ram
        at org.apache.spark.sql.hive.thriftserver.SparkExecuteStatementOperation.runInternal(SparkExecuteStatementOperation.scala:259)
        at org.apache.spark.sql.hive.thriftserver.SparkExecuteStatementOperation$$anon$1$$anon$2.run(SparkExecuteStatementOperation.scala:171)
        at java.security.AccessController.doPrivileged(Native Method)
        at javax.security.auth.Subject.doAs(Subject.java:422)
        at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1657)
        at org.apache.spark.sql.hive.thriftserver.SparkExecuteStatementOperation$$anon$1.run(SparkExecuteStatementOperation.scala:182)
        at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
        at java.util.concurrent.FutureTask.run(FutureTask.java:266)
        at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
        at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
        at java.lang.Thread.run(Thread.java:745)

你知道我缺少什么配置吗?

1aaf6o9v

1aaf6o9v1#

<property>
 <name>hive.server2.enable.doAs</name>
 <value>true</value>
</property>

另外,如果希望用户模拟所有(*),请将以下属性添加到core-site.xml中

<property>
  <name>hadoop.proxyuser.ABC.groups</name>
<value>*</value>
</property>

<property>
 <name>hadoop.proxyuser.ABC.hosts</name>
 <value>*</value>
</property>

相关问题