我们正在用hive和hbase+kerberos构建一个普通的hadoop2.7.3集群。我们使用hadoop中的bigtop repo来简化它。
部署脚本成功地安装了配置单元和组件,但即使我们运行了metastore和hiveserver,
它没有使用10000端口,
我们无法连接到直线。
没有错误
它甚至没有创建hiveserver2.log文件。
ps-ef | grep配置单元显示以下输出
hive 9043 1 2 10:57 ? 00:00:23 /usr/lib/jvm/java-openjdk/bin/java -Xmx256m -Djava.security.krb5.conf=/etc/krb5.conf -Dhadoop.log.dir=/usr/lib/hadoop/logs -Dhadoop.log.file=hadoop.log -Dhadoop.home.dir=/usr/lib/hadoop -Dhadoop.id.str= -Dhadoop.root.logger=INFO,console -Djava.library.path=/usr/lib/hadoop/lib/native -Dhadoop.policy.file=hadoop-policy.xml -Djava.net.preferIPv4Stack=true -Dhadoop.security.logger=INFO,NullAppender org.apache.hadoop.util.RunJar /usr/lib/hive/lib/hive-service-1.2.1.jar org.apache.hadoop.hive.metastore.HiveMetaStore
hive 9751 1 2 11:04 ? 00:00:11 /usr/lib/jvm/java-openjdk/bin/java -Xmx256m -Djava.security.krb5.conf=/etc/krb5.conf -Dhadoop.log.dir=/usr/lib/hadoop/logs -Dhadoop.log.file=hadoop.log -Dhadoop.home.dir=/usr/lib/hadoop -Dhadoop.id.str= -Dhadoop.root.logger=INFO,console -Djava.library.path=/usr/lib/hadoop/lib/native -Dhadoop.policy.file=hadoop-policy.xml -Djava.net.preferIPv4Stack=true -Dhadoop.security.logger=INFO,NullAppender org.apache.hadoop.util.RunJar /usr/lib/hive/lib/hive-service-1.2.1.jar org.apache.hive.service.server.HiveServer2
root 10285 7469 0 11:13 pts/1 00:00:00 grep hive
与直线的连接
[root@wnode55 ~]# beeline -u 'jdbc:hive2://wnode55.domain_name.com:10000/default;principal=hive/wnode55.domain_name.com@domain_name.COM'
ls: cannot access /usr/lib/spark/lib/spark-assembly-*.jar: No such file or directory
Connecting to jdbc:hive2://wnode55.domain_name.com:10000/default;principal=hive/wnode55.domain_name.com@domain_name.COM
Error: Could not open client transport with JDBC Uri: jdbc:hive2://wnode55.domain_name.com:10000/default;principal=hive/wnode55.domain_name.com@domain_name.COM: java.net.ConnectException: Connection refused (Connection refused) (state=08S01,code=0)
Beeline version 1.2.1 by Apache Hive
0: jdbc:hive2://wnode55.domain_name.com:10000 (closed)>
配置单元站点.xml
[root@wnode55 ~]# cat /etc/hive/conf/hive-site.xml
<?xml version="1.0"?>
<?xml-stylesheet type="text/xsl" href="configuration.xsl"?>
<!-- Licensed to the Apache Software Foundation (ASF) under one or more -->
<!-- contributor license agreements. See the NOTICE file distributed with -->
<!-- this work for additional information regarding copyright ownership. -->
<!-- The ASF licenses this file to You under the Apache License, Version 2.0 -->
<!-- (the "License"); you may not use this file except in compliance with -->
<!-- the License. You may obtain a copy of the License at -->
<!-- -->
<!-- http://www.apache.org/licenses/LICENSE-2.0 -->
<!-- -->
<!-- Unless required by applicable law or agreed to in writing, software -->
<!-- distributed under the License is distributed on an "AS IS" BASIS, -->
<!-- WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. -->
<!-- See the License for the specific language governing permissions and -->
<!-- limitations under the License. -->
<configuration>
<!-- Hive Configuration can either be stored in this file or in the hadoop configuration files -->
<!-- that are implied by Hadoop setup variables. -->
<!-- Aside from Hadoop setup variables - this file is provided as a convenience so that Hive -->
<!-- users do not have to edit hadoop configuration files (that may be managed as a centralized -->
<!-- resource). -->
<!-- Hive Execution Parameters -->
<property>
<name>hbase.zookeeper.quorum</name>
<value>wnode55.domain_name.com</value>
<description>http://wiki.apache.org/hadoop/Hive/HBaseIntegration</description>
</property>
<property>
<name>hive.execution.engine</name>
<value>mr</value>
</property>
<property>
<name>javax.jdo.option.ConnectionURL</name>
<value>jdbc:derby:;databaseName=/var/lib/hive/metastore/metastore_db;create=true</value>
<description>JDBC connect string for a JDBC metastore</description>
</property>
<property>
<name>javax.jdo.option.ConnectionDriverName</name>
<value>org.apache.derby.jdbc.EmbeddedDriver</value>
<description>Driver class name for a JDBC metastore</description>
</property>
<property>
<name>hive.hwi.war.file</name>
<value>/usr/lib/hive/lib/hive-hwi.war</value>
<description>This is the WAR file with the jsp content for Hive Web Interface</description>
</property>
<property>
<name>hive.server2.allow.user.substitution</name>
<value>true</value>
</property>
<property>
<name>hive.server2.enable.doAs</name>
<value>true</value>
</property>
<property>
<name>hive.server2.thrift.port</name>
<value>10000</value>
</property>
<property>
<name>hive.server2.thrift.http.port</name>
<value>10001</value>
</property>
<property>
<name>hive.metastore.uris</name>
<value>thrift:/wnode55.domain_name.com:9083</value>
</property>
<property>
<name>hive.security.metastore.authorization.manager</name>
<value>org.apache.hadoop.hive.ql.security.authorization.StorageBasedAuthorizationProvider</value>
</property>
<property>
<name>hive.server2.authentication</name>
<value>KERBEROS</value>
</property>
<property>
<name>hive.server2.authentication.kerberos.principal</name>
<value>hive/_HOST@domain_name.COM</value>
</property>
<property>
<name>hive.server2.authentication.kerberos.keytab</name>
<value>/etc/hadoop/conf/hive.keytab</value>
</property>
</configuration>
任何帮助都将不胜感激。
暂无答案!
目前还没有任何答案,快来回答吧!