大数据Hive系列之Hive数据仓库部署

x33g5p2x  于2021-12-25 转载在 其他  
字(3.6k)|赞(0)|评价(0)|浏览(663)

一、部署准备

二、部署Hive

【主机slave62上执行】

1.  创建hive工作目录,上传解压hive软件

 mkdir -p /apps/svr/hive/
    tar -zxvf ~/apache-hive-2.1.1-bin.tar.gz -C /apps/svr/hive/

2.  部署

 cd /apps/svr/hive/apache-hive-2.1.1-bin/conf/
    cp hive-env.sh.template hive-env.sh
    cp hive-log4j2.properties.template hive-log4j2.properties

2.1.  配置hive-env.sh

 vim hive-env.sh

export JAVA_HOME=/apps/svr/java/jdk1.8.0_172
export HADOOP_HOME=/apps/svr/hadoop/hadoop-2.7.3
export HIVE_CONF_DIR=/apps/svr/hive/apache-hive-2.1.1-bin/conf

2.2.  配置log4j2

 mkdir -p /apps/logs/hive
    vim hive-log4j2.properties

property.hive.log.dir = /apps/logs/hive/${sys:user.name}

2.3.  配置hive-site.xml

 vim hive-site.xml

<?xml version="1.0" encoding="UTF-8" standalone="no"?>
<?xml-stylesheet type="text/xsl" href="configuration.xsl"?>

<configuration>
	<!-- MySQL元数据配置 -->
	<property>
		<name>javax.jdo.option.ConnectionURL</name>
		<value>jdbc:mysql://localhost:3306/hive?characterEncoding=UTF-8&amp;useSSL=false</value>
	</property>
	<property>
		<name>javax.jdo.option.ConnectionDriverName</name>
		<value>com.mysql.jdbc.Driver</value>
	</property>
	<property>
		<name>javax.jdo.option.ConnectionUserName</name>
		<value>hive</value>
	</property>
	<property>
		<name>javax.jdo.option.ConnectionPassword</name>
		<value>hive!@#123</value>
	</property>
	
	<!-- HDFS创建Hive数据存放相关目录 --> 
	<property>
		<name>hive.metastore.warehouse.dir</name>
		<value>hdfs://master60:9000/user/hive/warehouse</value>
	</property>
	<property>
		<name>hive.exec.scratchdir</name>
		<value>hdfs://master60:9000/tmp/hive/hive-${user.name}</value>
	</property>
	
	<!-- 指定Hive的DDL/DML作业计算结果本地存储目录 --> 
	<property>
		<name>hive.exec.local.scratchdir</name>
		<value>/tmp/hive/iotmp</value>
	</property>
	
	<!-- 用于向远程文件系统添加资源的本地临时目录  -->
	<property>
		<name>hive.downloaded.resources.dir</name>
		<value>/tmp/hive/iotmp</value>
	</property>
	
	<!-- 指定hive查询日志本地存放目录  --> 
	<property>
		<name>hive.querylog.location</name>
		<value>/tmp/${user.name}</value>
	</property>
	<!-- 开启日志功能后存储日志的本地顶层目录  --> 
	<property>
		<name>hive.server2.logging.operation.log.location</name>
		<value>/tmp/${user.name}/operation_logs</value>
	</property>

	<!-- CLI -->
	<property>
		<name>hive.cli.print.header</name>
		<value>true</value>
	</property>
	<property>
		<name>hive.cli.print.current.db</name>
		<value>true</value>
	</property>
</configuration>

2.4.  上传并拷贝相关jar包

 cd /apps/svr/hive/apache-hive-2.1.1-bin/
    cp ~/jar/mysql-connector-java-5.1.46.jar ./lib/
    cp $HADOOP_HOME/share/hadoop/common/hadoop-common-2.7.3.jar ./lib/

2.5.  创建Hive仓库目录

【主机master60上执行】

 hdfs dfs -mkdir /tmp
    hdfs dfs -mkdir -p /user/hive/warehouse
    hdfs dfs -chmod g+w /tmp
    hdfs dfs -chmod g+w /user/hive/warehouse

2.6.  创建Hive元数据储存数据库

create database hive default charset utf8 collate utf8_general_ci;
grant all on hive.* to 'hive'@'%' identified by 'hive!@#123';
flush privileges;

3.  配置hive环境变量

3.1.  配置.bash_profile

 vim ~/.bash_profile

# HIVE_HOME
export HIVE_HOME=/apps/svr/hive/apache-hive-2.1.1-bin
export HCAT_HOME=$HIVE_HOME/hcatalog/
export PATH=$PATH:$HIVE_HOME/bin

3.2.  立即生效

 source ~/.bash_profile

4.  启动测试hive

4.1.  初始化metastore库

 schematool -dbType mysql -initSchema

4.2.  启动测试

 hive -e 'show tables'

相关文章