在ambari中安装配置单元时出错

bsxbgnwa  于 2021-06-26  发布在  Hive
关注(0)|答案(1)|浏览(458)

我在aws上吃了sap vora 1.2映像。我配置了集群并尝试在ambari中安装hive。安装hcat服务时安装失败:

2016-04-25 11:02:59,728 - The hadoop conf dir /usr/hdp/current/hadoop-client/conf exists, will call conf-select on it for version 2.3.4.0-3485
2016-04-25 11:02:59,728 - Checking if need to create versioned conf dir /etc/hadoop/2.3.4.0-3485/0
2016-04-25 11:02:59,729 - call['conf-select create-conf-dir --package hadoop --stack-version 2.3.4.0-3485 --conf-version 0'] {'logoutput': False, 'sudo': True, 'quiet': False, 'stderr': -1}
2016-04-25 11:02:59,848 - call returned (1, '/etc/hadoop/2.3.4.0-3485/0 exist already', '')
2016-04-25 11:02:59,848 - checked_call['conf-select set-conf-dir --package hadoop --stack-version 2.3.4.0-3485 --conf-version 0'] {'logoutput': False, 'sudo': True, 'quiet': False}
2016-04-25 11:02:59,969 - checked_call returned (0, '/usr/hdp/2.3.4.0-3485/hadoop/conf -> /etc/hadoop/2.3.4.0-3485/0')
2016-04-25 11:02:59,969 - Ensuring that hadoop has the correct symlink structure
2016-04-25 11:02:59,969 - Using hadoop conf dir: /usr/hdp/current/hadoop-client/conf
2016-04-25 11:02:59,970 - Group['spark'] {}
2016-04-25 11:02:59,971 - Group['hadoop'] {}
2016-04-25 11:02:59,972 - Group['users'] {}
2016-04-25 11:02:59,972 - User['hive'] {'gid': 'hadoop', 'groups': [u'hadoop']}
2016-04-25 11:02:59,972 - Adding user User['hive']
2016-04-25 11:03:00,907 - User['zookeeper'] {'gid': 'hadoop', 'groups': [u'hadoop']}
2016-04-25 11:03:00,908 - User['spark'] {'gid': 'hadoop', 'groups': [u'hadoop']}
2016-04-25 11:03:00,908 - User['ams'] {'gid': 'hadoop', 'groups': [u'hadoop']}
2016-04-25 11:03:00,909 - User['ambari-qa'] {'gid': 'hadoop', 'groups': [u'users']}
2016-04-25 11:03:00,910 - User['tez'] {'gid': 'hadoop', 'groups': [u'users']}
2016-04-25 11:03:00,910 - Adding user User['tez']
2016-04-25 11:03:01,032 - User['hdfs'] {'gid': 'hadoop', 'groups': [u'hadoop']}
2016-04-25 11:03:01,033 - User['yarn'] {'gid': 'hadoop', 'groups': [u'hadoop']}
2016-04-25 11:03:01,034 - User['mapred'] {'gid': 'hadoop', 'groups': [u'hadoop']}
2016-04-25 11:03:01,034 - User['hcat'] {'gid': 'hadoop', 'groups': [u'hadoop']}
2016-04-25 11:03:01,034 - Adding user User['hcat']
2016-04-25 11:03:01,293 - File['/var/lib/ambari-agent/tmp/changeUid.sh'] {'content': StaticFile('changeToSecureUid.sh'), 'mode': 0555}
2016-04-25 11:03:01,295 - Execute['/var/lib/ambari-agent/tmp/changeUid.sh ambari-qa /tmp/hadoop-ambari-qa,/tmp/hsperfdata_ambari-qa,/home/ambari-qa,/tmp/ambari-qa,/tmp/sqoop-ambari-qa'] {'not_if': '(test $(id -u ambari-qa) -gt 1000) || (false)'}
2016-04-25 11:03:01,398 - Skipping Execute['/var/lib/ambari-agent/tmp/changeUid.sh ambari-qa /tmp/hadoop-ambari-qa,/tmp/hsperfdata_ambari-qa,/home/ambari-qa,/tmp/ambari-qa,/tmp/sqoop-ambari-qa'] due to not_if
2016-04-25 11:03:01,398 - Group['hdfs'] {'ignore_failures': False}
2016-04-25 11:03:01,399 - User['hdfs'] {'ignore_failures': False, 'groups': [u'hadoop', u'hdfs']}
2016-04-25 11:03:01,399 - Directory['/etc/hadoop'] {'mode': 0755}
2016-04-25 11:03:01,414 - File['/usr/hdp/current/hadoop-client/conf/hadoop-env.sh'] {'content': InlineTemplate(...), 'owner': 'hdfs', 'group': 'hadoop'}
2016-04-25 11:03:01,415 - Directory['/var/lib/ambari-agent/tmp/hadoop_java_io_tmpdir'] {'owner': 'hdfs', 'group': 'hadoop', 'mode': 0777}
2016-04-25 11:03:01,425 - Repository['HDP-2.3'] {'base_url': 'http://public-repo-1.hortonworks.com/HDP/centos7/2.x/updates/2.3.4.0', 'action': ['create'], 'components': [u'HDP', 'main'], 'repo_template': '[{{repo_id}}]\nname={{repo_id}}\n{% if mirror_list %}mirrorlist={{mirror_list}}{% else %}baseurl={{base_url}}{% endif %}\n\npath=/\nenabled=1\ngpgcheck=0', 'repo_file_name': 'HDP', 'mirror_list': None}
2016-04-25 11:03:01,432 - File['/etc/yum.repos.d/HDP.repo'] {'content': '[HDP-2.3]\nname=HDP-2.3\nbaseurl=http://public-repo-1.hortonworks.com/HDP/centos7/2.x/updates/2.3.4.0\n\npath=/\nenabled=1\ngpgcheck=0'}
2016-04-25 11:03:01,433 - Repository['HDP-UTILS-1.1.0.20'] {'base_url': 'http://public-repo-1.hortonworks.com/HDP-UTILS-1.1.0.20/repos/centos7', 'action': ['create'], 'components': [u'HDP-UTILS', 'main'], 'repo_template': '[{{repo_id}}]\nname={{repo_id}}\n{% if mirror_list %}mirrorlist={{mirror_list}}{% else %}baseurl={{base_url}}{% endif %}\n\npath=/\nenabled=1\ngpgcheck=0', 'repo_file_name': 'HDP-UTILS', 'mirror_list': None}
2016-04-25 11:03:01,436 - File['/etc/yum.repos.d/HDP-UTILS.repo'] {'content': '[HDP-UTILS-1.1.0.20]\nname=HDP-UTILS-1.1.0.20\nbaseurl=http://public-repo-1.hortonworks.com/HDP-UTILS-1.1.0.20/repos/centos7\n\npath=/\nenabled=1\ngpgcheck=0'}
2016-04-25 11:03:01,437 - Package['unzip'] {}
2016-04-25 11:03:01,513 - Skipping installation of existing package unzip
2016-04-25 11:03:01,513 - Package['curl'] {}
2016-04-25 11:03:01,523 - Skipping installation of existing package curl
2016-04-25 11:03:01,523 - Package['hdp-select'] {}
2016-04-25 11:03:01,532 - Skipping installation of existing package hdp-select
2016-04-25 11:03:02,227 - The hadoop conf dir /usr/hdp/current/hadoop-client/conf exists, will call conf-select on it for version 2.3.4.0-3485
2016-04-25 11:03:02,227 - Checking if need to create versioned conf dir /etc/hadoop/2.3.4.0-3485/0
2016-04-25 11:03:02,227 - call['conf-select create-conf-dir --package hadoop --stack-version 2.3.4.0-3485 --conf-version 0'] {'logoutput': False, 'sudo': True, 'quiet': False, 'stderr': -1}
2016-04-25 11:03:02,346 - call returned (1, '/etc/hadoop/2.3.4.0-3485/0 exist already', '')
2016-04-25 11:03:02,346 - checked_call['conf-select set-conf-dir --package hadoop --stack-version 2.3.4.0-3485 --conf-version 0'] {'logoutput': False, 'sudo': True, 'quiet': False}
2016-04-25 11:03:02,467 - checked_call returned (0, '/usr/hdp/2.3.4.0-3485/hadoop/conf -> /etc/hadoop/2.3.4.0-3485/0')
2016-04-25 11:03:02,468 - Ensuring that hadoop has the correct symlink structure
2016-04-25 11:03:02,468 - Using hadoop conf dir: /usr/hdp/current/hadoop-client/conf
2016-04-25 11:03:02,597 - Failed to get extracted version with hdp-select
2016-04-25 11:03:02,603 - Package['mysql-connector-java'] {}
2016-04-25 11:03:02,677 - Installing package mysql-connector-java ('/usr/bin/yum -d 0 -e 0 -y install mysql-connector-java')

Python script has been killed due to timeout after waiting 1800 secs

你能帮我解决这个问题吗?我已经尝试在不同的服务器(微型、中型和大型)上使用此图像3次,但总是出现相同的错误。

yzxexxkh

yzxexxkh1#

您可以在受影响的节点上手动安装包:

mysql-connector-java
hive_2_3_*
atlas-metadata*
mysql-community-release
mysql-community-server
pig_2_3_*
datafu_2_3_*

相关问题