hiveserver2进程连接失败警报

n3schb8v  于 2021-07-15  发布在  Hadoop
关注(0)|答案(0)|浏览(253)

当sqoop导入作业对配置单元的负载过大时,ambari服务器上会连续出现以下hiveserver2连接失败警报:
主机上的连接失败master:10000 (traceback(最近一次调用last):file“/var/lib/ambari agent/cache/stacks/hdp/3.0/services/hive/package/alerts/alert\u hive\u thrift\u port.py”,第204行,在execute ldap\u password=ldap\u password)file“/usr/lib/ambari agent/lib/resource\u management/libraries/functions/hive\u check.py”第84行,在check\u thrift\u port\u sasl timeout\u kill\u strategy=terminatestrategy.kill\u process\u tree,文件“/usr/lib/ambari agent/lib/resource\u management/core/base.py”,第166行,在init self.env.run()文件“/usr/lib/ambari agent/lib/resource\u management/core/environment.py”,第160行,在run self.run\u action(resource,action)文件“/usr/lib/ambari agent/lib/resource\u management/core/environment.py”,第124行,在run\u action provider\u action()文件“/usr/lib/ambari agent/lib/resource\u management/core/providers/system.py”,第263行,在action\u run returns=self.resource.returns)文件“/usr/lib/ambari agent/lib/resource\u management/core/shell.py”,第72行,在inner result=function(command,**kwargs)file“/usr/lib/ambari-agent/lib/resource\u-management/core/shell.py”的第102行,在checked\u-call tries=try,try\u-sleep=try\u-sleep,timeout\u-kill\u-strategy=timeout\u-kill\u-strategy,returns=returns)file“/usr/lib/ambari-agent/lib/resource\u-management/core,在第308行的“\u call\u wrapper result=\u call(command,**kwargs\u copy)file”/usr/lib/ambari agent/lib/resource\u management/core/shell.py”中,在_callraise executetimeoutexception(err\u msg)executetimeoutexception中:执行'ambari-sudo.sh su ambari qa-l-s/bin/bash-c'导出path='“'”'/usr/sbin:/sbin:/usr/lib/ambari server/:/usr/sbin:/sbin:/usr/lib/ambari server/:/usr/lib64/qt-3.3/bin:/usr/local/sbin:/usr/local/bin:/sbin:/bin:/usr/sbin:/usr/bin:/root/bin:/var/lib/ambari agent:/var/lib/ambari agent:/bin/:/usr/bin/:/usr/lib/hive/bin/:/usr/sbin/'“'”;直线-nHive-u''jdbc:hive2://master:10000/;transportmode=二进制''''''''''''';'''''''''2> &1 | awk'''''{print}''''''''''''''grep-i-e'''''''连接到:''''''''''-e'''''''''''事务隔离:''''''''''''''''''''由于60秒后超时而被终止)

暂无答案!

目前还没有任何答案,快来回答吧!

相关问题