namenode启动时出错

ovfsdjhp  于 2021-06-04  发布在  Hadoop
关注(0)|答案(4)|浏览(864)

当我尝试在主节点上启动hadoop时,我得到了以下输出。

  1. [hduser@dellnode1 ~]$ start-dfs.sh
  2. starting namenode, logging to /usr/local/hadoop/bin/../logs/hadoop-hduser-namenode-dellnode1.library.out
  3. dellnode1.library: datanode running as process 5123. Stop it first.
  4. dellnode3.library: datanode running as process 4072. Stop it first.
  5. dellnode2.library: datanode running as process 4670. Stop it first.
  6. dellnode1.library: secondarynamenode running as process 5234. Stop it first.
  7. [hduser@dellnode1 ~]$ jps
  8. 5696 Jps
  9. 5123 DataNode
  10. 5234 SecondaryNameNode
bxpogfeg

bxpogfeg1#

在mac中(如果使用自制软件安装),其中3.0.0是hadoop版本。在linux中,相应地更改安装路径(只有这部分会更改)。 /usr/local/Cellar/ ).

  1. > /usr/local/Cellar/hadoop/3.0.0/sbin/stopyarn.sh
  2. > /usr/local/Cellar/hadoop/3.0.0/sbin/stopdfs.sh
  3. > /usr/local/Cellar/hadoop/3.0.0/sbin/stop-all.sh"

更适合专业用户写这个 alias 在你生命的尽头 ~/.bashrc 或者 ~/.zshrc (如果您是zsh用户)。只是打字 hstop 每次您想要停止hadoop和所有相关进程时,都可以从命令行执行。

  1. alias hstop="/usr/local/Cellar/hadoop/3.0.0/sbin/stop-yarn.sh;/usr/local/Cellar/hadoop/3.0.0/sbin/stop-dfs.sh;/usr/local/Cellar/hadoop/3.0.0/sbin/stop-all.sh"
vwoqyblh

vwoqyblh2#

今天,在执行pig脚本时,我遇到了问题中提到的相同错误:

  1. starting namenode, logging to /usr/local/hadoop/libexec/../logs/hadoop-training-namenode-localhost.localdomain.out
  2. localhost: /home/training/.bashrc: line 10: /jdk1.7.0_10/bin: No such file or directory
  3. localhost: Warning: $HADOOP_HOME is deprecated.
  4. localhost:
  5. localhost: starting datanode, logging to /usr/local/hadoop/libexec/../logs/hadoop-training-datanode-localhost.localdomain.out
  6. localhost: /home/training/.bashrc: line 10: /jdk1.7.0_10/bin: No such file or directory
  7. localhost: Warning: $HADOOP_HOME is deprecated.
  8. localhost:
  9. localhost: starting secondarynamenode, logging to /usr/local/hadoop/libexec/../logs/hadoop-training-secondarynamenode-localhost.localdomain.out
  10. starting jobtracker, logging to /usr/local/hadoop/libexec/../logs/hadoop-training-jobtracker-localhost.localdomain.out
  11. localhost: /home/training/.bashrc: line 10: /jdk1.7.0_10/bin: No such file or directory
  12. localhost: Warning: $HADOOP_HOME is deprecated.
  13. localhost:
  14. localhost: starting tasktracker, logging to /usr/local/hadoop/libexec/../logs/hadoop-training-tasktracker-localhost.localdomain.out

所以,答案是:

  1. [training@localhost bin]$ stop-all.sh

然后键入:

  1. [training@localhost bin]$ start-all.sh

这个问题会解决的。现在可以用mapreduce运行pig脚本了!

展开查看全部
a11xaf1n

a11xaf1n3#

根据在较新版本的hardoop上运行“stop all.sh”的说法,这是不推荐的。您应该使用: stop-dfs.shstop-yarn.sh

pcww981p

pcww981p4#

“先停下来”。
第一个呼叫停止.sh
jps型
调用start-all.sh(或start-dfs.sh和start-mapred.sh)
键入jps(如果namenode没有出现,则键入“hadoop namenode”并检查错误)

相关问题