functions.sh“意外标记`< '”附近的语法错误吗?

nbysray5  于 2021-05-27  发布在  Hadoop
关注(0)|答案(1)|浏览(610)

我在运行OSX10.14.2的macpro上配置了Hadoop3.1.1,运行start-all.sh时出现以下错误

$ sudo /usr/local/Cellar/hadoop/3.1.1/sbin/start-all.sh
Starting namenodes on [localhost]
/usr/local/Cellar/hadoop/3.1.1/libexec/bin/../libexec/hadoop-functions.sh: line 398: syntax error near unexpected token `<'
/usr/local/Cellar/hadoop/3.1.1/libexec/bin/../libexec/hadoop-functions.sh: line 398: `  done < <(for text in "${input[@]}"; do'
/usr/local/Cellar/hadoop/3.1.1/libexec/bin/../libexec/hadoop-config.sh: line 70: hadoop_deprecate_envvar: command not found
/usr/local/Cellar/hadoop/3.1.1/libexec/bin/../libexec/hadoop-config.sh: line 87: hadoop_bootstrap: command not found
/usr/local/Cellar/hadoop/3.1.1/libexec/bin/../libexec/hadoop-config.sh: line 104: hadoop_parse_args: command not found
/usr/local/Cellar/hadoop/3.1.1/libexec/bin/../libexec/hadoop-config.sh: line 105: shift: : numeric argument required
/usr/local/Cellar/hadoop/3.1.1/libexec/bin/../libexec/hadoop-config.sh: line 244: hadoop_need_reexec: command not found
/usr/local/Cellar/hadoop/3.1.1/libexec/bin/../libexec/hadoop-config.sh: line 252: hadoop_verify_user_perm: command not found
/usr/local/Cellar/hadoop/3.1.1/libexec/bin/hdfs: line 213: hadoop_validate_classname: command not found
/usr/local/Cellar/hadoop/3.1.1/libexec/bin/hdfs: line 214: hadoop_exit_with_usage: command not found
/usr/local/Cellar/hadoop/3.1.1/libexec/bin/../libexec/hadoop-config.sh: line 263: hadoop_add_client_opts: command not found
/usr/local/Cellar/hadoop/3.1.1/libexec/bin/../libexec/hadoop-config.sh: line 270: hadoop_subcommand_opts: command not found
/usr/local/Cellar/hadoop/3.1.1/libexec/bin/../libexec/hadoop-config.sh: line 273: hadoop_generic_java_subcmd_handler: command not found

在启动datanodes、secondary namenodes、resourcemanager和nodemanagers时也会出现同样的问题。
我在网上发现了一个类似的错误参考:https://issues.apache.org/jira/browse/hdfs-12571.
在一些调试之后更新,根本原因是bash“<<(command)”语法由于某些原因不被接受。系统上的bash版本(来自自制的/bin/bash和/usr/local/bin/bash)都可以正常工作。

bvhaajcl

bvhaajcl1#

也许你应该修改一下 HDFS_NAMENODE_USERHDFS_DATANODE_USER 以此类推,在hadoop-env.sh中对当前用户而不是根用户!然后在运行 sudo ./start-all.sh 命令,您可能需要使用 hdfs namenode -format .

相关问题