hadoop 2.2.0 64位安装,但无法启动

vc6uscn9  于 2021-06-04  发布在  Hadoop
关注(0)|答案(8)|浏览(360)

我正在尝试在服务器上安装hadoop2.2.0集群。现在所有的服务器都是64位的,我下载了hadoop2.2.0,所有的配置文件都已经设置好了。在运行./start-dfs.sh时,出现以下错误:

13/11/15 14:29:26 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
Starting namenodes on [Java HotSpot(TM) 64-Bit Server VM warning: You have loaded library /home/hchen/hadoop-2.2.0/lib/native/libhadoop.so.1.0.0 which might have disabled stack guard. The VM will try to fix the stack guard now.It's highly recommended that you fix the library with 'execstack -c <libfile>', or link it with '-z noexecstack'.namenode]
sed: -e expression #1, char 6: unknown option to `s' have: ssh: Could not resolve hostname have: Name or service not known
HotSpot(TM): ssh: Could not resolve hostname HotSpot(TM): Name or service not known
-c: Unknown cipher type 'cd'
Java: ssh: Could not resolve hostname Java: Name or service not known
The authenticity of host 'namenode (192.168.1.62)' can't be established.
RSA key fingerprint is 65:f9:aa:7c:8f:fc:74:e4:c7:a2:f5:7f:d2:cd:55:d4.
Are you sure you want to continue connecting (yes/no)? VM: ssh: Could not resolve        hostname VM: Name or service not known
You: ssh: Could not resolve hostname You: Name or service not known
warning:: ssh: Could not resolve hostname warning:: Name or service not known
library: ssh: Could not resolve hostname library: Name or service not known
have: ssh: Could not resolve hostname have: Name or service not known
64-Bit: ssh: Could not resolve hostname 64-Bit: Name or service not known
...

除了64位之外,还有其他错误吗?我已经完成了namenode和datanodes之间没有密码的登录,其他错误是什么意思?

pgx2nnw8

pgx2nnw81#

我也遇到了类似的问题&在遵循了以上所有的建议之后,我无法解决它。
最后明白了,主机名的配置和ip地址的分配是不一样的。
我的主机名是 vagrant 在中配置 /etc/hostname . 但是我发现这个流浪汉的ip地址没有被分配 /etc/hosts . 在 /etc/hosts 我只找到了 localhost .
一旦我更新了两者的主机名 localhost 以及 vagrant 以上问题全部解决。

zbdgwd5y

zbdgwd5y2#

还可以在hadoop-env.sh中导出变量

vim /usr/local/hadoop/etc/hadoop/hadoop-env.sh

/usr/local/hadoop-我的hadoop安装文件夹


# Hadoop variables

export JAVA_HOME=/usr/lib/jvm/java-1.7.0-openjdk-amd64 # your jdk install path
export HADOOP_INSTALL=/usr/local/hadoop
export PATH=$PATH:$HADOOP_INSTALL/bin
export PATH=$PATH:$HADOOP_INSTALL/sbin
export HADOOP_MAPRED_HOME=$HADOOP_INSTALL
export HADOOP_COMMON_HOME=$HADOOP_INSTALL
export HADOOP_HDFS_HOME=$HADOOP_INSTALL
export YARN_HOME=$HADOOP_INSTALL
export HADOOP_COMMON_LIB_NATIVE_DIR=$HADOOP_INSTALL/lib/native
export HADOOP_OPTS="-Djava.library.path=$HADOOP_INSTALL/lib"
o0lyfsai

o0lyfsai3#

我认为这里唯一的问题和这个问题是一样的,所以解决方法也是一样的:
停止jvm将stack guard警告打印到stdout/stderr,因为这会破坏hdfs启动脚本。
通过替换你的 etc/hadoop/hadoop-env.sh 生产线:

export HADOOP_OPTS="$HADOOP_OPTS -Djava.net.preferIPv4Stack=true"

使用:

export HADOOP_OPTS="$HADOOP_OPTS -XX:-PrintWarnings -Djava.net.preferIPv4Stack=true"

(这个解决方案已经在sumit chawla的博客上找到了)

mbzjlibv

mbzjlibv4#

你有三个问题:
正如@nogard所说,“无法加载本机hadoop库”。他的回答解决了这个问题。
“无法建立主机'namenode(192.168.1.62)'的真实性。”是因为您没有ssh身份验证。请执行以下操作: ssh-keygen -t rsa -P '' -f ~/.ssh/id_rsa cat ~/.ssh/id_rsa.pub >> ~/.ssh/authorized_keys chmod 600 ~/.ssh/authorized_keys scp ~/.ssh/authorized_keys your_install_user@192.168.1.62:/home/your_install_user/.ssh/ “sed:-e expression#1,char 6:未知选项为's'have:ssh:无法解析主机名have:名称或服务未知hotspot(tm):ssh:无法解析主机名hotspot(tm):名称或服务未知-c:”
试试这个:编辑你的 .bash_profile 或者 .bashrc 把这个放进去:

export HADOOP_COMMON_LIB_NATIVE_DIR=$HADOOP_HOME/lib/native
export HADOOP_OPTS="-Djava.library.path=$HADOOP_HOME/lib"

以及 source .bash_profile 或者 source .bashr 使变更立即生效。

ecbunoof

ecbunoof5#

将以下条目添加到.bashrc,其中hadoop\u home是您的hadoop文件夹:

export HADOOP_COMMON_LIB_NATIVE_DIR=$HADOOP_HOME/lib/native
export HADOOP_OPTS="-Djava.library.path=$HADOOP_HOME/lib"

此外,执行以下命令:

ssh-keygen -t rsa -P '' -f ~/.ssh/id_rsa
cat ~/.ssh/id_rsa.pub >> ~/.ssh/authorized_keys
eblbsuwk

eblbsuwk6#

根本原因是hadoop中的默认本机库是为32位构建的。解决方案
1) 在中设置一些环境变量 .bash_profile . 请参阅https://gist.github.com/ruo91/7154697 或
2) 重建hadoop本机库,请参阅http://hadoop.apache.org/docs/current/hadoop-project-dist/hadoop-common/nativelibraries.html

zed5wv10

zed5wv107#

问题不在于本机库。请注意这只是一个警告。请导出上面提到的hadoop变量。那就行了

pbossiut

pbossiut8#

确保您的 HADOOP_HOME 以及 HADOOP_PREFIX 设置正确。我有这个问题。另外,ssh passwordless需要正确设置。

相关问题