我正在尝试在我的计算机上安装hadoop2.2.0(osx10.9.3)。
我已经配置了相关的文件,当我键入 hadoop version
我明白了
Hadoop 2.2.0
Subversion https://svn.apache.org/repos/asf/hadoop/common -r 1529768
Compiled by hortonmu on 2013-10-07T06:28Z
Compiled with protoc 2.5.0
From source with checksum 79e53ce7994d1628b240f09af91e1af4
This command was run using /Users/hadoop/Downloads/hadoop/share/hadoop/common/hadoop-
common-2.2.0.jar
所以我相信这意味着hadoop已经成功安装了。
但是,当我试运行
$HADOOP_PREFIX/bin/hadoop jar $HADOOP_PREFIX/share/hadoop/yarn/hadoop-yarn-applications-
distributedshell-2.2.0.jar org.apache.hadoop.yarn.applications.distributedshell.Client --
jar $HADOOP_PREFIX/share/hadoop/yarn/hadoop-yarn-applications-distributedshell-2.2.0.jar -
-shell_command date --num_containers 2 --master_memory 1024
(顺便说一句,如果您想知道上面的代码的意思是:“通过这个命令,我们告诉hadoop在hadoop-yarn-applications-distributedshell-2.2.0.jar中运行客户机类,将包含applicationmaster定义的jar(相同的jar)传递给它,shell命令在每个主机上运行(date),要生成的容器数(2)和applicationmaster使用的内存(1024mb)。1024的值是通过多次尝试运行程序来设置的,直到由于applicationmaster使用的内存多于分配给它的内存,程序停止失败
我得到以下错误:
ERROR hdfs.DFSClient: Failed to close file /user/hadoop/DistributedShell/1/AppMaster.jar
org.apache.hadoop.ipc.RemoteException(java.io.IOException): File
/user/hadoop/DistributedShell/1/AppMaster.jar could only be replicated to 0 nodes instead
of minReplication (=1). There are 0 datanode(s) running and no node(s) are excluded in
this operation.
有人知道如何修正这个错误吗?
暂无答案!
目前还没有任何答案,快来回答吧!