我在ubuntu16.10上有hadoop一切正常:我可以上传输入文件在hdfs和执行Map减少操作。但当我重新启动电脑时,所有的hdfs块都损坏了,namenode以安全模式启动所以我必须1) 离开安全模式2) 删除所有损坏的块
hdfs fsck -delete
3) 重新上载输入文件在下次重新启动之前,它可以正常工作。有人能给我一些解决办法吗。谢谢
4bbkushb1#
我解决了我的问题。我用这个链接来检查我的配置文件http://www.bogotobogo.com/hadoop/bigdata_hadoop_install_on_ubuntu_single_node_cluster.php我忘了用 sudo chown -R hduser:hadoop /usr/local/hadoop_tmp 在我的hdfs目录上
sudo chown -R hduser:hadoop /usr/local/hadoop_tmp
6fe3ivhb2#
Create folder like /dfs/ in your machine open hdfs-site.xml or hdfs-default.xml set this property "dfs.namenode.name.dir". Example: <property> <name>dfs.namenode.name.dir</name> <value>/dfs/</value> <description>Determines where on the local filesystem the DFS name node should store the name table(fsimage). If this is a comma-delimited list of directories then the name table is replicated in all of the directories, for redundancy. </description> </property>
Create folder like /dfs/ in your machine
open hdfs-site.xml or hdfs-default.xml
set this property "dfs.namenode.name.dir".
Example:
<property>
<name>dfs.namenode.name.dir</name>
<value>/dfs/</value>
<description>Determines where on the local filesystem the DFS name node
should store the name table(fsimage). If this is a comma-delimited list
of directories then the name table is replicated in all of the
directories, for redundancy. </description>
</property>
2条答案
按热度按时间4bbkushb1#
我解决了我的问题。我用这个链接来检查我的配置文件http://www.bogotobogo.com/hadoop/bigdata_hadoop_install_on_ubuntu_single_node_cluster.php
我忘了用
sudo chown -R hduser:hadoop /usr/local/hadoop_tmp
在我的hdfs目录上6fe3ivhb2#