我在hadoop中运行一些任务时遇到以下异常。但hdfs显示它有空间。有关此错误的任何信息都会有所帮助。
java.lang.RuntimeException: org.apache.hadoop.fs.FSError: java.io.IOException: No space left on device
at org.apache.hadoop.hive.ql.exec.ExecReducer.reduce(ExecReducer.java:270)
at org.apache.hadoop.mapred.ReduceTask.runOldReducer(ReduceTask.java:506)
at org.apache.hadoop.mapred.ReduceTask.run(ReduceTask.java:447)
at org.apache.hadoop.mapred.Child$4.run(Child.java:268)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:396)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1408)
at org.apache.hadoop.mapred.Child.main(Child.java:262)
Caused by: org.apache.hadoop.fs.FSError: java.io.IOException: No space left on device
at org.apache.hadoop.fs.RawLocalFileSystem$LocalFSFileOutputStream.write(RawLocalFileSystem.java:220)
at java.io.BufferedOutputStream.flushBuffer(BufferedOutputStream.java:65)
at java.io.BufferedOutputStream.write(BufferedOutputStream.java:109)
at org.apache.hadoop.fs.FSDataOutputStream$Positio
2条答案
按热度按时间hsgswve41#
即使hdfs上有空间,某些节点的磁盘也可能已满。您可以通过选择livedatanodes,从dfs的web ui中进行检查。
zf2sa74q2#
可能你的日志没有旋转。请检查目录:/var/log/它有目录hadoop*。它可能被完全填满了,而且找不到存放原木的地方。