无法写入hdfs:warn hdfs.datastreamer-意外的eof

kqlmhetl  于 2021-05-27  发布在  Hadoop
关注(0)|答案(0)|浏览(333)

我正在遵循一个教程,当在单个集群测试环境中运行时,我突然无法运行任何mr jobs或将数据写入hdfs。它以前工作得很好,突然我一直低于错误(重新启动没有帮助)。
我可以从hdfs读取和删除文件,但不能写入。

  1. $ hdfs dfs -put war-and-peace.txt /user/hands-on/
  2. 19/03/25 18:28:29 WARN hdfs.DataStreamer: Exception for BP-1098838250-127.0.0.1-1516469292616:blk_1073742374_1550
  3. java.io.EOFException: Unexpected EOF while trying to read response from server
  4. at org.apache.hadoop.hdfs.protocolPB.PBHelperClient.vintPrefixed(PBHelperClient.java:399)
  5. at org.apache.hadoop.hdfs.protocol.datatransfer.PipelineAck.readFields(PipelineAck.java:213)
  6. at org.apache.hadoop.hdfs.DataStreamer$ResponseProcessor.run(DataStreamer.java:1020)
  7. put: All datanodes [DatanodeInfoWithStorage[127.0.0.1:50010,DS-b90326de-a499-4a43-a66a-cc3da83ea966,DISK]] are bad. Aborting...

“hdfs-dfsadmin-report”显示一切正常,磁盘空间足够。我几乎不做任何工作,只是做一些测试和一些测试数据。

  1. $ hdfs dfsadmin -report
  2. Configured Capacity: 52710469632 (49.09 GB)
  3. Present Capacity: 43335585007 (40.36 GB)
  4. DFS Remaining: 43334025216 (40.36 GB)
  5. DFS Used: 1559791 (1.49 MB)
  6. DFS Used%: 0.00%
  7. Under replicated blocks: 0
  8. Blocks with corrupt replicas: 0
  9. Missing blocks: 0
  10. Missing blocks (with replication factor 1): 0
  11. Pending deletion blocks: 0
  12. -------------------------------------------------
  13. Live datanodes (1):
  14. Name: 127.0.0.1:50010 (localhost)
  15. Hostname: localhost
  16. Decommission Status : Normal
  17. Configured Capacity: 52710469632 (49.09 GB)
  18. DFS Used: 1559791 (1.49 MB)
  19. Non DFS Used: 6690530065 (6.23 GB)
  20. DFS Remaining: 43334025216 (40.36 GB)
  21. DFS Used%: 0.00%
  22. DFS Remaining%: 82.21%
  23. Configured Cache Capacity: 0 (0 B)
  24. Cache Used: 0 (0 B)
  25. Cache Remaining: 0 (0 B)
  26. Cache Used%: 100.00%
  27. Cache Remaining%: 0.00%
  28. Xceivers: 2
  29. Last contact: Mon Mar 25 18:30:45 EDT 2019

另外,namenodewebui(端口50070)显示一切正常,日志也没有报告任何错误。可能是什么/如何正确排除故障?
centos linux 6.9最小版apache hadoop 2.8.1

暂无答案!

目前还没有任何答案,快来回答吧!

相关问题