pydoop-hdfs ioexception异常

jhiyze9q  于 2021-06-02  发布在  Hadoop
关注(0)|答案(0)|浏览(418)

我只想在pyspark shell中使用hdfs.open,但出现以下错误:

  1. 16/06/20 16:11:40 WARN util.NativeCodeLoader: Unable to load native-hadoop
  2. libra ry for your platform... using builtin-java classes where applicable
  3. hdfsBuilderConnect(forceNewInstance=0, nn=xipcc01, port=8020, kerbTicketCachePat h=(NULL), userName=(NULL)) error:
  4. java.io.IOException: No FileSystem for scheme: hdfs
  5. at org.apache.hadoop.fs.FileSystem.getFileSystemClass(FileSystem.java:26 44)
  6. at org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:2651 )
  7. at org.apache.hadoop.fs.FileSystem.access$200(FileSystem.java:92)
  8. at org.apache.hadoop.fs.FileSystem$Cache.getInternal(FileSystem.java:268 7)
  9. at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:2669)
  10. at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:371)
  11. at org.apache.hadoop.fs.FileSystem$1.run(FileSystem.java:160)
  12. at org.apache.hadoop.fs.FileSystem$1.run(FileSystem.java:157)
  13. at java.security.AccessController.doPrivileged(Native Method)
  14. at javax.security.auth.Subject.doAs(Subject.java:422)
  15. at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInforma tion.java:1709)
  16. at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:157)
  17. Traceback (most recent call last):
  18. File "<stdin>", line 1, in <module>
  19. File "/home/cloud/anaconda2/lib/python2.7/site-packages/pydoop/hdfs/__init_ _.py", line 121, in open
  20. fs = hdfs(host, port, user)
  21. File "/home/cloud/anaconda2/lib/python2.7/site-packages/pydoop/hdfs/fs.py", line 150, in __init__
  22. h, p, u, fs = _get_connection_info(host, port, user)
  23. File "/home/cloud/anaconda2/lib/python2.7/site-packages/pydoop/hdfs/fs.py", line 64, in _get_connection_info
  24. fs = core_hdfs_fs(host, port, user)
  25. File "/home/cloud/anaconda2/lib/python2.7/site-packages/pydoop/hdfs/core/__ init__.py", line 57, in core_hdfs_fs
  26. return _CORE_MODULE.CoreHdfsFs(host, port, user)
  27. RuntimeError: (255, 'Unknown error 255')

有人有主意吗?在python中我可以使用hdfs.open函数-在pyspark中我不能访问namenode吗?我不明白为什么它在python中工作而在pyspark中不工作?
python 2.7(anaconda 4)spark 1.6.0 hadoop 2.4(与ambari一起安装)

暂无答案!

目前还没有任何答案,快来回答吧!

相关问题