我只想在pyspark shell中使用hdfs.open,但出现以下错误:
16/06/20 16:11:40 WARN util.NativeCodeLoader: Unable to load native-hadoop
libra ry for your platform... using builtin-java classes where applicable
hdfsBuilderConnect(forceNewInstance=0, nn=xipcc01, port=8020, kerbTicketCachePat h=(NULL), userName=(NULL)) error:
java.io.IOException: No FileSystem for scheme: hdfs
at org.apache.hadoop.fs.FileSystem.getFileSystemClass(FileSystem.java:26 44)
at org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:2651 )
at org.apache.hadoop.fs.FileSystem.access$200(FileSystem.java:92)
at org.apache.hadoop.fs.FileSystem$Cache.getInternal(FileSystem.java:268 7)
at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:2669)
at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:371)
at org.apache.hadoop.fs.FileSystem$1.run(FileSystem.java:160)
at org.apache.hadoop.fs.FileSystem$1.run(FileSystem.java:157)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:422)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInforma tion.java:1709)
at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:157)
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
File "/home/cloud/anaconda2/lib/python2.7/site-packages/pydoop/hdfs/__init_ _.py", line 121, in open
fs = hdfs(host, port, user)
File "/home/cloud/anaconda2/lib/python2.7/site-packages/pydoop/hdfs/fs.py", line 150, in __init__
h, p, u, fs = _get_connection_info(host, port, user)
File "/home/cloud/anaconda2/lib/python2.7/site-packages/pydoop/hdfs/fs.py", line 64, in _get_connection_info
fs = core_hdfs_fs(host, port, user)
File "/home/cloud/anaconda2/lib/python2.7/site-packages/pydoop/hdfs/core/__ init__.py", line 57, in core_hdfs_fs
return _CORE_MODULE.CoreHdfsFs(host, port, user)
RuntimeError: (255, 'Unknown error 255')
有人有主意吗?在python中我可以使用hdfs.open函数-在pyspark中我不能访问namenode吗?我不明白为什么它在python中工作而在pyspark中不工作?
python 2.7(anaconda 4)spark 1.6.0 hadoop 2.4(与ambari一起安装)
暂无答案!
目前还没有任何答案,快来回答吧!