apache pig未连接到hdfs

tv6aics1  于 2021-05-29  发布在  Hadoop
关注(0)|答案(1)|浏览(405)

我有hadoopversion2.6.3和pig-0.6.0,所有的守护进程都在单节点集群中运行。打完Pig的命令。Pig只连接到文件:///不是hdfs你能告诉我如何连接hdfs下面是我能看到的信息日志吗

  1. 2016-01-10 20:58:30,431 [main] INFO org.apache.pig.backend.hadoop.executionengine.HExecutionEngine - Connecting to hadoop file system at: file:///
  2. 2016-01-10 20:58:30,650 [main] INFO org.apache.hadoop.metrics.jvm.JvmMetrics - Initializing JVM Metrics with processName=JobTracker, sessionId=

当我咕哝着按命令的时候

  1. grunt> ls hdfs://localhost:54310/
  2. 2016-01-10 21:05:41,059 [main] ERROR org.apache.pig.tools.grunt.Grunt - ERROR 2999: Unexpected internal error. Wrong FS: hdfs://localhost:54310/, expected: file:///
  3. Details at logfile: /home/hguna/pig_1452488310172.log

我不知道它为什么需要文件:///

  1. ERROR 2999: Unexpected internal error. Wrong FS: hdfs://localhost:54310/, expected: file:///
  2. java.lang.IllegalArgumentException: Wrong FS: hdfs://localhost:54310/, expected: file:///
  3. at org.apache.hadoop.fs.FileSystem.checkPath(FileSystem.java:305)
  4. at org.apache.hadoop.fs.RawLocalFileSystem.pathToFile(RawLocalFileSystem.java:47)
  5. at org.apache.hadoop.fs.RawLocalFileSystem.getFileStatus(RawLocalFileSystem.java:357)
  6. at org.apache.hadoop.fs.FilterFileSystem.getFileStatus(FilterFileSystem.java:245)
  7. at org.apache.hadoop.fs.FileSystem.exists(FileSystem.java:643)
  8. at org.apache.pig.backend.hadoop.datastorage.HDataStorage.isContainer(HDataStorage.java:203)
  9. at org.apache.pig.backend.hadoop.datastorage.HDataStorage.asElement(HDataStorage.java:131)
  10. at org.apache.pig.tools.grunt.GruntParser.processLS(GruntParser.java:576)
  11. at org.apache.pig.tools.pigscript.parser.PigScriptParser.parse(PigScriptParser.java:304)
  12. at org.apache.pig.tools.grunt.GruntParser.parseStopOnError(GruntParser.java:168)
  13. at org.apache.pig.tools.grunt.GruntParser.parseStopOnError(GruntParser.java:144)
  14. at org.apache.pig.tools.grunt.Grunt.run(Grunt.java:75)
  15. at org.apache.pig.Main.main(Main.java:352)

我是否正确配置了hadoop?或者有些地方我错了,请让我知道,如果有任何文件,我需要分享。我已经做了足够的研究无法修复它。顺便说一句,我是一个新手hadoop和Pig请帮助我。谢谢

cuxqih21

cuxqih211#

检查hadoop-site.xml、core-site.xml和mapred-site.xml中的配置
使用类路径指定其他类路径条目。例如,向类路径添加hadoop配置文件(hadoop-site.xml,core site.xml)

  1. export PIG_CLASSPATH=<path_to_hadoop_conf_dir>

您应该通过先设置pig\U user\U classpath\U来覆盖默认类路径条目

  1. export PIG_USER_CLASSPATH_FIRST=true

之后你就可以启动咕噜壳了

相关问题