在cygwin接收“错误:找不到或加载主类work\work”的情况下使用hadoop

lp0sw83n  于 2021-06-04  发布在  Hadoop
关注(0)|答案(0)|浏览(254)

我正在Windows7上使用cygwin尝试在Hadoop1.2.1中设置一个节点。我遵循这个教程。我能够很好地创建输入目录,并将.xml文件复制到输入目录。问题似乎是当我执行 $ bin/hadoop jar hadoop-examples-*.jar grep input output 'dfs [a-z.]+' 它在命令行中抛出“error:could not find or load main class work\work”。我已经检查了源代码(如下所列,看起来像python),并且显示了一个main方法。我还尝试了原始命令行调用的变体,例如, $ bin/hadoop jar hadoop-examples-*.jar main input output 'dfs [a-z.]+' 等等。
我的问题是:为什么hadoop没有阅读这个主要方法?我怎样才能理解这个主要方法呢?当cygwin说“work/work”时,它告诉我什么?源代码是用python编写并编译成.jar格式的这一事实有什么意义吗?

  1. from org.apache.hadoop.fs import Path
  2. from org.apache.hadoop.io import *
  3. from org.apache.hadoop.mapred import *
  4. import sys
  5. import getopt
  6. class WordCountMap(Mapper, MapReduceBase):
  7. one = IntWritable(1)
  8. def map(self, key, value, output, reporter):
  9. for w in value.toString().split():
  10. output.collect(Text(w), self.one)
  11. class Summer(Reducer, MapReduceBase):
  12. def reduce(self, key, values, output, reporter):
  13. sum = 0
  14. while values.hasNext():
  15. sum += values.next().get()
  16. output.collect(key, IntWritable(sum))
  17. def printUsage(code):
  18. print "wordcount [-m <maps>] [-r <reduces>] <input> <output>"
  19. sys.exit(code)
  20. def main(args):
  21. conf = JobConf(WordCountMap);
  22. conf.setJobName("wordcount");
  23. conf.setOutputKeyClass(Text);
  24. conf.setOutputValueClass(IntWritable);
  25. conf.setMapperClass(WordCountMap);
  26. conf.setCombinerClass(Summer);
  27. conf.setReducerClass(Summer);
  28. try:
  29. flags, other_args = getopt.getopt(args[1:], "m:r:")
  30. except getopt.GetoptError:
  31. printUsage(1)
  32. if len(other_args) != 2:
  33. printUsage(1)
  34. for f,v in flags:
  35. if f == "-m":
  36. conf.setNumMapTasks(int(v))
  37. elif f == "-r":
  38. conf.setNumReduceTasks(int(v))
  39. conf.setInputPath(Path(other_args[0]))
  40. conf.setOutputPath(Path(other_args[1]))
  41. JobClient.runJob(conf);
  42. if __name__ == "__main__":
  43. main(sys.argv)

暂无答案!

目前还没有任何答案,快来回答吧!

相关问题