fi-ware hadoop,不运行hadoop“hello world”->字数

9lowa7mx  于 2021-06-02  发布在  Hadoop
关注(0)|答案(0)|浏览(159)

我正在尝试使用一个100%基于wordcount的hadoop hello world代码,但是当我尝试将此代码作为restapi的作业运行时,fiware平台返回了一个错误。
下面的代码在我用于测试的私有hadoop集群中运行良好,但在fi-ware平台中没有,我不知道为什么。
代码如下:

package smartive;

  import java.io.IOException;
  import java.util.*;   
  import org.apache.hadoop.fs.Path;
  import org.apache.hadoop.conf.*;
  import org.apache.hadoop.io.*;
  import org.apache.hadoop.mapred.*;
  import org.apache.hadoop.util.*;

  public class Hello {
    public static class Map extends MapReduceBase implements Mapper < LongWritable, Text, Text, IntWritable > {
      private final static IntWritable one = new IntWritable(1);
      private Text word = new Text();

          public void map(LongWritable key, Text value, OutputCollector < Text, IntWritable > output, Reporter reporter) throws IOException {
              String line = value.toString();
              StringTokenizer tokenizer = new StringTokenizer(line);
              while (tokenizer.hasMoreTokens()) {
                  word.set(tokenizer.nextToken());
                  output.collect(word, one);
              }
          }
    }

    public static class Reduce extends MapReduceBase implements Reducer < Text,
      IntWritable,
      Text,
      IntWritable > {
          public void reduce(Text key, Iterator < IntWritable > values, OutputCollector < Text, IntWritable > output, Reporter reporter) throws IOException {
              int sum = 0;
                  while (values.hasNext()) {
                      sum += values.next().get();
                  }
              output.collect(key, new IntWritable(sum));
          }
    }

    public static void main(String[] args) throws Exception {
        JobConf conf = new JobConf(Hello.class);

        conf.setJobName("Hello");
        conf.setOutputKeyClass(Text.class);
        conf.setOutputValueClass(IntWritable.class);
        conf.setMapperClass(Map.class);
        conf.setCombinerClass(Reduce.class);
        conf.setReducerClass(Reduce.class);
        conf.setInputFormat(TextInputFormat.class);
        conf.setOutputFormat(TextOutputFormat.class);

        FileInputFormat.setInputPaths(conf, new Path(args[0]));
        FileOutputFormat.setOutputPath(conf, new Path(args[1]));
        JobClient.runJob(conf);
    }
  }

我用javac编译了前面的代码,并制作了一个jar包,其入口点设置为smartive.hello。
接下来,我对cosmos.lab.fiware运行以下rest调用:

curl -X POST "http://computing.cosmos.lab.fiware.org:12000/tidoop/v1/user/myuser/jobs" -d '{"jar":"Hello.jar","class_name":"Hello","lib_jars":"","input":"data/in","output":"data/out"}' -H "Content-Type: application/json" -H "X-Auth-Token: myOauth2token"

但我得到了一个结果

{"success":"false","error":1}

预期的响应必须如下所示:

{"success":"true","job_id": "job_1460639183882_0001"}

但是我不知道如何在newcosmos上进行调试,因为没有ssh接口,响应上也没有错误消息。
文件在data/in文件夹(test.txt文件)内正常,文件夹data和data/in具有777权限。
有人知道我哪里做错了什么吗?
非常感谢。

暂无答案!

目前还没有任何答案,快来回答吧!

相关问题