尽管我设置了MapOutputKey类,但map中的键类型不匹配

vsikbqxv  于 2021-06-04  发布在  Hadoop
关注(0)|答案(2)|浏览(316)

即使设置了map output key类和map output value类,我还是从map得到了类型不匹配。这是我的示例代码。

public class NgramCount{
  protected final static String RAWCOUNTDIR = "raw−counts";
  public static class countMap extends Mapper<LongWritable, Text, Text, IntWritable> {
    private final static IntWritable one = new IntWritable (1) ;

    public void map(LongWritable key, Text value, OutputCollector<Text, IntWritable> output, Reporter reporter) 
    throws IOException {
        String line = value.toString();
        output.collect(value,one);
    }
  }
  public static class countReduce extends Reducer<Text, IntWritable, Text, IntWritable> {
    public void reduce(Text key, Iterator<IntWritable> values, OutputCollector<Text, IntWritable> output, Reporter reporter)
    throws IOException{
        int sum = 0;
        while(values.hasNext()) {
            sum += values.next().get();
        }
        output.collect(key, new IntWritable(sum));
    }
  }

  public static void main(String[] args) throws Exception
  {
    Configuration conf = new Configuration();
    Job jobA = new Job(conf);
    jobA.setJarByClass(NgramCount.class);
    jobA.setOutputKeyClass(Text.class);
    // the values are counts(ints)
    jobA.setOutputValueClass(IntWritable.class);
    jobA.setMapperClass(NgramCount.countMap.class);
    jobA.setReducerClass(NgramCount.countReduce.class);

    jobA.setMapOutputKeyClass(Text.class);
    jobA.setMapOutputValueClass(IntWritable.class);
    FileInputFormat.addInputPath(jobA, new Path(args[0]));
    FileOutputFormat.setOutputPath(jobA, new Path(RAWCOUNTDIR));
    jobA.waitForCompletion(true);
    System.out.println("Job1 finished.");
  }
}

这是我收到的错误。

15/01/17 14:16:21 INFO mapreduce.Job: Task Id : attempt_1421481783919_0005_m_000000_0, Status : FAILED
Error: java.io.IOException: Type mismatch in key from map: expected org.apache.hadoop.io.Text, received org.apache.hadoop.io.LongWritable
at org.apache.hadoop.mapred.MapTask$MapOutputBuffer.collect(MapTask.java:1050)
at org.apache.hadoop.mapred.MapTask$NewOutputCollector.write(MapTask.java:691)
at org.apache.hadoop.mapreduce.task.TaskInputOutputContextImpl.write(TaskInputOutputContextImpl.java:89)
at org.apache.hadoop.mapreduce.lib.map.WrappedMapper$Context.write(WrappedMapper.java:112)
at org.apache.hadoop.mapreduce.Mapper.map(Mapper.java:124)
at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:145)
at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:763)
at org.apache.hadoop.mapred.MapTask.run(MapTask.java:339)
at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:162)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:415)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1491)
at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:157)

请帮帮我。我被困在这里了

m0rkklqb

m0rkklqb1#

你的Map程序代码不对。。你应该写文本键。。这不是写出来的。

private Text word = new Text();
public void map(LongWritable key, Text value, OutputCollector<Text, IntWritable> output, Reporter reporter) 
throws IOException {
    String line = value.toString();
    word.set(line);
    output.collect(value,one);

}
h7wcgrx3

h7wcgrx32#

在map和reduce函数中,您混淆了密钥类型的使用:例如,您告诉Map程序在编写output.collect(value,one)时发送此签名的输出(longwritable key,text value),其中value的类型为text,而不是longwritable。

相关问题