最大温度mapreduce java代码中的运行时错误

lxkprmvk  于 2021-05-29  发布在  Hadoop
关注(0)|答案(4)|浏览(361)

我正在运行一个mapreduce代码,得到的错误是

Error: java.lang.ClassCastException: org.apache.hadoop.io.LongWritable cannot be cast to org.apache.hadoop.io.IntWritable
        at test.temp$Mymapper.map(temp.java:1)
        at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:146)
        at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:787)
        at org.apache.hadoop.mapred.MapTask.run(MapTask.java:341)
        at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:164)
        at java.security.AccessController.doPrivileged(Native Method)
        at javax.security.auth.Subject.doAs(Subject.java:415)
        at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1657)
        at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:158)

代码如下:

package test;

import java.io.IOException;

import org.apache.hadoop.conf.Configuration;
import org.apache.hadoop.fs.Path;
import org.apache.hadoop.io.IntWritable;
import org.apache.hadoop.io.Text;
//import org.apache.hadoop.mapred.JobConf;
import org.apache.hadoop.mapreduce.Job;
import org.apache.hadoop.mapreduce.Mapper;
import org.apache.hadoop.mapreduce.Reducer;
import org.apache.hadoop.mapreduce.lib.input.FileInputFormat;
import org.apache.hadoop.mapreduce.lib.output.FileOutputFormat;

public class temp {
    public static class Mymapper extends Mapper<Object, Text, IntWritable,Text> {

        public void map(Object key, Text value,Context context) throws IOException, InterruptedException{

            int month=Integer.parseInt(value.toString().substring(17, 19));
            IntWritable mon=new IntWritable(month);
            String temp=value.toString().substring(27,31);
            String t=null;
            for(int i=0;i<temp.length();i++){
                if(temp.charAt(i)==',')
                        break;

                else
                    t=t+temp.charAt(i);
            }

            Text data=new Text(value.toString().substring(22, 26)+t);
            context.write(mon, data);
        }

    }

    public static class Myreducer extends  Reducer<IntWritable,Text,IntWritable,IntWritable> {

        public void reduce(IntWritable key,Iterable<Text> values,Context context) throws IOException, InterruptedException{
            String temp="";
            int max=0;
            for(Text t:values)
            {
                temp=t.toString();
                if(temp.substring(0, 4)=="TMAX"){

                    if(Integer.parseInt(temp.substring(4,temp.length()))>max){
                        max=Integer.parseInt(temp.substring(4,temp.length()));
                    }
                }
            }

            context.write(key,new IntWritable(max));
        }

        }

    public static void main(String[] args) throws Exception {
        Configuration conf = new Configuration();
        Job job = Job.getInstance(conf, "temp");
        job.setJarByClass(temp.class);
        job.setMapperClass(Mymapper.class);
        job.setCombinerClass(Myreducer.class);
        job.setReducerClass(Myreducer.class);
        job.setOutputKeyClass(IntWritable.class);
        job.setOutputValueClass(IntWritable.class);

        FileInputFormat.addInputPath(job, new Path(args[0]));
        FileOutputFormat.setOutputPath(job, new Path(args[1]));
        job.waitForCompletion(true);

        }

}

输入文件是
USC0037919000101,tmax,-78,,,6,USC0037919000101,tmax,-133,,,6,USC0037919000101,tmax,127,,,6
请回复和帮助!

kh212irz

kh212irz1#

当你在你的驱动程序中设置下面的,

job.setOutputKeyClass(IntWritable.class);
job.setOutputValueClass(IntWritable.class);

它定义了 output Map器和还原器的类,而不仅仅是还原器。
这意味着你的Map绘制者应该 connect.write(IntWritable, IntWritable) ,但是你已经编码了 connect.write(IntWritable, Text) .
修正:当Map输出类型与reduce输出不同时,需要显式设置Map器的输出类型。因此,在你的驱动程序代码中添加以下内容。

job.setMapOutputKeyClass(IntWritable.class);
job.setMapOutputValueClass(Text.class);
cbjzeqam

cbjzeqam2#

您已经在驱动程序中设置了以下内容:

job.setOutputKeyClass(IntWritable.class);
job.setOutputValueClass(IntWritable.class);

这意味着,Map器和reducer输出键类都应该是 IntWritable 价值等级应该是 IntWritable .
减速器良好:

public static class Myreducer extends  Reducer<IntWritable,Text,IntWritable,IntWritable>

这里输出键和值都是 IntWritable .
Map器有问题:

public static class Mymapper extends Mapper<Object, Text, IntWritable,Text>

这里的输出键类是 IntWritable . 但是,输出值类是 Text (预计 IntWritable ).
如果Map器的输出键/值类与reducer的输出键/值类不同,则需要向驱动程序显式添加以下语句:

setMapOutputKeyClass();
setMapOutputValueClass();

对代码进行以下更改:
设置Map输出键和值类:在您的情况下,由于Map器和还原器输出键和值类不同,因此需要设置以下内容:

job.setMapOutputKeyClass(IntWritable.class);
job.setMapOutputValueClass(Text.class);

job.setOutputKeyClass(IntWritable.class);
job.setOutputValueClass(IntWritable.class);

禁用合并器:因为您正在使用 Reducer 你的代码 Combiner ,输出 CombinerIntwritable 以及 IntWritable . 但是,这个 Reducer 期望输入为 IntWritable 以及 Text . 因此,您将得到以下异常,因为它的值为 IntWritable 而不是 Text :

Error: java.io.IOException: wrong value class: class org.apache.hadoop.io.IntWritable is not class org.apache.hadoop.io.Text

要删除此错误,需要禁用 Combiner :

job.setCombinerClass(Myreducer.class);

不要将reducer用作组合器:如果确实需要使用组合器,那么就编写一个输出键/值为的组合器 IntWritable 以及 Text .

vd2z7a6w

vd2z7a6w3#

认为您正在使用textinputformat作为作业的输入格式。它生成longwritable/text,hadoop正从中派生map输出类。
尝试显式设置Map输出类并移除组合器:

job.setMapOutputKeyClass(IntWritable.class);
job.setMapOutputValueClass(Text.class);
// job.setCombinerClass(Myreducer.class);

合路器将只工作,如果Map和减少输出兼容!

b4wnujal

b4wnujal4#

这就是我所做的改变。

public static void main(String[] args) throws Exception {

        Configuration conf = new Configuration();
        Job job = Job.getInstance(conf, "temp");

        job.setJarByClass(Temp.class);

        job.setMapperClass(Mymapper.class);
        job.setReducerClass(Myreducer.class);

        job.setMapOutputKeyClass(IntWritable.class);
        job.setMapOutputValueClass(Text.class);

        job.setOutputKeyClass(IntWritable.class);
        job.setOutputValueClass(IntWritable.class);

        FileInputFormat.addInputPath(job, new Path(args[0]));
        FileOutputFormat.setOutputPath(job, new Path(args[1]));

        job.setNumReduceTasks(1);
        job.waitForCompletion(true);
    }

输出:10 0
如需解释,请参阅manjunath ballur的帖子。

相关问题