在cloudera hadoop中构建mapreduce程序时出错

jckbn6z7  于 2021-06-03  发布在  Hadoop
关注(0)|答案(2)|浏览(373)

在hadoop中构建mapreduce文件时出现以下错误。我正在使用ClouderaHadoop发行版。testmr\u classes是一个文件夹,testmr.java是mapreduce文件

[cloudera@localhost ~]$ echo `hadoop classpath`
/etc/hadoop/conf:/usr/lib/hadoop/lib/*:/usr/lib/hadoop/.//*:/usr/lib/hadoop-hdfs/./:/usr/lib/hadoop-hdfs/lib/*:/usr/lib/hadoop-hdfs/.//*:/usr/lib/hadoop-yarn/lib/*:/usr/lib/hadoop-yarn/.//*:/usr/lib/hadoop-0.20-mapreduce/./:/usr/lib/hadoop-0.20-mapreduce/lib/*:/usr/lib/hadoop-0.20-mapreduce/.//*
[cloudera@localhost ~]$

[cloudera@localhost ~]$ javac -classpath `hadoop classpath`:. -d testmr_classes TestMR.java
TestMR.java:32: TestMR.Reduce is not abstract and does not override abstract method reduce(org.apache.hadoop.io.IntWritable,java.util.Iterator<org.apache.hadoop.io.Text>,org.apache.hadoop.mapred.OutputCollector<org.apache.hadoop.io.IntWritable,org.apache.hadoop.io.DoubleWritable>,org.apache.hadoop.mapred.Reporter) in org.apache.hadoop.mapred.Reducer
    public static class Reduce extends MapReduceBase implements Reducer<IntWritable,Text,IntWritable,DoubleWritable>
                  ^
1 error
[cloudera@localhost ~]$

下面是testmr.java的内容,

import java.io.IOException;
import java.util.*;

import org.apache.hadoop.fs.Path;
import org.apache.hadoop.conf.*;
import org.apache.hadoop.io.*;
import org.apache.hadoop.mapred.*;
import org.apache.hadoop.mapred.Reducer;
import org.apache.hadoop.util.*;

public class TestMR 
{
    public static class Map extends MapReduceBase implements Mapper<IntWritable,Text,IntWritable,Text>
    {
        private IntWritable key = new IntWritable();
        private Text value = new Text();

        public void map(IntWritable key, Text line, OutputCollector<IntWritable, Text> output, Reporter reporter) throws IOException
        {
            String [] split = line.toString().split(",");
            key.set(Integer.parseInt(split[0]));

            if(split[2] == "Test")
            {
                value.set(split[4] + "," + split[7]);
                output.collect(key, value);
            }
        }
    }

    public static class Reduce extends MapReduceBase implements Reducer<IntWritable,Text,IntWritable,DoubleWritable>
    {
        public void reduce(IntWritable key, Iterable<Text> v, OutputCollector<IntWritable, DoubleWritable> output, Reporter reporter) throws IOException
        {
            Iterator values = v.iterator();
            while(values.hasNext())
            {
                String [] tmp_buf_1 = values.next().toString().split(",");
                String V1 = tmp_buf_1[0];
                String T1 = tmp_buf_1[1];

                if(!values.hasNext())   
                    break;  

                String [] tmp_buf_2 = values.next().toString().split(",");       
                String V2 = tmp_buf_2[0];
                String T2 = tmp_buf_2[1];           

                double dResult = (Double.parseDouble(V2) - Double.parseDouble(V1)) / (Double.parseDouble(T2) - Double.parseDouble(T1));

                output.collect(key, new DoubleWritable(dResult));
            }
        }
    }

    public static void main(String[] args) throws Exception
    {
        JobConf conf = new JobConf(TestMR.class);
        conf.setJobName("TestMapReduce");

        conf.setOutputKeyClass(IntWritable.class);
        conf.setOutputValueClass(DoubleWritable.class);

        conf.setMapperClass(Map.class);
        conf.setCombinerClass(Reduce.class);
        conf.setReducerClass(Reduce.class);

        conf.setInputFormat(TextInputFormat.class);
        conf.setOutputFormat(TextOutputFormat.class);

        FileInputFormat.setInputPaths(conf, new Path(args[0]));
        FileOutputFormat.setOutputPath(conf, new Path(args[1]));

        JobClient.runJob(conf);
    }
}

这是我第一次尝试mapreduce,如果我在这里遗漏了什么,我会很高兴的。

lx0bsm1f

lx0bsm1f1#

您需要指定类路径来代替“hadoop classpath”,例如:

javac -classpath /opt/hadoop-1.1.2/hadoop-core-1.1.2.jar:/opt/hadoop-1.1.2/lib/commons-cli-1.2.jar -d testmr_classes TestMR.java

具体路径取决于安装hadoop的位置。

mbskvtky

mbskvtky2#

仔细查看reduce()的第二个参数和错误。您编写了iterable,但它提供了一个迭代器。

相关问题