cleanup(context)方法做什么?

0pizxfdo  于 2021-06-02  发布在  Hadoop
关注(0)|答案(1)|浏览(267)

我不明白hadoop中的cleanup方法到底是做什么的,它是如何工作的?我有下面的map reduce代码来计算一组数字的最大值、最小值和平均值。

public class Statistics 
{
    public static class Map extends Mapper<LongWritable, Text, Text, Text>
    {
        public void map(LongWritable key, Text value, Context context) throws IOException, InterruptedException 
        {
            /* code to calculate min, max, and mean from among a bunch of numbers */
        }
        public void cleanup(Context context) throws IOException, InterruptedException
        {
            Text key_min = new Text();
            key_min.set("min");
            Text value_min = new Text();
            value_min.set(String.valueOf(min));
            context.write(key_min,value_min);

            Text key_max = new Text();
            key_max.set("max");
            Text value_max = new Text();
            value_max.set(String.valueOf(max));
            context.write(key_max,value_max);

            Text key_avg = new Text();
            key_avg.set("avg");
            Text value_avg = new Text();
            value_avg.set(String.valueOf(linear_sum)+","+count);
            context.write(key_avg,value_avg);

            Text key_stddev = new Text();
            key_stddev.set("stddev");
            Text value_stddev = new Text();
            value_stddev.set(String.valueOf(linear_sum)+","+count+","+String.valueOf(quadratic_sum));
            context.write(key_stddev,value_stddev);
        }
    }
    public static class Reduce extends Reducer<Text,Text,Text,Text>
    {
        public void reduce(Text key, Iterable<Text> values,Context context) throws IOException, InterruptedException 
        {
            /* code to further find min, max and mean from among the outputs of different mappers */
        }
    }
    public static void main(String[] args) throws Exception 
    {
        /* driver program */
    }
}

那么到底是什么原因呢 cleanup(Context context) 你在这里干什么?我假设它从一堆Map器中收集输出(键、值)对,并将其传递给reducer。在其他网站上,我读到mapreduce的运行顺序是:安装->Map->清理,然后安装->减少->清理。为什么这个程序不使用安装方法?

7uhlpewt

7uhlpewt1#

这些值不能在mapper中计算,必须在reduce步骤中计算。https://hadoop.apache.org/docs/current/hadoop-mapreduce-client/hadoop-mapreduce-client-core/mapreducetutorial.html#reducer

相关问题