我有两个Map器类。如此使用 ChainMapper.addMapper
方法添加Map器和 ChainReducer.setReducer
设置减速器的方法。 ChainMapper.addMapper
方法是可以的,但是 Chain.setReducer
方法抛出语法错误
The method setReducer(Job, Class<? extends Reducer>, Class<?>, Class<?>, Class<?>, Class<?>, Configuration) in the type ChainReducer is not applicable for the arguments (JobConf, Class<FileComparisionReduce>, Class<LongWritable>, Class<Text>, Class<LongWritable>, Class<Text>, boolean, JobConf)
这是我的驾驶课:
package fileComparision;
import org.apache.hadoop.conf.Configured;
import org.apache.hadoop.fs.Path;
import org.apache.hadoop.io.LongWritable;
import org.apache.hadoop.io.Text;
import org.apache.hadoop.mapred.JobClient;
import org.apache.hadoop.mapred.JobConf;
import org.apache.hadoop.mapred.FileInputFormat;
import org.apache.hadoop.mapred.FileOutputFormat;
import org.apache.hadoop.mapred.MapReduceBase;
import org.apache.hadoop.mapred.Mapper;
import org.apache.hadoop.mapred.lib.ChainMapper;
import org.apache.hadoop.mapreduce.lib.chain.ChainReducer;
import org.apache.hadoop.util.Tool;
import org.apache.hadoop.util.ToolRunner;
public class DriverComparision extends Configured implements Tool{
@Override
public int run(String[] arg0) throws Exception {
JobConf conf = new JobConf(true);
conf.setJobName("Comaprision of 2 file ");
JobConf Mapper1 = new JobConf(false);
ChainMapper.addMapper(conf, FileComparisionMapper1.class, LongWritable.class, Text.class, LongWritable.class, Text.class, true, Mapper1);
JobConf Mapper2 = new JobConf(false);
ChainMapper.addMapper(conf, FileComparisionMapper2.class, LongWritable.class, Text.class, LongWritable.class, Text.class, true, Mapper2);
JobConf Reduc = new JobConf(false);
ChainReducer.setReducer(conf, FileComparisionReduce.class, LongWritable.class, Text.class, LongWritable.class, Text.class, true , Reduc);
FileInputFormat.setInputPaths(conf, new Path(arg0[0]));
FileOutputFormat.setOutputPath(conf, new Path(arg0[1]));
conf.setMapOutputKeyClass(LongWritable.class);
conf.setMapOutputValueClass(Text.class);
conf.setOutputKeyClass(LongWritable.class);
conf.setOutputValueClass(Text.class);
JobClient.runJob(conf);
return 0;
}
还尝试删除布尔参数“true”
JobConf Reduc = new JobConf(false);
ChainReducer.setReducer(conf, FileComparisionReduce.class, LongWritable.class, Text.class, LongWritable.class, Text.class, true , Reduc);
1条答案
按热度按时间i7uq4tfw1#
最后,我找到了解决办法。进口错误的 Package 。
import org.apache.hadoop.mapreduce.lib.chain.ChainReducer
; 而不是import org.apache.hadoop.mapred.lib.ChainReducer;