我得到一个输入文件作为参数,并将其复制到hdfs中进行一些初始计算。
这个文件将作为我的Map器的输入。
经过一些矩阵计算后,我需要发射2个双阵列到减速机。但当我这么做的时候,它显示了:
13/10/26 09:13:49 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
13/10/26 09:13:49 WARN conf.Configuration: session.id is deprecated. Instead, use dfs.metrics.session-id
13/10/26 09:13:49 INFO jvm.JvmMetrics: Initializing JVM Metrics with processName=JobTracker, sessionId=
13/10/26 09:13:49 WARN mapred.JobClient: Use GenericOptionsParser for parsing the arguments. Applications should implement Tool for the same.
13/10/26 09:13:49 WARN mapred.JobClient: No job jar file set. User classes may not be found. See JobConf(Class) or JobConf#setJar(String).
13/10/26 09:13:49 INFO input.FileInputFormat: Total input paths to process : 1
13/10/26 09:13:50 INFO mapred.LocalJobRunner: OutputCommitter set in config null
13/10/26 09:13:50 INFO mapred.JobClient: Running job: job_local1799392614_0001
13/10/26 09:13:50 INFO mapred.LocalJobRunner: OutputCommitter is org.apache.hadoop.mapreduce.lib.output.FileOutputCommitter
13/10/26 09:13:50 INFO mapred.LocalJobRunner: Waiting for map tasks
13/10/26 09:13:50 INFO mapred.LocalJobRunner: Starting task: attempt_local1799392614_0001_m_000000_0
13/10/26 09:13:50 WARN mapreduce.Counters: Group org.apache.hadoop.mapred.Task$Counter is deprecated. Use org.apache.hadoop.mapreduce.TaskCounter instead
13/10/26 09:13:50 INFO util.ProcessTree: setsid exited with exit code 0
13/10/26 09:13:50 INFO mapred.Task: Using ResourceCalculatorPlugin : org.apache.hadoop.util.LinuxResourceCalculatorPlugin@40cdc794
13/10/26 09:13:50 INFO mapred.MapTask: Map output collector class = org.apache.hadoop.mapred.MapTask$MapOutputBuffer
13/10/26 09:13:50 INFO mapred.MapTask: io.sort.mb = 100
13/10/26 09:13:50 INFO mapred.MapTask: data buffer = 79691776/99614720
13/10/26 09:13:50 INFO mapred.MapTask: record buffer = 262144/327680
13/10/26 09:13:50 INFO mapred.LocalJobRunner: Map task executor complete.
13/10/26 09:13:50 WARN mapred.LocalJobRunner: job_local1799392614_0001
java.lang.Exception: java.lang.ClassCastException: class edu.Driver$DoubleArrayWritable
at org.apache.hadoop.mapred.LocalJobRunner$Job.run(LocalJobRunner.java:404)
Caused by: java.lang.ClassCastException: class edu.Driver$DoubleArrayWritable
at java.lang.Class.asSubclass(Class.java:3037)
at org.apache.hadoop.mapred.JobConf.getOutputKeyComparator(JobConf.java:819)
at org.apache.hadoop.mapred.MapTask$MapOutputBuffer.init(MapTask.java:836)
at org.apache.hadoop.mapred.MapTask.createSortingCollector(MapTask.java:376)
at org.apache.hadoop.mapred.MapTask.access$100(MapTask.java:85)
at org.apache.hadoop.mapred.MapTask$NewOutputCollector.<init>(MapTask.java:584)
at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:656)
at org.apache.hadoop.mapred.MapTask.run(MapTask.java:330)
at org.apache.hadoop.mapred.LocalJobRunner$Job$MapTaskRunnable.run(LocalJobRunner.java:266)
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471)
at java.util.concurrent.FutureTask$Sync.innerRun(FutureTask.java:334)
at java.util.concurrent.FutureTask.run(FutureTask.java:166)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1110)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:603)
at java.lang.Thread.run(Thread.java:722)
13/10/26 09:13:51 INFO mapred.JobClient: map 0% reduce 0%
13/10/26 09:13:51 INFO mapred.JobClient: Job complete: job_local1799392614_0001
13/10/26 09:13:51 INFO mapred.JobClient: Counters: 0
我的驾驶课是:
job.setJarByClass(Driver.class);
job.setMapperClass(Mapper.class);
job.setReducerClass(Reducer.class);
job.setMapOutputKeyClass(DoubleArrayWritable.class);
job.setMapOutputValueClass(DoubleArrayWritable.class);
job.setOutputKeyClass(TextOutputFormat.class);
job.setOutputValueClass(DoubleArrayWritable.class);
job.setInputFormatClass(TextInputFormat.class);
job.setOutputFormatClass(TextOutputFormat.class);
FileInputFormat.setInputPaths(job, "in/inputfile");
FileOutputFormat.setOutputPath(job,new Path(args[1]));
我的二维数组是:
public static class DoubleArrayWritable extends TwoDArrayWritable {
public DoubleArrayWritable() {
super(DoubleWritable.class);
}
}
我的Map器类:
public class Mapper extends Mapper<Object, Text, DoubleArrayWritable, DoubleArrayWritable> {
public void map(Object key, Text value, Context context)
throws IOException, InterruptedException {
DoubleArrayWritable EArray = new DoubleArrayWritable();
DoubleWritable[][] Edata = new DoubleWritable[Erow][Ecol];
for (int k = 0; k < Erow; k++) {
for(int j=0;j< Ecol;j++){
Edata[k][j] = new DoubleWritable(E[k][j]);
}
}
EArray.set(Edata);
DoubleArrayWritable EDArray = new DoubleArrayWritable();
DoubleWritable[][] EDdata = new DoubleWritable[EtransDerow][EtransDecol];
for (int k1 = 0; k1 < EtransDerow; k1++) {
for(int j1=0;j1< EtransDecol;j1++) {
EDdata[k1][j1] = new DoubleWritable(ED[k1][j1]);
}
}
EDArray.set(EDdata);
context.write(EArray, EDArray);
}
}
减速器等级:
public class Reducer extends Reducer<DoubleArrayWritable, DoubleArrayWritable, IntWritable, Text> {
public void reduce(Iterable<DoubleArrayWritable> key,
Iterable<DoubleArrayWritable> values, Context context)
throws IOException, InterruptedException {
System.out.println("Entered into reducer. successfull");
}
}
我在驾驶课上做错什么了吗?我怎么才能弄清楚呢?
更新最初我试图发出只有1个双数组作为值和键作为intwriteable(转储值检查是否工作)。它实际上与下面的签名一起工作。我可以“进入减速机”。“成功”
驱动程序代码:
job.setJarByClass(Driver.class);
job.setMapperClass(Mapper.class);
job.setReducerClass(Reducer.class);
job.setMapOutputKeyClass(IntWritable.class);
job.setMapOutputValueClass(DoubleArrayWritable.class);
job.setOutputKeyClass(TextOutputFormat.class);
job.setOutputValueClass(DoubleArrayWritable.class);
job.setInputFormatClass(TextInputFormat.class);
job.setOutputFormatClass(TextOutputFormat.class);
FileInputFormat.setInputPaths(job, "in/inputfile");
FileOutputFormat.setOutputPath(job,new Path(args[1]));
Map器:
public class Mapper extends
Mapper<Object, Text, IntWritable, DoubleArrayWritable> {
public void map(Object key, Text value, Context context)
throws IOException, InterruptedException {
IntWritable clusterNumber = null;
int count = 0;
clusterNumber = new IntWritable(count);
context.write(clusterNumber, EArray);
}
}
减速器:
public class Reducer extends
Reducer<IntWritable, DoubleArrayWritable, IntWritable, Text> {
public void reduce(IntWritable key,
Iterable<DoubleArrayWritable> values, Context context)
throws IOException, InterruptedException {
System.out.println("Entered into reducer. successfull");
}
}
其次,我尝试发射两个双数组。通过更改配置,Map输出键/值和reducer输出键/值。好像有什么问题 configuration
.
暂无答案!
目前还没有任何答案,快来回答吧!