自定义分区导致arrayindexountofbounds错误

klr1opcd  于 2021-06-03  发布在  Hadoop
关注(0)|答案(0)|浏览(277)

运行代码时,出现以下异常:

hadoop@hadoop:~/testPrograms$ hadoop jar cp.jar CustomPartition /test/test.txt /test/output33
15/03/03 16:33:33 INFO Configuration.deprecation: session.id is deprecated. Instead, use dfs.metrics.session-id
15/03/03 16:33:33 INFO jvm.JvmMetrics: Initializing JVM Metrics with processName=JobTracker, sessionId=
15/03/03 16:33:33 WARN mapreduce.JobSubmitter: Hadoop command-line option parsing not performed. Implement the Tool interface and execute your application with ToolRunner to remedy this.
15/03/03 16:33:33 INFO input.FileInputFormat: Total input paths to process : 1
15/03/03 16:33:34 INFO mapreduce.JobSubmitter: number of splits:1
15/03/03 16:33:34 INFO mapreduce.JobSubmitter: Submitting tokens for job: job_local1055584612_0001
15/03/03 16:33:35 INFO mapreduce.Job: The url to track the job: http://localhost:8080/
15/03/03 16:33:35 INFO mapreduce.Job: Running job: job_local1055584612_0001
15/03/03 16:33:35 INFO mapred.LocalJobRunner: OutputCommitter set in config null
15/03/03 16:33:35 INFO mapred.LocalJobRunner: OutputCommitter is org.apache.hadoop.mapreduce.lib.output.FileOutputCommitter
15/03/03 16:33:35 INFO mapred.LocalJobRunner: Waiting for map tasks
15/03/03 16:33:35 INFO mapred.LocalJobRunner: Starting task: attempt_local1055584612_0001_m_000000_0
15/03/03 16:33:35 INFO mapred.Task:  Using ResourceCalculatorProcessTree : [ ]
15/03/03 16:33:35 INFO mapred.MapTask: Processing split: hdfs://node1/test/test.txt:0+107
15/03/03 16:33:35 INFO mapred.MapTask: (EQUATOR) 0 kvi 26214396(104857584)
15/03/03 16:33:35 INFO mapred.MapTask: mapreduce.task.io.sort.mb: 100
15/03/03 16:33:35 INFO mapred.MapTask: soft limit at 83886080
15/03/03 16:33:35 INFO mapred.MapTask: bufstart = 0; bufvoid = 104857600
15/03/03 16:33:35 INFO mapred.MapTask: kvstart = 26214396; length = 6553600
15/03/03 16:33:35 INFO mapred.MapTask: Map output collector class = org.apache.hadoop.mapred.MapTask$MapOutputBuffer
15/03/03 16:33:35 INFO mapred.MapTask: Starting flush of map output
15/03/03 16:33:35 INFO mapred.LocalJobRunner: map task executor complete.
15/03/03 16:33:35 WARN mapred.LocalJobRunner: job_local1055584612_0001
java.lang.Exception: java.lang.ArrayIndexOutOfBoundsException: 2
    at org.apache.hadoop.mapred.LocalJobRunner$Job.runTasks(LocalJobRunner.java:462)
    at org.apache.hadoop.mapred.LocalJobRunner$Job.run(LocalJobRunner.java:522)
Caused by: java.lang.ArrayIndexOutOfBoundsException: 2
    at CustomPartition$MapperClass.map(CustomPartition.java:27)
    at CustomPartition$MapperClass.map(CustomPartition.java:17)
    at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:145)
    at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:784)
    at org.apache.hadoop.mapred.MapTask.run(MapTask.java:341)
    at org.apache.hadoop.mapred.LocalJobRunner$Job$MapTaskRunnable.run(LocalJobRunner.java:243)
    at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471)
    at java.util.concurrent.FutureTask.run(FutureTask.java:262)
    at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
    at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
    at java.lang.Thread.run(Thread.java:745)
15/03/03 16:33:36 INFO mapreduce.Job: Job job_local1055584612_0001 running in uber mode : false
15/03/03 16:33:36 INFO mapreduce.Job:  map 0% reduce 0%
15/03/03 16:33:36 INFO mapreduce.Job: Job job_local1055584612_0001 failed with state FAILED due to: NA
15/03/03 16:33:36 INFO mapreduce.Job: Counters: 0

我试着根据人们玩的游戏来划分。每个单词用一个制表符隔开。在三个字段之后,我按回车键得到下一行。
我的代码:

public class CustomPartition {

      public static class MapperClass    extends Mapper<Object, Text, Text, Text>{

          public void map(Object key, Text value, Context context) throws IOException, InterruptedException {
               String itr[] = value.toString().split("\t");
               String game=itr[2].toString();
               String nameGoals=itr[0]+"\t"+itr[1];
               context.write(new Text(game), new Text(nameGoals));
          }   
      }

      public static class GoalPartition extends Partitioner<Text, Text>   {

           @Override   
           public int getPartition(Text key,Text value, int numReduceTasks){

                if(key.toString()=="football")
                   {return 0;}

                else if(key.toString()=="basketball")
                   {return 1;}

                else// (key.toString()=="icehockey")
                   {return 2;}   
           }  
      }

     public static class ReducerClass extends Reducer<Text,Text,Text,Text> {

          @Override
          public void reduce(Text key, Iterable<Text> values, Context context) throws IOException, InterruptedException {
               String name="";
               String game="";
               int maxGoals=0;
               for (Text val : values)
               {
                   String valTokens[]= val.toString().split("\t");
                   int goals = Integer.parseInt(valTokens[1]);

                   if(goals > maxGoals)
                   {
                        name = valTokens[0];
                        game = key.toString();
                        maxGoals = goals;
                        context.write(new Text(name), new Text ("game"+game+"score"+maxGoals));
                   }
               } 
          }   
          public static void main(String[] args) throws Exception {
               Configuration conf = new Configuration();
               Job job = Job.getInstance(conf, "custom partition");
               job.setJarByClass(CustomPartition.class);
               job.setMapperClass(MapperClass.class);
               job.setCombinerClass(ReducerClass.class);
               job.setPartitionerClass(GoalPartition.class);
               job.setReducerClass(ReducerClass.class);
               job.setOutputKeyClass(Text.class);
               job.setOutputValueClass(Text.class);
               FileInputFormat.addInputPath(job, new Path(args[0]));
               FileOutputFormat.setOutputPath(job, new Path(args[1]));
               System.exit(job.waitForCompletion(true) ? 0 : 1);   
          } 
 }

暂无答案!

目前还没有任何答案,快来回答吧!

相关问题