org.apache.hadoop.mapreduce.Mapper.run()方法的使用及代码示例

x33g5p2x  于2022-01-25 转载在 其他  
字(6.5k)|赞(0)|评价(0)|浏览(222)

本文整理了Java中org.apache.hadoop.mapreduce.Mapper.run()方法的一些代码示例,展示了Mapper.run()的具体用法。这些代码示例主要来源于Github/Stackoverflow/Maven等平台,是从一些精选项目中提取出来的代码,具有较强的参考意义,能在一定程度帮忙到你。Mapper.run()方法的具体详情如下:
包路径:org.apache.hadoop.mapreduce.Mapper
类名称:Mapper
方法名:run

Mapper.run介绍

[英]Expert users can override this method for more complete control over the execution of the Mapper.
[中]专家用户可以覆盖此方法,以便更完整地控制映射器的执行。

代码示例

代码示例来源:origin: apache/ignite

  1. /** {@inheritDoc} */
  2. @Override public void run(Context ctx) throws IOException, InterruptedException {
  3. try {
  4. super.run(ctx);
  5. }
  6. catch (HadoopTaskCancelledException e) {
  7. cancelledTasks.incrementAndGet();
  8. throw e;
  9. }
  10. }

代码示例来源:origin: apache/ignite

  1. mapper.run(new WrappedMapper().getMapContext(hadoopCtx));

代码示例来源:origin: asakusafw/asakusafw

  1. /**
  2. * Invokes {@code Mapper#run(Context)} internally.
  3. * Clients can override this method and implement customized {@code run} method.
  4. * @param context current context
  5. * @throws IOException if task is failed by I/O error
  6. * @throws InterruptedException if task execution is interrupted
  7. */
  8. protected void runInternal(Context context) throws IOException, InterruptedException {
  9. super.run(context);
  10. }

代码示例来源:origin: ShifuML/shifu

  1. @Override
  2. public void run() {
  3. try {
  4. mapper.run(subcontext);
  5. } catch (Throwable ie) {
  6. throwable = ie;
  7. } finally {
  8. try {
  9. reader.close();
  10. } catch (IOException ignore) {
  11. }
  12. }
  13. }
  14. }

代码示例来源:origin: ShifuML/shifu

  1. @Override
  2. public void run() {
  3. try {
  4. mapper.run(subcontext);
  5. } catch (Throwable ie) {
  6. throwable = ie;
  7. } finally {
  8. try {
  9. reader.close();
  10. } catch (IOException ignore) {
  11. }
  12. }
  13. }
  14. }

代码示例来源:origin: com.datasalt.pangool/pangool-core

  1. @Override
  2. public void run(Context context) throws IOException, InterruptedException {
  3. // Find the InputProcessor from the TaggedInputSplit.
  4. if(delegate == null) {
  5. TaggedInputSplit inputSplit = (TaggedInputSplit) context.getInputSplit();
  6. log.info("[profile] Got input split. Going to look at DC.");
  7. delegate = InstancesDistributor.loadInstance(context.getConfiguration(),
  8. Mapper.class, inputSplit.getInputProcessorFile(), true);
  9. log.info("[profile] Finished. Calling run() on delegate.");
  10. }
  11. delegate.run(context);
  12. }
  13. }

代码示例来源:origin: com.github.jiayuhan-it/hadoop-mapreduce-client-core

  1. @SuppressWarnings("unchecked")
  2. public void run(Context context)
  3. throws IOException, InterruptedException {
  4. setup(context);
  5. mapper.run(context);
  6. cleanup(context);
  7. }
  8. }

代码示例来源:origin: datasalt/pangool

  1. @Override
  2. public void run(Context context) throws IOException, InterruptedException {
  3. // Find the InputProcessor from the TaggedInputSplit.
  4. if(delegate == null) {
  5. TaggedInputSplit inputSplit = (TaggedInputSplit) context.getInputSplit();
  6. log.info("[profile] Got input split. Going to look at DC.");
  7. delegate = InstancesDistributor.loadInstance(context.getConfiguration(),
  8. Mapper.class, inputSplit.getInputProcessorFile(), true);
  9. log.info("[profile] Finished. Calling run() on delegate.");
  10. }
  11. delegate.run(context);
  12. }
  13. }

代码示例来源:origin: io.hops/hadoop-mapreduce-client-core

  1. @SuppressWarnings("unchecked")
  2. public void run(Context context)
  3. throws IOException, InterruptedException {
  4. setup(context);
  5. mapper.run(context);
  6. cleanup(context);
  7. }
  8. }

代码示例来源:origin: ch.cern.hadoop/hadoop-mapreduce-client-core

  1. @SuppressWarnings("unchecked")
  2. public void run(Context context)
  3. throws IOException, InterruptedException {
  4. setup(context);
  5. mapper.run(context);
  6. cleanup(context);
  7. }
  8. }

代码示例来源:origin: io.prestosql.hadoop/hadoop-apache

  1. @SuppressWarnings("unchecked")
  2. public void run(Context context)
  3. throws IOException, InterruptedException {
  4. setup(context);
  5. mapper.run(context);
  6. cleanup(context);
  7. }
  8. }

代码示例来源:origin: org.apache.hadoop/hadoop-mapred

  1. @SuppressWarnings("unchecked")
  2. public void run(Context context)
  3. throws IOException, InterruptedException {
  4. setup(context);
  5. mapper.run(context);
  6. cleanup(context);
  7. }
  8. }

代码示例来源:origin: asakusafw/asakusafw

  1. @Override
  2. public void run(Context context) throws IOException, InterruptedException {
  3. setup(context);
  4. mapper.run(context);
  5. cleanup(context);
  6. }

代码示例来源:origin: com.conversantmedia/mara-core

  1. @Override
  2. public void run(Context context) throws IOException, InterruptedException {
  3. setup(context);
  4. getDelegate(context).run(context);
  5. cleanup(context);
  6. }

代码示例来源:origin: com.github.jiayuhan-it/hadoop-mapreduce-client-core

  1. @SuppressWarnings("unchecked")
  2. void runMapper(TaskInputOutputContext context, int index) throws IOException,
  3. InterruptedException {
  4. Mapper mapper = mappers.get(index);
  5. RecordReader rr = new ChainRecordReader(context);
  6. RecordWriter rw = new ChainRecordWriter(context);
  7. Mapper.Context mapperContext = createMapContext(rr, rw, context,
  8. getConf(index));
  9. mapper.run(mapperContext);
  10. rr.close();
  11. rw.close(context);
  12. }

代码示例来源:origin: ch.cern.hadoop/hadoop-mapreduce-client-core

  1. @SuppressWarnings("unchecked")
  2. void runMapper(TaskInputOutputContext context, int index) throws IOException,
  3. InterruptedException {
  4. Mapper mapper = mappers.get(index);
  5. RecordReader rr = new ChainRecordReader(context);
  6. RecordWriter rw = new ChainRecordWriter(context);
  7. Mapper.Context mapperContext = createMapContext(rr, rw, context,
  8. getConf(index));
  9. mapper.run(mapperContext);
  10. rr.close();
  11. rw.close(context);
  12. }

代码示例来源:origin: org.apache.hadoop/hadoop-mapred

  1. @SuppressWarnings("unchecked")
  2. void runMapper(TaskInputOutputContext context, int index) throws IOException,
  3. InterruptedException {
  4. Mapper mapper = mappers.get(index);
  5. RecordReader rr = new ChainRecordReader(context);
  6. RecordWriter rw = new ChainRecordWriter(context);
  7. Mapper.Context mapperContext = createMapContext(rr, rw, context,
  8. getConf(index));
  9. mapper.run(mapperContext);
  10. rr.close();
  11. rw.close(context);
  12. }

代码示例来源:origin: io.hops/hadoop-mapreduce-client-core

  1. @SuppressWarnings("unchecked")
  2. void runMapper(TaskInputOutputContext context, int index) throws IOException,
  3. InterruptedException {
  4. Mapper mapper = mappers.get(index);
  5. RecordReader rr = new ChainRecordReader(context);
  6. RecordWriter rw = new ChainRecordWriter(context);
  7. Mapper.Context mapperContext = createMapContext(rr, rw, context,
  8. getConf(index));
  9. mapper.run(mapperContext);
  10. rr.close();
  11. rw.close(context);
  12. }

代码示例来源:origin: lintool/warcbase

  1. @SuppressWarnings("unchecked")
  2. void runMapper(TaskInputOutputContext context, int index) throws IOException,
  3. InterruptedException {
  4. Mapper mapper = mappers.get(index);
  5. RecordReader rr = new ChainRecordReader(context);
  6. RecordWriter rw = new ChainRecordWriter(context);
  7. Mapper.Context mapperContext = createMapContext(rr, rw, context,
  8. getConf(index));
  9. mapper.run(mapperContext);
  10. rr.close();
  11. rw.close(context);
  12. }

代码示例来源:origin: io.prestosql.hadoop/hadoop-apache

  1. @SuppressWarnings("unchecked")
  2. void runMapper(TaskInputOutputContext context, int index) throws IOException,
  3. InterruptedException {
  4. Mapper mapper = mappers.get(index);
  5. RecordReader rr = new ChainRecordReader(context);
  6. RecordWriter rw = new ChainRecordWriter(context);
  7. Mapper.Context mapperContext = createMapContext(rr, rw, context,
  8. getConf(index));
  9. mapper.run(mapperContext);
  10. rr.close();
  11. rw.close(context);
  12. }

相关文章