org.apache.hadoop.mapreduce.lib.output.MultipleOutputs.getNamedOutputValueClass()方法的使用及代码示例

x33g5p2x  于2022-01-25 转载在 其他  
字(4.3k)|赞(0)|评价(0)|浏览(140)

本文整理了Java中org.apache.hadoop.mapreduce.lib.output.MultipleOutputs.getNamedOutputValueClass()方法的一些代码示例,展示了MultipleOutputs.getNamedOutputValueClass()的具体用法。这些代码示例主要来源于Github/Stackoverflow/Maven等平台,是从一些精选项目中提取出来的代码,具有较强的参考意义,能在一定程度帮忙到你。MultipleOutputs.getNamedOutputValueClass()方法的具体详情如下:
包路径:org.apache.hadoop.mapreduce.lib.output.MultipleOutputs
类名称:MultipleOutputs
方法名:getNamedOutputValueClass

MultipleOutputs.getNamedOutputValueClass介绍

暂无

代码示例

代码示例来源:origin: io.prestosql.hadoop/hadoop-apache

  1. private TaskAttemptContext getContext(String nameOutput) throws IOException {
  2. TaskAttemptContext taskContext = taskContexts.get(nameOutput);
  3. if (taskContext != null) {
  4. return taskContext;
  5. }
  6. // The following trick leverages the instantiation of a record writer via
  7. // the job thus supporting arbitrary output formats.
  8. Job job = Job.getInstance(context.getConfiguration());
  9. job.setOutputFormatClass(getNamedOutputFormatClass(context, nameOutput));
  10. job.setOutputKeyClass(getNamedOutputKeyClass(context, nameOutput));
  11. job.setOutputValueClass(getNamedOutputValueClass(context, nameOutput));
  12. taskContext = new TaskAttemptContextImpl(job.getConfiguration(), context
  13. .getTaskAttemptID(), new WrappedStatusReporter(context));
  14. taskContexts.put(nameOutput, taskContext);
  15. return taskContext;
  16. }

代码示例来源:origin: org.apache.hadoop/hadoop-mapred

  1. private TaskAttemptContext getContext(String nameOutput) throws IOException {
  2. TaskAttemptContext taskContext = taskContexts.get(nameOutput);
  3. if (taskContext != null) {
  4. return taskContext;
  5. }
  6. // The following trick leverages the instantiation of a record writer via
  7. // the job thus supporting arbitrary output formats.
  8. Job job = new Job(context.getConfiguration());
  9. job.setOutputFormatClass(getNamedOutputFormatClass(context, nameOutput));
  10. job.setOutputKeyClass(getNamedOutputKeyClass(context, nameOutput));
  11. job.setOutputValueClass(getNamedOutputValueClass(context, nameOutput));
  12. taskContext = new TaskAttemptContextImpl(job.getConfiguration(), context
  13. .getTaskAttemptID(), new WrappedStatusReporter(context));
  14. taskContexts.put(nameOutput, taskContext);
  15. return taskContext;
  16. }

代码示例来源:origin: io.hops/hadoop-mapreduce-client-core

  1. private TaskAttemptContext getContext(String nameOutput) throws IOException {
  2. TaskAttemptContext taskContext = taskContexts.get(nameOutput);
  3. if (taskContext != null) {
  4. return taskContext;
  5. }
  6. // The following trick leverages the instantiation of a record writer via
  7. // the job thus supporting arbitrary output formats.
  8. Job job = Job.getInstance(context.getConfiguration());
  9. job.setOutputFormatClass(getNamedOutputFormatClass(context, nameOutput));
  10. job.setOutputKeyClass(getNamedOutputKeyClass(context, nameOutput));
  11. job.setOutputValueClass(getNamedOutputValueClass(context, nameOutput));
  12. taskContext = new TaskAttemptContextImpl(job.getConfiguration(), context
  13. .getTaskAttemptID(), new WrappedStatusReporter(context));
  14. taskContexts.put(nameOutput, taskContext);
  15. return taskContext;
  16. }

代码示例来源:origin: ch.cern.hadoop/hadoop-mapreduce-client-core

  1. private TaskAttemptContext getContext(String nameOutput) throws IOException {
  2. TaskAttemptContext taskContext = taskContexts.get(nameOutput);
  3. if (taskContext != null) {
  4. return taskContext;
  5. }
  6. // The following trick leverages the instantiation of a record writer via
  7. // the job thus supporting arbitrary output formats.
  8. Job job = Job.getInstance(context.getConfiguration());
  9. job.setOutputFormatClass(getNamedOutputFormatClass(context, nameOutput));
  10. job.setOutputKeyClass(getNamedOutputKeyClass(context, nameOutput));
  11. job.setOutputValueClass(getNamedOutputValueClass(context, nameOutput));
  12. taskContext = new TaskAttemptContextImpl(job.getConfiguration(), context
  13. .getTaskAttemptID(), new WrappedStatusReporter(context));
  14. taskContexts.put(nameOutput, taskContext);
  15. return taskContext;
  16. }

代码示例来源:origin: com.github.jiayuhan-it/hadoop-mapreduce-client-core

  1. private TaskAttemptContext getContext(String nameOutput) throws IOException {
  2. TaskAttemptContext taskContext = taskContexts.get(nameOutput);
  3. if (taskContext != null) {
  4. return taskContext;
  5. }
  6. // The following trick leverages the instantiation of a record writer via
  7. // the job thus supporting arbitrary output formats.
  8. Job job = Job.getInstance(context.getConfiguration());
  9. job.setOutputFormatClass(getNamedOutputFormatClass(context, nameOutput));
  10. job.setOutputKeyClass(getNamedOutputKeyClass(context, nameOutput));
  11. job.setOutputValueClass(getNamedOutputValueClass(context, nameOutput));
  12. taskContext = new TaskAttemptContextImpl(job.getConfiguration(), context
  13. .getTaskAttemptID(), new WrappedStatusReporter(context));
  14. taskContexts.put(nameOutput, taskContext);
  15. return taskContext;
  16. }

相关文章