我正在研究 AWS
并试图 create oozie workflow for map only job
使用色调。我拿走了 mapreduce action
为了它。在尝试了许多方法之后,我无法完成它。我从cli运行了我的工作,它工作得很好。
我在hdfs中创建了一个目录名mapreduce dir,并将driver.java和mapper.java放在其中。在mapreduce dir下,我创建lib目录并将我的runnable jar放在其中。我附上屏幕截图的色调界面。
我遗漏了一些东西,或者我似乎无法将可运行的jar放在适当的位置。
我还想添加一个额外的参数除了输入和输出目录在色调。我怎么能做到?
我的怀疑在于 2015-11-06 14:56:57,679 WARN [main] org.apache.hadoop.mapreduce.JobSubmitter: No job jar file set. User classes may not be found. See Job or Job#setJar(String).
当我试图看到oozie:action log. 我收到了下面的信息。 No tasks found for job job_1446129655727_0306.
###更新1
import org.apache.hadoop.fs.FileSystem;
import org.apache.hadoop.fs.Path;
import org.apache.hadoop.conf.Configuration;
import org.apache.hadoop.conf.Configured;
import org.apache.hadoop.io.*;
import org.apache.hadoop.util.Tool;
import org.apache.hadoop.util.ToolRunner;
import org.apache.hadoop.mapreduce.Job;
import org.apache.hadoop.mapreduce.lib.input.FileInputFormat;
import org.apache.hadoop.mapreduce.lib.output.FileOutputFormat;
/*
* Driver class to decompress the zip files.
*/
public class DecompressJob extends Configured implements Tool {
public static void main(String[] args) throws Exception {
int res = ToolRunner.run(new Configuration(), new DecompressJob(), args);
System.exit(res);
}
public int run(String[] args) throws Exception {
Configuration conf = new Configuration();
conf.set("unzip_files", args[2]);
Job JobConf = Job.getInstance(conf);
JobConf.setJobName("mapper class");
try {
FileSystem fs = FileSystem.get(getConf());
if (fs.isDirectory(new Path(args[1]))) {
fs.delete(new Path(args[1]), true);
}
} catch (Exception e) {
}
JobConf.setJarByClass(DecompressJob.class);
JobConf.setOutputKeyClass(LongWritable.class);
JobConf.setOutputValueClass(Text.class);
JobConf.setMapperClass(DecompressMapper.class);
JobConf.setNumReduceTasks(0);
Path input = new Path(args[0]);
Path output = new Path(args[1]);
FileInputFormat.addInputPath(JobConf, input);
FileOutputFormat.setOutputPath(JobConf, output);
return JobConf.waitForCompletion(true) ? 0 : 1;
}
}
我还更新了屏幕截图,增加了几个属性。同时发布错误日志
2015-11-07 02:43:31,074 INFO [main] org.apache.hadoop.mapred.Task: Using ResourceCalculatorProcessTree : [ ]
2015-11-07 02:43:31,110 WARN [main] org.apache.hadoop.mapred.YarnChild: Exception running child : java.lang.RuntimeException: java.lang.ClassNotFoundException: Class /user/Ajay/rad_unzip/DecompressMapper.class not found
at org.apache.hadoop.conf.Configuration.getClass(Configuration.java:2074)
at org.apache.hadoop.mapreduce.task.JobContextImpl.getMapperClass(JobContextImpl.java:186)
at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:751)
at org.apache.hadoop.mapred.MapTask.run(MapTask.java:342)
at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:171)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:415)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1628)
at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:166)
Caused by: java.lang.ClassNotFoundException: Class /user/uname/out/DecompressMapper.class not found
at org.apache.hadoop.conf.Configuration.getClassByName(Configuration.java:1980)
at org.apache.hadoop.conf.Configuration.getClass(Configuration.java:2072)
... 8 more
2015-11-07 02:43:31,114 INFO [main] org.apache.hadoop.mapred.Task: Runnning cleanup for the task
2015-11-07 02:43:31,125 WARN [main] org.apache.hadoop.mapreduce.lib.output.FileOutputCommitter: Could not delete hdfs://uname/out/output/_temporary/1/_temporary/attempt_1446129655727_0336_m_000001_1
1条答案
按热度按时间fv2wmkja1#
你应该把dirver和mapper放在同一个jar里。要传递新参数,可以直接单击“addproperty”并给出一个随机的propertyname和propertyvalue。在mr程序中,可以使用“getconf().get(“propertyname”)方法获取值。