我使用hvpi,一个开源hadoop接口,使用hadoop和mapreduce(完全分布式模式)处理视频。我把视频分割成帧,我想用xugglerapi用这些帧制作一个新的视频。
map阶段完成正常,但是reduce阶段导致 java.lang.RuntimeException: error Operation not allowed
. 这是因为我试图在主节点目录中制作一个新视频,但我真的不知道如何在hdfs上实现它。
17/03/25 08:07:12 INFO client.RMProxy: Connecting to ResourceManager at evoido/192.168.25.11:8032
17/03/25 08:07:13 INFO Configuration.deprecation: mapred.output.dir is deprecated. Instead, use mapreduce.output.fileoutputformat.outputdir
17/03/25 08:07:13 WARN mapreduce.JobResourceUploader: Hadoop command-line option parsing not performed. Implement the Tool interface and execute your application with ToolRunner to remedy this.
17/03/25 08:29:50 INFO input.FileInputFormat: Total input paths to process : 1
17/03/25 08:29:51 INFO mapreduce.JobSubmitter: number of splits:1
17/03/25 08:29:51 INFO mapreduce.JobSubmitter: Submitting tokens for job: job_1490439401793_0001
17/03/25 08:29:52 INFO impl.YarnClientImpl: Submitted application application_1490439401793_0001
17/03/25 08:29:52 INFO mapreduce.Job: The url to track the job: http://evoido:8088/proxy/application_1490439401793_0001/
17/03/25 08:29:52 INFO mapreduce.Job: Running job: job_1490439401793_0001
17/03/25 08:30:28 INFO mapreduce.Job: Job job_1490439401793_0001 running in uber mode : false
17/03/25 08:30:28 INFO mapreduce.Job: map 0% reduce 0%
17/03/25 08:30:52 INFO mapreduce.Job: map 100% reduce 0%
17/03/25 08:30:52 INFO mapreduce.Job: Task Id : attempt_1490439401793_0001_m_000000_0, Status : FAILED
17/03/25 08:30:54 INFO mapreduce.Job: map 0% reduce 0%
17/03/25 08:37:40 INFO mapreduce.Job: map 68% reduce 0%
17/03/25 08:37:43 INFO mapreduce.Job: map 69% reduce 0%
17/03/25 08:37:52 INFO mapreduce.Job: map 73% reduce 0%
17/03/25 08:38:30 INFO mapreduce.Job: map 82% reduce 0%
17/03/25 08:39:26 INFO mapreduce.Job: map 100% reduce 0%
17/03/25 08:40:36 INFO mapreduce.Job: map 100% reduce 67%
17/03/25 08:40:39 INFO mapreduce.Job: Task Id : attempt_1490439401793_0001_r_000000_0, Status : FAILED
Error: java.lang.RuntimeException: error Operação não permitida, failed to write trailer to /home/idobrt/Vídeos/Result/
at com.xuggle.mediatool.MediaWriter.close(MediaWriter.java:1306)
at ads.ifba.edu.tcc.util.MediaWriter.close(MediaWriter.java:97)
at edu.bupt.videodatacenter.input.VideoRecordWriter.close(VideoRecordWriter.java:61)
at org.apache.hadoop.mapred.ReduceTask$NewTrackingRecordWriter.close(ReduceTask.java:550)
at org.apache.hadoop.mapred.ReduceTask.runNewReducer(ReduceTask.java:629)
at org.apache.hadoop.mapred.ReduceTask.run(ReduceTask.java:389)
at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:164)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:422)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1657)
at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:158)
这是我的videorecordwriter实现:
public class VideoRecordWriter extends RecordWriter<Text, ImageWritable>{
private FileSystem fs;
@Override
public void close(TaskAttemptContext job) throws IOException, InterruptedException {
// TODO Auto-generated method stub
Path outputPath = new Path(job.getConfiguration().get("mapred.output.dir"));
Configuration conf = job.getConfiguration();
fs = outputPath.getFileSystem(conf);
MediaWriter.initialize().close();
//fs.copyFromLocalFile(new Path(MediaWriter.initialize().getVideoPath()), outputPath);
fs.close();
}
@Override
public void write(Text key,ImageWritable img) throws IOException, InterruptedException {
// TODO Auto-generated method stub
//System.out.println("Key value: "+key.toString());
MediaWriter.initialize().setDimentions(img.getBufferedImage());
MediaWriter.initialize().creaVideoContainer();
MediaWriter.initialize().create(img.getBufferedImage());
}
}
public class MediaWriter{
private MediaWriter(){
}
public static MediaWriter initialize() throws IOException{
if(instance == null){
instance = new MediaWriter();
/*
fs = FileSystem.get(new Configuration());
outputStream = fs.create(new Path("hdfs://evoido:9000/video/teste.mp4"));
containerFormat = IContainerFormat.make();
containerFormat.setOutputFormat("mpeg4", null, "video/ogg");
writer.getContainer().setFormat(containerFormat);
writer = ToolFactory.makeWriter(XugglerIO.map(outputStream));
*/
}
return instance;
}
public void setDimentions(BufferedImage img){
if((WIDTH==0)&&(HEIGHT==0)){
WIDTH = img.getWidth();
HEIGHT = img.getHeight();
}
}
public void setFileName(Text key){
if(fileName==null){
fileName = key.toString();
VIDEO_NAME += fileName.substring(0, (fileName.lastIndexOf("_")-4))+".mp4";
}
}
public void creaVideoContainer() throws IOException{
if(writer ==null){
writer = ToolFactory.makeWriter(VIDEO_NAME);
/*
fs = FileSystem.get(new Configuration());
outputStream = fs.create(new Path("hdfs://evoido:9000/video/teste.mp4"));
containerFormat = IContainerFormat.make();
containerFormat.setOutputFormat("mpeg4", null, "video/ogg");
*/
writer.getContainer().setFormat(containerFormat);
writer.addVideoStream(0, 0, ICodec.ID.CODEC_ID_MPEG4,WIDTH,HEIGHT);
}
}
public void create(BufferedImage img) {
// TODO Auto-generated method stub
//precisamos descobrir como setar o timeStamp corretamente
if(offset == 0){
offset = calcTimeStamp();
}
writer.encodeVideo(0,img,timeStamp, TimeUnit.NANOSECONDS);
timeStamp+=offset;
}
public void close() {
// TODO Auto-generated method stub
writer.close();
}
public String getVideoPath(){
return VIDEO_NAME;
}
public void setTime(long interval){
time+= interval;
}
public void setQtdFrame(long frameNum){
qtdFrame = frameNum;
}
/*
* */
public long calcTimeStamp(){
double interval = 0.0;
double timeLong = Math.round(time/CONST);
double result = (time/(double)qtdFrame)*1000.0;
/*
*/
if((timeLong > 3600)&&((time % qtdFrame)!=0)){
interval = 1000.0;
double overplus = timeLong/3600.0;
if(overplus >=2 ){
interval*=overplus;
}
result+=interval;
}
return (long)Math.round(result);
}
public void setFramerate(double frameR){
if(frameRate == 0){
frameRate = frameR;
}
}
private static IMediaWriter writer;
private static long nextFrameTime = 0;
private static FileSystem fs;
private static OutputStream outputStream;
private static MediaWriter instance;
private static IContainerFormat containerFormat;
private static String VIDEO_NAME = "/home/idobrt/Vídeos/Result/";
private static int WIDTH =0;
private static int HEIGHT= 0;
private static String fileName = null;
private static long timeStamp = 0;
private static double time = 0;
private static long qtdFrame = 0;
private static long offset = 0;
private static long startTime = 0;
private static double frameRate = 0;
private static double CONST = 1000000.0;
private static double INTERVAL = 1000.0;
}
问题只是 writer = ToolFactory.makeWriter(VIDEO_NAME);
因为video\u name是namenode本地目录。有人知道正确的方法吗?我想正确的方法是在hdfs上编写文件。如果作业运行 jobLocalRunner
它会起作用,但我会失去平行性。
1条答案
按热度按时间vlju58qv1#
现在,我只是将文件保存在一个datanode(reduce fase正在运行的地方)中,然后将这个文件复制到hdfs。这不是最好的解决办法,但目前仍有效。