我在完全分布式模式下安装了hadoop,有一个主节点和三个从节点。我正在尝试执行一个名为tasks.jar的jar文件,它将arg[0]作为输入目录,arg[1]作为输出目录。
在我的hadoop环境中,输入文件在/input目录中,没有/output目录。
我用 hadoop fs -ls /
命令
现在,当我尝试使用以下命令执行jar文件时:
hadoop jar Tasks.jar ProgrammingAssigment/Tasks /input /output'
我得到以下例外:
ubuntu@ip-172-31-5-213:~$ hadoop jar Tasks.jar ProgrammingAssignment/Tasks /input /output
16/10/14 02:26:23 INFO client.RMProxy: Connecting to ResourceManager at ec2-52-55-2-64.compute-1.amazonaws.com/172.31.5.213:8032
Exception in thread "main" org.apache.hadoop.mapred.FileAlreadyExistsException: Output directory hdfs://ec2-52-55-2-64.compute-1.amazonaws.com:9000/input already exists
at org.apache.hadoop.mapreduce.lib.output.FileOutputFormat.checkOutputSpecs(FileOutputFormat.java:146)
at org.apache.hadoop.mapreduce.JobSubmitter.checkSpecs(JobSubmitter.java:266)
at org.apache.hadoop.mapreduce.JobSubmitter.submitJobInternal(JobSubmitter.java:139)
at org.apache.hadoop.mapreduce.Job$10.run(Job.java:1290)
at org.apache.hadoop.mapreduce.Job$10.run(Job.java:1287)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:415)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1657)
at org.apache.hadoop.mapreduce.Job.submit(Job.java:1287)
at org.apache.hadoop.mapreduce.Job.waitForCompletion(Job.java:1308)
at ProgrammingAssignment.Tasks.main(Tasks.java:96)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at org.apache.hadoop.util.RunJar.run(RunJar.java:221)
at org.apache.hadoop.util.RunJar.main(RunJar.java:136)
2条答案
按热度按时间pxy2qtax1#
这个痕迹看起来像
hdfs://ec2-52-55-2-64.compute-1.amazonaws.com:9000/input
,在您的代码输出目录中/input
,因此发生了上述异常。您可能需要在hdfs中更改输出目录或输入目录名。pxiryf3j2#
确保/input作为输入目录而不是输出目录传递。通过例外
org.apache.hadoop.mapreduce.lib.output.fileoutputformat.checkoutputspecs
/输入被视为输出目录。