hadoop 3.2.1和yarn执行作业时出错

mfuanj7w  于 2021-05-31  发布在  Hadoop
关注(0)|答案(0)|浏览(356)

我在用hadoop和yarn运行mapreduce作业时遇到问题,下面是执行日志。
此命令的执行日志

  1. hadoop jar /opt/hadoop/share/hadoop/mapreduce/hadoop-mapreduce-examples-3.2.1.jar grep /user/hadoop output2 'dfs[a-z.]+'
  1. 2020-04-19 01:37:36,960 INFO client.RMProxy: Connecting to ResourceManager at /0.0.0.0:8032
  2. 2020-04-19 01:37:37,864 INFO mapreduce.JobResourceUploader: Disabling Erasure Coding for path: /tmp/hadoop-yarn/staging/hadoop/.staging/job_1587269102661_0005
  3. 2020-04-19 01:37:38,136 INFO sasl.SaslDataTransferClient: SASL encryption trust check: localHostTrusted = false, remoteHostTrusted = false
  4. 2020-04-19 01:37:38,299 INFO input.FileInputFormat: Total input files to process : 10
  5. 2020-04-19 01:37:38,368 INFO sasl.SaslDataTransferClient: SASL encryption trust check: localHostTrusted = false, remoteHostTrusted = false
  6. 2020-04-19 01:37:38,431 INFO sasl.SaslDataTransferClient: SASL encryption trust check: localHostTrusted = false, remoteHostTrusted = false
  7. 2020-04-19 01:37:38,446 INFO mapreduce.JobSubmitter: number of splits:10
  8. 2020-04-19 01:37:38,635 INFO sasl.SaslDataTransferClient: SASL encryption trust check: localHostTrusted = false, remoteHostTrusted = false
  9. 2020-04-19 01:37:38,656 INFO mapreduce.JobSubmitter: Submitting tokens for job: job_1587269102661_0005
  10. 2020-04-19 01:37:38,657 INFO mapreduce.JobSubmitter: Executing with tokens: []
  11. 2020-04-19 01:37:38,920 INFO conf.Configuration: resource-types.xml not found
  12. 2020-04-19 01:37:38,920 INFO resource.ResourceUtils: Unable to find 'resource-types.xml'.
  13. 2020-04-19 01:37:39,001 INFO impl.YarnClientImpl: Submitted application application_1587269102661_0005
  14. 2020-04-19 01:37:39,064 INFO mapreduce.Job: The url to track the job: http://lucas-Inspiron-N4050:8088/proxy/application_1587269102661_0005/
  15. 2020-04-19 01:37:39,065 INFO mapreduce.Job: Running job: job_1587269102661_0005
  16. 2020-04-19 01:37:42,105 INFO mapreduce.Job: Job job_1587269102661_0005 running in uber mode : false
  17. 2020-04-19 01:37:42,106 INFO mapreduce.Job: map 0% reduce 0%
  18. 2020-04-19 01:37:42,124 INFO mapreduce.Job: Job job_1587269102661_0005 failed with state FAILED due to: Application application_1587269102661_0005 failed 2 times due to AM Container for appattempt_1587269102661_0005_000002 exited with exitCode: 127
  19. Failing this attempt.Diagnostics: [2020-04-19 01:37:41.103]Exception from container-launch.
  20. Container id: container_1587269102661_0005_02_000001
  21. Exit code: 127
  22. [2020-04-19 01:37:41.106]Container exited with a non-zero exit code 127. Error file: prelaunch.err.
  23. Last 4096 bytes of prelaunch.err :
  24. Last 4096 bytes of stderr :
  25. /bin/bash: /bin/java: No such file or directory
  26. [2020-04-19 01:37:41.106]Container exited with a non-zero exit code 127. Error file: prelaunch.err.
  27. Last 4096 bytes of prelaunch.err :
  28. Last 4096 bytes of stderr :
  29. /bin/bash: /bin/java: No such file or directory
  30. For more detailed output, check the application tracking page: http://lucas-Inspiron-N4050:8088/cluster/app/application_1587269102661_0005 Then click on links to logs of each attempt.
  31. . Failing the application.
  32. 2020-04-19 01:37:42,150 INFO mapreduce.Job: Counters: 0
  33. 2020-04-19 01:37:42,184 INFO client.RMProxy: Connecting to ResourceManager at /0.0.0.0:8032
  34. 2020-04-19 01:37:42,229 INFO mapreduce.JobResourceUploader: Disabling Erasure Coding for path: /tmp/hadoop-yarn/staging/hadoop/.staging/job_1587269102661_0006
  35. 2020-04-19 01:37:42,286 INFO sasl.SaslDataTransferClient: SASL encryption trust check: localHostTrusted = false, remoteHostTrusted = false
  36. 2020-04-19 01:37:42,336 INFO mapreduce.JobSubmitter: Cleaning up the staging area /tmp/hadoop-yarn/staging/hadoop/.staging/job_1587269102661_0006
  37. org.apache.hadoop.mapreduce.lib.input.InvalidInputException: Input path does not exist: hdfs://localhost:9000/user/hadoop/grep-temp-280692674
  38. at org.apache.hadoop.mapreduce.lib.input.FileInputFormat.singleThreadedListStatus(FileInputFormat.java:332)
  39. at org.apache.hadoop.mapreduce.lib.input.FileInputFormat.listStatus(FileInputFormat.java:274)
  40. at org.apache.hadoop.mapreduce.lib.input.SequenceFileInputFormat.listStatus(SequenceFileInputFormat.java:59)
  41. at org.apache.hadoop.mapreduce.lib.input.FileInputFormat.getSplits(FileInputFormat.java:396)
  42. at org.apache.hadoop.mapreduce.JobSubmitter.writeNewSplits(JobSubmitter.java:310)
  43. at org.apache.hadoop.mapreduce.JobSubmitter.writeSplits(JobSubmitter.java:327)
  44. at org.apache.hadoop.mapreduce.JobSubmitter.submitJobInternal(JobSubmitter.java:200)
  45. at org.apache.hadoop.mapreduce.Job$11.run(Job.java:1570)
  46. at org.apache.hadoop.mapreduce.Job$11.run(Job.java:1567)
  47. at java.security.AccessController.doPrivileged(Native Method)
  48. at javax.security.auth.Subject.doAs(Subject.java:422)
  49. at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1730)
  50. at org.apache.hadoop.mapreduce.Job.submit(Job.java:1567)
  51. at org.apache.hadoop.mapreduce.Job.waitForCompletion(Job.java:1588)
  52. at org.apache.hadoop.examples.Grep.run(Grep.java:94)
  53. at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:76)
  54. at org.apache.hadoop.examples.Grep.main(Grep.java:103)
  55. at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
  56. at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
  57. at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
  58. at java.lang.reflect.Method.invoke(Method.java:498)
  59. at org.apache.hadoop.util.ProgramDriver$ProgramDescription.invoke(ProgramDriver.java:71)
  60. at org.apache.hadoop.util.ProgramDriver.run(ProgramDriver.java:144)
  61. at org.apache.hadoop.examples.ExampleDriver.main(ExampleDriver.java:74)
  62. at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
  63. at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
  64. at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
  65. at java.lang.reflect.Method.invoke(Method.java:498)
  66. at org.apache.hadoop.util.RunJar.run(RunJar.java:323)
  67. at org.apache.hadoop.util.RunJar.main(RunJar.java:236)

我的环境变量

  1. export JAVA_HOME=/opt/jdkhadoop
  2. export PATH=$PATH:$JAVA_HOME/bin
  3. export HADOOP_HOME=/opt/hadoop
  4. export HADOOP_INSTALL=$HADOOP_HOME
  5. export HADOOP_COMMON_HOME=$HADOOP_HOME
  6. export HADOOP_MAPRED_HOME=$HADOOP_HOME
  7. export HADOOP_HDFS_HOME=$HADOOP_HOME
  8. export YARN_HOME=$HADOOP_HOME
  9. export PATH=$PATH:$HADOOP_HOME/bin:$HADOOP_HOME/sbin

我在用hadoop3.2.1,有人能帮我吗?

暂无答案!

目前还没有任何答案,快来回答吧!

相关问题