java指定的目标目录不存在

xzlaal3s  于 2021-05-29  发布在  Hadoop
关注(0)|答案(1)|浏览(375)

我正在尝试使用hue界面提供的oozie Jmeter 板创建一个工作流。尝试一步一步地做,我的工作流只有一个java步骤。此java步骤的相关代码如下:

  1. import org.slf4j.Logger;
  2. import org.slf4j.LoggerFactory;
  3. public class InputPathsCalculator {
  4. private static final Logger LOGGER = LoggerFactory.getLogger(InputPathsCalculator.class);
  5. public static void main(String[] args) throws IOException {
  6. System.out.println("sout-ing");
  7. LOGGER.info("putting something in the log");
  8. JobConf jobConf = new JobConf();
  9. jobConf.addResource(new Path("/etc/hadoop/conf/hdfs-site.xml"));
  10. Path outputPath = new Path(args[1]);
  11. List<Path> inputPaths = calculateInputPaths(args[0], jobConf);
  12. FileUtil.copy(fileSystem,
  13. inputPaths.toArray(new Path[0]),
  14. fileSystem,
  15. outputPath,
  16. false,
  17. true,
  18. jobConf);
  19. }
  20. }
  21. ``` `calculateInputPaths(...)` 是一种经过单独测试的方法,效果很好。我传递给方法的参数是配置文件和 `String` 有价值的 `/usr/myUser/outputs/` .
  22. 我有两个问题:1。我在日志里什么都看不到。不是我放在控制台里的,不是我放在日志2里的。这个 `outputs` 目录存在,但我得到以下堆栈跟踪:

org.apache.oozie.action.hadoop.JavaMainException: java.io.IOException: /user/eliasg/outputs/output': specified destination directory does not exist at org.apache.oozie.action.hadoop.JavaMain.run(JavaMain.java:58) at org.apache.oozie.action.hadoop.LauncherMain.run(LauncherMain.java:39) at org.apache.oozie.action.hadoop.JavaMain.main(JavaMain.java:36) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:606) at org.apache.oozie.action.hadoop.LauncherMapper.map(LauncherMapper.java:226) at org.apache.hadoop.mapred.MapRunner.run(MapRunner.java:54) at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:450) at org.apache.hadoop.mapred.MapTask.run(MapTask.java:343) at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:163) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:415) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1628) at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:158) Caused by: java.io.IOException:/user/eliasg/outputs/output': specified destination directory does not exist
at org.apache.hadoop.fs.FileUtil.copy(FileUtil.java:306)
at com.ig.hadoop.jsonextractor.InputPathsCalculator.main(InputPathsCalculator.java:37)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at org.apache.oozie.action.hadoop.JavaMain.run(JavaMain.java:55)
... 15 more

  1. 我觉得第二点,我的 `jobConfig` 缺少一些可以让它与hdfs一起工作的东西,但我不知道是什么。关于第一点,我完全迷路了。
0sgqnhkj

0sgqnhkj1#

我看错地方了。日志在那里,但显然我需要查看map任务容器日志,而不是apps日志列表。
我发现 hdfs-site.xmljobConf 这还不够。有一个属性在默认情况下未启用,需要设置。该属性是: jobConf.set("fs.default.name", String.format("hdfs://%1$s", jobConf.get("dfs.nameservices")));

相关问题