java—使用分布式缓存的新api时出现的问题

wfauudbj  于 2021-06-02  发布在  Hadoop
关注(0)|答案(1)|浏览(453)

我尝试使用分布式缓存的新api运行hadoop程序。我被下面的错误信息困住了。

14/11/04 10:54:36 WARN fs.FileUtil: Command 'ln -s /tmp/hadoop-hduser/mapred/local/1415078671812/normal_small /home/yogi/Desktop/normal_small' failed 1 with: ln: failed to create symbolic link ‘/home/yogi/Desktop/normal_small’: Permission denied

14/11/04 10:54:36 WARN mapred.LocalDistributedCacheManager: Failed to create symlink: /tmp/hadoop-hduser/mapred/local/1415078671812/normal_small <- /home/yogi/Desktop/normal_small

java.io.FileNotFoundException: hdfs:/master:54310/usr/local/hadoop/input/normal_small (No such file or directory)

我从来没有在我的代码中提到过任何关于/家/瑜伽士/桌面/普通的小东西。无法理解它试图从何处访问该文件。
另外,我应该如何在driver类中给出输入文件路径来解决file not found异常?
下面是我的Map程序和驱动程序类代码段:
Map器:

BufferedReader in = null;
  FileReader fr = null;
  private List<String> list = new ArrayList<String>();

  @Override
  protected void setup(Context context)
          throws IOException, InterruptedException {
      Configuration conf = context.getConfiguration();
      URI[] cacheFiles = context.getCacheFiles();

      try {
          fr = new FileReader(cacheFiles[0].toString());
          in = new BufferedReader(fr);
          String str;
          while ((str = in.readLine()) != null) {
              list.add(str);
          }
      } catch (Exception e) {
          e.printStackTrace();
      } finally {
          in.close();
          fr.close();
      }

  }

public void map(LongWritable key, Text value, Context context) throws IOException, InterruptedException {

    FileOutputStream fos = new FileOutputStream("output");
    ObjectOutputStream oos = new ObjectOutputStream(fos);   
    oos.writeObject(list); // write MenuArray to ObjectOutputStream

    BufferedReader br=new BufferedReader(new FileReader("output"));

       String line=br.readLine();
        .........
}

司机:

Job job = Job.getInstance(getConf());
job.setJobName("wordcount");
job.setJarByClass(driver.class);        
job.addCacheFile(new Path("hdfs://master:54310/usr/local/hadoop/input/normal_small").toUri());
ijxebb2r

ijxebb2r1#

将文件添加到分布式缓存时,它将创建一个临时目录。因此,将该目录的所有权更改为当前用户。

相关问题