我正在尝试运行一个基本的爬虫程序。从nutchutorial得到命令:bin/crawl url-dir crawl-depth 3-topn 5
(完成所有预设后)
我是从windows运行的,所以我安装了cygwin64作为运行环境
当我从nutch home目录运行bin/nutch时,我没有看到任何问题,但是当我尝试像上面那样运行爬网时,我得到以下错误:
Injector: starting at 2014-11-29 11:31:35
Injector: crawlDb: -dir/crawldb
Injector: urlDir: urls
Injector: Converting injected urls to crawl db entries.
Injector: java.io.IOException: Failed to set permissions of path: \tmp\hadoop-Er
an\mapred\staging\Eran996102549\.staging to 0700
at org.apache.hadoop.fs.FileUtil.checkReturnValue(FileUtil.java:691)
at org.apache.hadoop.fs.FileUtil.setPermission(FileUtil.java:664)
at org.apache.hadoop.fs.RawLocalFileSystem.setPermission(RawLocalFileSys
tem.java:514)
at org.apache.hadoop.fs.RawLocalFileSystem.mkdirs(RawLocalFileSystem.jav
a:349)
at org.apache.hadoop.fs.FilterFileSystem.mkdirs(FilterFileSystem.java:19
3)
at org.apache.hadoop.mapreduce.JobSubmissionFiles.getStagingDir(JobSubmi
ssionFiles.java:126)
at org.apache.hadoop.mapred.JobClient$2.run(JobClient.java:942)
at org.apache.hadoop.mapred.JobClient$2.run(JobClient.java:936)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Unknown Source)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInforma
tion.java:1190)
at org.apache.hadoop.mapred.JobClient.submitJobInternal(JobClient.java:9
36)
at org.apache.hadoop.mapred.JobClient.submitJob(JobClient.java:910)
at org.apache.hadoop.mapred.JobClient.runJob(JobClient.java:1353)
at org.apache.nutch.crawl.Injector.inject(Injector.java:324)
at org.apache.nutch.crawl.Injector.run(Injector.java:380)
at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:65)
at org.apache.nutch.crawl.Injector.main(Injector.java:370)
在他们的教程中没有提到那个错误。我该怎么办?
1条答案
按热度按时间rqqzpn5f1#
这是一个权限问题。您应该设置对文件夹的读、写和执行权限(
<name>hadoop.tmp.dir</name>
hadoop配置文件core site.xml中的值)。希望这有帮助,
勒库克多