我在这里尝试学习hadoopmapreduce“wordcount”教程。我已经完全照着原样复制了源代码文件(除了我省略了包声明),但我不认为我的问题与实际的程序代码本身有关,而是在我的计算机上设置hadoop的方式的问题。下面是我给出的命令(来自hadoop-2.5.1/bin目录): hadoop jar ../../TestProgram/HadoopTest.jar WordCount ../../TestProgram/input ../../TestProgram/output2
例外情况是: Exception in thread "main" java.lang.UnsatisfiedLinkError: org.apache.hadoop.io.nativeio.NativeIO$Windows.access0(Ljava/lang/String;I)Z
以下是完整的输出:
'C:\Program' is not recognized as an internal or external command,
operable program or batch file.
14/10/31 13:52:30 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
14/10/31 13:52:30 INFO Configuration.deprecation: session.id is deprecated. Instead, use dfs.metrics.session-id
14/10/31 13:52:30 INFO jvm.JvmMetrics: Initializing JVM Metrics with processName=JobTracker, sessionId=
14/10/31 13:52:30 INFO jvm.JvmMetrics: Cannot initialize JVM Metrics with processName=JobTracker, sessionId= - already initialized
14/10/31 13:52:30 WARN mapreduce.JobSubmitter: Hadoop command-line option parsing not performed. Implement the Tool interface and execute your application with ToolRunner to remedy this.
14/10/31 13:52:31 INFO mapred.FileInputFormat: Total input paths to process : 2
14/10/31 13:52:31 INFO mapreduce.JobSubmitter: number of splits:2
14/10/31 13:52:31 INFO mapreduce.JobSubmitter: Submitting tokens for job: job_local600999744_0001
14/10/31 13:52:31 WARN conf.Configuration: file:/tmp/hadoop-Kenny/mapred/staging/Kenny600999744/.staging/job_local600999744_0001/job.xml:an attempt to override final parameter: mapreduce.job.end-notification.max.retry.interval; Ignoring.
14/10/31 13:52:31 WARN conf.Configuration: file:/tmp/hadoop-Kenny/mapred/staging/Kenny600999744/.staging/job_local600999744_0001/job.xml:an attempt to override final parameter: mapreduce.job.end-notification.max.attempts; Ignoring.
14/10/31 13:52:31 INFO mapreduce.JobSubmitter: Cleaning up the staging area file:/tmp/hadoop-Kenny/mapred/staging/Kenny600999744/.staging/job_local600999744_0001
Exception in thread "main" java.lang.UnsatisfiedLinkError: org.apache.hadoop.io.nativeio.NativeIO$Windows.access0(Ljava/lang/String;I)Z
at org.apache.hadoop.io.nativeio.NativeIO$Windows.access0(Native Method)
at org.apache.hadoop.io.nativeio.NativeIO$Windows.access(NativeIO.java:570)
at org.apache.hadoop.fs.FileUtil.canRead(FileUtil.java:977)
at org.apache.hadoop.util.DiskChecker.checkAccessByFileMethods(DiskChecker.java:173)
at org.apache.hadoop.util.DiskChecker.checkDirAccess(DiskChecker.java:160)
at org.apache.hadoop.util.DiskChecker.checkDir(DiskChecker.java:94)
at org.apache.hadoop.fs.LocalDirAllocator$AllocatorPerContext.confChanged(LocalDirAllocator.java:285)
at org.apache.hadoop.fs.LocalDirAllocator$AllocatorPerContext.getLocalPathForWrite(LocalDirAllocator.java:344)
at org.apache.hadoop.fs.LocalDirAllocator.getLocalPathForWrite(LocalDirAllocator.java:150)
at org.apache.hadoop.fs.LocalDirAllocator.getLocalPathForWrite(LocalDirAllocator.java:131)
at org.apache.hadoop.fs.LocalDirAllocator.getLocalPathForWrite(LocalDirAllocator.java:115)
at org.apache.hadoop.mapred.LocalDistributedCacheManager.setup(LocalDistributedCacheManager.java:131)
at org.apache.hadoop.mapred.LocalJobRunner$Job.<init>(LocalJobRunner.java:163)
at org.apache.hadoop.mapred.LocalJobRunner.submitJob(LocalJobRunner.java:731)
at org.apache.hadoop.mapreduce.JobSubmitter.submitJobInternal(JobSubmitter.java:432)
at org.apache.hadoop.mapreduce.Job$10.run(Job.java:1285)
at org.apache.hadoop.mapreduce.Job$10.run(Job.java:1282)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:415)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1614)
at org.apache.hadoop.mapreduce.Job.submit(Job.java:1282)
at org.apache.hadoop.mapred.JobClient$1.run(JobClient.java:562)
at org.apache.hadoop.mapred.JobClient$1.run(JobClient.java:557)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:415)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1614)
at org.apache.hadoop.mapred.JobClient.submitJobInternal(JobClient.java:557)
at org.apache.hadoop.mapred.JobClient.submitJob(JobClient.java:548)
at org.apache.hadoop.mapred.JobClient.runJob(JobClient.java:833)
at WordCount.main(WordCount.java:51)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at org.apache.hadoop.util.RunJar.main(RunJar.java:212)
14/10/31 13:52:31 WARN fs.FileUtil: Failed to delete file or dir [C:\tmp\hadoop-Kenny\hadoop-unjar2075153006497925230\lib\hadoop-common-2.5.1.jar]: it still exists.
14/10/31 13:52:31 WARN fs.FileUtil: Failed to delete file or dir [C:\tmp\hadoop-Kenny\hadoop-unjar2075153006497925230\lib\hadoop-mapreduce-client-app-2.5.1.jar]: it still exists.
14/10/31 13:52:31 WARN fs.FileUtil: Failed to delete file or dir [C:\tmp\hadoop-Kenny\hadoop-unjar2075153006497925230\lib\hadoop-mapreduce-client-common-2.5.1.jar]: it still exists.
14/10/31 13:52:31 WARN fs.FileUtil: Failed to delete file or dir [C:\tmp\hadoop-Kenny\hadoop-unjar2075153006497925230\lib\hadoop-mapreduce-client-core-2.5.1.jar]: it still exists.
14/10/31 13:52:31 WARN fs.FileUtil: Failed to delete file or dir [C:\tmp\hadoop-Kenny\hadoop-unjar2075153006497925230\lib\hadoop-mapreduce-client-hs-2.5.1.jar]: it still exists.
14/10/31 13:52:31 WARN fs.FileUtil: Failed to delete file or dir [C:\tmp\hadoop-Kenny\hadoop-unjar2075153006497925230\lib\hadoop-mapreduce-client-hs-plugins-2.5.1.jar]: it still exists.
14/10/31 13:52:31 WARN fs.FileUtil: Failed to delete file or dir [C:\tmp\hadoop-Kenny\hadoop-unjar2075153006497925230\lib\hadoop-mapreduce-client-jobclient-2.5.1-tests.jar]: it still exists.
14/10/31 13:52:31 WARN fs.FileUtil: Failed to delete file or dir [C:\tmp\hadoop-Kenny\hadoop-unjar2075153006497925230\lib\hadoop-mapreduce-client-jobclient-2.5.1.jar]: it still exists.
14/10/31 13:52:31 WARN fs.FileUtil: Failed to delete file or dir [C:\tmp\hadoop-Kenny\hadoop-unjar2075153006497925230\lib\hadoop-mapreduce-client-shuffle-2.5.1.jar]: it still exists.
我听说类似的问题可能是由于没有正确设置环境变量造成的,所以我在hadoop-env.cmd的开头添加了这个:
set HADOOP_PREFIX=C:\hadoop\hadoop-2.5.1
set HADOOP_HOME=%HADOOP_PREFIX%
set HADOOP_CONF_DIR=%HADOOP_PREFIX%\etc\hadoop
set YARN_CONF_DIR=%HADOOP_CONF_DIR%
set PATH=%PATH%;%HADOOP_PREFIX%\bin
在运行命令之前,我也手动设置了这些变量,但是仍然会出现相同的错误。有人知道我有什么问题吗?
1条答案
按热度按时间p8h8hvxi1#
意思是“c:\program”不能被识别为内部或外部命令。。。意思是当你引用java\u主路径时,它失败了。。。
请尝试->c:\progra~1\之类的操作,而不要使用->c:\program files
即使这样,我也相信它会抛出本机库异常..b'z因为它将无法获得bin文件夹中的winutils.exe和hadoop.dll文件