gridgain错误:java.lang.noclassdeffounderror:org/apache/hadoop/mapreduce/jobcontext

i1icjdpr  于 2021-06-03  发布在  Hadoop
关注(0)|答案(2)|浏览(499)

我已经配置了gridgain-hadoop-os-6.6.2.zip,并按照中提到的步骤进行了操作 docs/hadoop_readme.pdf . 使用bin/ggstart.sh命令启动了gridgain,现在我正在使用hadoop-2.2.0在gridgain中运行一个简单的wordcount代码。使用命令

  1. hadoop jar $HADOOP_HOME/share/hadoop/mapreduce/*-mapreduce-examples-*.jar wordcount /input /output

我尝试的步骤:
步骤1:提取usr/local文件夹中的hadoop-2.2.0和gridgain-hadoop-os-6.6.2.zip文件,并将gridgain文件夹的名称更改为“gridgain”。
第2步:设置export gridgain\u home=/usr/local/gridgain的路径。。hadoop-2.2.0和javahome的路径

  1. # Set Hadoop-related environment variables
  2. export HADOOP_PREFIX=/usr/local/hadoop-2.2.0
  3. export HADOOP_HOME=/usr/local/hadoop-2.2.0
  4. export HADOOP_MAPRED_HOME=/usr/local/hadoop-2.2.0
  5. export HADOOP_COMMON_HOME=/usr/local/hadoop-2.2.0
  6. export HADOOP_HDFS_HOME=/usr/local/hadoop-2.2.0
  7. export YARN_HOME=/usr/local/hadoop-2.2.0
  8. export HADOOP_CONF_DIR=/usr/local/hadoop-2.2.0/etc/hadoop
  9. export GRIDGAIN_HADOOP_CLASSPATH='/usr/local/hadoop-2.2.0/lib/*:/usr/local/hadoop-2.2.0/lib/*:/usr/local/hadoop-2.2.0/lib/*'

第三步:
现在我运行命令 bin/setup-hadoop.sh ... 回答每一个提示。
第四步:
已使用命令启动gridgain
bin/ggstart.sh文件
第五步:
现在我创建了dir并上传了文件,使用:

  1. hadoop fs -mkdir /input
  2. hadoop fs -copyFromLocal $HADOOP_HOME/README.txt /input/WORD_COUNT_ME.
  3. txt

第6步:
运行此命令时出错:

  1. hadoop jar $HADOOP_HOME/share/hadoop/mapreduce/*-mapreduce-examples-*.
  2. jar wordcount /input /output

获取以下错误:

  1. 15/02/22 12:49:13 INFO Configuration.deprecation: mapred.working.dir is deprecated. Instead, use mapreduce.job.working.dir
  2. 15/02/22 12:49:13 INFO mapreduce.JobSubmitter: Submitting tokens for job: job_091ebfbd-2993-475f-a506-28280dbbf891_0002
  3. 15/02/22 12:49:13 INFO mapreduce.JobSubmitter: Cleaning up the staging area /tmp/hadoop-yarn/staging/hduser/.staging/job_091ebfbd-2993-475f-a506-28280dbbf891_0002
  4. java.lang.NullPointerException
  5. at org.gridgain.client.hadoop.GridHadoopClientProtocol.processStatus(GridHadoopClientProtocol.java:329)
  6. at org.gridgain.client.hadoop.GridHadoopClientProtocol.submitJob(GridHadoopClientProtocol.java:115)
  7. at org.apache.hadoop.mapreduce.JobSubmitter.submitJobInternal(JobSubmitter.java:430)
  8. at org.apache.hadoop.mapreduce.Job$10.run(Job.java:1268)
  9. at org.apache.hadoop.mapreduce.Job$10.run(Job.java:1265)
  10. at java.security.AccessController.doPrivileged(Native Method)
  11. at javax.security.auth.Subject.doAs(Subject.java:415)
  12. at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1491)
  13. at org.apache.hadoop.mapreduce.Job.submit(Job.java:1265)
  14. at org.apache.hadoop.mapreduce.Job.waitForCompletion(Job.java:1286)
  15. at org.apache.hadoop.examples.WordCount.main(WordCount.java:84)
  16. at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
  17. at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
  18. at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
  19. at java.lang.reflect.Method.invoke(Method.java:606)
  20. at org.apache.hadoop.util.ProgramDriver$ProgramDescription.invoke(ProgramDriver.java:72)
  21. at org.apache.hadoop.util.ProgramDriver.run(ProgramDriver.java:144)
  22. at org.apache.hadoop.examples.ExampleDriver.main(ExampleDriver.java:74)
  23. at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
  24. at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
  25. at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
  26. at java.lang.reflect.Method.invoke(Method.java:606)
  27. at org.apache.hadoop.util.RunJar.main(RunJar.java:212)

gridgain控制台错误为:

  1. sLdrId=a0b8610bb41-091ebfbd-2993-475f-a506-28280dbbf891, userVer=0, loc=true, sampleClsName=java.lang.String, pendingUndeploy=false, undeployed=false, usage=0]], taskClsName=o.g.g.kernal.processors.hadoop.proto.GridHadoopProtocolSubmitJobTask, sesId=e129610bb41-091ebfbd-2993-475f-a506-28280dbbf891, startTime=1424589553332, endTime=9223372036854775807, taskNodeId=091ebfbd-2993-475f-a506-28280dbbf891, clsLdr=sun.misc.Launcher$AppClassLoader@1bdcbb2, closed=false, cpSpi=null, failSpi=null, loadSpi=null, usage=1, fullSup=false, subjId=091ebfbd-2993-475f-a506-28280dbbf891], jobId=f129610bb41-091ebfbd-2993-475f-a506-28280dbbf891]]
  2. java.lang.NoClassDefFoundError: org/apache/hadoop/mapreduce/JobContext
  3. at java.lang.Class.getDeclaredConstructors0(Native Method)
  4. at java.lang.Class.privateGetDeclaredConstructors(Class.java:2585)
  5. at java.lang.Class.getConstructor0(Class.java:2885)
  6. at java.lang.Class.getConstructor(Class.java:1723)
  7. at org.gridgain.grid.hadoop.GridHadoopDefaultJobInfo.createJob(GridHadoopDefaultJobInfo.java:107)
  8. at org.gridgain.grid.kernal.processors.hadoop.jobtracker.GridHadoopJobTracker.job(GridHadoopJobTracker.java:959)
  9. at org.gridgain.grid.kernal.processors.hadoop.jobtracker.GridHadoopJobTracker.submit(GridHadoopJobTracker.java:222)
  10. at org.gridgain.grid.kernal.processors.hadoop.GridHadoopProcessor.submit(GridHadoopProcessor.java:188)
  11. at org.gridgain.grid.kernal.processors.hadoop.GridHadoopImpl.submit(GridHadoopImpl.java:73)
  12. at org.gridgain.grid.kernal.processors.hadoop.proto.GridHadoopProtocolSubmitJobTask.run(GridHadoopProtocolSubmitJobTask.java:54)
  13. at org.gridgain.grid.kernal.processors.hadoop.proto.GridHadoopProtocolSubmitJobTask.run(GridHadoopProtocolSubmitJobTask.java:37)
  14. at org.gridgain.grid.kernal.processors.hadoop.proto.GridHadoopProtocolTaskAdapter$Job.execute(GridHadoopProtocolTaskAdapter.java:95)
  15. at org.gridgain.grid.kernal.processors.job.GridJobWorker$2.call(GridJobWorker.java:484)
  16. at org.gridgain.grid.util.GridUtils.wrapThreadLoader(GridUtils.java:6136)
  17. at org.gridgain.grid.kernal.processors.job.GridJobWorker.execute0(GridJobWorker.java:478)
  18. at org.gridgain.grid.kernal.processors.job.GridJobWorker.body(GridJobWorker.java:429)
  19. at org.gridgain.grid.util.worker.GridWorker.run(GridWorker.java:151)
  20. at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
  21. at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
  22. at java.lang.Thread.run(Thread.java:745)
  23. Caused by: java.lang.ClassNotFoundException: Failed to load class: org.apache.hadoop.mapreduce.JobContext
  24. at org.gridgain.grid.kernal.processors.hadoop.GridHadoopClassLoader.loadClass(GridHadoopClassLoader.java:125)
  25. at java.lang.ClassLoader.loadClass(ClassLoader.java:358)
  26. ... 20 more
  27. Caused by: java.lang.ClassNotFoundException: org.apache.hadoop.mapreduce.JobContext
  28. at java.net.URLClassLoader$1.run(URLClassLoader.java:366)
  29. at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
  30. at java.security.AccessController.doPrivileged(Native Method)
  31. at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
  32. at org.gridgain.grid.kernal.processors.hadoop.GridHadoopClassLoader.loadClassExplicitly(GridHadoopClassLoader.java:196)
  33. at org.gridgain.grid.kernal.processors.hadoop.GridHadoopClassLoader.loadClass(GridHadoopClassLoader.java:106)
  34. ... 21 more
  35. ^[[B

救命啊。。。。
此处编辑:

  1. raj@ubuntu:~$ hadoop classpath
  2. /usr/local/hadoop-2.2.0/etc/hadoop:/usr/local/hadoop-2.2.0/share/hadoop/common/lib/*:/usr/local/hadoop-2.2.0/share/hadoop/common/*:/usr/local/hadoop-2.2.0/share/hadoop/hdfs:/usr/local/hadoop-2.2.0/share/hadoop/hdfs/lib/*:/usr/local/hadoop-2.2.0/share/hadoop/hdfs/*:/usr/local/hadoop-2.2.0/share/hadoop/yarn/lib/*:/usr/local/hadoop-2.2.0/share/hadoop/yarn/*:/usr/local/hadoop-2.2.0/share/hadoop/mapreduce/lib/*:/usr/local/hadoop-2.2.0/share/hadoop/mapreduce/*:/usr/local/hadoop-2.2.0/contrib/capacity-scheduler/*.jar
  3. raj@ubuntu:~$ jps
  4. 3529 GridCommandLineStartup
  5. 3646 Jps
  6. raj@ubuntu:~$ echo $GRIDGAIN_HOME
  7. /usr/local/gridgain
  8. raj@ubuntu:~$ echo $HADOOP_HOME
  9. /usr/local/hadoop-2.2.0
  10. raj@ubuntu:~$ hadoop version
  11. Hadoop 2.2.0
  12. Subversion https://svn.apache.org/repos/asf/hadoop/common -r 1529768
  13. Compiled by hortonmu on 2013-10-07T06:28Z
  14. Compiled with protoc 2.5.0
  15. From source with checksum 79e53ce7994d1628b240f09af91e1af4
  16. This command was run using /usr/local/hadoop-2.2.0/share/hadoop/common/hadoop-common-2.2.0.jar
  17. raj@ubuntu:~$ cd /usr/local/hadoop-2.2.0/share/hadoop/mapreduce
  18. raj@ubuntu:/usr/local/hadoop-2.2.0/share/hadoop/mapreduce$ ls
  19. hadoop-mapreduce-client-app-2.2.0.jar hadoop-mapreduce-client-hs-2.2.0.jar hadoop-mapreduce-client-jobclient-2.2.0-tests.jar lib
  20. hadoop-mapreduce-client-common-2.2.0.jar hadoop-mapreduce-client-hs-plugins-2.2.0.jar hadoop-mapreduce-client-shuffle-2.2.0.jar lib-examples
  21. hadoop-mapreduce-client-core-2.2.0.jar hadoop-mapreduce-client-jobclient-2.2.0.jar hadoop-mapreduce-examples-2.2.0.jar sources
  22. raj@ubuntu:/usr/local/hadoop-2.2.0/share/hadoop/mapreduce$
ndh0cuux

ndh0cuux1#

多谢伊万,谢谢你的帮助和支持,你给出的解决方案很好地让我摆脱了这个问题。
问题不在于设置其他与hadoop相关的环境变量。这就足够了。

  1. JAVA_HOME , HADOOP_HOME and GRIDGAIN_HOME
a0x5cqrl

a0x5cqrl2#

我完全配置了您提到的版本(gridgain-hadoop-os-6.6.2.zip+hadoop-2.2.0),“wordcount”示例运行良好。
[问题作者日志分析后的upd:]
拉朱,谢谢你的详细记录。问题的原因是环境变量设置不正确

  1. export HADOOP_MAPRED_HOME=${HADOOP_HOME}
  2. export HADOOP_COMMON_HOME=${HADOOP_HOME}
  3. export HADOOP_HDFS_HOME=${HADOOP_HOME}

您显式地将所有这些变量设置为${hadoop\u home}值,这是错误的。这会导致gg编写不正确的hadoop类路径,如下面的gg节点日志所示:

  1. +++ HADOOP_PREFIX=/usr/local/hadoop-2.2.0
  2. +++ [[ -z /usr/local/hadoop-2.2.0 ]]
  3. +++ '[' -z /usr/local/hadoop-2.2.0 ']'
  4. +++ HADOOP_COMMON_HOME=/usr/local/hadoop-2.2.0
  5. +++ HADOOP_HDFS_HOME=/usr/local/hadoop-2.2.0
  6. +++ HADOOP_MAPRED_HOME=/usr/local/hadoop-2.2.0
  7. +++ GRIDGAIN_HADOOP_CLASSPATH='/usr/local/hadoop-2.2.0/lib/*:/usr/local/hadoop-2.2.0/lib/*:/usr/local/hadoop-2.2.0/lib/*'

所以,要解决这个问题,请不要设置不必要的env变量。java\u home和hadoop\u home已经足够了,不需要其他东西。

展开查看全部

相关问题