启动resourcemanager和nodemanager时发生hadoop错误

rqmkfv5c  于 2021-06-01  发布在  Hadoop
关注(0)|答案(2)|浏览(608)

我正在尝试使用单节点集群(psuedo分布式)设置hadoop3-alpha3,并使用apache指南来实现这一点。我尝试运行示例mapreduce作业,但每次连接都被拒绝。运行后 sbin/start-all.sh 我在resourcemanager日志(以及nodemanager日志)中看到了这些异常:

  1. xxxx-xx-xx xx:xx:xx,xxx INFO org.apache.commons.beanutils.FluentPropertyBeanIntrospector: Error when creating PropertyDescriptor for public final void org.apache.commons.configuration2.AbstractConfiguration.setProperty(java.lang.String,java.lang.Object)! Ignoring this property.
  2. xxxx-xx-xx xx:xx:xx,xxx DEBUG org.apache.commons.beanutils.FluentPropertyBeanIntrospector: Exception is:
  3. java.beans.IntrospectionException: bad write method arg count: public final void org.apache.commons.configuration2.AbstractConfiguration.setProperty(java.lang.String,java.lang.Object)
  4. at java.desktop/java.beans.PropertyDescriptor.findPropertyType(PropertyDescriptor.java:696)
  5. at java.desktop/java.beans.PropertyDescriptor.setWriteMethod(PropertyDescriptor.java:356)
  6. at java.desktop/java.beans.PropertyDescriptor.<init>(PropertyDescriptor.java:142)
  7. at org.apache.commons.beanutils.FluentPropertyBeanIntrospector.createFluentPropertyDescritor(FluentPropertyBeanIntrospector.java:178)
  8. at org.apache.commons.beanutils.FluentPropertyBeanIntrospector.introspect(FluentPropertyBeanIntrospector.java:141)
  9. at org.apache.commons.beanutils.PropertyUtilsBean.fetchIntrospectionData(PropertyUtilsBean.java:2245)
  10. at org.apache.commons.beanutils.PropertyUtilsBean.getIntrospectionData(PropertyUtilsBean.java:2226)
  11. at org.apache.commons.beanutils.PropertyUtilsBean.getPropertyDescriptor(PropertyUtilsBean.java:954)
  12. at org.apache.commons.beanutils.PropertyUtilsBean.isWriteable(PropertyUtilsBean.java:1478)
  13. at org.apache.commons.configuration2.beanutils.BeanHelper.isPropertyWriteable(BeanHelper.java:521)
  14. at org.apache.commons.configuration2.beanutils.BeanHelper.initProperty(BeanHelper.java:357)
  15. at org.apache.commons.configuration2.beanutils.BeanHelper.initBeanProperties(BeanHelper.java:273)
  16. at org.apache.commons.configuration2.beanutils.BeanHelper.initBean(BeanHelper.java:192)
  17. at org.apache.commons.configuration2.beanutils.BeanHelper$BeanCreationContextImpl.initBean(BeanHelper.java:669)
  18. at org.apache.commons.configuration2.beanutils.DefaultBeanFactory.initBeanInstance(DefaultBeanFactory.java:162)
  19. at org.apache.commons.configuration2.beanutils.DefaultBeanFactory.createBean(DefaultBeanFactory.java:116)
  20. at org.apache.commons.configuration2.beanutils.BeanHelper.createBean(BeanHelper.java:459)
  21. at org.apache.commons.configuration2.beanutils.BeanHelper.createBean(BeanHelper.java:479)
  22. at org.apache.commons.configuration2.beanutils.BeanHelper.createBean(BeanHelper.java:492)
  23. at org.apache.commons.configuration2.builder.BasicConfigurationBuilder.createResultInstance(BasicConfigurationBuilder.java:447)
  24. at org.apache.commons.configuration2.builder.BasicConfigurationBuilder.createResult(BasicConfigurationBuilder.java:417)
  25. at org.apache.commons.configuration2.builder.BasicConfigurationBuilder.getConfiguration(BasicConfigurationBuilder.java:285)
  26. at org.apache.hadoop.metrics2.impl.MetricsConfig.loadFirst(MetricsConfig.java:119)
  27. at org.apache.hadoop.metrics2.impl.MetricsConfig.create(MetricsConfig.java:98)
  28. at org.apache.hadoop.metrics2.impl.MetricsSystemImpl.configure(MetricsSystemImpl.java:478)
  29. at org.apache.hadoop.metrics2.impl.MetricsSystemImpl.start(MetricsSystemImpl.java:188)
  30. at org.apache.hadoop.metrics2.impl.MetricsSystemImpl.init(MetricsSystemImpl.java:163)
  31. at org.apache.hadoop.metrics2.lib.DefaultMetricsSystem.init(DefaultMetricsSystem.java:62)
  32. at org.apache.hadoop.metrics2.lib.DefaultMetricsSystem.initialize(DefaultMetricsSystem.java:58)
  33. at org.apache.hadoop.yarn.server.resourcemanager.ResourceManager$RMActiveServices.serviceInit(ResourceManager.java:678)
  34. at org.apache.hadoop.service.AbstractService.init(AbstractService.java:163)
  35. at org.apache.hadoop.yarn.server.resourcemanager.ResourceManager.createAndInitActiveServices(ResourceManager.java:1129)
  36. at org.apache.hadoop.yarn.server.resourcemanager.ResourceManager.serviceInit(ResourceManager.java:315)
  37. at org.apache.hadoop.service.AbstractService.init(AbstractService.java:163)
  38. at org.apache.hadoop.yarn.server.resourcemanager.ResourceManager.main(ResourceManager.java:1407)

随后在文件中:

  1. xxxx-xx-xx xx:xx:xx,xxx FATAL org.apache.hadoop.yarn.server.resourcemanager.ResourceManager: Error starting ResourceManager
  2. java.lang.ExceptionInInitializerError
  3. at com.google.inject.internal.cglib.reflect.$FastClassEmitter.<init>(FastClassEmitter.java:67)
  4. at com.google.inject.internal.cglib.reflect.$FastClass$Generator.generateClass(FastClass.java:72)
  5. at com.google.inject.internal.cglib.core.$DefaultGeneratorStrategy.generate(DefaultGeneratorStrategy.java:25)
  6. at com.google.inject.internal.cglib.core.$AbstractClassGenerator.create(AbstractClassGenerator.java:216)
  7. at com.google.inject.internal.cglib.reflect.$FastClass$Generator.create(FastClass.java:64)
  8. at com.google.inject.internal.BytecodeGen.newFastClass(BytecodeGen.java:204)
  9. at com.google.inject.internal.ProviderMethod$FastClassProviderMethod.<init>(ProviderMethod.java:256)
  10. at com.google.inject.internal.ProviderMethod.create(ProviderMethod.java:71)
  11. at com.google.inject.internal.ProviderMethodsModule.createProviderMethod(ProviderMethodsModule.java:275)
  12. at com.google.inject.internal.ProviderMethodsModule.getProviderMethods(ProviderMethodsModule.java:144)
  13. at com.google.inject.internal.ProviderMethodsModule.configure(ProviderMethodsModule.java:123)
  14. at com.google.inject.spi.Elements$RecordingBinder.install(Elements.java:340)
  15. at com.google.inject.spi.Elements$RecordingBinder.install(Elements.java:349)
  16. at com.google.inject.AbstractModule.install(AbstractModule.java:122)
  17. at com.google.inject.servlet.ServletModule.configure(ServletModule.java:52)
  18. at com.google.inject.AbstractModule.configure(AbstractModule.java:62)
  19. at com.google.inject.spi.Elements$RecordingBinder.install(Elements.java:340)
  20. at com.google.inject.spi.Elements.getElements(Elements.java:110)
  21. at com.google.inject.internal.InjectorShell$Builder.build(InjectorShell.java:138)
  22. at com.google.inject.internal.InternalInjectorCreator.build(InternalInjectorCreator.java:104)
  23. at com.google.inject.Guice.createInjector(Guice.java:96)
  24. at com.google.inject.Guice.createInjector(Guice.java:73)
  25. at com.google.inject.Guice.createInjector(Guice.java:62)
  26. at org.apache.hadoop.yarn.webapp.WebApps$Builder.build(WebApps.java:332)
  27. at org.apache.hadoop.yarn.webapp.WebApps$Builder.start(WebApps.java:377)
  28. at org.apache.hadoop.yarn.server.resourcemanager.ResourceManager.startWepApp(ResourceManager.java:1116)
  29. at org.apache.hadoop.yarn.server.resourcemanager.ResourceManager.serviceStart(ResourceManager.java:1218)
  30. at org.apache.hadoop.service.AbstractService.start(AbstractService.java:193)
  31. at org.apache.hadoop.yarn.server.resourcemanager.ResourceManager.main(ResourceManager.java:1408)
  32. Caused by: java.lang.reflect.InaccessibleObjectException: Unable to make protected final java.lang.Class java.lang.ClassLoader.defineClass(java.lang.String,byte[],int,int,java.security.ProtectionDomain) throws java.lang.ClassFormatError accessible: module java.base does not "opens java.lang" to unnamed module @173f73e7
  33. at java.base/java.lang.reflect.AccessibleObject.checkCanSetAccessible(AccessibleObject.java:337)
  34. at java.base/java.lang.reflect.AccessibleObject.checkCanSetAccessible(AccessibleObject.java:281)
  35. at java.base/java.lang.reflect.Method.checkCanSetAccessible(Method.java:197)
  36. at java.base/java.lang.reflect.Method.setAccessible(Method.java:191)
  37. at com.google.inject.internal.cglib.core.$ReflectUtils$2.run(ReflectUtils.java:56)
  38. at java.base/java.security.AccessController.doPrivileged(Native Method)
  39. at com.google.inject.internal.cglib.core.$ReflectUtils.<clinit>(ReflectUtils.java:46)
  40. ... 29 more

请参阅my core-site.xml:

  1. <configuration>
  2. <property>
  3. <name>fs.default.name</name>
  4. <value>hdfs://localhost:9000</value>
  5. </property>
  6. </configuration>

hdfs-site.xml:

  1. <configuration>
  2. <property>
  3. <name>dfs.replication</name>
  4. <value>1</value>
  5. </property>
  6. </configuration>

mapred-site.xml:

  1. <configuration>
  2. <property>
  3. <name>mapreduce.framework.name</name>
  4. <value>yarn</value>
  5. </property>
  6. </configuration>

和yarn-site.xml:

  1. <configuration>
  2. <property>
  3. <name>yarn.nodemanager.aux-services</name>
  4. <value>mapreduce_shuffle</value>
  5. </property>
  6. <property>
  7. <name>yarn.nodemanager.env-whitelist</name>
  8. <value>JAVA_HOME,HADOOP_COMMON_HOME,HADOOP_HDFS_HOME,HADOOP_CONF_DIR,CLASSPATH_PREPEND_DISTCACHE,HADOOP_YARN_HOME,HADOOP_MAPRED_HOME</value>
  9. </property>
  10. </configuration>

我不知道是什么导致了这些异常,任何帮助都是有帮助的。
编辑:添加hadoop-env.sh:

  1. export JAVA_HOME=/usr/local/jdk-9
  2. export HADOOP_HOME=/usr/local/hadoop
  3. export HADOOP_OS_TYPE=${HADOOP_OS_TYPE:-$(uname -s)}
  4. case ${HADOOP_OS_TYPE} in
  5. Darwin*)
  6. export HADOOP_OPTS="${HADOOP_OPTS} -Djava.security.krb5.realm= "
  7. export HADOOP_OPTS="${HADOOP_OPTS} -Djava.security.krb5.kdc= "
  8. export HADOOP_OPTS="${HADOOP_OPTS} -Djava.security.krb5.conf= "
  9. ;;
  10. esac
  11. export HADOOP_ROOT_LOGGER=DEBUG,console
  12. export HADOOP_DAEMON_ROOT_LOGGER=DEBUG,RFA
ztmd8pv5

ztmd8pv51#

我的问题是我使用java11与hadoop合作。
所以我要做的是
1.rm/库/java/*
2.下载java8https://www.oracle.com/technetwork/java/javase/downloads/jdk8-downloads-2133151.html
3.安装java8jdk和
4.修复hadoop-env.sh中的javau home
5.全部停止.sh
6.启动-dfs.sh
7.start-yarn.sh

yptwkmov

yptwkmov2#

@tk421在评论中提到。Java9与Hadoop3(可能还有任何hadoop版本)还不兼容。
https://issues.apache.org/jira/browse/hadoop-11123
我已改为java 8.181,现在两个版本都已启动:

  1. hadoop@hadoop:/usr/local/hadoop$ sbin/start-all.sh
  2. WARNING: Attempting to start all Apache Hadoop daemons as hadoop in 10 seconds.
  3. WARNING: This is not a recommended production deployment configuration.
  4. WARNING: Use CTRL-C to abort.
  5. Starting namenodes on [localhost]
  6. Starting datanodes
  7. Starting secondary namenodes [hadoop]
  8. Starting resourcemanager
  9. Starting nodemanagers
  10. hadoop@hadoop:/usr/local/hadoop$ jps
  11. 8756 SecondaryNameNode
  12. 8389 NameNode
  13. 9173 NodeManager
  14. 9030 ResourceManager
  15. 8535 DataNode
  16. 9515 Jps
展开查看全部

相关问题