hadoop无法找到或加载主类

j5fpnvbx  于 2021-05-29  发布在  Hadoop
关注(0)|答案(2)|浏览(795)

我尝试从这个视频安装hadoop
https://www.youtube.com/watch?v=ctohsz0sb1e&t=126s
当我运行最后一个命令时

start-all.sh

我收到这个信息:

This script is Deprecated. Instead use start-dfs.sh and start-yarn.sh  
Starting namenodes on [localhost]  
localhost: namenode running as process 6283. Stop it first.  
localhost: starting datanode, logging to /home/myname/hadoop-    2.7.3/logs/hadoop-myname-datanode-MYNAME.out  
Starting secondary namenodes [0.0.0.0]  
0.0.0.0: secondarynamenode running as process 6379. Stop it first.  
starting yarn daemons  
starting resourcemanager, logging to /home/myname/hadoop-    2.7.3/logs/yarn-myname-resourcemanager-MYNAME.out  
Error: Could not find or load main class     org.apache.hadoop.yarn.server.resourcemanager.ResourceManager  
localhost: starting nodemanager, logging to /home/myname/hadoop- 2.7.3/logs/yarn-myname-nodemanager-MYNAME.out  
localhost: Error: Could not find or load main class  org.apache.hadoop.yarn.server.nodemanager.NodeManager

我的bashrc文件

export JAVA_HOME=/usr/lib/jvm/java-8-openjdk-amd64
export HADOOP_INSTALL=/home/myname/hadoop-2.7.3
export PATH=$PATH:$HADOOP_INSTALL/bin
export PATH=$PATH:$HADOOP_INSTALL/sbin
export HADOOP_MAPRED_HOME=$HADOOP_INSTALL
export HADOOP_COMMON_HOME=$HADOOP_INSTALL
export HADOOP_HDFS_HOME=$HADOOP_INSTALL
export YARN_HOME=$HADOOP_INSTALL 
export HADOOP_COMMON_LIB_NATIVE_DIR=$HADOOP_INSTALL/lib/native
export HADOOP_OPTS="-Djava.library.path=$HADOOP_INSTALL/lib/native"

我的hdfs-site.xml

<configuration>  
 <property>  
  <name>dfs.replication</name>  
  <value>1</value>  
  <description>Default block replication.  
  The actual number of replications can be specified when the file is created.   
  The default is used if replication is not specified in create time.  
  </description>  
 </property>  
 <property>  
   <name>dfs.namenode.name.dir</name>  
  <value>file:/home/myname/hadoop-2.7.3/etc/hadoop/hadoop_store/hdfs/namenode</value>  
 </property>  
 <property>  
    <name>dfs.datanode.data.dir</name>  
   <value>file:/home/myname/hadoop-2.7.3/etc/hadoop/hadoop_store/hdfs/datanode</value>  
 </property>  
</configuration>

我的核心站点.xml

<configuration>  
 <property>  
  <name>hadoop.tmp.dir</name>  
  <value>/home/myname/hadoop-2.7.3/tmp</value>  
  <description>A base for other temporary directories.</description>  
 </property>  

 <property>  
  <name>fs.default.name</name>  
  <value>hdfs://localhost:54310</value>  
  <description>The name of the default file system.  A URI whose  
  scheme and authority determine the FileSystem implementation.  The  
  uri's scheme determines the config property (fs.SCHEME.impl) naming  
  the FileSystem implementation class.  The uri's authority is used to  
  determine the host, port, etc. for a filesystem.</description>    
 </property>  
</configuration>

我的mapred-site.xml

<configuration>  
 <property>  
  <name>mapred.job.tracker</name>  
  <value>localhost:54311</value>  
  <description>The host and port that the MapReduce job tracker runs  
  at.  If "local", then jobs are run in-process as a single map  
  and reduce task.  
  </description>  
 </property>  
</configuration>

我试过很多方法,但错误仍然存在。。
你知道吗?

b1payxdu

b1payxdu1#

将以下行添加到.bashrc文件中:

export HADOOP_PREFIX=/path_to_hadoop_location
hwazgwia

hwazgwia2#

在配置hadoop时,必须包含warn-site.xml文件

<configuration>
            <property>
                <name>yarn.nodemanager.aux-services</name>
                <value>mapreduce_shuffle</value>
            </property>
        </configuration>

mapred-site.xml:也添加这个

<property>
        <name>mapreduce.framework.name</name>
        <value>yarn</value>
  </property>

我认为可以通过添加这些属性来解决这个问题。

相关问题