有 Java 编程相关的问题?

你可以在下面搜索框中键入要查询的问题!

java在运行startdfs后无法启动namenode。sh(hadoop 2.7.1)

当试图构造本地伪Hadoop环境时,当我试图用start-dfs.sh启动我的namenode时,会出现以下错误

"Could not find or load main class org.apache.hadoop.hdfs.tools.GetConf"

我的java版本如下所示

java version "1.7.0_85"
OpenJDK Runtime Environment (IcedTea 2.6.1) (7u85-2.6.1-5ubuntu0.14.04.1)
OpenJDK 64-Bit Server VM (build 24.85-b03, mixed mode)

我还更改了hadoop-env.sh/usr/local/hadoop-2.7.1/etc/hadoop下的行

export JAVA_HOME=/usr/lib/jvm/java-7-openjdk-amd64

用于etc/hadoop/core站点。xml,我把

<configuration>
    <property>
        <name>fs.defaultFS</name>
        <value>hdfs://localhost:9000</value>
    </property>
</configuration>

用于etc/hadoop/hdfs站点。xml,我把

<configuration>
    <property>
        <name>dfs.replication</name>
        <value>1</value>
    </property>
</configuration>

我还更改了我的/home/hduser/。bashrc文件,添加如下行:(所有路径都正确)

#HADOOP VARIABLES START
export HADOOP_PREFIX =/usr/local/hadoop-2.7.1
export HADOOP_HOME=/usr/local/hadoop-2.7.1
export JAVA_HOME=/usr/lib/jvm/java-7-openjdk-amd64
export PATH=$PATH:$HADOOP_INSTALL/bin
export PATH=$PATH:$HADOOP_INSTALL/sbin
export HADOOP_MAPRED_HOME=${HADOOP_HOME}
export HADOOP_COMMON_HOME=${HADOOP_HOME}
export HADOOP_HDFS_HOME=${HADOOP_HOME}
export YARN_HOME=${HADOOP_HOME}
export HADOOP_COMMON_LIB_NATIVE_DIR=${HADOOP_PREFIX}/lib/native
export HADOOP_OPTS="-Djava.library.path=${HADOOP_PREFIX}/lib/native"
export HADOOP_CLASSPATH=$JAVA_HOME/lib/tools.jar
#HADOOP VARIABLES END

当输入start dfs时。sh,仅显示datanode,以及何时开始全部。sh.Nodemanager和datanode显示

6098 NodeManager
5691 DataNode
6267 Jps

http://localhost中没有显示任何内容:*****/


共 (1) 个答案

  1. # 1 楼答案

    首先使用此命令hadoop namenode -format格式化namenode,然后尝试从终端./hadoop-daemon.sh start namenode执行此命令
    jps要检查的命令

    核心站点。xml:

    <configuration> 
      <property> 
        <name>fs.default.name</name> 
        <value>hdfs://localhost:9000</value> 
      </property> 
    </configuration> 
    

    hdfs站点。xml:

    <configuration> 
     <property> 
      <name>dfs.replication</name> 
      <value>1</value> 
     </property> 
     <property> 
      <name>dfs.namenode.name.dir</name> 
      <value>/path/hadoop/namenode</value> 
     </property> 
     <property> 
      <name>dfs.datanode.data.dir</name> 
      <value>/path/hadoop/datanode</value> 
    </property> 
    </configuration>