无法在hadoop安装中找到start-all.sh

时间:2016-03-11 11:38:10

标签: hadoop installation ubuntu-14.04

我正在尝试在我的本地计算机上设置hadoop并且正在关注this。我也设置了hadoop home

这是我正在尝试运行的命令

hduser@ubuntu:~$ /usr/local/hadoop/bin/start-all.sh

这是我得到的错误

-su: /usr/local/hadoop/bin/start-all.sh: No such file or directory

这是我添加到$ HOME / .bashrc文件

的内容
# Set Hadoop-related environment variables
export HADOOP_HOME=/usr/local/hadoop

# Set JAVA_HOME (we will also configure JAVA_HOME directly for Hadoop later on)
export JAVA_HOME=/usr/lib/jvm/java-8-oracle

# Some convenient aliases and functions for running Hadoop-related commands
unalias fs &> /dev/null
alias fs="hadoop fs"
unalias hls &> /dev/null
alias hls="fs -ls"

# If you have LZO compression enabled in your Hadoop cluster and
# compress job outputs with LZOP (not covered in this tutorial):
# Conveniently inspect an LZOP compressed file from the command
# line; run via:
#
# $ lzohead /hdfs/path/to/lzop/compressed/file.lzo
#
# Requires installed 'lzop' command.
#
lzohead () {
    hadoop fs -cat $1 | lzop -dc | head -1000 | less
}

# Add Hadoop bin/ directory to PATH
export PATH=$PATH:$HADOOP_HOME/bin

编辑在尝试mahendra给出的解决方案后,我得到以下输出

This script is Deprecated. Instead use start-dfs.sh and start-yarn.sh Starting namenodes on [localhost] localhost: starting namenode, logging to /usr/local/hadoop/logs/hadoop-hduser-namenode-mmt-HP-ProBook-430-G3.out localhost: starting datanode, logging to /usr/local/hadoop/logs/hadoop-hduser-datanode-mmt-HP-ProBook-430-G3.out Starting secondary namenodes [0.0.0.0] 0.0.0.0: starting secondarynamenode, logging to /usr/local/hadoop/logs/hadoop-hduser-secondarynamenode-mmt-HP-ProBook-430-G3.out starting yarn daemons starting resourcemanager, logging to /usr/local/hadoop/logs/yarn-hduser-resourcemanager-mmt-HP-ProBook-430-G3.out localhost: starting nodemanager, logging to /usr/local/hadoop/logs/yarn-hduser-nodemanager-mmt-HP-ProBook-430-G3.out

1 个答案:

答案 0 :(得分:8)

尝试运行:

hduser@ubuntu:~$ /usr/local/hadoop/sbin/start-all.sh

由于位于sbin目录中的 start-all.sh stop-all.sh ,而 hadoop 二进制文件位于在bin目录中。

还更新了您的.bashrc

  

export PATH = $ PATH:$ HADOOP_HOME / bin: $ HADOOP_HOME / sbin

以便您可以直接访问start-all.sh