Hadoop:运行start-all.sh时发生错误

时间:2016-03-17 01:11:09

标签: hadoop

运行start-all.sh,然后发生错误。

yunweiguo@172.16.192.134's password: 
172.16.192.135: bash: line 0: cd: /Users/yunweiguo/hadoop/hadoop-1.2.1/libexec/..: No such file or directory
172.16.192.135: bash: /Users/yunweiguo/hadoop/hadoop-1.2.1/bin/hadoop-daemon.sh: No such file or directory

1 个答案:

答案 0 :(得分:0)

我猜您没有正确设置bashrc,请按照以下步骤操作:

vi $ HOME / .bashrc在文件末尾添加以下行:(将hadoop home改为你的家)

 # Set Hadoop-related environment variables 
 export HADOOP_HOME=/usr/local/hadoop

 # Set JAVA_HOME (we will also configure JAVA_HOME directly for Hadoop later on) 
 export JAVA_HOME=/usr/lib/jvm/java-6-sun

 # Some convenient aliases and functions for running Hadoop-related commands 
  unalias fs &> /dev/null
  alias fs="hadoop fs"
  unalias hls &> /dev/null 
  alias hls="fs -ls"

 # If you have LZO compression enabled in your Hadoop cluster and
 # compress job outputs with LZOP (not covered in this tutorial):
 # Conveniently inspect an LZOP compressed file from the command
 # line; run via:
 #
 # $ lzohead /hdfs/path/to/lzop/compressed/file.lzo
 #
 # Requires installed 'lzop' command.
  lzohead () {
     hadoop fs -cat $1 | lzop -dc | head -1000 | less 
    }

 # Add Hadoop bin/ directory to PATH
  export PATH=$PATH:$HADOOP_HOME/bin
相关问题