Hadoop start-all.sh错误:没有这样的文件或目录

时间:2014-04-01 20:47:41

标签: bash hadoop

成功创建名称节点后,尝试启动名称节点时遇到此问题。 对我来说,似乎它试图登录到一个不存在的文件。如何更改设置以将脚本日志定向到正确的目录?

bash-3.2$ start-all.sh
starting namenode, logging to /usr/local/bin/../logs/hadoop-Yili-namenode-wifi169-
116.bucknell.edu.out
nice: /usr/local/bin/../bin/hadoop: No such file or directory
localhost: starting datanode, logging to /usr/local/bin/../logs/hadoop-Yili-datanode-
wifi169-116.bucknell.edu.out
localhost: nice: /usr/local/bin/../bin/hadoop: No such file or directory
localhost: starting secondarynamenode, logging to /usr/local/bin/../logs/hadoop-Yili-
secondarynamenode-wifi169-116.bucknell.edu.out
localhost: nice: /usr/local/bin/../bin/hadoop: No such file or directory
starting jobtracker, logging to /usr/local/bin/../logs/hadoop-Yili-jobtracker-wifi169-
116.bucknell.edu.out
nice: /usr/local/bin/../bin/hadoop: No such file or directory
localhost: starting tasktracker, logging to /usr/local/bin/../logs/hadoop-Yili- 
tasktracker-wifi169-116.bucknell.edu.out
localhost: nice: /usr/local/bin/../bin/hadoop: No such file or directory

1 个答案:

答案 0 :(得分:1)

尝试运行which hadoop。如果此命令为您提供输出,那么您的HADOOP_HOME已在.bashrc文件中设置。

如果未设置,则在您的主目录中编辑.bashrc文件,并考虑您在/opt/hadoop中安装了hadoop,添加以下语句。它可能是另一个地方。

HADOOP_HOME=/opt/HADOOP
export HADOOP_HOME
PATH=$PATH:$HADOOP_HOME/bin
export PATH

这会对你有帮助。