在多节点群集中运行Hadoop不起作用

时间:2013-01-10 17:16:40

标签: java hadoop

我安装了HDFS并在3台计算机上工作。然后我尝试在现有群集中添加5台PC,但之后我尝试在主节点上启动hadoop时遇到了下面提到的错误。

[hduser@dellnode1 ~]$ start-all.sh
starting namenode, logging to /usr/local/hadoop/bin/../logs/hadoop-hduser-namenode-dellnode1.pictlibrary.out
log4j:ERROR setFile(null,true) call failed.
java.io.FileNotFoundException: /usr/local/hadoop/bin/../logs/hadoop-hduser-namenode-dellnode1.pictlibrary.log (Permission denied)
    at java.io.FileOutputStream.openAppend(Native Method)
    at java.io.FileOutputStream.<init>(FileOutputStream.java:207)
    at java.io.FileOutputStream.<init>(FileOutputStream.java:131)
    at org.apache.log4j.FileAppender.setFile(FileAppender.java:290)
    at org.apache.log4j.FileAppender.activateOptions(FileAppender.java:164)
    at org.apache.log4j.DailyRollingFileAppender.activateOptions(DailyRollingFileAppender.java:216)
    at org.apache.log4j.config.PropertySetter.activate(PropertySetter.java:257)
    at org.apache.log4j.config.PropertySetter.setProperties(PropertySetter.java:133)
dellnode3.pictlibrary: datanode running as process 4856. Stop it first.
dellnode1.pictlibrary: starting datanode, logging to /usr/local/hadoop/bin/../logs/hadoop-hduser-datanode-dellnode1.pictlibrary.out
dellnode2.pictlibrary: starting datanode, logging to /usr/local/hadoop/bin/../logs/hadoop-hduser-datanode-dellnode2.pictlibrary.out
dellnode1.pictlibrary: starting secondarynamenode, logging to /usr/local/hadoop/bin/../logs/hadoop-hduser-secondarynamenode-dellnode1.pictlibrary.out
starting jobtracker, logging to /usr/local/hadoop/bin/../logs/hadoop-hduser-jobtracker-dellnode1.pictlibrary.out
log4j:ERROR setFile(null,true) call failed.
java.io.FileNotFoundException: /usr/local/hadoop/bin/../logs/hadoop-hduser-jobtracker-dellnode1.pictlibrary.log (Permission denied)
    at java.io.FileOutputStream.openAppend(Native Method)
    at java.io.FileOutputStream.<init>(FileOutputStream.java:207)
    at java.io.FileOutputStream.<init>(FileOutputStream.java:131)
    at org.apache.log4j.FileAppender.setFile(FileAppender.java:290)
    at org.apache.log4j.FileAppender.activateOptions(FileAppender.java:164)
    at org.apache.log4j.DailyRollingFileAppender.activateOptions(DailyRollingFileAppender.java:216)
    at org.apache.log4j.config.PropertySetter.activate(PropertySetter.java:257)
    at org.apache.log4j.config.PropertySetter.setProperties(PropertySetter.java:133)
dellnode3.pictlibrary: starting tasktracker, logging to /usr/local/hadoop/bin/../logs/hadoop-hduser-tasktracker-dellnode3.pictlibrary.out
dellnode1.pictlibrary: starting tasktracker, logging to /usr/local/hadoop/bin/../logs/hadoop-hduser-tasktracker-dellnode1.pictlibrary.out
dellnode2.pictlibrary: starting tasktracker, logging to /usr/local/hadoop/bin/../logs/hadoop-hduser-tasktracker-dellnode2.pictlibrary.out

所有电脑正在运行Fedora 17

1 个答案:

答案 0 :(得分:1)

我会用

之类的东西手动创建日志文件
sudo touch  /usr/local/hadoop/bin/logs/hadoop-hduser-namenode-dellnode1.pictlibrary.log

如果您的路径是

/usr/local/hadoop/bin/../logs/hadoop-hduser-namenode-dellnode1.pictlibrary.log

我在第一行写的更好的解决方法。

然后更改文件的权限:

sudo chmod 750 /usr/local/hadoop/bin/logs/hadoop-hduser-namenode-dellnode1.pictlibrary.log

然后再试一次。它应该这次工作; - )