Hadoop数据节点尚未入门

时间:2014-02-19 18:15:59

标签: hadoop

我使用的是Hadoop 1.2.1版。在单个节点上。当我尝试在linux上使用bin / start-all.sh启动所有节点时。数据节点无法启动。在数据节点的日志文件中: “

2014-02-19 12:27:41,085 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: STARTUP_MSG:
/************************************************************
STARTUP_MSG: Starting DataNode
STARTUP_MSG:   host = HH-Xeon-2/127.0.1.1
STARTUP_MSG:   args = []
STARTUP_MSG:   version = 1.2.1
STARTUP_MSG:   build = https://svn.apache.org/repos/asf/hadoop/common/branches/branch-1.2 -r 1503152; compiled by 'mattf' on Mon Jul 22 15:23:09 PDT 2013
STARTUP_MSG:   java = 1.7.0_25
************************************************************/
2014-02-19 12:27:41,280 INFO org.apache.hadoop.metrics2.impl.MetricsConfig: loaded properties from hadoop-metrics2.properties
2014-02-19 12:27:41,294 INFO org.apache.hadoop.metrics2.impl.MetricsSourceAdapter: MBean for source MetricsSystem,sub=Stats registered.
2014-02-19 12:27:41,296 INFO org.apache.hadoop.metrics2.impl.MetricsSystemImpl: Scheduled snapshot period at 10 second(s).
2014-02-19 12:27:41,296 INFO org.apache.hadoop.metrics2.impl.MetricsSystemImpl: DataNode metrics system started
2014-02-19 12:27:41,519 INFO org.apache.hadoop.metrics2.impl.MetricsSourceAdapter: MBean for source ugi registered.
2014-02-19 12:27:41,524 WARN org.apache.hadoop.metrics2.impl.MetricsSystemImpl: Source name ugi already exists!
2014-02-19 12:27:46,472 INFO org.apache.hadoop.hdfs.server.common.Storage: Cannot access storage directory /app/hadoop/tmp2
2014-02-19 12:27:46,477 INFO org.apache.hadoop.hdfs.server.common.Storage: Storage directory /app/hadoop/tmp2 does not exist
2014-02-19 12:27:46,582 ERROR org.apache.hadoop.hdfs.server.datanode.DataNode: java.io.IOException: All specified directories are not accessible or do not exist.
        at org.apache.hadoop.hdfs.server.datanode.DataStorage.recoverTransitionRead(DataStorage.java:139)
        at org.apache.hadoop.hdfs.server.datanode.DataNode.startDataNode(DataNode.java:414)
        at org.apache.hadoop.hdfs.server.datanode.DataNode.<init>(DataNode.java:321)
        at org.apache.hadoop.hdfs.server.datanode.DataNode.makeInstance(DataNode.java:1712)
        at org.apache.hadoop.hdfs.server.datanode.DataNode.instantiateDataNode(DataNode.java:1651)
        at org.apache.hadoop.hdfs.server.datanode.DataNode.createDataNode(DataNode.java:1669)
        at org.apache.hadoop.hdfs.server.datanode.DataNode.secureMain(DataNode.java:1795)
        at org.apache.hadoop.hdfs.server.datanode.DataNode.main(DataNode.java:1812)

2014-02-19 12:27:46,583 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: SHUTDOWN_MSG:
"

目录确实存在并且它具有chmod 755的权限。使用777或775数据节点日志文件给出错误,即权限previatege不正确。 任何人都可以帮我解决这个问题吗? 感谢

0 个答案:

没有答案