hadoop fs -put:没有这样的文件或目录

时间:2018-02-18 10:49:02

标签: filesystems hadoop

在Hadoop 3.0中发出命令将文件从本地文件系统复制到终端上的HDFS时,显示错误

import os

nameList = os.listdir()
#storing file names in nameList list

newList = []
for name in nameList:
    if name.endswith(".jpg"):
        newList.append(name)
        # print(name)
    else:
        pass
#creating new list and storing only .jpg files

toChange = [5, 1, 3]
toChange.sort()

j = toChange[0]

# import pdb; pdb.set_trace()
#debugger

for name in newList:
    for j in toChange:
        newName = name.replace(newList[j], str(j))
        os.rename(name, newName)
        print(newName)
        break

但是我已检查目录 hadoop-3.0.0 / hadoop2_data / hdfs / datanode 是否存在具有适当访问权限的目录。我尝试从Web浏览器上传文件,它显示以下错误。

hadoop-3.0.0/hadoop2_data/hdfs/datanode': No such file or directory: 
`hdfs://localhost:9000/user/Amit/hadoop-3.0.0/hadoop2_data/hdfs/datanode.

请帮助解决问题。附加 core-site.xml

"Couldn't find datanode to write file. Forbidden"

HD​​FS-site.xml中

<configuration>
<property>
<name>fs.defaultFS</name>
<value>hdfs://localhost:9000</value>
</property>
</configuration>

0 个答案:

没有答案
相关问题