无法在HDFS上写入文件

时间:2015-08-03 07:25:14

标签: hadoop hdfs

我已安装hadoop-2.6.0并且还检查了所有hadoop守护进程是否正在运行。我能够在hdfs中创建或复制目录,但无法将文件复制到hdfs。

命令:

bin/hadoop fs -copyFromLocal /home/130853/Hadoop_Data/abc /trial/abc

它提供以下例外:

Exception in thread "main" java.lang.UnsatisfiedLinkError: org.apache.hadoop.util.NativeCrc32.nativeComputeChunkedSumsByteArray(II[BI[BIILjava/lang/String;JZ)V
    at org.apache.hadoop.util.NativeCrc32.nativeComputeChunkedSumsByteArray(Native Method)
    at org.apache.hadoop.util.NativeCrc32.calculateChunkedSumsByteArray(NativeCrc32.java:86)
    at org.apache.hadoop.util.DataChecksum.calculateChunkedSums(DataChecksum.java:430)
    at org.apache.hadoop.fs.FSOutputSummer.writeChecksumChunks(FSOutputSummer.java:202)
    at org.apache.hadoop.fs.FSOutputSummer.flushBuffer(FSOutputSummer.java:163)
    at org.apache.hadoop.fs.FSOutputSummer.flushBuffer(FSOutputSummer.java:144)
    at org.apache.hadoop.hdfs.DFSOutputStream.close(DFSOutputStream.java:2217)
    at org.apache.hadoop.fs.FSDataOutputStream$PositionCache.close(FSDataOutputStream.java:72)
    at org.apache.hadoop.fs.FSDataOutputStream.close(FSDataOutputStream.java:106)
    at org.apache.hadoop.io.IOUtils.copyBytes(IOUtils.java:54)
    at org.apache.hadoop.io.IOUtils.copyBytes(IOUtils.java:112)
    at org.apache.hadoop.fs.shell.CommandWithDestination$TargetFileSystem.writeStreamToFile(CommandWithDestination.java:466)
    at org.apache.hadoop.fs.shell.CommandWithDestination.copyStreamToTarget(CommandWithDestination.java:391)
    at org.apache.hadoop.fs.shell.CommandWithDestination.copyFileToTarget(CommandWithDestination.java:328)
    at org.apache.hadoop.fs.shell.CommandWithDestination.processPath(CommandWithDestination.java:263)
    at org.apache.hadoop.fs.shell.CommandWithDestination.processPath(CommandWithDestination.java:248)
    at org.apache.hadoop.fs.shell.Command.processPaths(Command.java:306)
    at org.apache.hadoop.fs.shell.Command.processPathArgument(Command.java:278)
    at org.apache.hadoop.fs.shell.CommandWithDestination.processPathArgument(CommandWithDestination.java:243)
    at org.apache.hadoop.fs.shell.Command.processArgument(Command.java:260)
    at org.apache.hadoop.fs.shell.Command.processArguments(Command.java:244)
    at org.apache.hadoop.fs.shell.CommandWithDestination.processArguments(CommandWithDestination.java:220)
    at org.apache.hadoop.fs.shell.CopyCommands$Put.processArguments(CopyCommands.java:267)
    at org.apache.hadoop.fs.shell.Command.processRawArguments(Command.java:190)
    at org.apache.hadoop.fs.shell.Command.run(Command.java:154)
    at org.apache.hadoop.fs.FsShell.run(FsShell.java:287)
    at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70)
    at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:84)
    at org.apache.hadoop.fs.FsShell.main(FsShell.java:340)

有人可以帮忙吗?

1 个答案:

答案 0 :(得分:0)

你的Linux系统是什么? 64位还是32位?

如果它是64位,建议你重新编写hadoop源代码,以便它支持原生lib。

如果它是32位,建议您切换64位系统。

尝试执行命令, hadoop checknative -a

enter image description here