Hadoop MapReduce错误:Mkdirs无法创建文件;工作失败了

时间:2016-03-12 14:13:04

标签: java eclipse hadoop mapreduce

我正在尝试在Hadoop上执行C4.5算法。但是,我遇到了问题,并坚持以下错误。我拥有所有权限。任何人都可以帮助我吗?

Java.lang.Exception: java.io.IOException: Mkdirs failed to create file:/usr/local/hadoop/1/output10/_temporary/0/_temporary/attempt_local960306821_0001_r_000000_0 (exists=false, cwd=file:/home/brina/workspace/C4.5Hadoop)
    at org.apache.hadoop.mapred.LocalJobRunner$Job.runTasks(LocalJobRunner.java:462)
    at org.apache.hadoop.mapred.LocalJobRunner$Job.run(LocalJobRunner.java:529)
Caused by: java.io.IOException: Mkdirs failed to create file:/usr/local/hadoop/1/output10/_temporary/0/_temporary/attempt_local960306821_0001_r_000000_0 (exists=false, cwd=file:/home/brina/workspace/C4.5Hadoop)
    at org.apache.hadoop.fs.ChecksumFileSystem.create(ChecksumFileSystem.java:442)
    at org.apache.hadoop.fs.ChecksumFileSystem.create(ChecksumFileSystem.java:428)
    at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:908)
    at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:801)
    at org.apache.hadoop.mapred.TextOutputFormat.getRecordWriter(TextOutputFormat.java:123)
    at org.apache.hadoop.mapred.ReduceTask$OldTrackingRecordWriter.<init>(ReduceTask.java:484)
    at org.apache.hadoop.mapred.ReduceTask.runOldReducer(ReduceTask.java:414)
    at org.apache.hadoop.mapred.ReduceTask.run(ReduceTask.java:392)
    at org.apache.hadoop.mapred.LocalJobRunner$Job$ReduceTaskRunnable.run(LocalJobRunner.java:319)
    at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471)
    at java.util.concurrent.FutureTask.run(FutureTask.java:262)
    at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
    at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
    at java.lang.Thread.run(Thread.java:745)
2016-03-12 19:08:04,332 INFO  [main] mapreduce.Job (Job.java:monitorAndPrintJob(1386)) - Job job_local960306821_0001 failed with state FAILED due to: NA
2016-03-12 19:08:04,492 INFO  [main] mapreduce.Job (Job.java:monitorAndPrintJob(1391)) - Counters: 33
    File System Counters
        FILE: Number of bytes read=523
        FILE: Number of bytes written=249822
        FILE: Number of read operations=0
        FILE: Number of large read operations=0
        FILE: Number of write operations=0
    Map-Reduce Framework
        Map input records=14
        Map output records=56
        Map output bytes=863
        Map output materialized bytes=981
        Input split bytes=93
        Combine input records=0
        Combine output records=0
        Reduce input groups=0
        Reduce shuffle bytes=981
        Reduce input records=0
        Reduce output records=0
        Spilled Records=56
        Shuffled Maps =1
        Failed Shuffles=0
        Merged Map outputs=1
        GC time elapsed (ms)=0
        CPU time spent (ms)=0
        Physical memory (bytes) snapshot=0
        Virtual memory (bytes) snapshot=0
        Total committed heap usage (bytes)=188743680
    Shuffle Errors
        BAD_ID=0
        CONNECTION=0
        IO_ERROR=0
        WRONG_LENGTH=0
        WRONG_MAP=0
        WRONG_REDUCE=0
    File Input Format Counters 
        Bytes Read=374
    File Output Format Counters 
        Bytes Written=0



Exception in thread "main" java.io.IOException: Job failed!
    at org.apache.hadoop.mapred.JobClient.runJob(JobClient.java:836)
    at C45.run(C45.java:192)
    at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70)
    at C45.main(C45.java:53)

2 个答案:

答案 0 :(得分:1)

(如果其他人遇到此问题,请从评论中复制)

基于日志行

Mkdirs failed to create file:/usr/local/hadoop/1/output10/_temporary/0/_temporary/attempt_local960306821_0001_r_000000_0 (exists=false, cwd=file:/home/brina/workspace/C4.5Hadoop)

问题不在HDFS中,而在本地文件系统中。因此,您需要调整在节点上写入的权限。

答案 1 :(得分:0)

我也遇到了这个问题,并按如下所示给予了许可

$ sudo chown -R用户:group / usr

它可以创建文件

相关问题