Hadoop中的XML处理失败

时间:2013-03-26 08:47:00

标签: xml hadoop mapreduce

我真的很喜欢这个领域。 通过引用http://java.dzone.com/articles/hadoop-practice我做了 https://github.com/studhadoop/xmlparsing-hadoop/blob/master/XmlParser11.java

创建jar文件,然后运行mapreduce pgm。 我的xml文件是

<configuration>
    <property>
            <name>dfs.replication</name>
            <value>1</value>
            <type>tr</type>
    </property>
</configuration>

root# javac -classpath /var/root/hadoop-1.0.4/hadoop-core-1.0.4.jar -d xml11 XmlParser11.java

root# jar -cvf /var/root/xmlparser11/xmlparser.jar -C xml11/ .

root# bin/hadoop jar /var/root/xmlparser11/xmlparser.jar com.org.XmlParser11 /user/root/xmlfiles/conf.xml /user/root/xmlfiles-outputjava3

更新

13/03/30 09:39:58 WARN mapred.JobClient: Use GenericOptionsParser for parsing the arguments. Applications should implement Tool for the same.
13/03/30 09:39:58 INFO input.FileInputFormat: Total input paths to process : 1
13/03/30 09:39:58 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
13/03/30 09:39:58 WARN snappy.LoadSnappy: Snappy native library not loaded
13/03/30 09:39:58 INFO mapred.JobClient: Running job: job_201303300855_0004
13/03/30 09:39:59 INFO mapred.JobClient:  map 0% reduce 0%
13/03/30 09:40:13 INFO mapred.JobClient: Task Id : attempt_201303300855_0004_m_000000_0, Status : FAILED
java.io.IOException: java.io.IOException: Type mismatch in key from map: expected org.apache.hadoop.io.Text, recieved java.lang.String
    at com.org.XmlParser11$Map.map(XmlParser11.java:186)
    at com.org.XmlParser11$Map.map(XmlParser11.java:148)
    at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:144)
    at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:764)
    at org.apache.hadoop.mapred.MapTask.run(MapTask.java:370)
    at org.apache.hadoop.mapred.Child$4.run(Child.java:255)
    at java.security.AccessController.doPrivileged(Native Method)
    at javax.security.auth.Subject.doAs(Subject.java:396)
    at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1121)
    at org.apache.hadoop.mapred.Child.main(Child.java:249)
Caused by: java.io.IOException: Type mismatch in key from map: expected org.apache.hadoop.io.Text, recieved java.lang.String
    at org.apache.hadoop.mapred.MapTask$MapOutputBuffer.collect(MapTask.java:1014)
    at org.apache.hadoop.mapred.MapTask$NewOutputCollector.write(MapTask.java:691)
    at org.apache.hadoop.mapreduce.TaskInputOutputContext.write(TaskInputOutputContext.java:80)
    at com.org.XmlParser11$Map.map(XmlParser11.java:184)
    ... 9 more

attempt_201303300855_0004_m_000000_0: ‘<property>
attempt_201303300855_0004_m_000000_0:             <name>dfs.replication</name>
attempt_201303300855_0004_m_000000_0:                 <value>1</value>
attempt_201303300855_0004_m_000000_0:             </property>‘
13/03/30 09:40:19 INFO mapred.JobClient: Task Id : attempt_201303300855_0004_m_000000_1, Status : FAILED
java.io.IOException: java.io.IOException: Type mismatch in key from map: expected org.apache.hadoop.io.Text, recieved java.lang.String
    at com.org.XmlParser11$Map.map(XmlParser11.java:186)
    at com.org.XmlParser11$Map.map(XmlParser11.java:148)
    at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:144)
    at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:764)
    at org.apache.hadoop.mapred.MapTask.run(MapTask.java:370)
    at org.apache.hadoop.mapred.Child$4.run(Child.java:255)
    at java.security.AccessController.doPrivileged(Native Method)
    at javax.security.auth.Subject.doAs(Subject.java:396)
    at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1121)
    at org.apache.hadoop.mapred.Child.main(Child.java:249)
Caused by: java.io.IOException: Type mismatch in key from map: expected org.apache.hadoop.io.Text, recieved java.lang.String
    at org.apache.hadoop.mapred.MapTask$MapOutputBuffer.collect(MapTask.java:1014)
    at org.apache.hadoop.mapred.MapTask$NewOutputCollector.write(MapTask.java:691)
    at org.apache.hadoop.mapreduce.TaskInputOutputContext.write(TaskInputOutputContext.java:80)
    at com.org.XmlParser11$Map.map(XmlParser11.java:184)
    ... 9 more

attempt_201303300855_0004_m_000000_1: ‘<property>
attempt_201303300855_0004_m_000000_1:             <name>dfs.replication</name>
attempt_201303300855_0004_m_000000_1:                 <value>1</value>
attempt_201303300855_0004_m_000000_1:             </property>‘
13/03/30 09:40:25 INFO mapred.JobClient: Task Id : attempt_201303300855_0004_m_000000_2, Status : FAILED
java.io.IOException: java.io.IOException: Type mismatch in key from map: expected org.apache.hadoop.io.Text, recieved java.lang.String
    at com.org.XmlParser11$Map.map(XmlParser11.java:186)
    at com.org.XmlParser11$Map.map(XmlParser11.java:148)
    at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:144)
    at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:764)
    at org.apache.hadoop.mapred.MapTask.run(MapTask.java:370)
    at org.apache.hadoop.mapred.Child$4.run(Child.java:255)
    at java.security.AccessController.doPrivileged(Native Method)
    at javax.security.auth.Subject.doAs(Subject.java:396)
    at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1121)
    at org.apache.hadoop.mapred.Child.main(Child.java:249)
Caused by: java.io.IOException: Type mismatch in key from map: expected org.apache.hadoop.io.Text, recieved java.lang.String
    at org.apache.hadoop.mapred.MapTask$MapOutputBuffer.collect(MapTask.java:1014)
    at org.apache.hadoop.mapred.MapTask$NewOutputCollector.write(MapTask.java:691)
    at org.apache.hadoop.mapreduce.TaskInputOutputContext.write(TaskInputOutputContext.java:80)
    at com.org.XmlParser11$Map.map(XmlParser11.java:184)
    ... 9 more

attempt_201303300855_0004_m_000000_2: ‘<property>
attempt_201303300855_0004_m_000000_2:             <name>dfs.replication</name>
attempt_201303300855_0004_m_000000_2:                 <value>1</value>
attempt_201303300855_0004_m_000000_2:             </property>‘
13/03/30 09:40:37 INFO mapred.JobClient: Job complete: job_201303300855_0004
13/03/30 09:40:37 INFO mapred.JobClient: Counters: 7
13/03/30 09:40:37 INFO mapred.JobClient:   Job Counters 
13/03/30 09:40:37 INFO mapred.JobClient:     SLOTS_MILLIS_MAPS=27296
13/03/30 09:40:37 INFO mapred.JobClient:     Total time spent by all reduces waiting after reserving slots (ms)=0
13/03/30 09:40:37 INFO mapred.JobClient:     Total time spent by all maps waiting after reserving slots (ms)=0
13/03/30 09:40:37 INFO mapred.JobClient:     Launched map tasks=4
13/03/30 09:40:37 INFO mapred.JobClient:     Data-local map tasks=4
13/03/30 09:40:37 INFO mapred.JobClient:     SLOTS_MILLIS_REDUCES=0
13/03/30 09:40:37 INFO mapred.JobClient:     Failed map tasks=1

map-reduce代码有什么问题?我怎么能纠正它?

2 个答案:

答案 0 :(得分:0)

尝试此命令并查看

root# bin/hadoop fs -cat /user/root/xmlfiles-outputjava3/part-r-00000

您是否拥有所需的输出。您指定的输出是Map Reduce在HDFS中运行时获得的标准输出。

<强>更新

您需要输入System.out.println

if (currentElement.equalsIgnoreCase("name")) {
    propertyName += reader.getText();
    System.out.println(propertyName);
} else if (currentElement.equalsIgnoreCase("value")) {
    propertyValue += reader.getText();
    System.out.println(propertyValue);
}

要查看正在设置的属性名称和值。如果不是,你需要找到原因?

更新2

context.write(propertyName.trim(), propertyValue.trim());

propertyName和propertyValue是String,但您已声明Mapper将Text输出为键和值。

像这样改变。

Text name = new Text();
Text value = new Text();
name.setText(propertyName.trim());
value.setText(propertyValue.trim());
context.write(name, value);

答案 1 :(得分:0)

请使用SAX解析器或Hadoop Streaming来解析XML。 接下来,请参考此链接供您参考。

http://xmlandhadoop.blogspot.in/ http://www.undercloud.org/?p=408

此致 Sudhakar Reddy

相关问题