sparkR-Mongo连接器查询到子文档

时间:2016-08-16 08:56:56

标签: mongodb apache-spark connector

我正在使用Mongo-Spark连接器文档中的所有示例(https://docs.mongodb.com/spark-connector/sparkR/)都没问题,但如果我在具有子文档的文档中测试查询失败,显然SQL还没有为此查询做好准备:

result <- sql(sqlContext, "SELECT DOCUMENT.SUBDOCUMENT FROM TABLE")

ERROR:

com.mongodb.spark.exceptions.MongoTypeConversionException: Cannot cast INT32 into a ConflictType (value: BsonInt32{value=171609012})
        at com.mongodb.spark.sql.MapFunctions$.com$mongodb$spark$sql$MapFunctions$$convertToDataType(MapFunctions.scala:79)
        at com.mongodb.spark.sql.MapFunctions$$anonfun$3.apply(MapFunctions.scala:38)
        at com.mongodb.spark.sql.MapFunctions$$anonfun$3.apply(MapFunctions.scala:36)
        at scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:244)
        at scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:244)
        at scala.collection.IndexedSeqOptimized$class.foreach(IndexedSeqOptimized.scala:33)
        at scala.collection.mutable.ArrayOps$ofRef.foreach(ArrayOps.scala:108)
        at scala.collection.TraversableLike$class.map(TraversableLike.scala:244)
        at scala.collection.mutable.ArrayOps$ofRef.map(ArrayOps.scala:108)
        at com.mongodb.spark.sql.MapFunctions$.documentToRow(MapFunctions.scala:36)
        at com.mongodb.spark.sql.MapFunctions$.castToStructType(MapFunctions.scala:108)
        at com.mongodb.spark.sql.MapFunctions$.com$mongodb$spark$sql$MapFunctions$$convertToDataType(MapFunctions.scala:74)

之前我已将表格注册如下:

registerTempTable(schema, "TABLE")

我猜关键问题是如何将mongo-subdocument注册为表。

有人有解决方案吗?

0 个答案:

没有答案