java.lang.ClassCastException:无法将java.lang.String强制转换为scala.collection.Seq

时间:2019-10-12 14:25:05

标签: scala apache-spark

我正在做这样的事情

val domainList = data1.select("columnname","domainvalues").where(col("domainvalues").isNotNull).map(r =>  (r.getString(0), r.getList[String](1).asScala.toList)).collect()

domainList的类型应为Array [(String,List [String])]

对于输入DF:

+-------------+----------------------------------------+
|columnname   |domainvalues                            |
+-------------+----------------------------------------+
|predchurnrisk|Very High,High,Medium,Low               |
|userstatus   |Active,Lapsed,Renew                     |
|predinmarket |Very High,High,Medium,Low               |
|predsegmentid|High flyers,Watching Pennies,Big pockets|
|usergender   |Male,Female,Others                      |
+-------------+----------------------------------------+

我遇到的错误是

java.lang.ClassCastException: java.lang.String cannot be cast to scala.collection.Seq
    at org.apache.spark.sql.Row$class.getSeq(Row.scala:283)
    at org.apache.spark.sql.catalyst.expressions.GenericRow.getSeq(rows.scala:166)
    at org.apache.spark.sql.Row$class.getList(Row.scala:291)
    at org.apache.spark.sql.catalyst.expressions.GenericRow.getList(rows.scala:166)
    at com.fis.sdi.ade.batch.SFTP.Test$$anonfun$6.apply(Test.scala:53)
    at com.fis.sdi.ade.batch.SFTP.Test$$anonfun$6.apply(Test.scala:53)
    at org.apache.spark.sql.catalyst.expressions.GeneratedClass$GeneratedIteratorForCodegenStage2.mapelements_doConsume_0$(Unknown Source)
    at org.apache.spark.sql.catalyst.expressions.GeneratedClass$GeneratedIteratorForCodegenStage2.deserializetoobject_doConsume_0$(Unknown Source)

我应该如何解决?

1 个答案:

答案 0 :(得分:0)

看起来您的第二列包含字符串值,您可以使用df.printSchema()进行检查。在这种情况下,您可以尝试使用.split(",")

获取此列表
.map(r =>  (r.getString(0), r.getString(1).split(",")).collect()
相关问题