错误:pyspark.sql.utils.AnalysisException:u" unresolved operator' Project [' coalesce(scalar-subquery#2375 [],0)

时间:2016-11-02 06:15:17

标签: subquery apache-spark-sql spark-dataframe pyspark-sql

我在pyspark 2.0中使用spark sql运行查询,代码如下所示

spark.sql("select coalesce((select f2.ChargeAmt from Fact_CMCharges where f2.BldgID=f.BldgID limit 1),0)as chargval from Fact_CMCharges f join CMRECC l on l.BLDGID=f.BldgID").show()

并且在运行之后,spark给了我这个例外

pyspark.sql.utils.AnalysisException: u"unresolved operator 'Project ['coalesce(scalar-subquery#2375 [], 0) AS chargval#2376];"

我怎么能解决这个问题 提前致谢 格利扬

0 个答案:

没有答案
相关问题