使用registerTempTable找不到表或视图

时间:2017-01-10 12:58:09

标签: apache-spark pyspark spark-dataframe pyspark-sql

所以我在pyspark shell上运行以下命令:

>>> data = spark.read.csv("annotations_000", header=False, mode="DROPMALFORMED", schema=schema)
>>> data.show(3)
+----------+--------------------+--------------------+---------+---------+--------+-----------------+
|   item_id|           review_id|                text|   aspect|sentiment|comments| annotation_round|
+----------+--------------------+--------------------+---------+---------+--------+-----------------+
|9999900031|9999900031/custom...|Just came back to...|breakfast|        3|    null|ASE_OpeNER_round2|
|9999900031|9999900031/custom...|Just came back to...|    staff|        3|    null|ASE_OpeNER_round2|
|9999900031|9999900031/custom...|The hotel was loc...|    noise|        2|    null|ASE_OpeNER_round2|
+----------+--------------------+--------------------+---------+---------+--------+-----------------+
>>> data.registerTempTable("temp")
>>> df = sqlContext.sql("select first(item_id), review_id, first(text), concat_ws(';', collect_list(aspect)) as aspect from temp group by review_id")
>>> df.show(3)
+---------------------+--------------------+--------------------+--------------------+
|first(item_id, false)|           review_id|  first(text, false)|              aspect|
+---------------------+--------------------+--------------------+--------------------+
|               100012|100012/tripadviso...|We stayed here la...|          staff;room| 
|               100013|100013/tripadviso...|We stayed for two...|           breakfast|
|               100031|100031/tripadviso...|We stayed two nig...|noise;breakfast;room|
+---------------------+--------------------+--------------------+--------------------+

并且它与shell sqlContext变量完美配合。

当我把它写成脚本时:

from pyspark import SparkContext
from pyspark.sql import SparkSession, SQLContext

sc = SparkContext(appName="AspectDetector")
spark = SparkSession(sc)
sqlContext = SQLContext(sc)
data.registerTempTable("temp")
df = sqlContext.sql("select first(item_id), review_id, first(text), concat_ws(';', collect_list(aspect)) as aspect from temp group by review_id")

并运行它我得到以下内容:

  

pyspark.sql.utils.AnalysisException:u'Table或view not found:temp;   第1行pos 99'

怎么可能?我在sqlContext的实例化上做错了吗?

1 个答案:

答案 0 :(得分:4)

首先,您需要使用Hive支持初始化spark,例如:

spark = SparkSession.builder \
    .master("yarn") \
    .appName("AspectDetector") \
    .enableHiveSupport() \
    .getOrCreate()

sqlContext = SQLContext(spark)

但是,您需要使用sqlContext.sql()来运行查询,而不是使用spark.sql()

我发现这也令人困惑,但我认为这是因为当你执行data.registerTempTable("temp")时,你实际上是在spark上下文而不是sqlContext上下文。如果要查询配置单元表,仍应使用sqlContext.sql()

相关问题