如何更新hive orc表的架构?

时间:2017-05-25 15:55:46

标签: hadoop hive apache-spark-sql pyspark-sql orc

我创建了一个以orc格式存储的hive外部表。

create external table test
(first_name string,
last_name string)
partitioned by (year int, month int) 
stored as orc location "\usr\tmp\orc_files'

然后我将一些数据插入到该位置。 当我在这个表上执行选择查询时,我得到了正确的结果。

接下来,我使用以下内容向表中添加了一个新列:

alter table test add columns(middle_name string);

现在,当我尝试

select * from test;

我收到错误:

17/05/25 11:52:23 INFO ParseDriver: Parsing command: select * from test
17/05/25 11:52:23 INFO ParseDriver: Parse Completed
Traceback (most recent call last):
  File "<stdin>", line 1, in <module>
  File "/usr/hdp/2.5.0.0-1245/spark/python/pyspark/sql/context.py", line 580, in sql
    return DataFrame(self._ssql_ctx.sql(sqlQuery), self)
  File "/usr/hdp/2.5.0.0-1245/spark/python/lib/py4j-0.9-src.zip/py4j/java_gateway.py", line 813, in __call__
  File "/usr/hdp/2.5.0.0-1245/spark/python/pyspark/sql/utils.py", line 45, in deco
    return f(*a, **kw)
  File "/usr/hdp/2.5.0.0-1245/spark/python/lib/py4j-0.9-src.zip/py4j/protocol.py", line 308, in get_return_value
py4j.protocol.Py4JJavaError: An error occurred while calling o55.sql.
: java.lang.AssertionError: assertion failed
        at scala.Predef$.assert(Predef.scala:165)
        at org.apache.spark.sql.execution.datasources.LogicalRelation$$anonfun$1.apply(LogicalRelation.scala:39)

有人可以帮我吗?为什么我无法更改orc表的架构?

0 个答案:

没有答案