使用Spark

时间:2016-01-19 18:01:22

标签: apache-spark hive

我正在使用Hive 1.2.1和Spark 1.6,问题是我无法使用spark shell在Hive表中执行简单的删除操作。由于hive从0.14开始支持ACID,我希望它可以在Spark中使用。

 16/01/19 12:44:24 INFO hive.metastore: Connected to metastore.


 scala> hiveContext.sql("delete from testdb.test where id=2");


 16/01/19 12:44:51 INFO parse.ParseDriver: Parsing command: delete from    
 testdb.test where id=2
 16/01/19 12:44:52 INFO parse.ParseDriver: Parse Completed

 org.apache.spark.sql.AnalysisException:
 Unsupported language features in query: delete from testdb.test where id=2
 TOK_DELETE_FROM 1, 0,12, 12
   TOK_TABNAME 1, 4,6, 12
    testdb 1, 4,4, 12
     test 1, 6,6, 19
     ......

 scala.NotImplementedError: No parse rules for TOK_DELETE_FROM:
 TOK_DELETE_FROM 1, 0,12, 12
 TOK_TABNAME 1, 4,6, 12
  testdb 1, 4,4, 12
  ......

1 个答案:

答案 0 :(得分:1)

您可以通过Scala内部的命令行运行Hive。

import scala.sys.process._
val cmd = "hive -e \"delete from testdb.test where id=2\"" // Your command
val output = cmd.!! // Captures the output

另见Execute external command