Shark中的缓存类型异常无效

时间:2014-05-07 12:23:55

标签: scala hive hiveql apache-spark shark-sql

我正在尝试在shark-0.8.0中创建一个缓存表。根据文档(https://github.com/amplab/shark/wiki/Shark-User-Guide),我创建了如下表:

CREATE TABLE mydata_cached (
  artist string,
  title string ,
    track_id string,
    similars array<array<string>>,
    tags array<array<string>>
)
ROW FORMAT SERDE 'org.openx.data.jsonserde.JsonSerDe'
TBLPROPERTIES('shark.cache' = 'MEMORY');

表已创建,我可以使用LOAD DATA命令加载数据。但是当我尝试查询表时,即使SELECT COUNT(1)语句失败也会出现以下错误:

shark> select count(1) from mydata_cached;                                                
shark.memstore2.CacheType$InvalidCacheTypeException: Invalid string representation of cache type MEMORY
    at shark.memstore2.CacheType$.fromString(CacheType.scala:48)
    at shark.execution.TableScanOperator.execute(TableScanOperator.scala:119)
    at shark.execution.Operator$$anonfun$executeParents$1.apply(Operator.scala:115)
    at shark.execution.Operator$$anonfun$executeParents$1.apply(Operator.scala:115)
    at scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:233)
    at scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:233)
    at scala.collection.mutable.ResizableArray$class.foreach(ResizableArray.scala:60)
    at scala.collection.mutable.ArrayBuffer.foreach(ArrayBuffer.scala:47)
    at scala.collection.TraversableLike$class.map(TraversableLike.scala:233)
    at scala.collection.mutable.ArrayBuffer.map(ArrayBuffer.scala:47)
    at shark.execution.Operator.executeParents(Operator.scala:115)
    at shark.execution.UnaryOperator.execute(Operator.scala:187)
    at shark.execution.Operator$$anonfun$executeParents$1.apply(Operator.scala:115)
    at shark.execution.Operator$$anonfun$executeParents$1.apply(Operator.scala:115)
    at scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:233)
    at scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:233)
    at scala.collection.mutable.ResizableArray$class.foreach(ResizableArray.scala:60)
    at scala.collection.mutable.ArrayBuffer.foreach(ArrayBuffer.scala:47)
    at scala.collection.TraversableLike$class.map(TraversableLike.scala:233)
    at scala.collection.mutable.ArrayBuffer.map(ArrayBuffer.scala:47)
    at shark.execution.Operator.executeParents(Operator.scala:115)
    at shark.execution.UnaryOperator.execute(Operator.scala:187)
    at shark.execution.Operator$$anonfun$executeParents$1.apply(Operator.scala:115)
    at shark.execution.Operator$$anonfun$executeParents$1.apply(Operator.scala:115)
    at scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:233)
    at scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:233)
    at scala.collection.mutable.ResizableArray$class.foreach(ResizableArray.scala:60)
    at scala.collection.mutable.ArrayBuffer.foreach(ArrayBuffer.scala:47)
    at scala.collection.TraversableLike$class.map(TraversableLike.scala:233)
    at scala.collection.mutable.ArrayBuffer.map(ArrayBuffer.scala:47)
    at shark.execution.Operator.executeParents(Operator.scala:115)
    at shark.execution.UnaryOperator.execute(Operator.scala:187)
    at shark.execution.Operator$$anonfun$executeParents$1.apply(Operator.scala:115)
    at shark.execution.Operator$$anonfun$executeParents$1.apply(Operator.scala:115)
    at scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:233)
    at scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:233)
    at scala.collection.mutable.ResizableArray$class.foreach(ResizableArray.scala:60)
    at scala.collection.mutable.ArrayBuffer.foreach(ArrayBuffer.scala:47)
    at scala.collection.TraversableLike$class.map(TraversableLike.scala:233)
    at scala.collection.mutable.ArrayBuffer.map(ArrayBuffer.scala:47)
    at shark.execution.Operator.executeParents(Operator.scala:115)
    at org.apache.hadoop.hive.ql.exec.GroupByPostShuffleOperator.execute(GroupByPostShuffleOperator.scala:194)
    at shark.execution.Operator$$anonfun$executeParents$1.apply(Operator.scala:115)
    at shark.execution.Operator$$anonfun$executeParents$1.apply(Operator.scala:115)
    at scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:233)
    at scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:233)
    at scala.collection.mutable.ResizableArray$class.foreach(ResizableArray.scala:60)
    at scala.collection.mutable.ArrayBuffer.foreach(ArrayBuffer.scala:47)
    at scala.collection.TraversableLike$class.map(TraversableLike.scala:233)
    at scala.collection.mutable.ArrayBuffer.map(ArrayBuffer.scala:47)
    at shark.execution.Operator.executeParents(Operator.scala:115)
    at shark.execution.UnaryOperator.execute(Operator.scala:187)
    at shark.execution.Operator$$anonfun$executeParents$1.apply(Operator.scala:115)
    at shark.execution.Operator$$anonfun$executeParents$1.apply(Operator.scala:115)
    at scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:233)
    at scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:233)
    at scala.collection.mutable.ResizableArray$class.foreach(ResizableArray.scala:60)
    at scala.collection.mutable.ArrayBuffer.foreach(ArrayBuffer.scala:47)
    at scala.collection.TraversableLike$class.map(TraversableLike.scala:233)
    at scala.collection.mutable.ArrayBuffer.map(ArrayBuffer.scala:47)
    at shark.execution.Operator.executeParents(Operator.scala:115)
    at shark.execution.FileSinkOperator.execute(FileSinkOperator.scala:120)
    at shark.execution.SparkTask.execute(SparkTask.scala:101)
    at org.apache.hadoop.hive.ql.exec.Task.executeTask(Task.java:134)
    at org.apache.hadoop.hive.ql.exec.TaskRunner.runSequential(TaskRunner.java:57)
    at org.apache.hadoop.hive.ql.Driver.launchTask(Driver.java:1312)
    at org.apache.hadoop.hive.ql.Driver.execute(Driver.java:1104)
    at org.apache.hadoop.hive.ql.Driver.run(Driver.java:937)
    at shark.SharkCliDriver.processCmd(SharkCliDriver.scala:294)
    at org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:406)
    at org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:341)
    at shark.SharkCliDriver$.main(SharkCliDriver.scala:203)
    at shark.SharkCliDriver.main(SharkCliDriver.scala)
FAILED: Execution Error, return code -101 from shark.execution.SparkTask

根据GitHub(https://github.com/amplab/shark/blob/master/src/main/scala/shark/memstore2/CacheType.scala)的代码,选项MEMORY是有效的。我也试过MEMORY_ONLY选项,它给了我同样的错误。关于这里出了什么问题的任何建议或想法?

谢谢, TM

1 个答案:

答案 0 :(得分:1)

需要:

TBLPROPERTIES('shark.cache' = 'MEMORY_ONLY')
相关问题