用于ORC或RC格式的Hive Json SerDE

时间:2017-04-05 01:54:06

标签: hive hive-serde

是否可以使用带有RC或ORC文件格式的JSON serde?我试图插入一个文件格式为ORC的Hive表,并在序列化的JSON中存储在azure blob上。

2 个答案:

答案 0 :(得分:1)

显然不是

insert overwrite local directory '/home/cloudera/local/mytable' 
stored as orc 
select '{"mycol":123,"mystring","Hello"}'
;

create external table verify_data (rec string) 
stored as orc 
location 'file:////home/cloudera/local/mytable'
;

select * from verify_data
;
  

REC
  {" mycol" 123" MyString的""你好"}

create external table mytable (myint int,mystring string)
row format serde 'org.apache.hive.hcatalog.data.JsonSerDe' 
stored as orc
location 'file:///home/cloudera/local/mytable'
;
  

myint mystring
  异常java.io.IOException失败:java.lang.ClassCastException:
  org.apache.hadoop.hive.ql.io.orc.OrcStruct无法强制转换为org.apache.hadoop.io.Text

JsonSerDe.java

...
import org.apache.hadoop.io.Text;
...

  @Override
  public Object deserialize(Writable blob) throws SerDeException {

    Text t = (Text) blob;
  ...

答案 1 :(得分:0)

您可以使用某种转换步骤来执行此操作,例如在目标目录中生成ORC文件并在分组后安装具有相同模式的hive表的分段步骤。如下所示。

CREATE EXTERNAL TABLE my_fact_orc
(
  mycol STRING,
  mystring INT
)
PARTITIONED BY (dt string)
CLUSTERED BY (some_id) INTO 64 BUCKETS
STORED AS ORC
LOCATION 's3://dev/my_fact_orc'
TBLPROPERTIES ('orc.compress'='SNAPPY');

ALTER TABLE my_fact_orc ADD IF NOT EXISTS PARTITION (dt='2017-09-07') LOCATION 's3://dev/my_fact_orc/dt=2017-09-07';

ALTER TABLE my_fact_orc PARTITION (dt='2017-09-07') SET FILEFORMAT ORC;

SELECT * FROM my_fact_orc WHERE dt='2017-09-07' LIMIT 5;
相关问题