如何将表导入本地文件系统?

时间:2017-03-02 17:05:57

标签: java hive hdfs hiveql

我正在尝试从配置单元中的表中导出结果并使用此命令:

bline --hiveconf hive.mapred.mode=nonstrict --outputformat=csv2 -e "select * from db.table;">~/table.csv

(bline is an alias for beeline -u address + some options)

查询结束但后来给了我

error java.lang.OutOfMemoryError: GC overhead limit exceeded

我是否正确导出表格,还是有更好的方法在Hive中导出表格?

2 个答案:

答案 0 :(得分:1)

由于您的表格以文本格式存储,您只需使用 get / getmerge 将文件从HDFS复制到本地文件系统

演示

<强>蜂房

create table mytable (i int,s string,d date);

insert into mytable values 
    (1,'hello','2017-03-01')
   ,(2,'world','2017-03-02')
;

select * from mytable
;

mytable.i   mytable.s   mytable.d
1   hello   2017-03-01
2   world   2017-03-02

show create table mytable;

CREATE TABLE `mytable`(
  `i` int, 
  `s` string, 
  `d` date)
ROW FORMAT SERDE 
  'org.apache.hadoop.hive.serde2.lazy.LazySimpleSerDe' 
STORED AS INPUTFORMAT 
  'org.apache.hadoop.mapred.TextInputFormat' 
OUTPUTFORMAT 
  'org.apache.hadoop.hive.ql.io.HiveIgnoreKeyTextOutputFormat'
LOCATION
  'hdfs://localhost:8020/user/hive/warehouse/mytable'
.
.
.

<强>的bash

hdfs dfs -getmerge /user/hive/warehouse/mytable mytable.txt

cat mytable.txt 

1hello2017-03-01
2world2017-03-02

P.S。 列之间有一个看不见的分隔符,字符SOH,ascii值为1。

<mytable.txt tr $'\x01' ','
1,hello,2017-03-01
2,world,2017-03-02

答案 1 :(得分:0)

由于您需要不同的分隔符,因此有一个更清晰的解决方案

insert overwrite local directory '/tmp/mytable' 
row format delimited
fields terminated by ','
select * from mytable
;

LanguageManual DML

演示

<强>蜂房

create table mytable (i int,s string,d date);

insert into mytable values 
    (1,'hello','2017-03-01')
   ,(2,'world','2017-03-02')
;

select * from mytable
;

mytable.i   mytable.s   mytable.d
1   hello   2017-03-01
2   world   2017-03-02
insert overwrite local directory '/tmp/mytable' 
row format delimited
fields terminated by ','
select * from mytable
;

<强>的bash

cat /tmp/mytable/*
1,hello,2017-03-01
2,world,2017-03-02