失败:ParseException行1:21无法识别表名称中'<EOF>''<EOF>''<EOF>'附近的输入

时间:2019-07-26 03:37:23

标签: sql command-line hive escaping hiveql

命令:

hive -e "use xxx;DROP TABLE IF EXISTS `xxx.flashsaleeventproducts_hist`;CREATE EXTERNAL TABLE `xxx.flashsaleeventproducts_hist`(`event_id` string,`group_code` string,`id` string,`is_deleted` int,`price` int,`price_guide` int,`product_code` int,`product_id` string,`quantity_each_person_limit` int,`quantity_limit_plan` int,`sort_num` int,`update_time` bigint,`meta_offset` bigint,`meta_status` int,`meta_start_time` bigint)PARTITIONED BY(`cur_date` string,`cur_hour` string) ROW FORMAT SERDE 'org.apache.hadoop.hive.ql.io.parquet.serde.ParquetHiveSerDe'STORED AS INPUTFORMAT 'org.apache.hadoop.hive.ql.io.parquet.MapredParquetInputFormat' OUTPUTFORMAT 'org.apache.hadoop.hive.ql.io.parquet.MapredParquetOutputFormat'LOCATION '/data/ods/xxx/flashsaleeventproducts_hist';msck repair table flashsaleevents_hist;"

错误:


xxx.flashsaleeventproducts_hist: command not found
xxx.flashsaleeventproducts_hist: command not found
event_id: command not found
group_code: command not found
is_deleted: command not found

Command 'price' not found, did you mean:

  command 'rice' from deb golang-rice

Try: sudo apt install <deb name>

price_guide: command not found
product_code: command not found
product_id: command not found
quantity_each_person_limit: command not found
quantity_limit_plan: command not found
sort_num: command not found
update_time: command not found
meta_offset: command not found
meta_status: command not found
meta_start_time: command not found
cur_date: command not found
cur_hour: command not found
/opt/cloudera/parcels/CDH-6.2.0-1.cdh6.2.0.p0.967373/bin/../lib/hive/conf/hive-env.sh: line 24: /opt/cloudera/parcels/CDH-6.2.0-1.cdh6.2.0.p0.967373/lib/hbase/hbase-protocol.jar: Permission denied
/opt/cloudera/parcels/CDH-6.2.0-1.cdh6.2.0.p0.967373/bin/../lib/hive/conf/hive-env.sh: line 25: /opt/cloudera/parcels/CDH-6.2.0-1.cdh6.2.0.p0.967373/lib/hbase/hbase-hadoop-compat.jar: Permission denied
/opt/cloudera/parcels/CDH-6.2.0-1.cdh6.2.0.p0.967373/bin/../lib/hive/conf/hive-env.sh: line 26: /opt/cloudera/parcels/CDH-6.2.0-1.cdh6.2.0.p0.967373/lib/hbase/hbase-hadoop2-compat.jar: Permission denied
/opt/cloudera/parcels/CDH-6.2.0-1.cdh6.2.0.p0.967373/bin/../lib/hive/conf/hive-env.sh: line 27: /opt/cloudera/parcels/CDH-6.2.0-1.cdh6.2.0.p0.967373/lib/hbase/lib/htrace-core.jar: Permission denied
/opt/cloudera/parcels/CDH-6.2.0-1.cdh6.2.0.p0.967373/bin/../lib/hive/conf/hive-env.sh: line 28: /opt/cloudera/parcels/CDH-6.2.0-1.cdh6.2.0.p0.967373/lib/hbase/hbase-client.jar: Permission denied
/opt/cloudera/parcels/CDH-6.2.0-1.cdh6.2.0.p0.967373/bin/../lib/hive/conf/hive-env.sh: line 29: /opt/cloudera/parcels/CDH-6.2.0-1.cdh6.2.0.p0.967373/lib/hbase/hbase-server.jar: Permission denied
/opt/cloudera/parcels/CDH-6.2.0-1.cdh6.2.0.p0.967373/bin/../lib/hadoop/libexec/hadoop-functions.sh: line 2331: HADOOP_ORG.APACHE.HADOOP.HBASE.UTIL.GETJAVAPROPERTY_USER: bad substitution
/opt/cloudera/parcels/CDH-6.2.0-1.cdh6.2.0.p0.967373/bin/../lib/hadoop/libexec/hadoop-functions.sh: line 2426: HADOOP_ORG.APACHE.HADOOP.HBASE.UTIL.GETJAVAPROPERTY_OPTS: bad substitution
WARNING: Use "yarn jar" to launch YARN applications.
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/opt/cloudera/parcels/CDH-6.2.0-1.cdh6.2.0.p0.967373/jars/log4j-slf4j-impl-2.8.2.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/opt/cloudera/parcels/CDH-6.2.0-1.cdh6.2.0.p0.967373/jars/slf4j-log4j12-1.7.25.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
SLF4J: Actual binding is of type [org.apache.logging.slf4j.Log4jLoggerFactory]

Logging initialized using configuration in jar:file:/opt/cloudera/parcels/CDH-6.2.0-1.cdh6.2.0.p0.967373/jars/hive-common-2.1.1-cdh6.2.0.jar!/hive-log4j2.properties Async: false
OK
Time taken: 2.277 seconds
NoViableAltException(-1@[199:1: tableName : (db= identifier DOT tab= identifier -> ^( TOK_TABNAME $db $tab) |tab= identifier -> ^( TOK_TABNAME $tab) );])
        at org.antlr.runtime.DFA.noViableAlt(DFA.java:158)
        at org.antlr.runtime.DFA.predict(DFA.java:144)
        at org.apache.hadoop.hive.ql.parse.HiveParser_FromClauseParser.tableName(HiveParser_FromClauseParser.java:3821)
        at org.apache.hadoop.hive.ql.parse.HiveParser.tableName(HiveParser.java:40055)
        at org.apache.hadoop.hive.ql.parse.HiveParser.dropTableStatement(HiveParser.java:6887)
        at org.apache.hadoop.hive.ql.parse.HiveParser.ddlStatement(HiveParser.java:3126)
        at org.apache.hadoop.hive.ql.parse.HiveParser.execStatement(HiveParser.java:2266)
        at org.apache.hadoop.hive.ql.parse.HiveParser.statement(HiveParser.java:1318)
        at org.apache.hadoop.hive.ql.parse.ParseDriver.parse(ParseDriver.java:218)
        at org.apache.hadoop.hive.ql.parse.ParseUtils.parse(ParseUtils.java:75)
        at org.apache.hadoop.hive.ql.parse.ParseUtils.parse(ParseUtils.java:68)
        at org.apache.hadoop.hive.ql.Driver.compile(Driver.java:564)
        at org.apache.hadoop.hive.ql.Driver.compileInternal(Driver.java:1425)
        at org.apache.hadoop.hive.ql.Driver.runInternal(Driver.java:1493)
        at org.apache.hadoop.hive.ql.Driver.run(Driver.java:1339)
        at org.apache.hadoop.hive.ql.Driver.run(Driver.java:1328)
        at org.apache.hadoop.hive.cli.CliDriver.processLocalCmd(CliDriver.java:239)
        at org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:187)
        at org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:409)
        at org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:342)
        at org.apache.hadoop.hive.cli.CliDriver.executeDriver(CliDriver.java:800)
        at org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:772)
        at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:699)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
        at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        at java.lang.reflect.Method.invoke(Method.java:498)
        at org.apache.hadoop.util.RunJar.run(RunJar.java:313)
        at org.apache.hadoop.util.RunJar.main(RunJar.java:227)
FAILED: ParseException line 1:21 cannot recognize input near '<EOF>' '<EOF>' '<EOF>' in table name


注意:sql是正确的,直接在hive cli中运行不会引发错误。

我认为问题可能是sql中的一些特殊字符,但无法弄清楚。

该SQL的更好视图

use xxx;
DROP TABLE IF EXISTS `xxx.flashsaleeventproducts`;
CREATE EXTERNAL TABLE `xxx.flashsaleeventproducts`(
`id` string,
`event_id` string,
`product_id` string,
`sort_num` int,
`price_guide` int,
`price` int,
`quantity_limit_plan` int,
`quantity_each_person_limit` int,
`is_deleted` int,
`update_time` bigint,
`group_code` string,
`product_code` int
)PARTITIONED BY(`cur_date` string,`cur_hour` string) 
ROW FORMAT SERDE 'org.apache.hadoop.hive.ql.io.parquet.serde.ParquetHiveSerDe'  
STORED AS INPUTFORMAT 'org.apache.hadoop.hive.ql.io.parquet.MapredParquetInputFormat' 
OUTPUTFORMAT 'org.apache.hadoop.hive.ql.io.parquet.MapredParquetOutputFormat'
LOCATION '/data/ods/xxx/flashsaleeventproducts';
msck repair table flashsaleeventproducts;

2 个答案:

答案 0 :(得分:0)

请找到以下脚本以连接到配置单元:

import subprocess
import sys
query=""" hive -e "set hive.cli.print.header=true;use db;select * from somehivetable;" """

outresutfile=open("query_result.csv", 'w')
p=subprocess.Popen(query,shell=True,stdout=outresutfile,stderr=subprocess.PIPE)
stdout,stderr = p.communicate()
if p.returncode != 0:
    print stderr
    sys.exit(1)

答案 1 :(得分:0)

我知道问题出在符号:`。

删除该符号即可解决问题。

仅剩下一个问题,为什么`在hive-cli中工作良好,而在hive -e“ xxx”中失败。

相关问题