prestodb hive sql查询错误

时间:2013-11-21 05:25:52

标签: hadoop hive presto

亲爱的朋友们,我可以使用Hive配置Presto。

我可以看到“SHOW TABLES”的结果。我可以在结果中看到“书籍”表。

另请阅读显示所有列详细信息的书籍。

我有一个“书籍”表 - 能够通过配置单元查询并查看结果。

Ex:hive> select * from books;

但是当我尝试通过presto。我收到了以下错误

请指导我

错误

presto:default> select * from books;

Query 20131121_025845_00004_qqe25, FAILED, 1 node
Splits: 1 total, 0 done (0.00%)
0:00 [0 rows, 0B] [0 rows/s, 0B/s]

Query 20131121_025845_00004_qqe25 failed: java.io.IOException: Failed on local exception: java.io.IOException: Broken pipe; Host Details : local host is: "ubuntu/192.168.56.101"; destination host is: "localhost":54310;
presto:default>

服务器上的异常


45_00004_qqe25.1
java.lang.RuntimeException: java.io.IOException: Failed on local exception: java.io.IOException: Broken pipe; Host Details : local host is: "ubuntu/192.168.56.101"; destination host is: "localhost":54310;
        at com.google.common.base.Throwables.propagate(Throwables.java:160) ~[guava-15.0.jar:na]
        at com.facebook.presto.hive.HiveSplitIterable$HiveSplitQueue.computeNext(HiveSplitIterable.java:433) ~[na:na]
        at com.facebook.presto.hive.HiveSplitIterable$HiveSplitQueue.computeNext(HiveSplitIterable.java:392) ~[na:na]
        at com.google.common.collect.AbstractIterator.tryToComputeNext(AbstractIterator.java:143) ~[guava-15.0.jar:na]
        at com.google.common.collect.AbstractIterator.hasNext(AbstractIterator.java:138) ~[guava-15.0.jar:na]
        at com.facebook.presto.execution.SqlStageExecution.startTasks(SqlStageExecution.java:463) [presto-main-0.52.jar:0.52]
        at com.facebook.presto.execution.SqlStageExecution.access$300(SqlStageExecution.java:80) [presto-main-0.52.jar:0.52]
        at com.facebook.presto.execution.SqlStageExecution$5.run(SqlStageExecution.java:435) [presto-main-0.52.jar:0.52]
        at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471) [na:1.7.0_45]
        at java.util.concurrent.FutureTask.run(FutureTask.java:262) [na:1.7.0_45]
        at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145) [na:1.7.0_45]
        at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615) [na:1.7.0_45]
        at java.lang.Thread.run(Thread.java:744) [na:1.7.0_45]
Caused by: java.io.IOException: Failed on local exception: java.io.IOException: Broken pipe; Host Details : local host is: "ubuntu/192.168.56.101"; destination host is: "localhost":54310;
        at org.apache.hadoop.net.NetUtils.wrapException(NetUtils.java:763) ~[na:na]
        at org.apache.hadoop.ipc.Client.call(Client.java:1229) ~[na:na]
        at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:202) ~[na:na]
        at com.sun.proxy.$Proxy155.getListing(Unknown Source) ~[na:na]
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) ~[na:1.7.0_45]
        at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57) ~[na:1.7.0_45]
        at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) ~[na:1.7.0_45]
        at java.lang.reflect.Method.invoke(Method.java:606) ~[na:1.7.0_45]
        at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:164) ~[na:na]
        at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:83) ~[na:na]
        at com.sun.proxy.$Proxy155.getListing(Unknown Source) ~[na:na]
        at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.getListing(ClientNamenodeProtocolTranslatorPB.java:441) ~[na:na]
        at org.apache.hadoop.hdfs.DFSClient.listPaths(DFSClient.java:1526) ~[na:na]
        at org.apache.hadoop.hdfs.DFSClient.listPaths(DFSClient.java:1509) ~[na:na]
        at org.apache.hadoop.hdfs.DistributedFileSystem.listStatus(DistributedFileSystem.java:406) ~[na:na]
        at org.apache.hadoop.fs.FileSystem.listStatus(FileSystem.java:1462) ~[na:na]
        at org.apache.hadoop.fs.FileSystem.listStatus(FileSystem.java:1502) ~[na:na]
        at com.facebook.presto.hive.ForwardingFileSystem.listStatus(ForwardingFileSystem.java:298) ~[na:na]
        at com.facebook.presto.hive.ForwardingFileSystem.listStatus(ForwardingFileSystem.java:298) ~[na:na]
        at com.facebook.presto.hive.FileSystemWrapper$3.listStatus(FileSystemWrapper.java:146) ~[na:na]
        at org.apache.hadoop.fs.FileSystem$4.<init>(FileSystem.java:1778) ~[na:na]
        at org.apache.hadoop.fs.FileSystem.listLocatedStatus(FileSystem.java:1777) ~[na:na]
        at org.apache.hadoop.fs.FileSystem.listLocatedStatus(FileSystem.java:1760) ~[na:na]
        at com.facebook.presto.hive.util.AsyncRecursiveWalker$1.run(AsyncRecursiveWalker.java:58) ~[na:na]
        at com.facebook.presto.hive.util.SuspendingExecutor$1.run(SuspendingExecutor.java:67) ~[na:na]
        at com.facebook.presto.hive.util.BoundedExecutor.executeOrMerge(BoundedExecutor.java:82) ~[na:na]
        at com.facebook.presto.hive.util.BoundedExecutor.access$000(BoundedExecutor.java:41) ~[na:na]
        at com.facebook.presto.hive.util.BoundedExecutor$1.run(BoundedExecutor.java:53) ~[na:na]
        ... 3 common frames omitted
Caused by: java.io.IOException: Broken pipe
        at sun.nio.ch.FileDispatcherImpl.write0(Native Method) ~[na:1.7.0_45]
        at sun.nio.ch.SocketDispatcher.write(SocketDispatcher.java:47) ~[na:1.7.0_45]
        at sun.nio.ch.IOUtil.writeFromNativeBuffer(IOUtil.java:93) ~[na:1.7.0_45]
        at sun.nio.ch.IOUtil.write(IOUtil.java:65) ~[na:1.7.0_45]
        at sun.nio.ch.SocketChannelImpl.write(SocketChannelImpl.java:487) ~[na:1.7.0_45]
        at org.apache.hadoop.net.SocketOutputStream$Writer.performIO(SocketOutputStream.java:62) ~[na:na]
        at org.apache.hadoop.net.SocketIOWithTimeout.doIO(SocketIOWithTimeout.java:143) ~[na:na]
        at org.apache.hadoop.net.SocketOutputStream.write(SocketOutputStream.java:153) ~[na:na]
        at org.apache.hadoop.net.SocketOutputStream.write(SocketOutputStream.java:114) ~[na:na]
        at java.io.BufferedOutputStream.flushBuffer(BufferedOutputStream.java:82) ~[na:1.7.0_45]
        at java.io.BufferedOutputStream.flush(BufferedOutputStream.java:140) ~[na:1.7.0_45]
        at java.io.DataOutputStream.flush(DataOutputStream.java:123) ~[na:1.7.0_45]
        at org.apache.hadoop.ipc.Client$Connection$3.run(Client.java:897) ~[na:na]
        at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471) [na:1.7.0_45]
        at java.util.concurrent.FutureTask.run(FutureTask.java:262) [na:1.7.0_45]
        ... 3 common frames omitted
2013-11-20T21:58:45.915-0500    DEBUG   task-notification-1     com.facebook.presto.execution.TaskStateMachine  Task 20131121_025845_00004_qqe25.0.0 is CANCELED

1 个答案:

答案 0 :(得分:0)

我从谷歌小组得到了一些答案 https://groups.google.com/forum/#!topic/presto-users/lVLvMGP1sKE

Dain Sundstrom
11月8日 这是HDFS客户端内部的错误(org.apache.hadoop.ipc.Client:我的代码中的941),在快速查看该代码后,看起来这意味着客户端无法解析服务器响应。我的猜测是我们捆绑presto-hive-cdh4插件的客户端与您的Hadoop版本不兼容。此代码包含Cloudera Hadoop版本2.0.0-cdh4.3.0。你用的是哪个版本?