启动Thrift服务器错误文件未找到:/ tmp / hive /

时间:2017-12-28 19:28:13

标签: hadoop cassandra hive datastax thrift

当我们尝试使用此命令启动thrift服务器时,我可以访问具有以下错误的集群:

dse -u my_usser -p my_password spark-sql-thriftserver start

运行该命令后,将显示以下消息:

# starting org.apache.spark.sql.hive.thriftserver.HiveThriftServer2, logging to /root/spark-thrift-server/spark-root-org.apache.spark.sql.hive.thriftserver.HiveThriftServer2-1-node3.domain.com.out

当我显示该文件的输出时,错误是:

  

WARN 2017-12-29 01:44:01,875 org.apache.spark.SparkContext:使用现有的SparkContext,某些配置可能不会生效。   错误2017-12-29 01:44:05,379 org.apache.spark.deploy.DseSparkSubmitBootstrapper:无法启动或提交Spark应用程序   java.lang.RuntimeException:com.datastax.bdp.fs.model.NoSuchFileException:找不到文件:/ tmp / hive /           在org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:522)〜[hive-exec-1.2.1.spark2.jar:1.2.1.spark2]           在org.apache.spark.sql.hive.client.HiveClientImpl。(HiveClientImpl.scala:189)〜[spark-hive_2.11-2.0.2.6-de611f9.jar:2.0.2.6-de611f9]           at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)〜[na:1.8.0_151]           at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)〜[na:1.8.0_151]           at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)〜[na:1.8.0_151]           在java.lang.reflect.Constructor.newInstance(Constructor.java:423)〜[na:1.8.0_151]           在org.apache.spark.sql.hive.client.IsolatedClientLoader.createClient(IsolatedClientLoader.scala:258)〜[spark-hive_2.11-2.0.2.6-de611f9.jar:2.0.2.6-de611f9]           在org.apache.spark.sql.hive.HiveUtils $ .newClientForMetadata(HiveUtils.scala:359)〜[spark-hive_2.11-2.0.2.6-de611f9.jar:2.0.2.6-de611f9]           在org.apache.spark.sql.hive.HiveUtils $ .newClientForMetadata(HiveUtils.scala:263)〜[spark-hive_2.11-2.0.2.6-de611f9.jar:2.0.2.6-de611f9]           在org.apache.spark.sql.hive.HiveSharedState.metadataHive $ lzycompute(HiveSharedState.scala:39)〜[spark-hive_2.11-2.0.2.6-de611f9.jar:2.0.2.6-de611f9]           在org.apache.spark.sql.hive.HiveSharedState.metadataHive(HiveSharedState.scala:38)〜[spark-hive_2.11-2.0.2.6-de611f9.jar:2.0.2.6-de611f9]           在org.apache.spark.sql.hive.HiveSessionState.metadataHive $ lzycompute(HiveSessionState.scala:43)〜[spark-hive_2.11-2.0.2.6-de611f9.jar:2.0.2.6-de611f9]           在org.apache.spark.sql.hive.HiveSessionState.metadataHive(HiveSessionState.scala:43)〜[spark-hive_2.11-2.0.2.6-de611f9.jar:2.0.2.6-de611f9]           在org.apache.spark.sql.hive.thriftserver.SparkSQLEnv $ .init(SparkSQLEnv.scala:62)〜[spark-hive-thriftserver_2.11-2.0.2.6-de611f9.jar:2.0.2.6-de611f9]           在org.apache.spark.sql.hive.thriftserver.HiveThriftServer2 $ .main(HiveThriftServer2.scala:81)〜[spark-hive-thriftserver_2.11-2.0.2.6-de611f9.jar:2.0.2.6-de611f9]

我已手动创建该目录,但这不起作用:(

0 个答案:

没有答案