java.io.IOException:无法使用--property-file打开与Cassandra的本机连接

时间:2018-02-15 09:26:27

标签: scala apache-spark cassandra datastax

Spark-submit工作正常,没有数据存储集群中的--properties-file

dse -u user-p password spark-submit --master spark://ip:7077--properties-file sparkpropsfile.conf --total-executor-cores 5 --class engine.Driver spark-app.jar tablename App.properties

但是当我使用--properties-file提交spark作业时,请低于例外。

Exception in thread "main" java.io.IOException: Failed to open native connection to Cassandra at {10.73.76.15}:9042
        at com.datastax.spark.connector.cql.CassandraConnector$.com$datastax$spark$connector$cql$CassandraConnector$$createSession(CassandraConnector.scala:160)
        at com.datastax.spark.connector.cql.CassandraConnector$$anonfun$3.apply(CassandraConnector.scala:146)
        at com.datastax.spark.connector.cql.CassandraConnector$$anonfun$3.apply(CassandraConnector.scala:146)
        at com.datastax.spark.connector.cql.RefCountedCache.createNewValueAndKeys(RefCountedCache.scala:31)
        at com.datastax.spark.connector.cql.RefCountedCache.acquire(RefCountedCache.scala:56)
        at com.datastax.spark.connector.cql.CassandraConnector.openSession(CassandraConnector.scala:79)
        at com.datastax.spark.connector.cql.CassandraConnector.withSessionDo(CassandraConnector.scala:107)
        at org.apache.spark.deploy.SparkNodeConfiguration$.apply(SparkNodeConfiguration.scala:38)
        at org.apache.spark.deploy.SparkConfigurator.sparkNodeConfiguration$lzycompute(SparkConfigurator.scala:64)
        at org.apache.spark.deploy.SparkConfigurator.sparkNodeConfiguration(SparkConfigurator.scala:64)
        at org.apache.spark.deploy.SparkConfigurator.dseDriverProps$lzycompute(SparkConfigurator.scala:126)
        at org.apache.spark.deploy.SparkConfigurator.dseDriverProps(SparkConfigurator.scala:110)
        at org.apache.spark.deploy.SparkConfigurator.<init>(SparkConfigurator.scala:98)
        at org.apache.spark.deploy.DseSparkArgsPreprocessor.sparkConfigurator$lzycompute(DseSparkArgsPreprocessor.scala:62)
        at org.apache.spark.deploy.DseSparkArgsPreprocessor.sparkConfigurator(DseSparkArgsPreprocessor.scala:62)
        at org.apache.spark.deploy.DseSparkArgsPreprocessor.updatedArgs$lzycompute(DseSparkArgsPreprocessor.scala:70)
        at org.apache.spark.deploy.DseSparkArgsPreprocessor.updatedArgs(DseSparkArgsPreprocessor.scala:65)
        at org.apache.spark.deploy.DseSparkSubmitBootstrapper$.main(DseSparkSubmitBootstrapper.scala:25)
        at org.apache.spark.deploy.DseSparkSubmitBootstrapper.main(DseSparkSubmitBootstrapper.scala)
Caused by: com.datastax.driver.core.exceptions.NoHostAvailableException: All host(s) tried for query failed (tried: /10.73.76.15:9042 (com.datastax.driver.core.exceptions.TransportException: [/10.73.76.15:9042] Channel has been closed))
        at com.datastax.driver.core.ControlConnection.reconnectInternal(ControlConnection.java:233)
        at com.datastax.driver.core.ControlConnection.connect(ControlConnection.java:79)
        at com.datastax.driver.core.Cluster$Manager.init(Cluster.java:1424)
        at com.datastax.driver.core.Cluster.getMetadata(Cluster.java:403)
        at com.datastax.spark.connector.cql.CassandraConnector$.com$datastax$spark$connector$cql$CassandraConnector$$createSession(CassandraConnector.scala:153)
        ... 18 more

有人可以为此提供帮助吗。

谢谢, 钱德拉

0 个答案:

没有答案
相关问题