将 Spark 作业远程提交到独立集群

时间:2021-07-13 08:07:17

标签: apache-spark pyspark

我有一个包含 1 个主节点和 4 个工作节点的 Spark 集群。
我尝试从工作节点提交我的应用程序,但出现以下异常:

Py4JJavaError: An error occurred while calling None.org.apache.spark.api.java.JavaSparkContext.
: java.net.BindException: Cannot assign requested address: Service 'sparkDriver' failed after 128 retries (on a random free port)! Consider explicitly setting the appropriate binding address for the service 'sparkDriver' (for example spark.driver.bindAddress for SparkDriver) to the correct binding address.

我的代码很受欢迎:

    findspark.init()

    conf = SparkConf()
    conf.setMaster(f'spark://{SPARK_MASTER_HOST}:{SPARK_PORT}').setAppName(application_name)
    conf.set('spark.driver.host', SPARK_DRIVER_HOST)
    sc = SparkContext(conf=conf)
    spark = SparkSession(sc)

这是不是因为在独立模式下,我们无法远程提交申请?

0 个答案:

没有答案