SparkException:此JVM中只能运行一个SparkContext

时间:2017-01-11 11:42:31

标签: apache-spark

提交一个简单的Spark管道:

./bin/spark-submit --class com.example.ExamplePipeline --master local pipeline-1.0.0-SNAPSHOT.jar
...
17/01/11 12:34:24 INFO BlockManagerMaster: Registered BlockManager
Exception in thread "main" org.apache.spark.SparkException: Only one SparkContext may be running in this JVM (see SPARK-2243). To ignore this error, set spark.driver.allowMultipleContexts = true. The currently running SparkContext was created at:
org.apache.spark.SparkContext.<init>(SparkContext.scala:82)
org.apache.spark.streaming.StreamingContext$.createNewSparkContext(StreamingContext.scala:874)
org.apache.spark.streaming.StreamingContext.<init>(StreamingContext.scala:81)
org.apache.spark.streaming.api.java.JavaStreamingContext.<init>(JavaStreamingContext.scala:140)
com.example.ExamplePipeline.createExecutionContext(ExamplePipeline.java:72)
com.example.ExamplePipeline.exec(ExamplePipeline.java:115)
sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
java.lang.reflect.Method.invoke(Method.java:498)
org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.invokeCustomInitMethod(AbstractAutowireCapableBeanFactory.java:1702)
org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.invokeInitMethods(AbstractAutowireCapableBeanFactory.java:1641)
org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.initializeBean(AbstractAutowireCapableBeanFactory.java:1570)
org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.doCreateBean(AbstractAutowireCapableBeanFactory.java:539)
org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.createBean(AbstractAutowireCapableBeanFactory.java:476)
org.springframework.beans.factory.support.AbstractBeanFactory$1.getObject(AbstractBeanFactory.java:303)
org.springframework.beans.factory.support.DefaultSingletonBeanRegistry.getSingleton(DefaultSingletonBeanRegistry.java:230)
org.springframework.beans.factory.support.AbstractBeanFactory.doGetBean(AbstractBeanFactory.java:299)
org.springframework.beans.factory.support.AbstractBeanFactory.getBean(AbstractBeanFactory.java:194)
org.springframework.beans.factory.support.DefaultListableBeanFactory.preInstantiateSingletons(DefaultListableBeanFactory.java:755)
    at org.apache.spark.SparkContext$$anonfun$assertNoOtherContextIsRunning$1.apply(SparkContext.scala:2257)
    at org.apache.spark.SparkContext$$anonfun$assertNoOtherContextIsRunning$1.apply(SparkContext.scala:2239)
    at scala.Option.foreach(Option.scala:236)
    at org.apache.spark.SparkContext$.assertNoOtherContextIsRunning(SparkContext.scala:2239)
    at org.apache.spark.SparkContext$.setActiveContext(SparkContext.scala:2325)
    at org.apache.spark.SparkContext.<init>(SparkContext.scala:2197)
    at org.apache.spark.streaming.StreamingContext$.createNewSparkContext(StreamingContext.scala:874)
    at org.apache.spark.streaming.StreamingContext.<init>(StreamingContext.scala:81)
    at org.apache.spark.streaming.api.java.JavaStreamingContext.<init>(JavaStreamingContext.scala:140)
    at com.example.ExamplePipeline.createExecutionContext(Exampleipeline.java:72)
    at com.example.ExamplePipeline.exec(ExamplePipeline.java:115)
    at com.example.ExamplePipeline.main(ExamplePipeline.java:144)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    at java.lang.reflect.Method.invoke(Method.java:498)
    at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:731)
    at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:181)
    at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:206)
    at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:121)
    at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
17/01/11 12:34:24 INFO ReceiverTracker: Sent stop signal to all 1 receivers
17/01/11 12:34:24 INFO StreamingContext: Invoking stop(stopGracefully=false) from shutdown hook

它看起来有另一个上下文正在运行,所以它已经停止了。我无法找到正在运行的其他内容,但过去常常在同一环境中工作。

1 个答案:

答案 0 :(得分:5)

你可以只有一个SparkContext实例,除非你设置spark.driver.allowMultipleContexts = true(但不推荐 - 它是测试,而不是生产)

如果你这样做:

JavaStreamingContext ssc = new JavaStreamingContext(conf, window);

Spark将创建新的SparkContext,然后创建StreamingContext,它将使用创建的SparkContext。如果您在StreamingContext之前创建了SparkContext,则会抛出异常。据我所知,从堆栈跟踪中,您正在使用此构造函数

要避免此异常,您可以运行:

JavaStreamingContext ssc = new JavaStreamingContext(sparkContext, window);