使用JavaStreamingContext.getOrCreate()的SparkException:在此JVM中只能运行一个SparkContext

时间:2017-01-18 15:15:55

标签: apache-spark spark-streaming spark-submit

this question相关,我得到了提示,getOrCreate成语应该用来避免这个问题。但是尝试:

JavaStreamingContextFactory contextFactory = new JavaStreamingContextFactory() {

    @Override
    public JavaStreamingContext create() {
        final SparkConf conf = new SparkConf().setAppName(NAME);
        return new JavaStreamingContext(conf, Durations.seconds(BATCH_SPAN));
    }

};

final JavaStreamingContext context = JavaStreamingContext.getOrCreate("/tmp/"+NAME, contextFactory);

我还在:

Exception in thread "main" org.apache.spark.SparkException: Only one SparkContext may be running in this JVM (see SPARK-2243). To ignore this error, set spark.driver.allowMultipleContexts = true. The currently running SparkContext was created at:
org.apache.spark.SparkContext.<init>(SparkContext.scala:82)
org.apache.spark.streaming.StreamingContext$.createNewSparkContext(StreamingContext.scala:874)
org.apache.spark.streaming.StreamingContext.<init>(StreamingContext.scala:81)
org.apache.spark.streaming.api.java.JavaStreamingContext.<init>(JavaStreamingContext.scala:140)
org.example.ExamplePipeline$1.create(ExamplePipeline.java:56)
org.apache.spark.streaming.api.java.JavaStreamingContext$$anonfun$7.apply(JavaStreamingContext.scala:706)
org.apache.spark.streaming.api.java.JavaStreamingContext$$anonfun$7.apply(JavaStreamingContext.scala:705)
scala.Option.getOrElse(Option.scala:120)
org.apache.spark.streaming.StreamingContext$.getOrCreate(StreamingContext.scala:864)
org.apache.spark.streaming.api.java.JavaStreamingContext$.getOrCreate(JavaStreamingContext.scala:705)
org.apache.spark.streaming.api.java.JavaStreamingContext.getOrCreate(JavaStreamingContext.scala)
org.example.ExamplePipeline.createExecutionContext(ExamplePipeline.java:70)
org.example.ExamplePipeline.exec(ExamplePipeline.java:116)
sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
java.lang.reflect.Method.invoke(Method.java:498)
org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.invokeCustomInitMethod(AbstractAutowireCapableBeanFactory.java:1702)
org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.invokeInitMethods(AbstractAutowireCapableBeanFactory.java:1641)
org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.initializeBean(AbstractAutowireCapableBeanFactory.java:1570)
    at org.apache.spark.SparkContext$$anonfun$assertNoOtherContextIsRunning$1.apply(SparkContext.scala:2257)
    at org.apache.spark.SparkContext$$anonfun$assertNoOtherContextIsRunning$1.apply(SparkContext.scala:2239)
    at scala.Option.foreach(Option.scala:236)
    at org.apache.spark.SparkContext$.assertNoOtherContextIsRunning(SparkContext.scala:2239)
    at org.apache.spark.SparkContext$.setActiveContext(SparkContext.scala:2325)
    at org.apache.spark.SparkContext.<init>(SparkContext.scala:2197)
    at org.apache.spark.streaming.StreamingContext$.createNewSparkContext(StreamingContext.scala:874)
    at org.apache.spark.streaming.StreamingContext.<init>(StreamingContext.scala:81)
    at org.apache.spark.streaming.api.java.JavaStreamingContext.<init>(JavaStreamingContext.scala:140)
    at org.example.ExamplePipeline$1.create(ExamplePipeline.java:56)
    at org.apache.spark.streaming.api.java.JavaStreamingContext$$anonfun$7.apply(JavaStreamingContext.scala:706)
    at org.apache.spark.streaming.api.java.JavaStreamingContext$$anonfun$7.apply(JavaStreamingContext.scala:705)
    at scala.Option.getOrElse(Option.scala:120)
    at org.apache.spark.streaming.StreamingContext$.getOrCreate(StreamingContext.scala:864)
    at org.apache.spark.streaming.api.java.JavaStreamingContext$.getOrCreate(JavaStreamingContext.scala:705)
    at org.apache.spark.streaming.api.java.JavaStreamingContext.getOrCreate(JavaStreamingContext.scala)
    at org.example.ExamplePipeline.createExecutionContext(ExamplePipeline.java:70)
    at org.example.ExamplePipeline.exec(ExamplePipeline.java:116)
    at org.example.ExamplePipeline.main(ExamplePipeline.java:157)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    at java.lang.reflect.Method.invoke(Method.java:498)
    at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:786)
    at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:183)
    at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:208)
    at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:123)
    at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)

我认为做错了什么?

提前致谢。

1 个答案:

答案 0 :(得分:0)

根据this question我认为我应该这样做:

SparkConf conf = new SparkConf().setAppName(NAME);
JavaSparkContext ctx = JavaSparkContext.fromSparkContext(SparkContext.getOrCreate(conf));
JavaStreamingContext context = new JavaStreamingContext(ctx, Durations.seconds(BATCH_SPAN));

右?