checkpoint SqlContext nullpointerException问题

时间:2016-03-31 17:11:46

标签: apache-spark apache-spark-sql spark-streaming

我在我的应用程序中使用检查指向,当我的应用程序以失败开始时,我在NullPointerException上得到SQLContext
我认为由于序列化/反序列化问题,应用程序无法恢复SQLContextSQLContext不可序列化吗?

以下是我的代码

    //DriverClass
    final JavaSparkContext javaSparkCtx = new JavaSparkContext(conf);
    final SQLContext sqlContext = new SQLContext(javaSparkCtx);

    JavaStreamingContextFactory javaStreamingContextFactory = new JavaStreamingContextFactory() {
        @Override
        public JavaStreamingContext create() { //only first time executed
            // TODO Auto-generated method stub

            JavaStreamingContext jssc = new JavaStreamingContext(javaSparkCtx, Durations.minutes(1));
            jssc.checkpoint(CHECKPOINT_DIRECTORY);

            HashMap < String, String > kafkaParams = new HashMap < String, String > ();
            kafkaParams.put("metadata.broker.list",
                            "abc.xyz.localdomain:6667");
            //....
            JavaDStream < String > fullMsg = messages
                                             .map(new MapFunction());

            fullMsg.foreachRDD(new SomeClass(sqlContext));
            return jssc;
        }
    };
}

//Closure Class
public class SomeClass implements Serializable, Function < JavaRDD < String > , Void > {
    SQLContext sqlContext;
    public SomeClass(SQLContext sqlContext) {
        // TODO Auto-generated constructor stub
        this.sqlContext = sqlContext;
    }
    public void doSomething() {
        this.sqlContext.createDataFrame();**// here is the nullpointerException**
    }
    //.......
}

1 个答案:

答案 0 :(得分:4)

SQLContext是Serializable,因为Spark SQL需要在内部执行程序端使用SQLContext。但是,您不应将其序列化为Streaming检查点。相反,你应该从像SQLContext sqlContext = SQLContext.getOrCreate(rdd.context());

这样的rdd中获取它

有关详细信息,请参阅流媒体文档:http://spark.apache.org/docs/1.6.1/streaming-programming-guide.html#dataframe-and-sql-operations