无法强制转换为org.apache.spark.serializer.Serializer

时间:2015-10-16 01:59:41

标签: java serialization apache-spark

我正在尝试使用Java解决HashMaps的Spark序列化问题。 我指的是Save Spark Dataframe into Elasticsearch - Can’t handle type exception链接。

现在我遇到了以下问题:

java.lang.ClassCastException: com.spark.util.umf.MyKryoRegistrator cannot be cast to org.apache.spark.serializer.Serializer
    at org.apache.spark.SparkEnv$.create(SparkEnv.scala:259)
    at org.apache.spark.SparkEnv$.createDriverEnv(SparkEnv.scala:163)
    at org.apache.spark.SparkContext.createSparkEnv(SparkContext.scala:267)
    at org.apache.spark.SparkContext.(SparkContext.scala:270)
    at org.apache.spark.api.java.JavaSparkContext.JavaSparkContext.scala:61)
    at com.spark.util.umf.MyMain.main(MyMain.java:46)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    at java.lang.reflect.Method.invoke(Method.java:606)
    at org.apache.spark.deploy.yarn.ApplicationMaster$$anon$2.run(ApplicationMaster.scala:480)
15/10/16 01:47:22 INFO yarn.ApplicationMaster: Final app status:
FAILED, exitCode: 15, (reason: User class threw exception:
com.spark.util.umf.MyKryoRegistrator cannot be cast to
org.apache.spark.serializer.Serializer)

我创建了Kryo registrator,如下所示:

import java.io.Serializable;
import org.apache.spark.serializer.KryoRegistrator;
import com.esotericsoftware.kryo.Kryo;

public class MyKryoRegistrator implements KryoRegistrator, Serializable {
    @Override
    public void registerClasses(Kryo kryo) {
        // Product POJO associated to a product Row from the DataFrame            
        kryo.register(MyRecord.class); 
    }
}

主要方法:

public static void main(String args[]){

    SparkConf sConf= new SparkConf().setAppName("SparkTestJob");
    sConf.set( "spark.driver.allowMultipleContexts", "true");
    //Kryo kryo = new Kryo();;
    //kryo.setDefaultSerializer(MyRecord.class);
    //my.registerClasses(kryo);
    sConf.set("spark.serializer","com.spark.util.umf.MyKryoRegistrator");

    [...]
}

1 个答案:

答案 0 :(得分:3)

根据您在问题中提到的链接中提供的答案,您可以看到我已经定义了这两个参数:

spark.serializerspark.kryo.registrator

所以你必须设置两个参数。

如果在未设置序列化程序的情况下设置registrator,则不会设置kryo序列化程序。