具有200个属性的case类获取java.lang.StackOverflowError错误和消息"该条目似乎已经杀死了编译器。"

时间:2018-04-16 21:29:52

标签: scala apache-spark

在Spark-shell中,尝试创建具有200属性的Scala案例类并不起作用。我得到错误,说它杀死了编译器: 我有一个建议来增加堆栈内存:

spark-shell --conf "spark.driver.extraJavaOptions=-Xss2M" --conf "spark.executor.extraJavaOptions=-Xss2M"

这最初有所帮助,但是当我为Kafka和HBase传递更多关于spark-shell的参数时:

spark-shell --conf "spark.driver.extraJavaOptions=-Xss4M" --conf "spark.executor.extraJavaOptions=-Xss4M" --conf "spark.executor.extraJavaOptions=-Djava.security.auth.login.config=/etc/kafka/conf/kafka_jaas.conf" --driver-java-options "-Djava.security.auth.login.config=/etc/kafka/conf/kafka_jaas.conf" -classpath /usr/hdp/current/hbase-client/lib/hbase-common.jar:/usr/hdp/current/hbase-client/lib/hbase-client.jar:/usr/hdp/current/hbase-client/lib/hbase-server.jar:/usr/hdp/current/hbase-client/lib/hbase-protocol.jar:shc-core-1.1.1-2.1-s_2.11.jar:spark-streaming-kafka-0-10_2.11-2.0.0.jar:/usr/hdp/2.6.1.0-129/kafka/libs/kafka-clients-0.10.1.2.6.1.0-129.jar --files /etc/hbase/conf/hbase-site.xml 

同样的错误继续发生:

java.lang.StackOverflowError
    at scala.tools.nsc.typechecker.Typers$Typer.normalTypedApply$1(Typers.scala:4504)
    at scala.tools.nsc.typechecker.Typers$Typer.typedApply$1(Typers.scala:4580)
    at scala.tools.nsc.typechecker.Typers$Typer.typedInAnyMode$1(Typers.scala:5343)
.............

That entry seems to have slain the compiler.  Shall I replay
your session? I can re-run each line except the last one.
[y/n]

0 个答案:

没有答案
相关问题