hadoop / yarn / spark执行器内存增加

时间:2018-03-31 12:47:29

标签: apache-spark hadoop yarn

当我用执行spark-submit命令时 - master yarn-cluster --num-executors 7 --driver-memory 10g --executor-memory 16g --executor-cores 5 ,我得到以下错误,我不知道在哪里更改堆大小,我怀疑Yarn配置文件在某处,请建议

错误

Invalid maximum heap size: -Xmx10g
The specified size exceeds the maximum representable size.
Error: Could not create the Java Virtual Machine.
Error: A fatal exception has occurred. Program will exit.**

0 个答案:

没有答案
相关问题