由于“ --jars”参数(具有*通配符)未扩展,因此无法通过系统调用从scala内调用“ spark-submit”

时间:2018-10-30 21:13:27

标签: linux scala shell apache-spark spark-submit

在外壳中执行“ spark-submit”调用后效果很好

/bin/bash -c '/local/spark-2.3.1-bin-hadoop2.7/bin/spark-submit --class analytics.tiger.agents.spark.Orsp --master spark://analytics.broadinstitute.org:7077 --deploy-mode client --executor-memory 1024m --conf spark.app.id=Orsp --conf spark.executor.extraJavaOptions=-Dconfig.file=/home/unix/analytics/TigerETL3/application.conf --conf spark.driver.extraJavaOptions=-Dconfig.file=/home/unix/analytics/TigerETL3/application.conf --jars "/home/unix/analytics/TigerETL3/spark-jars/*.jar" /home/unix/analytics/TigerETL3/spark-agents.jar'

但是,当我仅将其转换为这样的Scala中的系统调用时:

val cmd = Seq("/bin/bash", "-c", s"""/local/spark-2.3.1-bin-hadoop2.7/bin/spark-submit --class analytics.tiger.agents.spark.Orsp --master spark://analytics.broadinstitute.org:7077 --deploy-mode client --executor-memory 1024m --conf spark.app.id=Orsp --conf spark.executor.extraJavaOptions=-Dconfig.file=/home/unix/analytics/TigerETL3/application.conf --conf spark.driver.extraJavaOptions=-Dconfig.file=/home/unix/analytics/TigerETL3/application.conf --jars "/home/unix/analytics/TigerETL3/spark-jars/*.jar" /home/unix/analytics/TigerETL3/spark-agents.jar""")
import scala.sys.process._
val log = cmd.lineStream.toList
println(log.mkString)

引发错误

Warning: Local jar /home/unix/analytics/TigerETL3/spark-jars/*.jar does not exist, skipping.
Exception in thread "main" java.lang.NoClassDefFoundError: scalikejdbc/DB
        at java.lang.Class.getDeclaredMethods0(Native Method)
        at java.lang.Class.privateGetDeclaredMethods(Class.java:2701)
        at java.lang.Class.privateGetMethodRecursive(Class.java:3048)
        at java.lang.Class.getMethod0(Class.java:3018)
        at java.lang.Class.getMethod(Class.java:1784)
        at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:739)
        at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:180)
        at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:205)
        at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:119)
        at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
Caused by: java.lang.ClassNotFoundException: scalikejdbc.DB
        at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
        at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
        at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
        ... 10 more

异常表明* .jars模式由于某种原因未扩展(即使它在shell中工作正常)。 枚举CSV列表中的所有罐子不是很吸引人,这将是一个庞然大物-187个罐子。 我尝试了任何我能想到的技巧,但不幸失败了,很长一段时间以来并没有感到沮丧。

帮助表示赞赏! 谢谢

2 个答案:

答案 0 :(得分:1)

在指定--jars时,需要删除双引号“”。你可以试试吗?

val cmd = Seq("/bin/bash", "-c", s"""/local/spark-2.3.1-bin-hadoop2.7/bin/spark-submit --class analytics.tiger.agents.spark.Orsp --master spark://analytics.broadinstitute.org:7077 --deploy-mode client --executor-memory 1024m --conf spark.app.id=Orsp --conf spark.executor.extraJavaOptions=-Dconfig.file=/home/unix/analytics/TigerETL3/application.conf --conf spark.driver.extraJavaOptions=-Dconfig.file=/home/unix/analytics/TigerETL3/application.conf --jars /home/unix/analytics/TigerETL3/spark-jars/*.jar /home/unix/analytics/TigerETL3/spark-agents.jar""")
import scala.sys.process._
val log = cmd.lineStream.toList
println(log.mkString)

答案 1 :(得分:1)

好,我知道了。我必须通读Spark的脚本才能认识到,如果缺少SPARK_HOME和JAVA_HOME,Spark将通过一系列步骤尝试对其进行推断。 我最初的Scala命令(包括双引号)非常好-我只需要定义这两个变量

val cmd = Seq("/bin/bash", "-c", s"""JAVA_HOME=/broad/software/free/Linux/redhat_7_x86_64/pkgs/jdk1.8.0_121 SPARK_HOME=/local/spark-2.3.1-bin-hadoop2.7 /local/spark-2.3.1-bin-hadoop2.7/bin/spark-submit --class analytics.tiger.agents.spark.Orsp --master spark://analytics.broadinstitute.org:7077 --deploy-mode client --executor-memory 1024m --conf spark.app.id=Orsp --conf spark.executor.extraJavaOptions=-Dconfig.file=/home/unix/analytics/TigerETL3/application.conf --conf spark.driver.extraJavaOptions=-Dconfig.file=/home/unix/analytics/TigerETL3/application.conf --jars "/home/unix/analytics/TigerETL3/spark-jars/*.jar" /home/unix/analytics/TigerETL3/spark-agents.jar""")

它就像一种魅力。