SparkLauncher。 java.lang.NoSuchMethodError:org.yaml.snakeyaml.Yaml。<init>

时间:2018-08-06 15:11:59

标签: java docker apache-spark spring-boot yaml

同事们,早上好。 我基于sparkLauncher开发了一个应用程序,该应用程序运行一个可执行jar,其中执行5个操作。每个操作取决于特定的变量。 我有一个主要的hadoop集群spark2.3.0-hadoop2.6.5。它工作得很好。 我的部分工作代码:

 private void runSparkJob(String pathToJar, final LocalDate startDate, final LocalDate endDate) {
        if (executionInProgress.get()) {
            LOGGER.warn("Execution already in progress");
            return;
        }
        Process sparkProcess = null;
        try {
            LOGGER.info("Create SparkLauncher. SparkHome: [{}]. JarPath: [{}].", sparkHome, vmJarPath);
            executionInProgress.set(true);
            sparkProcess = new SparkLauncher()
                    .setAppName(activeOperationProfile)
                    .setSparkHome(sparkHome) //sparkHome folder on main cluster
                    .setAppResource(pathToJar) // jar with 5 operation
                    .setConf(SparkLauncher.DRIVER_EXTRA_JAVA_OPTIONS,
                            String.format("-Drunner.operation-profile=%1$s -Doperation.startDate=%2$s -Doperation.endDate=%3$s", activeOperationProfile, startDate,endDate))
                    .setConf(SparkLauncher.DRIVER_MEMORY, "12G")
                    .redirectToLog(LOGGER.getName())
                    .setMaster("yarn")
                    .launch();

            sparkProcess.waitFor();
            int exitCode = sparkProcess.exitValue();
            if (exitCode != 0) {
                throw new RuntimeException("Illegal exit code. Expected: [0]. Actual: [" + exitCode + "]");
            }

        } catch (IOException | InterruptedException e) {
            LOGGER.error("Error occurred while running SparkApplication.", e);
            throw new RuntimeException(e);
        } finally {
            if (sparkProcess != null && sparkProcess.isAlive()) {
                LOGGER.warn("Process still alive. Try to kill");
                sparkProcess.destroy();
            }
            executionInProgress.set(false);
        }
    }

我已经启动了一个docker容器,其中有一个下载的spark 2.3.0-hadoop6。测试人员需要此容器。 我将master更改为.setMaster(“ local”),将新配置文件包含指向sparkHome,jarsWithOpertations的路径,并打包了没有阴影的jar(尝试使用阴影,这对我不起作用)。当我尝试运行我的sparkLaunch应用程序时,我现在遇到一个例外:

  

2018-08-06 14:47:53,150信息   [n.m.m.b.r.SparkBaseOperationsRunner.runSparkJob] 105:创建   SparkLauncher。 SparkHome:   [/opt/bigtv/spark/spark-2.3.0-bin-hadoop2.6]。 JarPath:   [/opt/bigtv/bin/multirating-bigdata-operations-MASTER-SNAPSHOT.jar]。   2018-08-06 14:47:54,905信息   [o.a.spark.launcher.OutputRedirector.redirect] 63:2018-08-06   14:47:54 WARN NativeCodeLoader:62-无法加载本机Hadoop   您平台的库...使用内建的Java类,其中   适用于2018-08-06 14:47:57,042信息   [o.a.spark.launcher.OutputRedirector.redirect] 63:2018-08-06   14:47:57错误SpringApplication:842-应用程序运行失败   2018-08-06 14:47:57,043信息   [o.a.spark.launcher.OutputRedirector.redirect] 63:    java.lang.NoSuchMethodError:   org.yaml.snakeyaml.Yaml。 / resolver / Resolver;)V   2018-08-06 14:47:57,043信息   [o.a.spark.launcher.OutputRedirector.redirect] 63:位于   org.springframework.boot.env.OriginTrackedYamlLoader.createYaml(OriginTrackedYamlLoader.java:70)   2018-08-06 14:47:57,043信息   [o.a.spark.launcher.OutputRedirector.redirect] 63:位于   org.springframework.beans.factory.config.YamlProcessor.process(YamlProcessor.java:139)   2018-08-06 14:47:57,044信息   [o.a.spark.launcher.OutputRedirector.redirect] 63:位于   org.springframework.boot.env.OriginTrackedYamlLoader.load(OriginTrackedYamlLoader.java:75)   2018-08-06 14:47:57,044信息   [o.a.spark.launcher.OutputRedirector.redirect] 63:位于   org.springframework.boot.env.YamlPropertySourceLoader.load(YamlPropertySourceLoader.java:50)   2018-08-06 14:47:57,044信息   [o.a.spark.launcher.OutputRedirector.redirect] 63:位于   org.springframework.boot.context.config.ConfigFileApplicationListener $ Loader.loadDocuments(ConfigFileApplicationListener.java:547)   2018-08-06 14:47:57,044信息   [o.a.spark.launcher.OutputRedirector.redirect] 63:位于   org.springframework.boot.context.config.ConfigFileApplicationListener $ Loader.load(ConfigFileApplicationListener.java:517)   2018-08-06 14:47:57,045信息   [o.a.spark.launcher.OutputRedirector.redirect] 63:位于   org.springframework.boot.context.config.ConfigFileApplicationListener $ Loader.loadForFileExtension(ConfigFileApplicationListener.java:496)   2018-08-06 14:47:57,045信息   [o.a.spark.launcher.OutputRedirector.redirect] 63:位于   org.springframework.boot.context.config.ConfigFileApplicationListener $ Loader.load(ConfigFileApplicationListener.java:464)   2018-08-06 14:47:57,045信息   [o.a.spark.launcher.OutputRedirector.redirect] 63:位于   org.springframework.boot.context.config.ConfigFileApplicationListener $ Loader.lambda $ null $ 6(ConfigFileApplicationListener.java:446)   2018-08-06 14:47:57,046信息   [o.a.spark.launcher.OutputRedirector.redirect] 63:位于   java.lang.Iterable.forEach(Iterable.java:75)2018-08-06 14:47:57,046   INFO [o.a.spark.launcher.OutputRedirector.redirect] 63:位于   org.springframework.boot.context.config.ConfigFileApplicationListener $ Loader.lambda $ load $ 7(ConfigFileApplicationListener.java:445)   2018-08-06 14:47:57,046信息   [o.a.spark.launcher.OutputRedirector.redirect] 63:位于   java.lang.Iterable.forEach(Iterable.java:75)2018-08-06 14:47:57,046   INFO [o.a.spark.launcher.OutputRedirector.redirect] 63:位于   org.springframework.boot.context.config.ConfigFileApplicationListener $ Loader.load(ConfigFileApplicationListener.java:442)   2018-08-06 14:47:57,046信息   [o.a.spark.launcher.OutputRedirector.redirect] 63:位于   org.springframework.boot.context.config.ConfigFileApplicationListener $ Loader.load(ConfigFileApplicationListener.java:330)   2018-08-06 14:47:57,047信息   [o.a.spark.launcher.OutputRedirector.redirect] 63:位于   org.springframework.boot.context.config.ConfigFileApplicationListener.addPropertySources(ConfigFileApplicationListener.java:212)   2018-08-06 14:47:57,047信息   [o.a.spark.launcher.OutputRedirector.redirect] 63:位于   org.springframework.boot.context.config.ConfigFileApplicationListener.postProcessEnvironment(ConfigFileApplicationListener.java:195)   2018-08-06 14:47:57,047信息   [o.a.spark.launcher.OutputRedirector.redirect] 63:位于   org.springframework.boot.context.config.ConfigFileApplicationListener.onApplicationEnvironmentPreparedEvent(ConfigFileApplicationListener.java:182)   2018-08-06 14:47:57,047信息   [o.a.spark.launcher.OutputRedirector.redirect] 63:位于   org.springframework.boot.context.config.ConfigFileApplicationListener.onApplicationEvent(ConfigFileApplicationListener.java:168)   2018-08-06 14:47:57,048信息   [o.a.spark.launcher.OutputRedirector.redirect] 63:位于   org.springframework.context.event.SimpleApplicationEventMulticaster.doInvokeListener(SimpleApplicationEventMulticaster.java:172)   2018-08-06 14:47:57,048信息   [o.a.spark.launcher.OutputRedirector.redirect] 63:位于   org.springframework.context.event.SimpleApplicationEventMulticaster.invokeListener(SimpleApplicationEventMulticaster.java:165)   2018-08-06 14:47:57,048信息   [o.a.spark.launcher.OutputRedirector.redirect] 63:位于   org.springframework.context.event.SimpleApplicationEventMulticaster.multicastEvent(SimpleApplicationEventMulticaster.java:139)   2018-08-06 14:47:57,048信息   [o.a.spark.launcher.OutputRedirector.redirect] 63:位于   org.springframework.context.event.SimpleApplicationEventMulticaster.multicastEvent(SimpleApplicationEventMulticaster.java:127)   2018-08-06 14:47:57,049信息   [o.a.spark.launcher.OutputRedirector.redirect] 63:位于   org.springframework.boot.context.event.EventPublishingRunListener.environmentPrepared(EventPublishingRunListener.java:74)   2018-08-06 14:47:57,049信息   [o.a.spark.launcher.OutputRedirector.redirect] 63:位于   org.springframework.boot.SpringApplicationRunListeners.environmentPrepared(SpringApplicationRunListeners.java:54)   2018-08-06 14:47:57,049信息   [o.a.spark.launcher.OutputRedirector.redirect] 63:位于   org.springframework.boot.SpringApplication.prepareEnvironment(SpringApplication.java:358)   2018-08-06 14:47:57,049信息   [o.a.spark.launcher.OutputRedirector.redirect] 63:位于   org.springframework.boot.SpringApplication.run(SpringApplication.java:317)   2018-08-06 14:47:57,050信息   [o.a.spark.launcher.OutputRedirector.redirect] 63:位于   org.springframework.boot.SpringApplication.run(SpringApplication.java:1255)   2018-08-06 14:47:57,050信息   [o.a.spark.launcher.OutputRedirector.redirect] 63:位于   org.springframework.boot.SpringApplication.run(SpringApplication.java:1243)   2018-08-06 14:47:57,050信息   [o.a.spark.launcher.OutputRedirector.redirect] 63:位于   net.mediascope.multirating.bigdata.operations.OperationRunner.main(OperationRunner.java:21)   2018-08-06 14:47:57,050信息   [o.a.spark.launcher.OutputRedirector.redirect] 63:位于   sun.reflect.NativeMethodAccessorImpl.invoke0(本机方法)2018-08-06   14:47:57,050 INFO [o.a.spark.launcher.OutputRedirector.redirect] 63:   在   sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)   2018-08-06 14:47:57,051信息   [o.a.spark.launcher.OutputRedirector.redirect] 63:位于   sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)   2018-08-06 14:47:57,051信息   [o.a.spark.launcher.OutputRedirector.redirect] 63:位于   java.lang.reflect.Method.invoke(Method.java:498)2018年8月6日   14:47:57,051 INFO [o.a.spark.launcher.OutputRedirector.redirect] 63:   在   org.springframework.boot.loader.MainMethodRunner.run(MainMethodRunner.java:48)   2018-08-06 14:47:57,051信息   [o.a.spark.launcher.OutputRedirector.redirect] 63:位于   org.springframework.boot.loader.Launcher.launch(Launcher.java:87)   2018-08-06 14:47:57,052信息   [o.a.spark.launcher.OutputRedirector.redirect] 63:位于   org.springframework.boot.loader.Launcher.launch(Launcher.java:50)   2018-08-06 14:47:57,052信息   [o.a.spark.launcher.OutputRedirector.redirect] 63:位于   org.springframework.boot.loader.JarLauncher.main(JarLauncher.java:51)   2018-08-06 14:47:57,052信息   [o.a.spark.launcher.OutputRedirector.redirect] 63:位于   sun.reflect.NativeMethodAccessorImpl.invoke0(本机方法)2018-08-06   14:47:57,052信息[o.a.spark.launcher.OutputRedirector.redirect] 63:   在   sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)   2018-08-06 14:47:57,053信息   [o.a.spark.launcher.OutputRedirector.redirect] 63:位于   sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)   2018-08-06 14:47:57,053信息   [o.a.spark.launcher.OutputRedirector.redirect] 63:位于   java.lang.reflect.Method.invoke(Method.java:498)2018年8月6日   14:47:57,053 INFO [o.a.spark.launcher.OutputRedirector.redirect] 63:   在   org.apache.spark.deploy.JavaMainApplication.start(SparkApplication.scala:52)   2018-08-06 14:47:57,053信息   [o.a.spark.launcher.OutputRedirector.redirect] 63:位于   org.apache.spark.deploy.SparkSubmit $ .org $ apache $ spark $ deploy $ SparkSubmit $$ runMain(SparkSubmit.scala:879)   2018-08-06 14:47:57,054信息   [o.a.spark.launcher.OutputRedirector.redirect] 63:位于   org.apache.spark.deploy.SparkSubmit $ .doRunMain $ 1(SparkSubmit.scala:197)   2018-08-06 14:47:57,054信息   [o.a.spark.launcher.OutputRedirector.redirect] 63:位于   org.apache.spark.deploy.SparkSubmit $ .submit(SparkSubmit.scala:227)   2018-08-06 14:47:57,054信息   [o.a.spark.launcher.OutputRedirector.redirect] 63:位于   org.apache.spark.deploy.SparkSubmit $ .main(SparkSubmit.scala:136)   2018-08-06 14:47:57,054信息   [o.a.spark.launcher.OutputRedirector.redirect] 63:位于   org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)2018-08-06   14:47:57,058 INFO [o.a.spark.launcher.OutputRedirector.redirect] 63:   2018-08-06 14:47:57 INFO ShutdownHookManager:54-关闭挂钩   称为2018-08-06 14:47:57,060信息   [o.a.spark.launcher.OutputRedirector.redirect] 63:2018-08-06   14:47:57 INFO ShutdownHookManager:54-删除目录   / tmp / spark-55b54924-e628-43fe-9e43-ed34d7f35a8b 2018-08-06   14:47:57,151信息   [o.s.b.a.l.ConditionEvaluationReportLoggingListener.logAutoConfigurationReport]   101:

     

启动ApplicationContext时出错。显示条件报告   在启用“调试”的情况下重新运行您的应用程序。

在我的项目中,我从5.0春季开始使用蛇yaml 1.19,没有其他依赖项。 我不明白问题是什么,也许当我将其放入docker容器手册中时,除了火花外还需要安装其他东西。

带有操作的模块中的pom:

<dependencies>
        <dependency>
            <groupId>net.mediascope</groupId>
            <artifactId>multirating-bigdata-core</artifactId>
            <version>${project.version}</version>
        </dependency>
        <dependency>
            <groupId>org.springframework.boot</groupId>
            <artifactId>spring-boot-starter-log4j2</artifactId>
        </dependency>
        <!-- Data Base -->
        <dependency>
            <groupId>org.jdbi</groupId>
            <artifactId>jdbi</artifactId>
            <version>2.71</version>
        </dependency>

        <dependency>
            <groupId>com.microsoft.sqlserver</groupId>
            <artifactId>sqljdbc42</artifactId>
            <version>4.2</version>
        </dependency>

        <dependency>
            <groupId>org.apache.spark</groupId>
            <artifactId>spark-core_2.11</artifactId>
            <exclusions>
                <exclusion>
                    <groupId>org.slf4j</groupId>
                    <artifactId>slf4j-log4j12</artifactId>
                </exclusion>
                <exclusion>
                    <groupId>org.codehaus.janino</groupId>
                    <artifactId>commons-compiler</artifactId>
                </exclusion>
            </exclusions>
        </dependency>
        <dependency>
            <groupId>org.apache.spark</groupId>
            <artifactId>spark-sql_2.11</artifactId>
        </dependency>
        <dependency>
            <groupId>net.sourceforge.jtds</groupId>
            <artifactId>jtds</artifactId>
            <version>1.3.1</version>
        </dependency>
    </dependencies>

    <profiles>
        <profile>
            <id>local</id>
            <build>
                <plugins>
                    <plugin>
                        <groupId>org.springframework.boot</groupId>
                        <artifactId>spring-boot-maven-plugin</artifactId>
                        <configuration>
                            <profiles>
                                <profile>${profile.active}</profile>
                            </profiles>
                            <executable>true</executable>
                        </configuration>
                    </plugin>
                </plugins>
            </build>
        </profile>
        <profile>
            <id>hadoop</id>
            <build>
                <!--Необходимо для адаптации Spring-Boot приложения под запуск через Spark-->
                <plugins>
                    <plugin>
                        <groupId>org.apache.maven.plugins</groupId>
                        <artifactId>maven-shade-plugin</artifactId>
                        <version>2.3</version>
                        <executions>
                            <execution>
                                <phase>package</phase>
                                <goals>
                                    <goal>shade</goal>
                                </goals>
                                <configuration>
                                    <transformers>
                                        <transformer
                                                implementation="org.apache.maven.plugins.shade.resource.AppendingTransformer">
                                            <resource>META-INF/spring.handlers</resource>
                                        </transformer>
                                        <transformer
                                                implementation="org.apache.maven.plugins.shade.resource.AppendingTransformer">
                                            <resource>META-INF/spring.schemas</resource>
                                        </transformer>
                                        <transformer
                                                implementation="org.apache.maven.plugins.shade.resource.AppendingTransformer">
                                            <resource>META-INF/spring.provides</resource>
                                        </transformer>
                                        <transformer
                                                implementation="org.springframework.boot.maven.PropertiesMergingResourceTransformer">
                                            <resource>META-INF/spring.factories</resource>
                                        </transformer>
                                        <transformer
                                                implementation="org.apache.maven.plugins.shade.resource.ManifestResourceTransformer">
                                            <mainClass>${start-class}</mainClass>
                                        </transformer>
                                    </transformers>
                                </configuration>
                            </execution>
                        </executions>
                    </plugin>
                </plugins>
            </build>
        </profile>
    </profiles>I

1 个答案:

答案 0 :(得分:0)

我找到了解决方案。 Origin Spark软件包的文件夹jars存放在snakeyml 1.15上,我将其更改为1.19,现在一切正常。