线程“streaming-start”中的异常java.lang.NoSuchMethodError:scala.Predef $ .ArrowAssoc(Ljava / lang / Object;)Ljava / lang / Object;

时间:2018-01-31 19:11:18

标签: scala apache-spark spark-streaming

我看到了与这个问题相关的所有步骤,他们都非常清楚海报是用两个版本的Scala交叉编译的。在我的情况下,我确保我只有一个版本2.11但我仍然得到相同的错误。任何帮助表示赞赏,谢谢。 我的星火环境:

   /___/ .__/\_,_/_/ /_/\_\   version 2.0.0.2.5.3.0-37
  /_/   
Using Scala version 2.11.8 (Java HotSpot(TM) 64-Bit Server VM, Java 1.7.0_67)

我的pom.xml:

    <properties>
    <spark.version>2.2.1</spark.version>
    <scala.version>2.11.8</scala.version>
    <scala.library.version>2.11.8</scala.library.version>
    <scala.binary.version>2.11</scala.binary.version>
    <project.build.sourceEncoding>UTF-8</project.build.sourceEncoding>
    <project.reporting.outputEncoding>UTF-8</project.reporting.outputEncoding>
    <java.source.version>1.7</java.source.version>
    <java.compile.version>1.7</java.compile.version>
    <kafka.version>0-10</kafka.version>
</properties>

<dependencies>
    <dependency>
        <groupId>org.apache.spark</groupId>
        <artifactId>spark-core_${scala.binary.version}</artifactId>
        <version>${spark.version}</version>
    </dependency>
    <dependency>
        <groupId>org.apache.spark</groupId>
        <artifactId>spark-sql_${scala.binary.version}</artifactId>
        <version>${spark.version}</version>
    </dependency>
    <dependency>
        <groupId>org.apache.spark</groupId>
        <artifactId>spark-hive_${scala.binary.version}</artifactId>
        <version>${spark.version}</version>
    </dependency>
    <dependency>
        <groupId>com.typesafe.scala-logging</groupId>
        <artifactId>scala-logging-slf4j_${scala.binary.version}</artifactId>
        <version>2.1.2</version>
    </dependency>
    <dependency>
        <groupId>org.apache.spark</groupId>
        <artifactId>spark-streaming-kafka-${kafka.version}_${scala.binary.version}</artifactId>
        <version>${spark.version}</version>
    </dependency>
    <dependency>
        <groupId>org.scala-lang</groupId>
        <artifactId>scala-library</artifactId>
        <version>${scala.library.version}</version>
    </dependency>
    <dependency>
        <groupId>org.apache.spark</groupId>
        <artifactId>spark-streaming_${scala.binary.version}</artifactId>
        <version>${spark.version}</version>
    </dependency>
    <dependency>
        <groupId>org.apache.kafka</groupId>
        <artifactId>kafka-clients</artifactId>
        <version>0.11.0.2</version>
    </dependency>
</dependencies>

这是一个例外:

    at org.apache.spark.streaming.kafka010.DirectKafkaInputDStream$$anonfun$start$1.apply(DirectKafkaInputDStream.scala:246)
at org.apache.spark.streaming.kafka010.DirectKafkaInputDStream$$anonfun$start$1.apply(DirectKafkaInputDStream.scala:245)
at scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:244)
at scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:244)
at scala.collection.Iterator$class.foreach(Iterator.scala:727)
at scala.collection.AbstractIterator.foreach(Iterator.scala:1157)
at scala.collection.IterableLike$class.foreach(IterableLike.scala:72)
at scala.collection.AbstractIterable.foreach(Iterable.scala:54)
at scala.collection.TraversableLike$class.map(TraversableLike.scala:244)
at scala.collection.mutable.AbstractSet.scala$collection$SetLike$$super$map(Set.scala:45)
at scala.collection.SetLike$class.map(SetLike.scala:93)
at scala.collection.mutable.AbstractSet.map(Set.scala:45)
at org.apache.spark.streaming.kafka010.DirectKafkaInputDStream.start(DirectKafkaInputDStream.scala:245)
at org.apache.spark.streaming.DStreamGraph$$anonfun$start$5.apply(DStreamGraph.scala:47)
at org.apache.spark.streaming.DStreamGraph$$anonfun$start$5.apply(DStreamGraph.scala:47)
at scala.collection.parallel.mutable.ParArray$ParArrayIterator.foreach_quick(ParArray.scala:145)
at scala.collection.parallel.mutable.ParArray$ParArrayIterator.foreach(ParArray.scala:138)
at scala.collection.parallel.ParIterableLike$Foreach.leaf(ParIterableLike.scala:975)
at scala.collection.parallel.Task$$anonfun$tryLeaf$1.apply$mcV$sp(Tasks.scala:54)
at scala.collection.parallel.Task$$anonfun$tryLeaf$1.apply(Tasks.scala:53)
at scala.collection.parallel.Task$$anonfun$tryLeaf$1.apply(Tasks.scala:53)
at scala.collection.parallel.Task$class.tryLeaf(Tasks.scala:56)
at scala.collection.parallel.ParIterableLike$Foreach.tryLeaf(ParIterableLike.scala:972)
at scala.collection.parallel.AdaptiveWorkStealingTasks$WrappedTask$class.compute(Tasks.scala:165)
at scala.collection.parallel.AdaptiveWorkStealingForkJoinTasks$WrappedTask.compute(Tasks.scala:514)
at scala.concurrent.forkjoin.RecursiveAction.exec(RecursiveAction.java:160)
at scala.concurrent.forkjoin.ForkJoinTask.doExec(ForkJoinTask.java:260)
at scala.concurrent.forkjoin.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1339)
at scala.concurrent.forkjoin.ForkJoinPool.runWorker(ForkJoinPool.java:1979)
at scala.concurrent.forkjoin.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:107)

当我在命令输出中grep“_2.1”时:mvn依赖:tree -Dverbose我没有看到任何对2.10的引用。

    [INFO] +- org.apache.spark:spark-core_2.11:jar:2.2.1:compile
[INFO] |  +- com.twitter:chill_2.11:jar:0.8.0:compile
[INFO] |  +- org.apache.spark:spark-launcher_2.11:jar:2.2.1:compile
[INFO] |  |  +- (org.apache.spark:spark-tags_2.11:jar:2.2.1:compile - omitted for duplicate)
[INFO] |  +- org.apache.spark:spark-network-common_2.11:jar:2.2.1:compile
[INFO] |  +- org.apache.spark:spark-network-shuffle_2.11:jar:2.2.1:compile
[INFO] |  |  +- (org.apache.spark:spark-network-common_2.11:jar:2.2.1:compile - omitted for duplicate)
[INFO] |  +- org.apache.spark:spark-unsafe_2.11:jar:2.2.1:compile
[INFO] |  |  +- (org.apache.spark:spark-tags_2.11:jar:2.2.1:compile - omitted for duplicate)
[INFO] |  |  +- (com.twitter:chill_2.11:jar:0.8.0:compile - omitted for duplicate)
[INFO] |  +- org.json4s:json4s-jackson_2.11:jar:3.2.11:compile
[INFO] |  |  +- org.json4s:json4s-core_2.11:jar:3.2.11:compile
[INFO] |  |  |  +- org.json4s:json4s-ast_2.11:jar:3.2.11:compile
[INFO] |  |  |        +- org.scala-lang.modules:scala-xml_2.11:jar:1.0.1:compile
[INFO] |  |  |        \- (org.scala-lang.modules:scala-parser-combinators_2.11:jar:1.0.1:compile - omitted for conflict with 1.0.4)
[INFO] |  +- com.fasterxml.jackson.module:jackson-module-scala_2.11:jar:2.6.5:compile
[INFO] |  +- org.apache.spark:spark-tags_2.11:jar:2.2.1:compile
[INFO] +- org.apache.spark:spark-sql_2.11:jar:2.2.1:compile
[INFO] |  +- org.apache.spark:spark-sketch_2.11:jar:2.2.1:compile
[INFO] |  |  +- (org.apache.spark:spark-tags_2.11:jar:2.2.1:compile - omitted for duplicate)
[INFO] |  +- (org.apache.spark:spark-core_2.11:jar:2.2.1:compile - omitted for duplicate)
[INFO] |  +- org.apache.spark:spark-catalyst_2.11:jar:2.2.1:compile
[INFO] |  |  +- (org.apache.spark:spark-core_2.11:jar:2.2.1:compile - omitted for duplicate)
[INFO] |  |  +- (org.apache.spark:spark-tags_2.11:jar:2.2.1:compile - omitted for duplicate)
[INFO] |  |  +- (org.apache.spark:spark-unsafe_2.11:jar:2.2.1:compile - omitted for duplicate)
[INFO] |  |  +- (org.apache.spark:spark-sketch_2.11:jar:2.2.1:compile - omitted for duplicate)
[INFO] |  +- (org.apache.spark:spark-tags_2.11:jar:2.2.1:compile - omitted for duplicate)
[INFO] +- org.apache.spark:spark-hive_2.11:jar:2.2.1:compile
[INFO] |  +- (org.apache.spark:spark-core_2.11:jar:2.2.1:compile - omitted for duplicate)
[INFO] |  +- (org.apache.spark:spark-sql_2.11:jar:2.2.1:compile - omitted for duplicate)
[INFO] +- com.typesafe.scala-logging:scala-logging-slf4j_2.11:jar:2.1.2:compile
[INFO] |  +- com.typesafe.scala-logging:scala-logging-api_2.11:jar:2.1.2:compile
[INFO] +- org.apache.spark:spark-streaming-kafka-0-10_2.11:jar:2.2.1:compile
[INFO] |  +- org.apache.kafka:kafka_2.11:jar:0.10.0.1:compile
[INFO] |  |  +- org.scala-lang.modules:scala-parser-combinators_2.11:jar:1.0.4:compile
[INFO] |  +- (org.apache.spark:spark-tags_2.11:jar:2.2.1:compile - omitted for duplicate)
[INFO] +- org.apache.spark:spark-streaming_2.11:jar:2.2.1:compile
[INFO] |  +- (org.apache.spark:spark-core_2.11:jar:2.2.1:compile - omitted for duplicate)
[INFO] |  +- (org.apache.spark:spark-tags_2.11:jar:2.2.1:compile - omitted for duplicate)

另外我应该说我正在使用一个优步jar使用spark-submit在Spark服务器上运行。优步包括下面的罐子。我把scala罐子作为解决问题的最后资源包括在内,但无论是否做都无关紧要。

            <plugin>
            <groupId>org.apache.maven.plugins</groupId>
            <artifactId>maven-shade-plugin</artifactId>
            <version>3.1.0</version>
            <configuration>
                <shadedArtifactAttached>false</shadedArtifactAttached>
                <keepDependenciesWithProvidedScope>false</keepDependenciesWithProvidedScope>
                <artifactSet>
                    <includes>
                        <include>org.apache.kafka:spark*</include>
                        <include>org.apache.spark:spark-streaming-kafka-${kafka.version}_${scala.binary.version}
                        </include>
                        <include>org.apache.kafka:kafka_${scala.binary.version}</include>
                        <include>org.apache.kafka:kafka-clients</include>
                        <include>org.apache.spark:*</include>
                        <include>org.scala-lang:scala-library</include>
                    </includes>
                    <excludes>
                        <exclude>org.apache.hadoop:*</exclude>
                        <exclude>com.fasterxml:*</exclude>
                    </excludes>
                </artifactSet>
                <transformers>
                    <transformer implementation="org.apache.maven.plugins.shade.resource.AppendingTransformer">
                        <resource>META-INF/services/javax.ws.rs.ext.Providers</resource>
                    </transformer>
                </transformers>
            </configuration>
            <executions>
                <execution>
                    <goals>
                        <goal>shade</goal>
                    </goals>
                </execution>
            </executions>
        </plugin>

2 个答案:

答案 0 :(得分:2)

Scala 2.11不适用于Java 7:https://scala-lang.org/download/2.11.8.html。请将Java更新为8

答案 1 :(得分:0)

最后,我让它为我的特定环境工作。我所做的更改是Scala 2.10.6 Java 1.7 Spark 2.0.0。

为了完整性,这里是我的pom.xml:

   <properties>
    <spark.version>2.0.0</spark.version>
    <scala.version>2.10.6</scala.version>
    <scala.library.version>2.10.6</scala.library.version>
    <scala.binary.version>2.10</scala.binary.version>
    <project.build.sourceEncoding>UTF-8</project.build.sourceEncoding>
    <project.reporting.outputEncoding>UTF-8</project.reporting.outputEncoding>
    <java.source.version>1.7</java.source.version>
    <java.compile.version>1.7</java.compile.version>
    <kafka.version>0-10</kafka.version>
</properties>

<dependencies>
    <dependency>
        <groupId>org.apache.spark</groupId>
        <artifactId>spark-core_${scala.binary.version}</artifactId>
        <version>${spark.version}</version>
    </dependency>
    <dependency>
        <groupId>org.apache.spark</groupId>
        <artifactId>spark-sql_${scala.binary.version}</artifactId>
        <version>${spark.version}</version>
    </dependency>
    <dependency>
        <groupId>org.apache.spark</groupId>
        <artifactId>spark-hive_${scala.binary.version}</artifactId>
        <version>${spark.version}</version>
    </dependency>
    <dependency>
        <groupId>com.typesafe.scala-logging</groupId>
        <artifactId>scala-logging-slf4j_${scala.binary.version}</artifactId>
        <version>2.1.2</version>
    </dependency>
    <dependency>
        <groupId>org.apache.spark</groupId>
        <artifactId>spark-streaming-kafka-${kafka.version}_${scala.binary.version}</artifactId>
        <version>${spark.version}</version>
    </dependency>
    <dependency>
        <groupId>org.scala-lang</groupId>
        <artifactId>scala-library</artifactId>
        <version>${scala.library.version}</version>
    </dependency>
    <dependency>
        <groupId>org.apache.spark</groupId>
        <artifactId>spark-streaming_${scala.binary.version}</artifactId>
        <version>${spark.version}</version>
    </dependency>
    <dependency>
        <groupId>org.apache.kafka</groupId>
        <artifactId>kafka-clients</artifactId>
        <version>0.11.0.2</version>
    </dependency>
</dependencies>

<build>
    <sourceDirectory>src/main/java</sourceDirectory>
    <testSourceDirectory>src/test/java</testSourceDirectory>
    <resources>
        <resource>
            <directory>src/main/resources</directory>
        </resource>
    </resources>

    <plugins>
        <plugin>
            <groupId>org.apache.maven.plugins</groupId>
            <artifactId>maven-shade-plugin</artifactId>
            <version>3.1.0</version>
            <configuration>
                <shadedArtifactAttached>false</shadedArtifactAttached>
                <keepDependenciesWithProvidedScope>false</keepDependenciesWithProvidedScope>
                <artifactSet>
                    <includes>
                        <include>org.apache.kafka:spark*</include>
                        <include>org.apache.spark:spark-streaming-kafka-${kafka.version}_${scala.binary.version}
                        </include>
                        <include>org.apache.kafka:kafka_${scala.binary.version}</include>
                        <include>org.apache.kafka:kafka-clients</include>
                        <include>org.apache.spark:*</include>
                        <include>org.scala-lang:scala-library</include>
                    </includes>
                    <excludes>
                        <exclude>org.apache.hadoop:*</exclude>
                        <exclude>com.fasterxml:*</exclude>
                    </excludes>
                </artifactSet>
                <transformers>
                    <transformer implementation="org.apache.maven.plugins.shade.resource.AppendingTransformer">
                        <resource>META-INF/services/javax.ws.rs.ext.Providers</resource>
                    </transformer>
                </transformers>
            </configuration>
            <executions>
                <execution>
                    <goals>
                        <goal>shade</goal>
                    </goals>
                </execution>
            </executions>
        </plugin>

        <plugin>
            <groupId>org.apache.maven.plugins</groupId>
            <artifactId>maven-compiler-plugin</artifactId>
            <version>3.1</version>
            <configuration>
                <source>${java.source.version}</source>
                <target>${java.compile.version}</target>
            </configuration>
        </plugin>

        <plugin>
            <groupId>org.scala-tools</groupId>
            <artifactId>maven-scala-plugin</artifactId>
            <version>2.15.2</version>
            <executions>
                <execution>
                    <id>compile</id>
                    <goals>
                        <goal>compile</goal>
                    </goals>
                    <phase>compile</phase>
                </execution>
                <execution>
                    <id>test-compile</id>
                    <goals>
                        <goal>testCompile</goal>
                    </goals>
                    <phase>test-compile</phase>
                </execution>
                <execution>
                    <phase>process-resources</phase>
                    <goals>
                        <goal>compile</goal>
                    </goals>
                </execution>
            </executions>
        </plugin>
        <plugin>
            <groupId>org.apache.maven.plugins</groupId>
            <artifactId>maven-eclipse-plugin</artifactId>
            <version>2.9</version>
            <configuration>
                <sourceIncludes>
                    <sourceInclude>**/*.scala</sourceInclude>
                </sourceIncludes>
                <projectNameTemplate>[artifactId]</projectNameTemplate>
                <projectnatures>
                    <projectnature>org.scala-ide.sdt.core.scalanature</projectnature>
                    <projectnature>org.eclipse.m2e.core.maven2Nature</projectnature>
                    <projectnature>org.eclipse.jdt.core.javanature</projectnature>
                </projectnatures>
                <buildcommands>
                    <buildcommand>org.eclipse.m2e.core.maven2Builder</buildcommand>
                    <buildcommand>org.scala-ide.sdt.core.scalabuilder</buildcommand>
                </buildcommands>
                <classpathContainers>
                    <classpathContainer>org.scala-ide.sdt.launching.SCALA_CONTAINER"</classpathContainer>
                </classpathContainers>
                <excludes>
                    <exclude>org.scala-lang:scala-library</exclude>
                    <exclude>org.scala-lang:scala-compiler</exclude>
                </excludes>
            </configuration>
        </plugin>
        <plugin>
            <groupId>org.apache.maven.plugins</groupId>
            <artifactId>maven-jar-plugin</artifactId>
            <version>2.6</version>
            <configuration>
                <archive>
                    <manifestEntries>
                        <Implementation-Version>${project.version}</Implementation-Version>
                        <SCM-Revision>1.0</SCM-Revision>
                    </manifestEntries>
                </archive>
            </configuration>
        </plugin>
    </plugins>
</build>