番石榴不向后兼容

时间:2018-05-31 06:43:26

标签: maven pom.xml guava grpc sparkcore

我的jar(spark-core和io.grpc)中有一些依赖项,它们分别使用两个版本的guava(19.0和spark-core取决于hadoop mapreduce,它使用11.0.2)。这两个版本彼此不兼容,因为一个版本中的少数方法在其他版本中不可用。在这种情况下如何处理maven依赖?

依赖POM:

<dependency>
            <groupId>com.trueaccord.scalapb</groupId>
            <artifactId>scalapb-runtime_${scala.compat.version}</artifactId>
            <version>0.6.3</version>
        </dependency>
        <dependency>
            <groupId>com.trueaccord.scalapb</groupId>
            <artifactId>scalapb-runtime-grpc_${scala.compat.version}</artifactId>
            <version>0.6.3</version>
        </dependency>
        <dependency>
            <groupId>com.trueaccord.scalapb</groupId>
            <artifactId>compilerplugin_${scala.compat.version}</artifactId>
            <version>0.6.3</version>
        </dependency>
        <dependency>
            <groupId>io.grpc</groupId>
            <artifactId>grpc-netty-shaded</artifactId>
            <version>1.10.1</version>
        </dependency>
        <dependency>
            <groupId>io.netty</groupId>
            <artifactId>netty-tcnative-boringssl-static</artifactId>
            <version>2.0.5.Final</version>
        </dependency>

        <dependency>
            <groupId>com.google.guava</groupId>
            <artifactId>guava</artifactId>
            <version>23.0</version>
        </dependency>


        <dependency>
            <groupId>org.apache.spark</groupId>
            <artifactId>spark-core_2.11</artifactId>
            <version>2.2.1</version>
            <exclusions>
                <exclusion>
                    <artifactId>asm</artifactId>
                    <groupId>asm</groupId>
                </exclusion>
            </exclusions>
        </dependency>

        <dependency>
            <groupId>org.apache.spark</groupId>
            <artifactId>spark-sql_2.11</artifactId>
            <version>2.2.1</version>
        </dependency>
        <dependency>
            <groupId>org.apache.spark</groupId>
            <artifactId>spark-mllib_2.11</artifactId>
            <version>2.2.1</version>
        </dependency>

问题更像是早期版本和最新版本之间的死锁,而不是单个错误。

0 个答案:

没有答案
相关问题