无法编译scala&java项目。无法找到临时依赖项

时间:2019-05-27 12:08:21

标签: scala maven apache-spark compilation scala-maven-plugin

我很想念我缺少的东西。 我正在尝试编译Java&Scala项目。幸运的是,我能够在一个简单的虚拟项目中拒绝我的问题。 我正在使用scala-maven-plugin(https://github.com/davidB/scala-maven-plugin)。我也基于此question

旨在解决Scala&Java多模块项目。我有root和两个模块m1和m2。

m1 pom:

<?xml version="1.0" encoding="UTF-8"?>
<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
    xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/maven-v4_0_0.xsd">
    <modelVersion>4.0.0</modelVersion>
    <parent>
        <groupId>samples.scala-maven-plugin</groupId>
        <artifactId>prj_multi_modules</artifactId>
        <version>0.0.1-SNAPSHOT</version>
        <relativePath>../pom.xml</relativePath>
    </parent>
    <artifactId>m1</artifactId>
    <packaging>jar</packaging>
    <dependencies>
        <dependency>
            <groupId>org.scala-lang</groupId>
            <artifactId>scala-library</artifactId>
        </dependency>
        <dependency>
            <groupId>samples.scala-maven-plugin</groupId>
            <artifactId>m2</artifactId>
            <version>0.0.1-SNAPSHOT</version>
        </dependency>
    </dependencies>
</project>

平方米pom:

<?xml version="1.0" encoding="UTF-8"?>
<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
    xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/maven-v4_0_0.xsd">
    <modelVersion>4.0.0</modelVersion>
    <parent>
        <groupId>samples.scala-maven-plugin</groupId>
        <artifactId>prj_multi_modules</artifactId>
        <version>0.0.1-SNAPSHOT</version>
        <relativePath>../pom.xml</relativePath>
    </parent>
    <artifactId>m2</artifactId>
    <packaging>jar</packaging>


    <properties>
        <cdh.version>cdh5.7.6</cdh.version>
        <build.prop.dir>..</build.prop.dir>
        <scala.version>2.11.8</scala.version>
        <scala.version.major>2.11</scala.version.major>
        <spark.version>2.3.0.cloudera4</spark.version>
        <hadoop.version>2.6.0-${cdh.version}</hadoop.version>
    </properties>


    <dependencies>
        <dependency>
            <groupId>org.scala-lang</groupId>
            <artifactId>scala-library</artifactId>
        </dependency>
        <dependency>
            <groupId>org.apache.spark</groupId>
            <artifactId>spark-core_${scala.version.major}</artifactId>
            <version>${spark.version}</version>
            <scope>provided</scope>
        </dependency>
        <dependency>
            <groupId>org.apache.spark</groupId>
            <artifactId>spark-sql_${scala.version.major}</artifactId>
            <version>${spark.version}</version>
            <scope>provided</scope>
        </dependency>
        <dependency>
            <groupId>org.apache.spark</groupId>
            <artifactId>spark-mllib_${scala.version.major}</artifactId>
            <version>${spark.version}</version>
            <scope>provided</scope>
        </dependency>
        <dependency> <!-- remove this for spark 2.3? -->
            <groupId>org.apache.spark</groupId>
            <artifactId>spark-hive_${scala.version.major}</artifactId>
            <version>${spark.version}</version>
            <scope>provided</scope>
        </dependency>
        <dependency>
            <groupId>com.github.scopt</groupId>
            <artifactId>scopt_${scala.version.major}</artifactId>
            <version>3.3.0</version>
        </dependency>
        <dependency>
            <groupId>com.databricks</groupId>
            <artifactId>spark-csv_${scala.version.major}</artifactId>
            <version>1.5.0</version>
        </dependency>
        <dependency>
            <groupId>org.apache.commons</groupId>
            <artifactId>commons-csv</artifactId>
            <version>1.1</version>
        </dependency>
        <dependency>
            <groupId>org.apache.hadoop</groupId>
            <artifactId>hadoop-common</artifactId>
            <version>${hadoop.version}</version>
        </dependency>
        <dependency>
            <groupId>org.apache.hadoop</groupId>
            <artifactId>hadoop-client</artifactId>
            <version>${hadoop.version}</version>
        </dependency>
        <dependency>
            <groupId>com.holdenkarau</groupId>
            <artifactId>spark-testing-base_${scala.version.major}</artifactId>
            <version>1.6.0_0.7.4</version>
            <scope>test</scope>
        </dependency>
    </dependencies>
</project>

root pom:

<?xml version="1.0" encoding="UTF-8"?>
<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
  xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/maven-v4_0_0.xsd">
  <modelVersion>4.0.0</modelVersion>

    <repositories>
        <repository>
            <id>cloudera</id>
            <url>https://repository.cloudera.com/artifactory/cloudera-repos/</url>
        </repository>
    </repositories>

  <groupId>samples.scala-maven-plugin</groupId>
  <artifactId>prj_multi_modules</artifactId>
  <version>0.0.1-SNAPSHOT</version>
  <packaging>pom</packaging>

  <properties>
    <cdh.version>cdh5.7.6</cdh.version>
    <build.prop.dir>..</build.prop.dir>
    <scala.version>2.11.8</scala.version>
    <scala.version.major>2.11</scala.version.major>
    <spark.version>2.3.0.cloudera4</spark.version>
    <hadoop.version>2.6.0-${cdh.version}</hadoop.version>
  </properties>


  <dependencyManagement>
  <dependencies>
    <dependency>
      <groupId>org.scala-lang</groupId>
      <artifactId>scala-library</artifactId>
        <version>${scala.version}</version>
    </dependency>
  </dependencies>
  </dependencyManagement>

  <build>
    <pluginManagement>
      <plugins>
        <plugin>
          <groupId>net.alchim31.maven</groupId>
          <artifactId>scala-maven-plugin</artifactId>
          <version>3.2.2</version>
          <executions>
            <execution>
              <goals>
                <goal>compile</goal>
                <goal>testCompile</goal>
              </goals>
            </execution>
          </executions>
        </plugin>
      </plugins>
  </pluginManagement>
  <plugins>
    <plugin>
      <groupId>net.alchim31.maven</groupId>
      <artifactId>scala-maven-plugin</artifactId>
    </plugin>
  </plugins>
</build>

  <modules>
      <module>m1</module>
      <module>m2</module>
  </modules>
</project>

我的目标非常简单,我希望m1项目从m2模块获取所有火花强度,以便在m1模块中编译火花代码。

这是m1中的火花代码:

package p1

import org.apache.spark.sql.SparkSession

class MyClass(spark: SparkSession) {
  import spark.implicits._


  def run(input: String) = {
    val df = spark.read.parquet(input)

    df.show()
  }

}

object MyClass {

  case class Params(input: String = "")

  def main(args: Array[String]): Unit = {

    val parser = new scopt.OptionParser[Params](usageHeader) {
      opt[String]('i', "input") required() action { (s, params) => params.copy(input = s) }
    }

    parser.parse(args, Params()) match {
      case Some(p) =>
        val spark = SparkSession.builder
          .appName("spark2 example")
          .getOrCreate()

        val job = new Spark2Exp(spark)
        job.run(p.input)

        spark.stop()
      case None =>
        System.exit(1)
    }
  }
  val usageHeader: String = "n/a"
}

m2编译正在顺利进行。在编译m1时,所有Java编译过程都很好,但是在尝试编译scala代码时,它无法找到spark依赖项,就好像它不是在依赖模块之间过渡一样。

此外,如果我将所有m2 spark依赖项都复制到m1 pom,则一切正常。

我似乎无法理解我做错了什么。我确定这是scala-maven问题,因为这种情况在Java中非常简单。

添加编译错误:

"C:\Program Files\Java\jdk1.8.0_151\bin\java" -Dmaven.multiModuleProjectDirectory=C:\dev\scala-maven-plugin "-Dmaven.home=C:\Program Files\JetBrains\IntelliJ IDEA 2017.3\plugins\maven\lib\maven3" "-Dclassworlds.conf=C:\Program Files\JetBrains\IntelliJ IDEA 2017.3\plugins\maven\lib\maven3\bin\m2.conf" "-javaagent:C:\Program Files\JetBrains\IntelliJ IDEA 2017.3\lib\idea_rt.jar=55662:C:\Program Files\JetBrains\IntelliJ IDEA 2017.3\bin" -Dfile.encoding=UTF-8 -classpath "C:\Program Files\JetBrains\IntelliJ IDEA 2017.3\plugins\maven\lib\maven3\boot\plexus-classworlds-2.5.2.jar" org.codehaus.classworlds.Launcher -Didea.version=2017.3 clean install
[INFO] Scanning for projects...
[INFO] ------------------------------------------------------------------------
[INFO] Reactor Build Order:
[INFO] 
[INFO] prj_multi_modules
[INFO] m2
[INFO] m1
[INFO]                                                                         
[INFO] ------------------------------------------------------------------------
[INFO] Building prj_multi_modules 0.0.1-SNAPSHOT
[INFO] ------------------------------------------------------------------------
[INFO] 
[INFO] --- maven-clean-plugin:2.5:clean (default-clean) @ prj_multi_modules ---
[INFO] 
[INFO] --- scala-maven-plugin:3.2.2:compile (default) @ prj_multi_modules ---
[INFO] No sources to compile
[INFO] 
[INFO] --- scala-maven-plugin:3.2.2:testCompile (default) @ prj_multi_modules ---
[INFO] No sources to compile
[INFO] 
[INFO] --- maven-install-plugin:2.4:install (default-install) @ prj_multi_modules ---
[INFO] Installing C:\dev\scala-maven-plugin\samples\prj_multi_modules\pom.xml to C:\Users\raphael.peretz\.m2\repository\samples\scala-maven-plugin\prj_multi_modules\0.0.1-SNAPSHOT\prj_multi_modules-0.0.1-SNAPSHOT.pom
[INFO]                                                                         
[INFO] ------------------------------------------------------------------------
[INFO] Building m2 0.0.1-SNAPSHOT
[INFO] ------------------------------------------------------------------------
[INFO] 
[INFO] --- maven-clean-plugin:2.5:clean (default-clean) @ m2 ---
[INFO] Deleting C:\dev\scala-maven-plugin\samples\prj_multi_modules\m2\target
[INFO] 
[INFO] --- maven-resources-plugin:2.6:resources (default-resources) @ m2 ---
[WARNING] Using platform encoding (UTF-8 actually) to copy filtered resources, i.e. build is platform dependent!
[INFO] skip non existing resourceDirectory C:\dev\scala-maven-plugin\samples\prj_multi_modules\m2\src\main\resources
[INFO] 
[INFO] --- maven-compiler-plugin:3.1:compile (default-compile) @ m2 ---
[INFO] No sources to compile
[INFO] 
[INFO] --- scala-maven-plugin:3.2.2:compile (default) @ m2 ---
[WARNING]  Expected all dependencies to require Scala version: 2.11.8
[WARNING]  samples.scala-maven-plugin:m2:0.0.1-SNAPSHOT requires scala version: 2.11.8
[WARNING]  com.twitter:chill_2.11:0.8.0 requires scala version: 2.11.8
[WARNING]  org.apache.spark:spark-core_2.11:2.3.0.cloudera4 requires scala version: 2.11.8
[WARNING]  org.json4s:json4s-jackson_2.11:3.2.11 requires scala version: 2.11.8
[WARNING]  org.json4s:json4s-core_2.11:3.2.11 requires scala version: 2.11.8
[WARNING]  org.json4s:json4s-ast_2.11:3.2.11 requires scala version: 2.11.8
[WARNING]  org.json4s:json4s-core_2.11:3.2.11 requires scala version: 2.11.0
[WARNING] Multiple versions of scala libraries detected!
[INFO] No sources to compile
[INFO] 
[INFO] --- maven-resources-plugin:2.6:testResources (default-testResources) @ m2 ---
[WARNING] Using platform encoding (UTF-8 actually) to copy filtered resources, i.e. build is platform dependent!
[INFO] skip non existing resourceDirectory C:\dev\scala-maven-plugin\samples\prj_multi_modules\m2\src\test\resources
[INFO] 
[INFO] --- maven-compiler-plugin:3.1:testCompile (default-testCompile) @ m2 ---
[INFO] No sources to compile
[INFO] 
[INFO] --- scala-maven-plugin:3.2.2:testCompile (default) @ m2 ---
[WARNING]  Expected all dependencies to require Scala version: 2.11.8
[WARNING]  samples.scala-maven-plugin:m2:0.0.1-SNAPSHOT requires scala version: 2.11.8
[WARNING]  com.twitter:chill_2.11:0.8.0 requires scala version: 2.11.8
[WARNING]  org.apache.spark:spark-core_2.11:2.3.0.cloudera4 requires scala version: 2.11.8
[WARNING]  org.json4s:json4s-jackson_2.11:3.2.11 requires scala version: 2.11.8
[WARNING]  org.json4s:json4s-core_2.11:3.2.11 requires scala version: 2.11.8
[WARNING]  org.json4s:json4s-ast_2.11:3.2.11 requires scala version: 2.11.8
[WARNING]  org.json4s:json4s-core_2.11:3.2.11 requires scala version: 2.11.0
[WARNING] Multiple versions of scala libraries detected!
[INFO] No sources to compile
[INFO] 
[INFO] --- maven-surefire-plugin:2.12.4:test (default-test) @ m2 ---
[INFO] No tests to run.
[INFO] 
[INFO] --- maven-jar-plugin:2.4:jar (default-jar) @ m2 ---
[WARNING] JAR will be empty - no content was marked for inclusion!
[INFO] Building jar: C:\dev\scala-maven-plugin\samples\prj_multi_modules\m2\target\m2-0.0.1-SNAPSHOT.jar
[INFO] 
[INFO] --- maven-install-plugin:2.4:install (default-install) @ m2 ---
[INFO] Installing C:\dev\scala-maven-plugin\samples\prj_multi_modules\m2\target\m2-0.0.1-SNAPSHOT.jar to C:\Users\raphael.peretz\.m2\repository\samples\scala-maven-plugin\m2\0.0.1-SNAPSHOT\m2-0.0.1-SNAPSHOT.jar
[INFO] Installing C:\dev\scala-maven-plugin\samples\prj_multi_modules\m2\pom.xml to C:\Users\raphael.peretz\.m2\repository\samples\scala-maven-plugin\m2\0.0.1-SNAPSHOT\m2-0.0.1-SNAPSHOT.pom
[INFO]                                                                         
[INFO] ------------------------------------------------------------------------
[INFO] Building m1 0.0.1-SNAPSHOT
[INFO] ------------------------------------------------------------------------
[INFO] 
[INFO] --- maven-clean-plugin:2.5:clean (default-clean) @ m1 ---
[INFO] 
[INFO] --- maven-resources-plugin:2.6:resources (default-resources) @ m1 ---
[WARNING] Using platform encoding (UTF-8 actually) to copy filtered resources, i.e. build is platform dependent!
[INFO] skip non existing resourceDirectory C:\dev\scala-maven-plugin\samples\prj_multi_modules\m1\src\main\resources
[INFO] 
[INFO] --- maven-compiler-plugin:3.1:compile (default-compile) @ m1 ---
[INFO] No sources to compile
[INFO] 
[INFO] --- scala-maven-plugin:3.2.2:compile (default) @ m1 ---
[INFO] C:\dev\scala-maven-plugin\samples\prj_multi_modules\m1\src\main\scala:-1: info: compiling
[INFO] Compiling 1 source files to C:\dev\scala-maven-plugin\samples\prj_multi_modules\m1\target\classes at 1558957820236
[ERROR] C:\dev\scala-maven-plugin\samples\prj_multi_modules\m1\src\main\scala\p1\MyClass.scala:3: error: object spark is not a member of package org.apache
[ERROR] import org.apache.spark.sql.SparkSession
[ERROR]                   ^
[ERROR] C:\dev\scala-maven-plugin\samples\prj_multi_modules\m1\src\main\scala\p1\MyClass.scala:5: error: not found: type SparkSession
[ERROR] class MyClass(spark: SparkSession) {
[ERROR]                      ^
[ERROR] C:\dev\scala-maven-plugin\samples\prj_multi_modules\m1\src\main\scala\p1\MyClass.scala:29: error: not found: value SparkSession
[ERROR]         val spark = SparkSession.builder
[ERROR]                     ^
[ERROR] C:\dev\scala-maven-plugin\samples\prj_multi_modules\m1\src\main\scala\p1\MyClass.scala:33: error: not found: type Spark2Exp
[ERROR]         val job = new Spark2Exp(spark)
[ERROR]                       ^
[ERROR] four errors found
[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] prj_multi_modules .................................. SUCCESS [  1.435 s]
[INFO] m2 ................................................. SUCCESS [ 22.342 s]
[INFO] m1 ................................................. FAILURE [  3.497 s]
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 27.379 s
[INFO] Finished at: 2019-05-27T14:50:23+03:00
[INFO] Final Memory: 52M/927M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal net.alchim31.maven:scala-maven-plugin:3.2.2:compile (default) on project m1: wrap: org.apache.commons.exec.ExecuteException: Process exited with an error: 1 (Exit value: 1) -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoExecutionException
[ERROR] 
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn <goals> -rf :m1

Process finished with exit code 1

1 个答案:

答案 0 :(得分:0)

既然在m1和m2中都需要spark,那么为什么不关心根pom文件中的spark依赖关系呢?

当然,您可以随后在m2中删除重复的对象。

相关问题