一个简单的Spark应用程序的集成测试

时间:2018-11-26 15:01:37

标签: scala apache-spark integration-testing scalatest

我需要为一个小型研究项目编写一些单元测试和集成测试。我正在使用一个简单的Spark应用程序,该应用程序从文件中读取数据并输出文件中的字符数。我正在使用ScalaTest编写单元测试。但是我无法提出该项目的集成测试。根据项目流程,我需要执行单元测试,打包一个jar文件,然后使用该jar文件执行集成测试。我有一个文件,其中数据作为测试的资源。因此,应该将此文件与源代码打包在一起还是应该将其放在单独的位置?我可以为此应用程序编写哪种集成测试?

简单的Spark应用程序如下所示:

object SparkExample {

  def readFile(sparkContext: SparkContext, fileName: String) = {
    sparkContext.textFile(fileName)
  }

  def mapStringToLength(data: RDD[String]) = {
    data.map(fileData => fileData.length)
  }

  def printIntFileData(data: RDD[Int]) = {
    data.foreach(fileString =>
      println(fileString.toString)
    )
  }

  def printFileData(data: RDD[String]) = {
    data.foreach(fileString =>
      println(fileString)
    )
  }

  def main(args: Array[String]) {

    val spark = SparkSession
      .builder
      .master("local[*]")
      .appName("TestApp")
      .getOrCreate()

    val dataFromFile = readFile(spark.sparkContext, args(0))

    println("\nAll the data:")

    val dataToInt = mapStringToLength(dataFromFile)

    printFileData(dataFromFile)
    printIntFileData(dataToInt)

    spark.stop()
  }
}

我编写的单元测试:

class SparkExampleTest extends FunSuite with BeforeAndAfter with Matchers{

  val master = "local"
  val appName = "TestApp"
  var sparkContext: SparkContext = _
  val fileContent = "This is the text only for the test purposes. There is no sense in it completely. This is the test of the Spark Application"
  val fileName = "src/test/resources/test_data.txt"
  val noPathFileName = "test_data.txt"
  val errorFileName = "test_data1.txt"

  before {
    val sparkSession = SparkSession
      .builder
      .master(master)
      .appName(appName)
      .getOrCreate()
    sparkContext = sparkSession.sparkContext
  }

  test("SparkExample.readFile"){
    assert(SparkExample.readFile(sparkContext, fileName).collect() sameElements Array(fileContent))
  }

  test("SparkExample.mapStringToLength"){
    val stringLength = fileContent.length
    val rdd = sparkContext.makeRDD(Array(fileContent))

    assert(SparkExample.mapStringToLength(rdd).collect() sameElements Array(stringLength))
  }

  test("SparkExample.mapStringToLength Negative"){
    val stringLength = fileContent.length
    val rdd = sparkContext.makeRDD(Array(fileContent + " "))

    assert(SparkExample.mapStringToLength(rdd).collect() != Array(stringLength))
  }


  test("SparkExample.readFile does not throw Exception"){
    noException should be thrownBy SparkExample.readFile(sparkContext, fileName).collect()
  }

  test("SparkExample.readFile throws InvalidInputException without filePath"){
    an[InvalidInputException] should be thrownBy SparkExample.readFile(sparkContext, noPathFileName).collect()
  }

  test("SparkExample.readFile throws InvalidInputException with wrong filename"){
    an[InvalidInputException] should be thrownBy SparkExample.readFile(sparkContext, errorFileName).collect()
  }
}

1 个答案:

答案 0 :(得分:0)

Spark Testing Base是必经之路,它基本上是用于测试的轻量级嵌入式火花。它可能在“集成测试”方面比单元测试更多,但您也可以跟踪代码覆盖率等。带有覆盖 https://github.com/holdenk/spark-testing-base

相关问题