Best practice to create SparkSession object in Scala to use both in unittest and spark-submit

Joo-Won Jung picture Joo-Won Jung · Jul 31, 2017 · Viewed 15.1k times · Source

I have tried to write a transform method from DataFrame to DataFrame. And I also want to test it by scalatest.

As you know, in Spark 2.x with Scala API, you can create SparkSession object as follows:

import org.apache.spark.sql.SparkSession

val spark = SparkSession.bulider
     .config("spark.master", "local[2]")
     .getOrCreate()

This code works fine with unit tests. But, when I run this code with spark-submit, the cluster options did not work. For example,

spark-submit --master yarn --deploy-mode client --num-executors 10 ...

does not create any executors.

I have found that the spark-submit arguments are applied when I remove config("master", "local[2]") part of the above code. But, without master setting the unit test code did not work.

I tried to split spark (SparkSession) object generation part to test and main. But there is so many code blocks needs spark, for example import spark.implicit,_ and spark.createDataFrame(rdd, schema).

Is there any best practice to write a code to create spark object both to test and to run spark-submit?

Answer

Rick Moritz picture Rick Moritz · Jul 31, 2017

One way is to create a trait which provides the SparkContext/SparkSession, and use that in your test cases, like so:

trait SparkTestContext {
  private val master = "local[*]"
  private val appName = "testing"
  System.setProperty("hadoop.home.dir", "c:\\winutils\\")
  private val conf: SparkConf = new SparkConf()
    .setMaster(master)
    .setAppName(appName)
    .set("spark.driver.allowMultipleContexts", "false")
    .set("spark.ui.enabled", "false")

  val ss: SparkSession = SparkSession.builder().config(conf).enableHiveSupport().getOrCreate()
  val sc: SparkContext = ss.sparkContext
  val sqlContext: SQLContext = ss.sqlContext
}

And your test class header then looks like this for example:

class TestWithSparkTest extends BaseSpec with SparkTestContext with Matchers{