Not able to import Spark Implicits in ScalaTest

himanshuIIITian picture himanshuIIITian · May 21, 2017 · Viewed 10.2k times · Source

I am writing Test Cases for Spark using ScalaTest.

import org.apache.spark.sql.SparkSession
import org.scalatest.{BeforeAndAfterAll, FlatSpec}

class ClassNameSpec extends FlatSpec with BeforeAndAfterAll {
  var spark: SparkSession = _
  var className: ClassName = _

  override def beforeAll(): Unit = {
    spark = SparkSession.builder().master("local").appName("class-name-test").getOrCreate()
    className = new ClassName(spark)
  }

  it should "return data" in {
    import spark.implicits._
    val result = className.getData(input)

    assert(result.count() == 3)
  }

  override def afterAll(): Unit = {
    spark.stop()
  }
}

When I try to compile the test suite it gives me following error:

stable identifier required, but ClassNameSpec.this.spark.implicits found.
[error]     import spark.implicits._
[error]                  ^
[error] one error found
[error] (test:compileIncremental) Compilation failed

I am not able to understand why I cannot import spark.implicits._ in a test suite.

Any help is appreciated !

Answer

Assaf Mendelson picture Assaf Mendelson · May 21, 2017

To do an import you need a "stable identifier" as the error message says. This means that you need to have a val, not a var. Since you defined spark as a var, scala can't import correctly.

To solve this you can simply do something like:

val spark2 = spark
import spark2.implicits._

or instead change the original var to val, e.g.:

lazy val spark: SparkSession = SparkSession.builder().master("local").appName("class-name-test").getOrCreate()