I am running a Spark program on Intellij and getting the below error : "object apache is not a member of package org".
I have used these import statement in the code :
import org.apache.spark.SparkContext
import org.apache.spark.SparkContext._
import org.apache.spark.SparkConf
The above import statement is not running on sbt prompt too. The corresponding lib appears to be missing but I am not sure how to copy the same and at which path.
Make sure you have entries like this in SBT:
scalaVersion := "2.11.8"
libraryDependencies ++= Seq(
"org.apache.spark" %% "spark-core" % "2.1.0",
"org.apache.spark" %% "spark-sql" % "2.1.0"
)
Then make sure IntelliJ knows about these libraries by either enabling "auto-import" or doing it manually by clicking the refresh-looking button on the SBT panel.