How to enable or disable Hive support in spark-shell through Spark property (Spark 1.6)?

Krishna Reddy picture Krishna Reddy · Jul 20, 2017 · Viewed 21.7k times · Source

Is there any configuration property we can set it to disable / enable Hive support through spark-shell explicitly in spark 1.6. I tried to get all the sqlContext configuration properties with,

sqlContext.getAllConfs.foreach(println)

But, I am not sure on which property can actually required to disable/enable hive support. or Is there any other way to do this?

Answer

Jacek Laskowski picture Jacek Laskowski · Jul 20, 2017

Spark >= 2.0

Enable and disable of Hive context is possible with config spark.sql.catalogImplementation

Possible values for spark.sql.catalogImplementation is in-memory or hive

SPARK-16013 Add option to disable HiveContext in spark-shell/pyspark


Spark < 2.0

Such a Spark property is not available in Spark 1.6.

One way to work it around is to remove Hive-related jars that would in turn disable Hive support in Spark (as Spark has Hive support when required Hive classes are available).