Is it possible to get the current spark context settings in PySpark?

whisperstream picture whisperstream · May 31, 2015 · Viewed 95.3k times · Source

I'm trying to get the path to spark.worker.dir for the current sparkcontext.

If I explicitly set it as a config param, I can read it back out of SparkConf, but is there anyway to access the complete config (including all defaults) using PySpark?

Answer

Kevad picture Kevad · Jul 11, 2017

Spark 2.1+

spark.sparkContext.getConf().getAll() where spark is your sparksession (gives you a dict with all configured settings)