Zeppelin: How to restart sparkContext in zeppelin

eatSleepCode picture eatSleepCode · Nov 11, 2016 · Viewed 15.1k times · Source

I am using Isolated mode of zeppelins spark interpreter, with this mode it will start a new job for each notebook in spark cluster. I want to kill the job via zeppelin when the notebook execution is completed. For this I did sc.stop this stopped the sparkContext and the job is also stopped from spark cluster. But next time when I try to run the notebook its not starting the sparkContext again. So how to do that?

Answer

user6022341 picture user6022341 · Nov 11, 2016

It's a bit counter intuitive but you need to access the interpreter menu tab instead of stopping SparkContext directly:

  • go to interpreter list.

    interpreter list

  • find Spark interpreter and click restart in the right upper corner:

    spark intepreter