Running Spark on Linux : $JAVA_HOME not set error

Marc Zaharescu picture Marc Zaharescu · Aug 3, 2016 · Viewed 7.4k times · Source

I am trying to configure spark-2.0.0-bin-hadoop2.7 on Ubuntu 16.04.1 LTS. I have set

export JAVA_HOME=/home/marc/jdk1.8.0_101
export SCALA_HOME=/home/marc/scala-2.11.8
export SPARK_HOME=/home/marc/spark-2.0.0-bin-hadoop2.7
export PATH=$PATH:$SCALA_HOME/bin:$JAVA_HOME/bin

at the end of .bashrc and also included in the start-all.sh file from spark/sbin folder

when I type echo $JAVA_HOME it gives me the correct path as /home/marc/jdk1.8.0_101

But when I call sbin/start-all.sh

It gives me the following error

localhost: failed to launch org.apache.spark.deploy.worker.Worker: localhost: JAVA_HOME is not set

I tried to follow similar topics, but I couldn't find a solution to the problem. Any help would be much appreciated.

Answer

Haoran Yang picture Haoran Yang · Dec 16, 2018

You need to modify the file named 'spark-config.sh' in the 'sbin'. Add your JAVA_HOME in this file, then everything will be OK.