How do I run multiple spark applications in parallel in standalone master

Sankalp picture Sankalp · Apr 20, 2017 · Viewed 8.7k times · Source

Using Spark(1.6.1) standalone master, I need to run multiple applications on same spark master. All application submitted after first one, keep on holding 'WAIT' state always. I also observed, the one running holds all cores sum of workers. I already tried limiting it by using SPARK_EXECUTOR_CORES but its for yarn config, while I am running is "standalone master". I tried running many workers on same master but every time first submitted application consumes all workers.

Answer

user8117884 picture user8117884 · Jun 6, 2017

I was having same problem on spark standalone cluster.

What I got is, Somehow it is utilising all the resources for one single job. We need to define the resources so that their will be space to run other job as well.

Below is the command I am using to submit spark job.

bin/spark-submit --class classname --master spark://hjvm1:6066 --deploy-mode cluster  --driver-memory 500M --conf spark.executor.memory=1g --conf spark.cores.max=1 /data/test.jar