Spark Kill Running Application

B.Mr.W. picture B.Mr.W. · Apr 10, 2015 · Viewed 166.4k times · Source

I have a running Spark application where it occupies all the cores where my other applications won't be allocated any resource.

I did some quick research and people suggested using YARN kill or /bin/spark-class to kill the command. However, I am using CDH version and /bin/spark-class doesn't even exist at all, YARN kill application doesn't work either.

enter image description here

Can anyone with me with this?

Answer

Gérald Reinhart picture Gérald Reinhart · May 15, 2015
  • copy past the application Id from the spark scheduler, for instance application_1428487296152_25597
  • connect to the server that have launch the job
  • yarn application -kill application_1428487296152_25597