terminating a spark step in aws

Daniel Imberman picture Daniel Imberman · Jan 26, 2016 · Viewed 8.2k times · Source

I want to set up a series of spark steps on an EMR spark cluster, and terminate the current step if it's taking too long. However, when I ssh into the master node and run hadoop jobs -list, the master node seems to believe that there is no jobs running. I don't want to terminate the cluster, because doing so would force me to buy a whole new hour of whatever cluster I'm running. Can anyone please help me terminate a spark-step in EMR without terminating the entire cluster?

Answer

Erik Schmiegelow picture Erik Schmiegelow · Jan 26, 2016

That's easy:

yarn application -kill [application id]

you can list your running applications with

yarn application -list