how to kill hadoop jobs

Frank picture Frank · Jul 12, 2012 · Viewed 120.2k times · Source

I want to kill all my hadoop jobs automatically when my code encounters an unhandled exception. I am wondering what is the best practice to do it?

Thanks

Answer

dr0i picture dr0i · Feb 13, 2014

Depending on the version, do:

version <2.3.0

Kill a hadoop job:

hadoop job -kill $jobId

You can get a list of all jobId's doing:

hadoop job -list

version >=2.3.0

Kill a hadoop job:

yarn application -kill $ApplicationId

You can get a list of all ApplicationId's doing:

yarn application -list