Back

Explore Courses Blog Tutorials Interview Questions
0 votes
2 views
in Big Data Hadoop & Spark by (19k points)

I have a running Spark application where it occupies all the cores where my other applications won't be allocated any resource.

I did some quick research and people suggested using YARN kill or /bin/spark-class to kill the command. However, I am using CDH version and /bin/spark-class doesn't even exist at all, YARN kill application doesn't work either.

enter image description here

Can anyone with me with this?

1 Answer

0 votes
by (33.1k points)

To kill running Spark application:

  • copy paste the application Id from the spark scheduler, for instance, application_1428487296152_25597
  • connect to the server that have to launch the job
  • yarn application -kill application_1428487296152_25597
Hope this answer helps you!

Related questions

0 votes
1 answer
0 votes
1 answer

Browse Categories

...