Explore Courses Blog Tutorials Interview Questions
0 votes
in Big Data Hadoop & Spark by (11.4k points)

To check running applications in Apache spark, one can check them from the web interface on the URL:


My question how we can check running applications from terminal, is there any command that returns applications status?

1 Answer

0 votes
by (32.3k points)

I would suggest you to use spark-submit --status (as described in Mastering Apache Spark 2.0).

spark-submit --status [submission ID]

And for reference just see the code of spark-submit:

if (!master.startsWith("spark://") && !master.startsWith("mesos://")) {


    "Requesting submission statuses is only supported in standalone or Mesos mode!")


Browse Categories