0 votes
1 view
in Big Data Hadoop & Spark by (11.5k points)

To check running applications in Apache spark, one can check them from the web interface on the URL:

http://<master>:8080

My question how we can check running applications from terminal, is there any command that returns applications status?

1 Answer

0 votes
by (31.4k points)

I would suggest you to use spark-submit --status (as described in Mastering Apache Spark 2.0).

spark-submit --status [submission ID]

And for reference just see the code of spark-submit:

if (!master.startsWith("spark://") && !master.startsWith("mesos://")) {

  SparkSubmit.printErrorAndExit(

    "Requesting submission statuses is only supported in standalone or Mesos mode!")

}

Welcome to Intellipaat Community. Get your technical queries answered by top developers !


Categories

...