Back

Explore Courses Blog Tutorials Interview Questions
0 votes
2 views
in Big Data Hadoop & Spark by (11.4k points)

We are running a Spark job via spark-submit, and I can see that the job will be re-submitted in the case of failure.

How can I stop it from having attempt #2 in case of yarn container failure or whatever the exception be?

enter image description here

1 Answer

0 votes
by (32.3k points)

The number of retries is controlled by the following settings(i.e. the maximum number of ApplicationMaster registration attempts with YARN is considered failed and hence the entire Spark application):

private[spark] val MAX_APP_ATTEMPTS = ConfigBuilder("spark.yarn.maxAppAttempts")

  .doc("Maximum number of AM attempts before failing the app.")

  .intConf

  .createOptional

  • yarn.resourcemanager.am.max-attempts

One solution for your problem would be to set the yarn max attempts as a command line argument:

spark-submit --conf spark.yarn.maxAppAttempts=1 <application_name>

Welcome to Intellipaat Community. Get your technical queries answered by top developers!

30.5k questions

32.6k answers

500 comments

108k users

Browse Categories

...