Back
We are running a Spark job via spark-submit, and I can see that the job will be re-submitted in the case of failure.
spark-submit
How can I stop it from having attempt #2 in case of yarn container failure or whatever the exception be?
The number of retries is controlled by the following settings(i.e. the maximum number of ApplicationMaster registration attempts with YARN is considered failed and hence the entire Spark application):
spark.yarn.maxAppAttempts - Spark's own setting. Have a look on MAX_APP_ATTEMPTS:
private[spark] val MAX_APP_ATTEMPTS = ConfigBuilder("spark.yarn.maxAppAttempts") .doc("Maximum number of AM attempts before failing the app.") .intConf .createOptional
private[spark] val MAX_APP_ATTEMPTS = ConfigBuilder("spark.yarn.maxAppAttempts")
.doc("Maximum number of AM attempts before failing the app.")
.intConf
.createOptional
yarn.resourcemanager.am.max-attempts
One solution for your problem would be to set the yarn max attempts as a command line argument:
spark-submit --conf spark.yarn.maxAppAttempts=1 <application_name>
31k questions
32.8k answers
501 comments
693 users