Back

Explore Courses Blog Tutorials Interview Questions
0 votes
2 views
in Big Data Hadoop & Spark by (11.4k points)

I want to change Typesafe config of a Spark job in dev/prod environment. It seems to me that the easiest way to accomplish this is to pass -Dconfig.resource=ENVNAME to the job. Then Typesafe config library will do the job for me.

Is there way to pass that option directly to the job? Or maybe there is better way to change job config at runtime?

1 Answer

0 votes
by (32.3k points)
edited by

Change spark-submit command line adding these three options:

--files <location_to_your_app.conf>

--conf 'spark.executor.extraJavaOptions=-Dconfig.resource=app'

--conf 'spark.driver.extraJavaOptions=-Dconfig.resource=app'


 

Note - Using the --conf 'spark.executor.extraJavaOptions=-Dconfig.resource=app' option will not work when spark submits the driver in client mode. Use --driver-java-options "-Dconfig.resource=app" instead.

If you want to know more about Spark, then do check out this awesome video tutorial:

Browse Categories

...