Back

Explore Courses Blog Tutorials Interview Questions
0 votes
2 views
in Big Data Hadoop & Spark by (6.5k points)

Can anyone explain the driver memory in Spark?

1 Answer

0 votes
by (11.3k points)
edited by

Driver memory in Spark is allocated to be 1 GB by default, but this can be altered using the -driver -memory flag. The driver memory is the memory that stores RDDs created during the Spark job, and the executor memory is the memory allocated within the worker nodes to perform job execution.

Kickstart your career by enrolling in the Apache Spark course in Singapore.

If you are looking for an online course to learn Spark, check out this Spark Training program by Intellipaat.

Browse Categories

...