Explore Courses Blog Tutorials Interview Questions
0 votes
1 view
in Big Data Hadoop & Spark by (6.5k points)

Can anyone explain the driver memory in Spark?

1 Answer

0 votes
by (11.3k points)

Driver memory in Spark is allocated to be 1 GB by default, but this can be altered using the -driver -memory flag. The driver memory is the memory that stores RDDs created during the Spark job, and the executor memory is the memory allocated within the worker nodes to perform job execution.

If you are looking for an online course to learn Spark, check out this Spark Training program by Intellipaat.

Welcome to Intellipaat Community. Get your technical queries answered by top developers!

28.4k questions

29.7k answers


94.7k users

Browse Categories