Explore Courses Blog Tutorials Interview Questions
0 votes
in Big Data Hadoop & Spark by (11.4k points)

I am configuring an Apache Spark cluster.

When I run the cluster with 1 master and 3 slaves, I see this on the master monitor page:

2.0 GB (512.0 MB Used)
2.0 GB (512.0 MB Used)
6.0 GB (512.0 MB Used)

I want to increase the used memory for the workers but I could not find the right config for this.

1 Answer

0 votes
by (32.3k points)

For Spark 1.1.1+, to set the Max Memory of workers. You need to write this in conf/


Also, check if you have not used the config file yet, then copy the template file first, as mentioned below:

cp conf/ conf/

Then make the change and don't forget to source it

source conf/

Browse Categories