Intellipaat Back

Explore Courses Blog Tutorials Interview Questions
0 votes
2 views
in Big Data Hadoop & Spark by (11.4k points)

I am configuring an Apache Spark cluster.

When I run the cluster with 1 master and 3 slaves, I see this on the master monitor page:

Memory
2.0 GB (512.0 MB Used)
2.0 GB (512.0 MB Used)
6.0 GB (512.0 MB Used)


I want to increase the used memory for the workers but I could not find the right config for this.

1 Answer

0 votes
by (32.3k points)

For Spark 1.1.1+, to set the Max Memory of workers. You need to write this in conf/spark.env.sh:

export SPARK_EXECUTOR_MEMORY=2G

Also, check if you have not used the config file yet, then copy the template file first, as mentioned below:

cp conf/spark-env.sh.template conf/spark-env.sh

Then make the change and don't forget to source it

source conf/spark-env.sh

31k questions

32.8k answers

501 comments

693 users

Browse Categories

...