Back

Explore Courses Blog Tutorials Interview Questions
0 votes
2 views
in Big Data Hadoop & Spark by (55.5k points)
Which algorithm is used to remove old and unused RDD to release more memory in spark?

1 Answer

0 votes
by (119k points)

Spark uses the LRU (Least Recently Used) algorithm to remove the old and unused RCC from to release more memory.

Spark has automatic monitoring to monitor the cache and LRU removes the old and unused RDDs. If you want to remove any specific object, you can this RDD.unpersist() method.

If you are interested in to learn Spark, you can enroll in this Spark Training by Intellipaat.

You can watch this YouTube tutorial to understand more about RDD:

Related questions

Browse Categories

...