Intellipaat Back

Explore Courses Blog Tutorials Interview Questions
0 votes
2 views
in Big Data Hadoop & Spark by (11.4k points)

If I have an RDD that I no longer need, how do I delete it from memory? Would the following be enough to get this done:

del thisRDD

1 Answer

0 votes
by (32.3k points)

No, del thisRDD is not enough, it would just delete the pointer to the RDD. You should call thisRDD.unpersist() to remove the cached data.

For more information, I want to tell you that Spark uses a model of lazy computations, which means that when you run this code:

>>> thisRDD = sc.parallelize(xrange(10),2).cache()

Actually here, you won't have any data cached, it would be only marked as 'to be cached' in the RDD execution plan.

But if you just call an action on top of this RDD for at least once, it would eventually become cached:

>>> thisRDD.count()

10

>>> print thisRDD.toDebugString()

(2) PythonRDD[6] at RDD at PythonRDD.scala:43 [Memory Serialized 1x Replicated]

 |       CachedPartitions: 2; MemorySize: 174.0 B; TachyonSize: 0.0 B; DiskSize: 0.0 B

 |  ParallelCollectionRDD[5] at parallelize at PythonRDD.scala:364 [Memory Serialized 1x Replicated]

You can easily check the persisted data and the level of persistence in the Spark UI using the address  http://<driver_node>:4040/storage. You would see there that del thisRDD won't change the persistence of this RDD, but thisRDD.unpersist() would unpersist it, while you still would be able to use thisRDD in your code ( but note that it won't persist in memory anymore and would be recomputed each time it is queried).

Related questions

0 votes
1 answer
0 votes
1 answer
0 votes
1 answer

31k questions

32.8k answers

501 comments

693 users

Browse Categories

...