Back

Explore Courses Blog Tutorials Interview Questions
0 votes
2 views
in Big Data Hadoop & Spark by (11.4k points)

Is there any way to get the current number of partitions of a DataFrame? I checked the DataFrame javadoc (spark 1.6) and didn't found a method for that, or am I just missed it?

1 Answer

0 votes
by (32.3k points)
edited by

You need to call getNumPartitions() on the DataFrame's underlying RDD, e.g., df.rdd.getNumPartitions().

In the case of Scala, this is a parameterless method: df.rdd.getNumPartitions.

If you want to know more about Spark, then do check out this awesome video tutorial:

Browse Categories

...