0 votes
1 view
in Big Data Hadoop & Spark by (11.5k points)

Is there any way to get the current number of partitions of a DataFrame? I checked the DataFrame javadoc (spark 1.6) and didn't found a method for that, or am I just missed it?

1 Answer

0 votes
by (32.5k points)
edited by

You need to call getNumPartitions() on the DataFrame's underlying RDD, e.g., df.rdd.getNumPartitions().

In the case of Scala, this is a parameterless method: df.rdd.getNumPartitions.

If you want to know more about Spark, then do check out this awesome video tutorial:

Welcome to Intellipaat Community. Get your technical queries answered by top developers !


Categories

...