Back
I'm using pyspark(Python 2.7.9/Spark 1.3.1) and have a dataframe GroupObject which I need to filter & sort in the descending order. Trying to achieve it via this piece of code.
group_by_dataframe.count().filter("`count` >= 10").sort('count', ascending=False)
But it throws the following error.
sort() got an unexpected keyword argument 'ascending'
In PySpark 1.3 ascending parameter is not accepted by sort method. You can use desc method instead:
from pyspark.sql.functions import col(group_by_dataframe .count() .filter("`count` >= 10") .sort(col("count").desc()))
from pyspark.sql.functions import col
(group_by_dataframe
.count()
.filter("`count` >= 10")
.sort(col("count").desc()))
or desc function:
from pyspark.sql.functions import desc(group_by_dataframe .count() .filter("`count` >= 10") .sort(desc("count"))
from pyspark.sql.functions import desc
.sort(desc("count"))
Both the above methods are valid for Spark 2.3 and greater, including Spark 2.x.
You can Use orderBy:
group_by_dataframe.count().filter("`count` >= 10").orderBy('count', ascending=False)
For more information refer: http://spark.apache.org/docs/2.0.0/api/python/pyspark.sql.html
As @chandra answered you can use groupBy and orderBy as follows:
dataFrameWay = df.groupBy("firstName").count().withColumnRenamed("count","distinct_name").sort(desc("count"))
31k questions
32.8k answers
501 comments
693 users