0 votes
1 view
in Big Data Hadoop & Spark by (11.4k points)

I have a spark data frame df. Is there a way of sub selecting a few columns using a list of these columns?

scala> df.columns
res0: Array[String] = Array("a", "b", "c", "d")

I know I can do something like df.select("b", "c"). But suppose I have a list containing a few column names val cols = List("b", "c"), is there a way to pass this to df.select? df.select(cols) throws an error. Something like df.select(*cols) as in python

1 Answer

0 votes
by (32.2k points)

Here df.select(cols.head, cols.tail: _*) is used.

The key is the best method of select:

select(col: String, cols: String*)

In this the cols:String* entry takes a variable number of arguments. :_* unpacks arguments so that they can be managed by this argument. Very same to unpacking in python with *args.

Welcome to Intellipaat Community. Get your technical queries answered by top developers !