As far as I am concerned it is not possible to use a single withColumnRenamed call.
I would suggest you to use DataFrame.toDF method. Do this:
data.toDF('x3', 'x4')
or
new_names = ['x3', 'x4']
data.toDF(*new_names)
You can also do renaming using simple select:
from pyspark.sql.functions import col
mapping = dict(zip(['x1', 'x2'], ['x3', 'x4']))
data.select([col(c).alias(mapping.get(c, c)) for c in data.columns])
Similarly, In Scala
To rename all columns do:
val newNames = Seq("x3", "x4")
data.toDF(newNames: _*)
To rename from mapping with select:
val mapping = Map("x1" -> "x3", "x2" -> "x4")
df.select(
df.columns.map(c => df(c).alias(mapping.get(c).getOrElse(c))): _*
)
or you can also use foldLeft + withColumnRenamed:
mapping.foldLeft(data){
case (data, (oldName, newName)) => data.withColumnRenamed(oldName, newName)
}
Note: * Not to be confused with RDD.toDF which is not a variadic functions, and takes column names as a list.