Intellipaat Back

Explore Courses Blog Tutorials Interview Questions
0 votes
2 views
in Big Data Hadoop & Spark by (11.4k points)

I have a DataFrame with Timestamp column, which i need to convert as Date format.

Is there any Spark SQL functions available for this?

1 Answer

0 votes
by (32.3k points)

You can just simply cast the column to date by following the code given below:

Scala:

import org.apache.spark.sql.types.DateType

val newDF = df.withColumn("dateColumn", df("timestampColumn").cast(DateType))

Pyspark:

df = df.withColumn('dateColumn', df['timestampColumn'].cast('date'))

Note:This solution uses functions available as part of the Spark SQL package, but it doesn't use the SQL language, instead it uses the robust DataFrame API, with SQL-like functions. It doesn’t use less reliable strings with actual SQL queries.

Related questions

0 votes
1 answer
+2 votes
6 answers
0 votes
1 answer

31k questions

32.8k answers

501 comments

693 users

Browse Categories

...