Back

Explore Courses Blog Tutorials Interview Questions
0 votes
2 views
in Big Data Hadoop & Spark by (45k points)
Could someone walk me through the syllabus of Databricks Spark Certification?

1 Answer

0 votes
by (99k points)

The syllabus or pre-requisites of Databricks Spark Developer Certification is: 

  • Sound knowledge in BasicsSpark Architecture, Adaptive Query Execution, etc. 

  • Must be able to apply Spark DataFrame API to complete Data Manipulation tasks like selecting, manipulating, renaming columns, filtering, sorting, dropping, aggregating rows, reading, writing, joining, and partitioning DataFrames, working with Spark SQL functions, and UDFs. 

Databricks recommends you have at least six months of experience or more on Spark DataFrame API before attempting the certification exam. If you are looking for a course to help you clear this Databricks Certification, check out the Databricks Spark Certification Training. If you are starting, I recommend you watch the following YouTube video on Apache Spark Tutorial. 

Browse Categories

...