Back

Explore Courses Blog Tutorials Interview Questions
0 votes
2 views
in Big Data Hadoop & Spark by (50.2k points)

How does Apache Spark work?

1 Answer

0 votes
by (106k points)

Spark uses the Master/Slave architecture, which consists of a single central coordinator (driver) and several distributed workers (executors). When a code is entered in Spark, the driver program (SparkContext) creates the job and sends it to DAG Scheduler, which creates the Operator graph and sends it to the Task Scheduler, which launches the task via the cluster manager. To learn more about the working and components of Apache Spark, please go through this video - Introduction to Apache Spark.

Browse Categories

...