0 votes
1 view
in Big Data Hadoop & Spark by (50.5k points)

How does Apache Spark work?

1 Answer

0 votes
by (107k points)

Spark uses the Master/Slave architecture, which consists of a single central coordinator (driver) and several distributed workers (executors). When a code is entered in Spark, the driver program (SparkContext) creates the job and sends it to DAG Scheduler, which creates the Operator graph and sends it to the Task Scheduler, which launches the task via the cluster manager. To learn more about the working and components of Apache Spark, please go through this video - Introduction to Apache Spark.

Welcome to Intellipaat Community. Get your technical queries answered by top developers !


Categories

...