A Spark driver (an application’s driver process) is a JVM process that hosts SparkContext for a Spark application. In a Spark application, it is considered as the master node.
It is the special small place of jobs and tasks execution (using DAGScheduler and Task Scheduler). It also hosts Web UI for the environment.
It is used for splitting a Spark application into tasks and scheduling them to run on executors.
A driver builds up coordination between workers and overall execution of tasks.
More explanation regarding the role of a driver:
The driver prepares the context and declares the operations on the data using RDD transformations and actions.
The driver submits the serialized RDD graph to the master. The master creates tasks out of it and submits them to the workers for execution. It coordinates the different job stages.
If you want to know more about Spark, then do check out this awesome video tutorial: