Search⌘ K
AI Features

Execution of a Spark Application

Explore the detailed process of Spark application execution, including job initiation, DAG scheduling, task assignment, and executor operations. Understand how Spark optimizes task placement and manages failures to run big data jobs efficiently.

We'll cover the following...

Execution of a Spark application

As discussed earlier, a Spark job is initiated when an action is performed. Internally, this invokes the SparkContext object's runJob(...) method. The call is passed on the scheduler. The scheduler runs as part of the driver and has two parts:

  • DAG Scheduler

  • Task Scheduler

The DAG scheduler breaks a job into a directed acyclic graph (DAG) of stages ...