Anatomy of a Spark Application
This lesson explains the constituents of a Spark job.
We'll cover the following
Anatomy a Spark Application
In this lesson, we’ll formally look at various components of a Spark job. A Spark application consists of one or several jobs. But a Spark job, unlike MapReduce, is much broader in scope. Each job is made of a directed acyclic graph of stages. A stage is roughly equivalent to a map or reduce phase in MapReduce. A stage is split into tasks by the Spark runtime and executed in parallel on partitions of an RDD across the cluster. The relationship among these various concepts is depicted below:
Get hands-on with 1400+ tech skills courses.