Refinements in Spark
Learn how Spark deals with the faults it faces.
We'll cover the following...
The problems that Spark faces include worker failures and limited memory issues. It can also have driver failures for which Spark does not provide any tolerance.
Managing limited memory
A Least Recently Used (LRU) eviction policy is used at the RDD level to manage limited memory. Whenever there is insufficient memory to cache a newly computed RDD partition, Spark removes an RDD partition that belongs to the least recently used RDD. ...
Access this course and 1400+ top-rated courses and projects.