Search⌘ K
AI Features

Implement LRU Cache

Explore how to implement an LRU cache that supports efficient get and set functions in constant time. Understand the combination of hashmaps and doubly linked lists to maintain cache order and perform eviction. Gain the skills to handle cache size limits and optimize for quick lookups and updates.

Statement

Implement an LRU cache class with the following functions:

  • LRUCache(size): Initializes an LRU cache with the capacity size.
  • get(key): Returns the value of the key, or -1 if the key does not exist.
  • set(key, value): Adds a new key-value pair or updates an existing key with a new value.

After adding a new key, if the number of keys exceeds the cache capacity, evict the least recently used key and add the new one.

A cachecaching store is usually not big enough to store a complete data set. Whenever the cache becomes full, we need to evict some data from it. There are several caching algorithms to implement a cache eviction policy. LRU is a very simple and commonly used algorithm. The core concept of the LRU algorithm is to evict the oldest data from the cache to accommodate more data.

Example

Let’s take an example of a cache with a capacity of four elements. The diagram below represents the cache state after accessing all four elements in order 1, 2, 3, and 4. All these four elements have been stored in an order where the head of the cache has the most recently accessed element, and the tail has the least recently used element.

g array head 4 3 2 1 tail

Now, we need to cache another element, 5 ...