The LRU Cache
Get introduced to the Least Recently Used (LRU) caching strategy and its benefits in distributed systems.
We'll cover the following...
The LRU cache is a popular caching strategy that discards the least recently used items first. This cache eviction policy is based on the assumption that items that have not been accessed recently are less likely to be accessed in the near future. Let us look at how we can implement an LRU cache in Go.
Setup
First, we need to define a data structure that represents a cache entry. The entry should contain the key and value of the cached item. We also need to define a doubly linked list node to manage the order of the entries in the cache.
type LRUCache struct {capacity intcache map[int]*CacheEntryhead *CacheEntrytail *CacheEntry}type CacheEntry struct {key intvalue intprev *CacheEntrynext *CacheEntry}
The LRUCache struct contains the following:
It contains a
capacityfield that represents the maximum number of entries that the cache can hold.It contains a
sizefield that represents the current number of entries in the cache.It contains a
cachemap that maps keys to entries.It contains the
headandtailpointers that point to the first and last entries in the doubly linked list.
The CacheEntry struct contains the following:
It contains a unique key (
key) for every entry. ...