...

/

Motivation and Requirements for a Many-core Approach

Motivation and Requirements for a Many-core Approach

Learn about problems with scaling the Memcached key-value store with a many-core processor.

Why we use key-value stores

Memcached is a key-value store that is used by large platforms like Facebook and LinkedIn to read and write data quickly. Applications that require repeated fast reads and writes are ideal for a caching layer, which can be implemented using key-value stores.

Press + to interact
Traditional architecture (left) vs. architecture with key-value store (right)
Traditional architecture (left) vs. architecture with key-value store (right)

The image above shows how Memcached comes between the frontend and database to provide faster response time. Web architecture that uses a memory resident key-value store can be many folds faster than disk-based stores (typical RAM access takes 100 nanoseconds, while typical hard disk access takes four milliseconds). Key-value stores can be used to store:

  • User preferences

  • Users shopping carts

  • Real time product recommendations

As the users of these platforms grow larger, the need to scale key-value stores has become essential, and one of the hurdles to scaling them is the financial cost, mostly electricity cost for running them. According to one study, the cost of powering up servers in a modern data center can be up to 50% of a typical three-year total cost of ownership (TCO).

Problem statement

To reduce the financial cost of data centers hosting key-value stores, we need to reduce their powerA Watt is a unit of power equal to one joule per second. consumption. To do that, we want to maximize energy efficiency by increasing the performance of our system while decreasing the power it consumes.

The Memcached key-value store has a GET operation that can retrieve multiple keys in a single call (often called a batched GET), which reduces overall network traffic. However, this batch operation requires relatively high RAM than the simple GET ...