Batching and Caching with Promises

Promises are a great tool for implementing asynchronous batching and caching of requests. Let’s see why.

There are two properties of promises that can be exploited to our advantage in this circumstance:

  • Multiple then() listeners can be attached to the same promise.
  • The then() listener is guaranteed to be invoked (only once), and it works even if it’s attached after the promise is already resolved. Moreover, then() is guaranteed to always be invoked asynchronously.

In short, the first property is exactly what we need for batching requests, while the second means that a promise is already a cache for the resolved value and offers a natural mechanism for returning a cached value in a consistent, asynchronous way. In other words, this means that batching and caching become extremely simple and concise with promises.

Batching requests in the total sales web server

Let’s now add a batching layer on top of our totalSales API. The pattern we’re going to use is very simple: if there’s another identical request pending when the API is invoked, we’ll wait for that request to complete instead of launching a new one. As we’ll see, this can easily be implemented with promises. In fact, all we have to do is save the promise in a map, associating it to the specified request parameters (the product type, in our case) every time we launch a new request. Then, at every subsequent request, we check if there’s already a promise for the specified product and if there is one, we just return it; otherwise, we launch a new request.

Now, let’s see how this translates into code. Let’s create a new totalSalesBatch.js named module. Here, we’re going to implement a batching layer on top of the original totalSales() API.

Get hands-on with 1400+ tech skills courses.