Rate Limiting
Learn about rate limiting and how we can implement rate limiting in Go.
We'll cover the following
What is rate limiting?
Rate limiting is a technique used to control the rate of traffic sent or received by a service. This is important to prevent a service from being overwhelmed by too many requests. Rate limiting is often used in APIs, where clients request a server to retrieve data or perform some action. The underlying physical resources running the service will get exhausted if overloaded beyond a point. At this point, the system will protect itself by either killing the process that is utilizing too many resources, which in this case will be our service, or it might restart the entire virtual machine to begin again from a clean slate. Neither of these scenarios is appealing. Another approach is to control the amount of incoming traffic that can be accepted and processed to avoid these unpleasant scenarios. This is the approach that we will discuss at length in this lesson.
What is a semaphore?
A semaphore is a generic name for a synchronization mechanism used in operating systems and computer programs to control access to a shared resource by multiple processes or threads. It is essentially a counter that limits the number of processes or threads that can simultaneously access a resource. We will use a similar approach to implement our rate limiter.
New rate limiter middleware
Let’s start by creating a new middleware. This middleware should dictate whether a new incoming request is to be processed.
Get hands-on with 1400+ tech skills courses.