Load Balancers

Learn about the load balancers and the services offered by them.

Overview

Millions of requests could arrive per second in a typical data center. To serve these requests, thousands (or a hundred thousand) servers work together to share the load of incoming requests.

Note: Here, it’s important that we consider how the incoming requests will be divided among all the available servers.

A load balancer (LB) is the answer to our problem. The job of the load balancer is to fairly divide all client requests among the pool of available servers. Load balancers perform this job to avoid overloading or crashing servers.

The load balancing layer is the first point of contact within a data center after the firewall. A load balancer may not be required if a service entertains a few hundred or even a few thousand requests per second. However, for increasing client requests, load balancers provide the following capabilities:

  • Scalability: By adding servers, the capacity of the application/service can be increased seamlessly. Load balancers make such upscaling or downscaling transparent to the end users.
  • Availability: Even if some servers go down or suffer a fault, the system still remains available. One of the jobs of load balancers is to hide the faults and failures of the servers.
  • Performance: Load balancers can forward requests to servers with a lesser load so the user can get a quicker response time. This not only improves performance but also improves resource utilization.

Here’s an abstract depiction of how load balancers work:

Placing load balancers

Generally, LBs sit between clients and servers. Requests go through to servers and back to clients via the load balancing layer. However, that isn’t the only point where LBs are used.

Let’s consider the three well-known groups of servers: the web, the application, and the database servers. To divide the traffic load among the available servers, LBs can be used between the server instances of these three services in the following way:

  • Place LBs between end users of the application and web servers/application gateway.
  • Place LBs between the web servers and application servers that run the business/application logic.
  • Place LBs between the application servers and database servers.

In reality, load balancers can be potentially used between any two services with multiple instances within the design of a system.

Services offered by load balancers

LBs not only enable services to be scalable, available, and highly performant, but they offer some key services like the following:

  • Health checking: LBs use the heartbeat protocolThe heartbeat protocol is a way of identifying failures in distributed systems. Using this protocol, every node in a cluster periodically reports its health to a monitoring service. to monitor the health and, therefore, the reliability of end servers. Another advantage of health checking is the improved user experience.
  • TLS termination: LBs reduce the burden on end servers by handling TLS terminationTLS termination, also called TLS/SSL
...