Yes, Google’s infrastructure is an example of distributed computing, as it relies on a network of interconnected computers to process large-scale data and deliver services efficiently.
Applications are central to computing. Without applications, there would be no computing. Over the years, applications have evolved significantly, resulting in an evolution in computing technology as well. A careful look at the computing history reveals that it has gone through cycles of centralized and distributed styles.
In this blog, we'll take a look at the history of computing and observe how applications evolved to cause shifts in computing technology. We’ll notice something really interesting about how computing history has cycled between centralized and distributed deployment styles over the years. So let's dive right in.
Note: For the purpose of this blog, we'll limit the history to the eletrconic computer.
The early computers of the 1950s were large mainframe computers. The production quantities were small, physical sizes were large, and prices were high. Only large private organizations, governments and research institutions could afford these.
Here are some examples of the applications that typically run on these computers:
Many people would share a mainframe. Computing was centralized; there'd be a central computer room housing the mainframe(s). Computational jobs would be submitted to run in either batch mode or time-sharing.
In batch mode, the submitted applications would be run one after the other. In time-sharing, multiple users would be connected to the mainframe through remote terminals at the same time, while their programs would use the computer in a time-sliced fashion.
The challenge, whether the mainframe was used on-site or remotely, was latency. Since these were mostly batch-mode applications which would take some time to run anyway, this wasn't much of an issue.
In the 1970s and 1980s, smaller personal computers were developed. Production quantities increased and prices dropped. Individuals could now afford to have a computer of their own. With a computer by their side, latency dropped significantly.
Applications of a more personal nature could be run on these computers, for example:
In this era, computing became predominantly distributed. We went from centralized mainframes to distributed personal computers. People could have one in their homes; or small businesses could keep them in their offices.
Personal computers were all great, but applications kept getting complex. At the same time, handheld devices like personal digital assistants (PDAs) and smartphones were invented. These devices had limited memory, computing power, and battery. Complex applications couldn't be run on these devices with a reasonable user experience. Some examples include finding optimal routes on a map, financial data processing, playing complex multi-player games, etc.
Cloud computing emerged in the late 1990s. Large centralized data centers were built with lots of computing equipment. In the cloud computing model, this enormous computing power could be remotely used on demand over the Internet. This enabled users on handheld devices and personal computers to be able to enjoy complex applications.
Examples of players in the cloud computing business include Salesforce, Amazon Web Services, Google Cloud Platform, Microsoft Azure, and Digital Ocean.
With the emergence of cloud computing, computing power was predominantly concentrated in centralized cloud data centers. So, computing, which was centralized in the beginning, later became distributed, became centralized once more.
Again, since the typical user was far from the cloud data centers, latency increased. For computationally complex applications with flexible latency requirements, this was still OK.
Cloud Computing Fundamentals
We are surrounded by the technology that we utilize daily. Most of it makes use of cloud computing. Cloud is not a nuance anymore; it’s the norm. As software practitioners, it’s imperative to have a good understanding of cloud computing concepts. In this course, you will learn the fundamental concepts of cloud computing. Next, you’ll familiarize yourself with cloud’s standard services. You’ll also learn about various service models available in cloud computing. You’ll learn the concepts of clustering and its relevance in cloud computing. You’ll explore storage and deployment concepts in the cloud. You’ll wrap up with a hands-on experience of how to pick a cloud platform and start your cloud journey. In the end, you’ll have plenty of resources to continue your cloud learning journey. By the end of this course, you’ll have a deeper understanding of the basic concepts of cloud computing and the services and products that cloud platforms offer.
It wasn't long before real-time applications requiring low latency became hot. Examples include autonomous vehicles, virtual reality, augmented reality, remote robotic surgery, and the Internet of Things (IoT). For such applications, events arising on the client side had to be sent to the server, processed on the server, and response received by the client on a very tight timeline.
To be able to enjoy such computationally complex applications from handheld devices, we had to leverage the cloud infrastructure. However, the round trip time from a typical user to the nearest cloud data center would easily run into hundreds of milliseconds, which is unacceptable for latency-sensitive applications.
The centralized cloud computing model failed to meet the demands of latency-sensitive applications. To enable such complex yet latency-sensitive applications, it was proposed that cloudlets or micro data centers should be deployed close to the users. One possible deployment scenario is installing some servers, networking, and storage devices at the cellular base stations. That way, all users in the vicinity would be served through this cloudlet.
Notice how computing is becoming more distributed in this era. We're going from centralized data centers to distributed cloudlets.
Computer applications have evolved in nature and requirements over the years. Technology evolved to meet these requirements. As a result, we have seen computing go from centralized mainframes to distributed personal computers, then back to centralized in the form of the cloud, only to be distributed again in the form of edge computing. Will it take a centralized shift again? Who knows?
Free Resources