The tech industry is currently in a massive paradigm shift — and of course, this is not the first time.
Every decade or so, we see a major change that revolutionizes tech's status quo. Broadband internet, cloud, and mobile have been powerful shifts since the 2000s. Before that, industry changes were driven by personal computing, machine learning, and the World Wide Web.
Cloud computing really kicked off in the mid '00's. Amazon released AWS in 2006, Google and Microsoft followed suit, and today, millions of people use cloud-based applications for daily tasks. With backups and recovery options, cloud gives us more data security and availability. Meanwhile, cloud technology has enhanced real-time collaboration — especially among remote teams — leading to increased productivity. By allowing access to off-premise computing and storage, the cloud allowed organizations and individuals to transcend the limits of hardware constraints.
The mobile shift boomed in parallel with the cloud. iPhones and Android phones swept the US market in the late '00s. In-person services became accessible from our very pockets. As smartphone-owners came to expect on-the-go options for handling professional and personal tasks, almost every app developed a mobile counterpart. The mobile era connected us more than ever and brought the internet to the fingertips of millions.
Fast forward to today, the 2020s seems markedly to be about AI — and especially, Generative AI.
The Generative AI shift exploded when OpenAI unveiled ChatGPT in late 2022. Since then, companies big and small have launched new Generative AIs and Large Language Models (LLMs). Meanwhile, hyperscalers from AWS to Oracle have had to upgrade their infrastructure to run such powerful technologies. Amid the excitement, McKinsey estimated that Generative AI’s benefits to productivity could add $2.6-4.4 trillion dollars in value to the global economy.1
Over the last year, AI has integrated into more of our everyday lives while companies have raced to expand AI services. AI-powered analytics are hyper-personalizing the user experience with relevant recommendations, from streaming services to online shopping. Meanwhile, virtual assistants and search engines are getting better at understanding and responding to human queries.
Like every other era, the Generative AI shift is changing our software development practices — and bringing a great deal of uncertainty.
But even in uncharted territories, tech history shows us that we can expect some things to remain constant.
During paradigm shifts, there's often a narrative that new technologies are "superior" and will overcome their predecessors. But in reality, emerging technologies are built upon their predecessors. For instance, both the mobile and cloud era were built upon the backbone of broadband internet. And undoubtedly, the Gen AI boom is also built upon the highly scalable cloud.
If anything, shifts just lead to technologies getting increasingly interconnected. We already live in a world where it's difficult to build an application without interacting with cloud services and AI.
Let's imagine we're creating a large-scale application, such as an online store. We may be running it in the cloud, and leveraging other apps that rely on the cloud, like:
Payment processing services (e.g., Stripe)
Messaging services (e.g., Twilio)
At this point, we're already leveraging the cloud both directly and indirectly. Now let's add APIs to the picture — like a Chatbot API. With the AI LLM behind the API we're leveraging, we now have a direct and indirect explosion of cloud and AI usage.
Emerging technologies also empower each other in what could be considered symbiotic relationships.
For instance, the mobile shift empowered the cloud. Users expected their devices to do more and solve bigger problems. Take Google Maps, for instance. Maps didn't have to scale their servers so much if mobile didn't exist. However, mobile increased expectations, and users demanded real-time directions and traffic updates. Suddenly, mobile phones had to be just as powerful as a traditional GPS device. Getting there required some extensive server-side engineering (compared to the desktop Maps, which was rather simple and static).
Similarly, mobile and cloud technologies supported the advancement of AI. Both technologies have enabled the mass of data we refer to as "Big Data." Millions of people interact with their smartphones everyday, leading to a huge variety and volume of data that we've collected. Meanwhile, cloud infrastructure has made it possible to process and store these large datasets. Without big data, we wouldn't have such powerful AI systems today.
As new technologies build upon each other, shifts don't replace all we've ever known — but they do require us to rebuild.
Every paradigm shifts ushers in new challenges.
As we adapt, we see new development and IT practices arise in the industry. The cloud gave way to distributed systems — entire systems spread across networked devices in different locations — which grew to spawn the new discipline system design. And in keeping with the constant need to scale and modify systems, microservices architecture became a staple of cloud applications.
As masses began using smartphones, more people were leveraging these services simultaneously. This led to unprecedented high traffic — and all the while, users still expected available and performant applications. Engineers suddenly had to face and mitigate new challenges to keep services running, such as the thundering herd problem. (In the thundering herd problem, systems can fail because they don't have enough processing power to handle incoming requests. We recently discussed this in the context of Ticketmaster).
Now, the Generative AI era is welcoming unique issues around security and authentication, and new practices will have to be developed as a result. But to add to that, infrastructural constraints also pose a great challenge for the AI boom. Most cloud servers run on general-purpose CPU chips, and are not optimized for AI. As a result, hyperscalers from Google to Amazon are scrambling to upgrade their infrastructure.
Paradigm shifts always face limitations — but luckily, constraints tend to foster creative solutions.
Our familiar example, Google Maps, is still a very present, very live assistant —but that's only because the app was rebuilt and retaught to scale. This required a lot of server-side cloud engineering skills. So, Generative AI's limitations are only paving the way for rethinking and rebuilding our infrastructure (and there's a lot of work ahead for developers with cloud skills).
We tend to have an exaggerated perception of how quickly new technologies will change the world. But in reality, change takes time.
Consider cloud computing, for instance. Cloud computing now takes up a significant portion of today's computing market. According to Grand View Research, the global cloud computing market was valued at over $438 billion in 2022. Despite that, it took nearly 20 years for cloud technology to reach this point. In its first ten years, the adoption of cloud technologies was relatively slow. And even now, not all computing is on the cloud. While the cloud revolutionized a lot, it didn't completely change the status quo.
Even Generative AI's history has been a long time in the making. After all, Generative AI is a breakthrough in Natural Language Processing (NLP) — a field that can be traced back to the 1950s. While Generative AI is a shiny new ML technology, we've had AI-powered voice assistants, transportation systems, and search engines for decades now.
We can't be fully certain of what tech's future will look like with Generative AI. But the truth is that these changes will come slowly. If for no other reason, it's because we just don't have enough computing power available in the world. We can't build enough chips fast enough. These constraints are simply impossible to override.
And regardless of hype (or fears), I can assure you that the need for developers is not going to diminish just because of Generative AI. But you still have time now to learn to leverage ML/AI, and frankly, taking up that opportunity will only help you stay competitive.
It is important to keep an eye on next year’s software development trends and prepare yourself for the change. Each paradigm shift opens a new dimension of how we work with technology, and new genres of problems we have to deal with. Similarly, they also create many opportunities for developers, so long as they choose to step up their game.
But, over time, the more interesting jobs will require cloud engineering and ML skills. If you want to play a role in the rebuilding of tech's future, you can start fostering relevant skills:
Deep understanding of cloud services
Server-side cloud engineering skills
NLP skills
We offer thousands of courses, Skill Paths, Projects, and CloudLabs to help you prepare for the changes ahead at Educative.
Here are a few that I think you might find interesting:
We also offer CloudLabs, a setup-free way to get you hands-on skills with AWS services from SageMaker to Lambda. You can browse our growing catalog of 100+ CloudLabs because we’re sure you'll find something of your interest.
Happy learning!
Free Resources