The Serverless Landscape
Learn about the serverless applications and various ways to run them.
Serverless computing is a deployment model that allows us to build and run applications without having to manage servers. It describes a paradigm in which a provider handles the routine work of provisioning, maintaining, and scaling the server infrastructure, while developers simply package and upload their code for deployment. Serverless applications may scale up and down automatically as needed with no additional setup required by the developer.
According to the CNCF Serverless Whitepaper published by the CNCF Serverless Working Group, there are two main serverless personas:
-
Developers: They write code for the serverless platform, which gives them the impression that there are no servers and that the code is always up and running.
-
Providers: They deploy the serverless platform.
The service provider must manage servers or containers. Even when idle, the platform will incur some costs. A self-hosted system can also be considered serverless. Typically, one team serves as the provider, while the other serves as the developer.
Benefits of serverless computing
Deploying applications as serverless services has become a popular architectural style. By serverless, we aren’t only referring to the Functions as a Service (FaaS) style but also to the applications running in containers with the ability to scale down to zero and that require no server management. Kubernetes, by itself, is great. However, it might be intimidating for a developer who wants to focus solely on the developmental and not the operational aspects.
Various ways to run serverless computing
Serverless apps can be run in several ways. As seen in the figure below, various tools, frameworks, and platforms exist to build and run serverless applications.
We should see Knative listed in the figure above in the lower right section labeled Installable Platforms. We have chosen the Knative project out of all the options in that section because they have a supportive community on Slack and continuously improve documentation. Most importantly, any image deployed on Knative can be deployed on any platform that can run an Open Container Initiative (OCI) image. This means that we aren’t restricted to only using Knative for the app and can deploy it to other platforms without modifying the code or image. We can compare this to running containers on AWS Lambda, which requires a proprietary runtime that only works within their platform.
Another important reason for choosing Knative is because it’s backed by big companies like Google, IBM, and VMware, and it powers products like Google Cloud Run and IBM Cloud Code Engine.
Note: The OCI is a Linux Foundation project started in June 2015 by Docker to design open standards for container image formats and runtimes. The OCI currently contains two specifications: the Runtime Specification and the Image.