Serverless Deployments in Kubernetes

This lesson states the benefits of Kubernetes-based serverless computing and also introduces Knative.

Kubernetes-based serverless computing

At this point, I must make an assumption that you, dear reader, might disagree with. Most companies will run at least some of their applications in Kubernetes. It is becoming a standard API that will be used by everyone.

Why is that assumption important? If I’m right, then almost everyone will have a Kubernetes cluster. Everyone will spend time maintaining it, and everyone will have some level of in-house knowledge of how it works. If that assumption is correct, it stands to reason that Kubernetes would be the best choice for a platform to run serverless applications as well. As an added bonus, that would avoid vendor lock-in since Kubernetes can run almost anywhere.

Kubernetes-based serverless computing provides quite a few other benefits.

  • We are free to write our applications in any language, instead of being limited by those supported by function-as-a-service solutions offered through cloud vendors.
  • We aren’t limited to writing only functions.
  • A microservice or even a monolith could run as a serverless application. We just need to find a solution to make that happen. After all, proprietary, cloud-specific, serverless solutions use containers as well. The standard mechanism for running containers is using Kubernetes.

There’s an increasing number of Kubernetes platforms that allow us to run serverless applications. We won’t go into all of those, but instead, fast-track the conversation by stating that Knative is likely going to become the de-facto standard in how to deploy serverless loads to Kubernetes. Maybe, it’s already the most widely accepted standard by the time you read this.

Get hands-on with 1400+ tech skills courses.