Custom ML Dockers Images in AWS Lambda
Learn about the Serverless Framework and AWS constructs for running ML dockers.
We'll cover the following...
Introduction
By now, we probably have a pretty good idea of the main components of the serverless.yml
file and its most important configuration options. In this example, we’ll extend that knowledge to running our custom Docker images in AWS Lambda.
In this lesson, we'll deploy a service that consists of the following:
ML model:
model/train.py
is a small script that produces a tiny decision tree model, which we will inject into our Docker image. We'll serve its predictions in AWS Lambda.Dockerfile: The basis of our custom AWS Lambda image in which we'll serve an ML model. SLS will use the Dockerfile included in the root directory in order to build and ship that image to Amazon Elastic Container Registry (ECR)
.Amazon Elastic Container Registry AWS ECR repository: AWS ECR allows for hosting Docker images. We will use it in our example in order to host our custom AWS Lambda Docker image.
AWS Lambda function: This function is responsible for serving model predictions, but contrary to how we deployed services before, we won’t point to it in our SLS configuration.
ML model
The first component of our custom model ...