Hands-On: Custom ML Docker Images in AWS Lambda
Implementing custom ML dockers in AWS Lambda.
We'll cover the following...
Deployment
We have deployed our service the same way as before both in development and production. What SLS will do differently this time is build our Docker image and push it to ECR next to other resources that it will deploy to AWS.
Test yourself
Replace the value of the
SERVERLESS_SERVICE
environmental variable with the name of your choice.
import joblib import numpy as np import json model_file = "/opt/ml/model.joblib" model = joblib.load(model_file) def handler(event, context): x = event["body"] prd = model.predict(x) body = { "service": "sls-aws-python-starter-ml-serving-sf", "model": "linear", "version": "1.0", "prd": prd.tolist(), } return { "statusCode": 200, "body": body, }
Test yourself here
As a result, ...