Deploying a Tensorflow Model

Learn to deploy TensorFlow models in Azure.

Create a batch endpoint

We will discuss the deep learning deployment details in this lesson. For deep learning deployments, batch deployments are more helpful in running the deployments in bulk. We can leverage parallelism support for batch deployments.

We will start with batch endpoint creation. Using the widget below, we create a batch endpoint called batch-endpt.

$schema: https://azuremlschemas.azureedge.net/latest/batchEndpoint.schema.json
name: batch-endpt
description: my sample batch endpoint
auth_mode: aad_token
Creating a batch endpoint

Create a batch deployment

As the batch endpoint is created, let’s create a batch deployment. The process is similar to ML batch deployment with a few changes.

We will use the TensorFlow MNIST model, which is present here. We have downloaded it to the /usercode/models folder. You will add its path in the command below.

The model contains a session graph.

Register the model in the cloud using the following command. Replace the <model_path> with the path where the model is downloaded. The model ...