Serving a TFLite Model

Learn to use a TFLite model for inference.

We can use the TFLite model for inference on our datasets.

The tensorflow package supports TFLite and comes with built-in functions for inference.

Utility functions

We’ll need the softmax utility function to convert the tensor output to probabilities. Implement it by entering the following code:

Press + to interact
import numpy as np
import tensorflow as tf
import os
os.environ['TF_CPP_MIN_LOG_LEVEL'] = '2'
def softmax(vec):
exponential = np.exp(vec)
probabilities = exponential / np.sum(exponential)
return probabilities
# test softmax function
dummy_vec = np.array([1.5630065, -0.24305986, -0.08382231, -0.4424621])
print('The output probabilities after softmax is:', softmax(dummy_vec))

Note: ...