Search⌘ K

TF Lite Task Library

Explore the TF Lite Task Library to handle common preprocessing and postprocessing tasks when deploying image classification and other ML models on Android. Understand its APIs for vision, audio, and natural language tasks, and learn how to optimize model performance using hardware delegates like CPU, GPU, and NNAPI.

The TF Lite framework provides us with the tools to run ML and DL models on target hardware, such as mobile and edge devices. However, we have to develop preprocessing and postprocessing logic, including the input data preparation and output result interpretation, to use TF Lite models on these devices.

The TF Lite Task Library provides prebuilt components and utilities that handle common preprocessing and postprocessing tasks associated with ML and DL inference on mobile and edge devices. It facilitates the integration of TF Lite models into the apps without having to implement the processing steps from scratch. Let’s explore the functionalities provided by the TF Lite Task Library.

APIs supported by the Task Library

The TF Lite Task Library is a collection of pretrained models, a set of interfaces, and associated tools for common learning tasks such as vision, audio, and natural language processing. Some of ...