Producer and Consumer Applications Using Avro Data

Learn how to use Schema Registry in producer and consumer applications.

Let’s learn how to use Schema Registry to work with Avro data in our Kafka applications. We will use Confluent Schema Registry as an example, but the concepts apply to any other Schema Registry implementation.

Avro format

Apache Avro is a data serialization system that provides rich data structures and a compact, fast, binary data format. It uses JSON for defining data types and protocols, and it serializes data in a compact binary format.

Avro’s main attributes that make it suitable for data serialization are its schema evolution mechanism, backward compatibility, and language-independent schema. It provides dynamic integration with different languages, enabling users to integrate it easily with different platforms and services.

Why use Avro with Kafka?

Kafka alone does not enforce or provide a mechanism for ensuring that the data adheres to a particular schema. It can handle enormous streams of records without enforcing a rigid schema for the data. This creates the need for a system like Avro, which can structure and organize the data effectively.

Pairing Avro with Kafka provides multiple benefits:

  • Compactness of data: Avro’s efficient binary encoding makes the serialized data extremely compact. This directly reduces the size of the payload carried by Kafka messages, leading to more efficient storage and network usage.

  • Schema evolution support: A salient feature of Avro is its ability to allow for schema evolution, where fields can be added, modified, or removed without breaking the compatibility with older data. This becomes very useful in Kafka, especially when data producers and consumers don’t evolve at the same pace or are managed by different teams.

  • Language-independent schema: Unlike many other data serialization frameworks, Avro’s schema definition is JSON and not tied to any specific programming language. This makes Avro a universal solution for Kafka systems written in different programming languages, promoting smooth integration and interoperability.

Let’s see Avro and Kafka in action with a practical example.

Avro producer and consumer applications

In this section, we will run two Java-based applications. They use Schema Registry along with the Confluent Avro serializer/deserializer libraries.

To start the required components, click the “Run” button.

Get hands-on with 1400+ tech skills courses.