...

/

Producer and Consumer Applications Using Avro Data

Producer and Consumer Applications Using Avro Data

Learn how to use Schema Registry in producer and consumer applications.

Let’s learn how to use Schema Registry to work with Avro data in our Kafka applications. We will use Confluent Schema Registry as an example, but the concepts apply to any other Schema Registry implementation.

Avro format

Apache Avro is a data serialization system that provides rich data structures and a compact, fast, binary data format. It uses JSON for defining data types and protocols, and it serializes data in a compact binary format.

Avro’s main attributes that make it suitable for data serialization are its schema evolution mechanism, backward compatibility, and language-independent schema. It provides dynamic integration with different languages, enabling users to integrate it easily with different platforms and services.

Why use Avro with Kafka?

Kafka alone does not enforce or provide a mechanism for ensuring that the data adheres to a particular schema. It can handle enormous streams of records without enforcing a rigid schema for the data. This creates the need for a system like Avro, which can structure and organize the data effectively.

Pairing Avro with Kafka provides multiple benefits:

  • Compactness of data: Avro’s efficient binary encoding makes the serialized data extremely compact. This directly reduces the size of the payload carried by Kafka messages, leading to more efficient storage and network usage.

  • Schema evolution support: A salient feature of Avro is its ...

Access this course and 1400+ top-rated courses and projects.