Reviewing Implementation Details in Infrastructure
Learn about topics, streams, and tables in Kafka, how they relate to the producer-consumer pattern, and their role in an event-driven system.
We'll cover the following
So far, we’ve focused slightly on the infrastructure provisioning of Kafka, although there’s far more to setting up Kafka than simply running a script. To understand the producer and consumer examples, we made some assumptions about how to interact with Kafka and, specifically, a topic.
Topics
Topics are not unique to Kafka but are a construct that producer-consumer patterns leverage to store and retrieve messages relevant to a specific domain or grouping of events within a domain. Normally, topics are scoped to a specific subset of data. For example, with the domain model of the MTAEDA application, we might expect to find a topic for equipment, stations, and scheduling, among others.
Events and messages (known as records in Kafka) are written in an append-only fashion. This means that each record is immutable upon being written to a topic. Any changes that are required have to be appended to the end of the topic. This allows for the sequential reading of records, if needed, by the consumers.
Topics in Kafka are also comprised of partitions. Much like other data persistence technologies, partitions can be used to store events or messages that all have a common key. This ensures all relevant information for a particular key will be found within a specific partition. Enabling multiple partitions allows for greater distribution of datasets and, potentially, faster search capabilities when using that common key as a search term.
Get hands-on with 1400+ tech skills courses.