This device is not compatible.

Setting up a Streaming Data Pipeline With Kafka

PROJECT


Setting up a Streaming Data Pipeline With Kafka

In this project, we’ll learn about configuring and starting the Kafka server. To check the working of the Kafka server, we’ll begin by creating a topic on our server. After creating a topic, we’ll create a new producer and consumer to demonstrate data transmission using Kafka.

Setting up a Streaming Data Pipeline With Kafka

You will learn to:

Configure Zookeeper and Kafka

Start Zookeeper and Kafka

Create a topic using a console

Carry out console-based streaming between producers and consumers

Skills

Data Pipeline Engineering

Live Streaming

Prerequisites

Basic knowledge of Python

Basic knowledge of streaming

Basic knowledge of message broker architecture.

Basic knowledge of Kafka and Zookeeper

Technologies

Kafka

Python

Project Description

Apache Kafka is a stream-processing framework that implements a message bus. It is a Scala and Java-based open-source development platform, which was developed by the Apache Software Foundation. The goal of Kafka is to provide a single, high-throughput, and low-latency platform for real-time data flows. Apache Kafka uses Zookeeper for naming and registry services.

Zookeeper is a naming registry that is used in distribution systems for service synchronization. In Kafka, the Zookeeper is responsible for managing and tracking the status of the Kafka cluster’s nodes, topics, and messages.

In this project, we’ll learn and have the hands-on configuration of Zookeeper and Kafka. After that, we’ll learn to start both services. We’ll create a topic using the terminal and, after that, we’ll create a console-based consumer and producer to check the proper data streamflow.

Project Tasks

1

Installation and Configuration

Task 0: Getting Started

Task 1: Configure Zookeeper on Localhost

2

Starting Zookeeper and Kafka

Task 2: Start Zookeeper

Task 3: Start Kafka

3

Creating a Topic, Producer, and Consumer

Task 4: Creating a New Topic

Task 5: Things To Know

Task 6: Creating a New Producer

Task 7: Create a New Consumer

Congratulations!

has successfully completed the Guided ProjectSetting up a Streaming Data Pipeline WithKafka

Relevant Courses

Use the following content to review prerequisites or explore specific concepts in detail.