This project sets up a multi-broker Apache Kafka cluster with Zookeeper, a producer, and a consumer using Docker Compose. It allows for fault-tolerant message streaming across brokers.
✅ Multi-node Kafka cluster (2 brokers).
✅ Zookeeper for broker coordination.
✅ Kafka Producer for log generation.
✅ Kafka Consumer for log consumption.
✅ Load balancing discussion for scaling strategies.
Ensure you have Docker and Docker Compose installed on your system.
# Install Docker
sudo apt update && sudo apt install docker.io -y
# Install Docker Compose
sudo apt install docker-compose -ygit clone https://github.com/your-username/kafka-multi-node-docker.gitcd kafka-multi-node-docker
docker-compose up -d --buildThis will start
- 1 Zookeeper instance
- 2 Kafka brokers
- 1 Producer
- 1 Consumer
Check if Kafka brokers are running:
docker psCheck if topics are created:
docker exec -it kafka-1 kafka-topics --list --bootstrap-server kafka-1:9092docker logs -f kafka_producerdocker logs -f kafka_producerBelow is the high-level architecture of the multi-node Kafka cluster deployed in this project.
- Zookeeper: Manages Kafka brokers and coordinates leader election.
- Kafka Brokers: Two brokers handle message storage and replication.
- Producer: Sends messages to Kafka topics.
- Consumer: Reads messages from Kafka topics.
- Docker Network: All components communicate within an isolated network.
multi-node-kafka-cluster/
│── docker-compose.yml # Defines all services (Kafka, Zookeeper, Producer, Consumer)
│── producer/
│ ├── Dockerfile
│ ├── producer.py # Python script to generate logs
│ ├── requirements.txt
│── consumer/
│ ├── Dockerfile
│ ├── consumer.py # Python script to consume logs
│ ├── requirements.txt
└── README.md- Kafka Documentation
- Kafka Python Client (kafka-python)
- A Simple Kafka and Python Walkthrough
- Kafka Tutorial for Beginners
Feel free to fork this repo, raise issues, or submit PRs! 😊
This project is licensed under the MIT License.
