Setup Kafka in Your Local Laptop - ETL-TESTING/shared GitHub Wiki

The Medium Post(Kafka As a Postbox)

Read "Kafka As a Postbox" article here (8 mins read) which can simplify your approach to "Data Stream Testing".

Step 1: Bring Up Kafka In Your localhost

Use the below YAML file to start Kafka server.

version: '3.8'

services:
  zookeeper:
    image: confluentinc/cp-zookeeper:5.4.1
    container_name: zookeeper
    ports:
      - "2181:2181"
    environment:
      ZOOKEEPER_CLIENT_PORT: 2181
      ZOOKEEPER_TICK_TIME: 2000

  kafka:
    image: confluentinc/cp-kafka:5.4.1
    container_name: kafka
    depends_on:
      - zookeeper
    ports:
      - "9092:9092"
    environment:
      KAFKA_BROKER_ID: 1
      KAFKA_ZOOKEEPER_CONNECT: zookeeper:2181
      KAFKA_ADVERTISED_LISTENERS: PLAINTEXT://kafka:29092,PLAINTEXT_HOST://localhost:9092
      KAFKA_LISTENER_SECURITY_PROTOCOL_MAP: PLAINTEXT:PLAINTEXT,PLAINTEXT_HOST:PLAINTEXT
      KAFKA_INTER_BROKER_LISTENER_NAME: PLAINTEXT
      KAFKA_OFFSETS_TOPIC_REPLICATION_FACTOR: 1
  1. Copy to a a yaml file, example "docker-compose.yml"
  2. Run the following command from your terminal(command line):
docker-compose up
  1. You shhouldn't get any error, if you get any error, check your Docker desktop is working fine and it's running/active. Also, leave a comment on my Medium post here with a short problem description.

Step 2: Check Kafka is Up and Running

To check Kafka started successfully, use the below command:

docker ps

then, it should output like this:

> docker ps
CONTAINER ID   IMAGE                             COMMAND                  CREATED        STATUS                  PORTS                                        NAMES
80ef07a8d209   confluentinc/cp-kafka:5.4.1       "/etc/confluent/dock…"   33 hours ago   Up 33 hours             0.0.0.0:9092->9092/tcp                       kafka
32bab7541458   confluentinc/cp-zookeeper:5.4.1   "/etc/confluent/dock…"   33 hours ago   Up 33 hours             2888/tcp, 0.0.0.0:2181->2181/tcp, 3888/tcp   zookeeper

Step 3: Connect a UI Tool or use the below CLI Commands to Produce or Consume

  • For UI Tool, you can use AKHQ, which is a Open-Source tool available here in GitHub.
    • Follow the instructions to connect and explore the messages

or

use the following CLI:

PRODUCE a "Hello World" message:

kafka-console-producer.sh --broker-list localhost:9092 --topic hello-world-topic

Next, type your message and hit "Enter" to PRODUCE.

CONSUME all the messages from the "hello-world-topic":

kafka-console-consumer.sh --bootstrap-server localhost:9092 --topic hello-world-topic --from-beginning 

Issues

Note: If you run into any issues, drop a comment on the Medium Post here. I’ll make sure to respond or cover it in my next post.

Kafka Core Concepts Post link is here.

Don't miss to Watch the video tutorial here if that helps.