Apache Kakfa is an opensource distributed event streaming platform which works based on publish/subscribe messaging system. That means, there would be a producer who publishes messages to Kafka and a consumer who reads messages from Kafka. In between, Kafka acts like a filesystem or database commit log.
In this post we will setup kafka local environment, create topic, publish and consume messages using console clients.
Step 1: Download latest version of Apache Kafka from Apache Kafka website: https://kafka.apache.org/downloads.
Extract the folder into your local and navigate to the folder in Terminal session (if Mac) or command line (if windows):
$ tar -xzf kafka_2.13-3.1.0.tgz
$ cd kafka_2.13-3.1.0
Step 2: Run Kafka in your local:
Run zookeeper using the below command terminal/command line window 1:
# Start the ZooKeeper service
$ bin/zookeeper-server-start.sh config/zookeeper.properties
Run Kafka using the below command in another terminal or command line:
# Start the Kafka broker service
$ bin/kafka-server-start.sh config/server.properties
Note: You must have Java8 or above in your machine to run Kafka.
Once above two services are run successfully in local, you are set with running Kafka in your local machine.
Step 3: Create topic in Kafka to produce/consume the message in another terminal or command like. In below example, topic name is ‘order-details’ and kafka broker is running in my localhost 9092 port.
$ bin/kafka-topics.sh --create --topic order-details --bootstrap-server localhost:9092
If needed, use describe topic to understand more details about topic created above:
$ bin/kafka-topics.sh --describe --topic order-details --bootstrap-server localhost:9092
Output looks like below:
Topic: order-details PartitionCount: 1 ReplicationFactor: 1 Configs: segment.bytes=1073741824
Topic: order-details Partition: 0 Leader: 0 Replicas: 0 Isr: 0
Step 4: Write events to topic
Run the console producer client to write a few events into your topic. By default, each line you enter will result in a separate event being written to the topic.
$ bin/kafka-console-producer.sh --topic order-details --bootstrap-server localhost:9092
Order 1 details
Order 2 details
Step 5: Read events from Kafka
Open another terminal session/command line and run the console consumer client to read the events you just created:
$ bin/kafka-console-consumer.sh --topic order-details --from-beginning --bootstrap-server localhost:9092
Order 1 details
Order 2 details
Conclusion:
By completing all the above steps, you learned about setting up kafka environment, creating topics, producing the messages using console producer and consming the message using console consumer.