About Kafka:
- An Open source, Distributed Streaming platform that helps in developing Event-Driven Applications.
- It relies on Producer-Consumer (or Publisher-Subscriber) approach to deal with streams of data records.
- Has High accuracy in mainting the data records.
- Maintains order of the data record occurrences also.
- Can be used as Messaging Service.
- Can be used as Location tracking.
- Can be used in Data Analytics as Data gathering event.
- Download Kafka Binary from this link - https://kafka.apache.org/downloads
- After download gunzip the downloaded file.
- Start the ZooKeeper service
<KAFKA_FOLDER>/bin/zookeeper-server-start.sh <KAFKA_FOLDER>/config/zookeeper.properties
ZooKeeper - acts as a centralized service, maintains naming and configuration data, Kafka cluster nodes and topics, partitions etc.
4. Start the Broker Service
<KAFKA_FOLDER>/bin/kafka-server-start.sh <KAFKA_FOLDER>/config/server.properties
5. Create a Kafka topic
<KAFKA_FOLDER>/bin/katopics.sh --create --topic KafkaExample --bootstrap-server localhost:9092 --replication-factor 1 --partitions 1
6. Create a new maven project with following Maven dependencies
<dependencies>
<dependency>
<groupId>org.springframework.kafka</groupId>
<artifactId>spring-kafka</artifactId>
</dependency>
<dependency>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-web</artifactId>
<version>2.5.5</version>
</dependency>
<dependency>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-test</artifactId>
<version>2.5.5</version>
<scope>test</scope>
</dependency>
<dependency>
<groupId>org.projectlombok</groupId>
<artifactId>lombok</artifactId>
<version>1.18.22</version>
<scope>provided</scope>
</dependency>
</dependencies>
7. To publish the message , we need KafkaTemplate. Let us autowire it and use it inside our API.
@Autowired
KafkaTemplate<String, String> kafkaTemplate;
private final String topicName = "KafkaTopic";
@PostMapping("/publish_message_v1/{message}")
public String publishMessageV1(@PathVariable String message){
kafkaTemplate.send(topicName, message);
return "Published V1 Message Successfully!";
}
8. To consume the message, we need to listen to the topic. We should be able to achieve with the help of @KafkaListener
@KafkaListener(topics = topicName, groupId = "group_id")
public void consumeMessageV1(String message) {
System.out.println("Consumed the message : " + message);
}
9. We need to add @service tag and @EnableKafka for keeping the Subscriber/Consumer listening to the Published message.
10. Now we can start the SpringBootApplication with the following command:
mvn clean install spring-boot:run
11. Now call the following API and check the logs.
curl --location --request POST 'http://localhost:9909/api/publish_message_v1/PUBLISH_MESSAGE_V1'
12. Now we can see the Message getting posted and consumed. To cross-check, we can run the following scripts in separate terminals, and we can see the message getting Published and Consumed:
* To Check the publishing Message Manually, Execute the following command and type the message:
<KAFKA_FOLDER>/bin/kafka-console-producer.sh --topic KafkaTopic --broker-list localhost:9092
> PUBLISH_MESSAGE_V1
* To check the published Message Manually, Execute the following command and call the API in step 10:
<KAFKA_FOLDER>/bin/kafka-console-consumer.sh --topic KafkaTopic --from-beginning --bootstrap-server localhost:9092