Skip to content
geeksforgeeks
  • Courses
    • DSA to Development
    • Get IBM Certification
    • Newly Launched!
      • Master Django Framework
      • Become AWS Certified
    • For Working Professionals
      • Interview 101: DSA & System Design
      • Data Science Training Program
      • JAVA Backend Development (Live)
      • DevOps Engineering (LIVE)
      • Data Structures & Algorithms in Python
    • For Students
      • Placement Preparation Course
      • Data Science (Live)
      • Data Structure & Algorithm-Self Paced (C++/JAVA)
      • Master Competitive Programming (Live)
      • Full Stack Development with React & Node JS (Live)
    • Full Stack Development
    • Data Science Program
    • All Courses
  • Tutorials
    • Data Structures & Algorithms
    • ML & Data Science
    • Interview Corner
    • Programming Languages
    • Web Development
    • CS Subjects
    • DevOps And Linux
    • School Learning
  • Practice
    • Build your AI Agent
    • GfG 160
    • Problem of the Day
    • Practice Coding Problems
    • GfG SDE Sheet
  • Contests
    • Accenture Hackathon (Ending Soon!)
    • GfG Weekly [Rated Contest]
    • Job-A-Thon Hiring Challenge
    • All Contests and Events
  • Java Tutorial
  • Java Spring
  • Spring Interview Questions
  • Java SpringBoot
  • Spring Boot Interview Questions
  • Spring MVC
  • Spring MVC Interview Questions
  • Java Hibernate
  • Hibernate Interview Questions
  • Advance Java Projects
  • Java Interview Questions
Open In App
Next Article:
How to Get Number of Messages in a Topic in Apache Kafka in Java?
Next article icon

Implementing Request Response in Java Apache Kafka

Last Updated : 16 Jul, 2024
Comments
Improve
Suggest changes
Like Article
Like
Report

Apache Kafka is a very powerful distributed event streaming platform that can be used for building real-time pipelines and streaming applications. It is highly scalable, fault-tolerant, and provides high throughput. One of the common patterns used in the Kafka application is the request-response pattern which is slightly unconventional as Kafka is designed for asynchronous messaging. We can implement the request-response communication between the services using Kafka.

In the request-response pattern, the Consumer sends the request message to the Producer, and the producer can process the request and send it back to the response message. In Kafka, It can achieved using separate topics for the requests and responses along with the correlation IDs to match the responses to corresponding requests.

Prerequisites:

  • Basic understanding of the Java and Apache Kafka.
  • JDK and IntelliJ Idea installed in your local system.
  • Maven for building dependency management.
  • Required the Kafka Client Library.

Implementing Request Response in Java Apache Kafka

Step 1: Create a Maven Project

Create a new Java maven project using IntelliJ Idea, after that add the following dependencies into the project dependency.

Note: To Install and setup the Apache Kafka refer this link

Required Dependencies:

            <!-- https://mvnrepository.com/artifact/org.apache.kafka/kafka-clients -->             <dependency>                 <groupId>org.apache.kafka</groupId>                 <artifactId>kafka-clients</artifactId>                 <version>3.7.0</version>             </dependency>              <!-- https://mvnrepository.com/artifact/org.slf4j/slf4j-api -->             <dependency>                 <groupId>org.slf4j</groupId>                 <artifactId>slf4j-api</artifactId>                 <version>2.0.13</version>             </dependency>              <!-- https://mvnrepository.com/artifact/org.slf4j/slf4j-simple -->             <dependency>                 <groupId>org.slf4j</groupId>                 <artifactId>slf4j-simple</artifactId>                 <version>2.0.13</version>                 <scope>test</scope>             </dependency>


After creating the project, the folder structure in the IDE will look like the below image:

Folder Structure


Step 2: Add the Dependencies into the pom.xml file

Open the pom.xml add the required dependencies into the project.

XML
<?xml version="1.0" encoding="UTF-8"?> <project xmlns="http://maven.apache.org/POM/4.0.0"          xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"          xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">     <modelVersion>4.0.0</modelVersion>      <groupId>com.gfg</groupId>     <artifactId>kafka-request-response</artifactId>     <version>1.0-SNAPSHOT</version>      <properties>         <maven.compiler.source>17</maven.compiler.source>         <maven.compiler.target>17</maven.compiler.target>         <project.build.sourceEncoding>UTF-8</project.build.sourceEncoding>     </properties>     <dependencies>              <!-- https://mvnrepository.com/artifact/org.apache.kafka/kafka-clients -->             <dependency>                 <groupId>org.apache.kafka</groupId>                 <artifactId>kafka-clients</artifactId>                 <version>3.7.0</version>             </dependency>              <!-- https://mvnrepository.com/artifact/org.slf4j/slf4j-api -->             <dependency>                 <groupId>org.slf4j</groupId>                 <artifactId>slf4j-api</artifactId>                 <version>2.0.13</version>             </dependency>              <!-- https://mvnrepository.com/artifact/org.slf4j/slf4j-simple -->             <dependency>                 <groupId>org.slf4j</groupId>                 <artifactId>slf4j-simple</artifactId>                 <version>2.0.13</version>                 <scope>test</scope>             </dependency>          </dependencies> </project> 


Step 3: Create the KafkaProducerService Class

Java
package com.gfg;  import org.apache.kafka.clients.producer.KafkaProducer; import org.apache.kafka.clients.producer.ProducerRecord; import org.apache.kafka.clients.producer.RecordMetadata; import org.apache.kafka.clients.producer.Callback; import org.apache.kafka.common.serialization.StringSerializer;  import java.util.Properties;  /**  * Service class for producing messages to Kafka topics.  */ public class KafkaProducerService {      private final KafkaProducer<String, String> producer;      /**      * Constructor to initialize Kafka producer with provided bootstrap servers.      *      * @param bootstrapServers The list of Kafka bootstrap servers.      */     public KafkaProducerService(String bootstrapServers) {         Properties props = new Properties();         props.put("bootstrap.servers", bootstrapServers);         props.put("key.serializer", StringSerializer.class.getName());         props.put("value.serializer", StringSerializer.class.getName());         this.producer = new KafkaProducer<>(props);     }      /**      * Sends a message to the specified Kafka topic.      *      * @param topic The topic to which the message will be sent.      * @param key   The key for the message.      * @param value The value of the message.      */     public void sendMessage(String topic, String key, String value) {         ProducerRecord<String, String> record = new ProducerRecord<>(topic, key, value);         producer.send(record, new Callback() {             public void onCompletion(RecordMetadata metadata, Exception exception) {                 if (exception != null) {                     exception.printStackTrace();                 } else {                     System.out.println("Sent message to topic: " + metadata.topic() + " partition: " + metadata.partition() + " offset: " + metadata.offset());                     System.out.println("Received message: Hello, Kafka!");                 }             }         });     }      /**      * Closes the Kafka producer.      */     public void close() {         producer.close();     } } 

The KafkaProducerService class manages a Kafka Producer instance configured with bootstrap servers and serializers for keys and values. It sends messages asynchronously to specified topics, handles delivery callbacks, and provides a method to close the producer cleanly.

Step 4: Create the KafkaConsumerService Class

Java
package com.gfg;  import org.apache.kafka.clients.consumer.ConsumerConfig; import org.apache.kafka.clients.consumer.ConsumerRecord; import org.apache.kafka.clients.consumer.KafkaConsumer; import org.apache.kafka.clients.consumer.ConsumerRecords; import org.apache.kafka.common.serialization.StringDeserializer;  import java.time.Duration; import java.util.Collections; import java.util.Properties;  /**  * Service class to manage Kafka consumer instance.  */ public class KafkaConsumerService {      private final KafkaConsumer<String, String> consumer;      /**      * Constructor to initialize Kafka consumer with configuration.      * @param bootstrapServers Kafka server addresses.      * @param groupId Consumer group ID.      * @param topic Topic to subscribe to.      */     public KafkaConsumerService(String bootstrapServers, String groupId, String topic) {         Properties props = new Properties();         props.put(ConsumerConfig.BOOTSTRAP_SERVERS_CONFIG, bootstrapServers);         props.put(ConsumerConfig.GROUP_ID_CONFIG, groupId);         props.put(ConsumerConfig.KEY_DESERIALIZER_CLASS_CONFIG, StringDeserializer.class.getName());         props.put(ConsumerConfig.VALUE_DESERIALIZER_CLASS_CONFIG, StringDeserializer.class.getName());         this.consumer = new KafkaConsumer<>(props);         this.consumer.subscribe(Collections.singletonList(topic));     }      /**      * Polls messages from Kafka topic indefinitely.      */     public void pollMessages() {         while (true) {             ConsumerRecords<String, String> records = consumer.poll(Duration.ofMillis(100));             for (ConsumerRecord<String, String> record : records) {                 System.out.println("Received message: " + record.value() + " from topic: " + record.topic());             }         }     }      /**      * Closes the Kafka consumer.      */     public void close() {         consumer.close();     } } 
  • KafkaConsumerService initializes a Kafka consumer with specified configuration (bootstrap servers, group ID, topic).
  • It continuously polls messages from the subscribed Kafka topic using a loop.
  • The class provides a method to close the Kafka consumer instance when no longer needed.

Step 5: Create the RequestResponseService Class

Java
package com.gfg;  import java.util.UUID;  /**  * Service class for handling request-response pattern using Kafka.  */ public class RequestResponseService {      private final KafkaProducerService producerService;     private final KafkaConsumerService consumerService;     private final String requestTopic;     private final String responseTopic;      /**      * Constructs a RequestResponseService with specified Kafka bootstrap servers and topics.      *      * @param bootstrapServers Kafka bootstrap servers to connect to.      * @param requestTopic     Topic for sending requests.      * @param responseTopic    Topic for receiving responses.      */     public RequestResponseService(String bootstrapServers, String requestTopic, String responseTopic) {         this.producerService = new KafkaProducerService(bootstrapServers);         this.consumerService = new KafkaConsumerService(bootstrapServers, "response-group", responseTopic);         this.requestTopic = requestTopic;         this.responseTopic = responseTopic;     }      /**      * Sends a request message to the Kafka topic specified by requestTopic.      *      * @param message Message to send as a request.      */     public void sendRequest(String message) {         String correlationId = UUID.randomUUID().toString();         producerService.sendMessage(requestTopic, correlationId, message);         System.out.println("Sent request with correlationId: " + correlationId);         // In a real-world scenario, implement a more robust correlation ID handling mechanism.     }      /**      * Starts listening for responses from the Kafka response topic.      */     public void receiveResponse() {         consumerService.pollMessages();     }      /**      * Closes the producer and consumer services.      */     public void close() {         producerService.close();         consumerService.close();     } } 
  • It manages Kafka producer and consumer services for implementing request-response pattern.
  • Sends requests with unique correlation IDs to a specified topic and prints the ID.
  • Receives and prints responses from a designated Kafka topic indefinitely.

Step 6: Main Class

Java
package com.gfg;  public class MainApplication {      public static void main(String[] args) {         String bootstrapServers = "localhost:9092";         String requestTopic = "request-topic";         String responseTopic = "response-topic";          RequestResponseService service = new RequestResponseService(bootstrapServers, requestTopic, responseTopic);          // Send a request         service.sendRequest("Hello, Kafka!");           // Start receiving responses         service.receiveResponse();           // Close services on shutdown         Runtime.getRuntime().addShutdownHook(new Thread(() -> {             service.close();         }));     } } 

Step 7: Run the application

Output:

Console Output

This example project demonstrates the basic setup and implementation of the request and response pattern in Apache Kafka.


Next Article
How to Get Number of Messages in a Topic in Apache Kafka in Java?

M

maheshdaxt
Improve
Article Tags :
  • Advance Java
  • Apache Kafka Java

Similar Reads

  • Apache Kafka - Create Producer using Java
    Apache Kafka is a publish-subscribe messaging system. A messaging system lets you send messages between processes, applications, and servers. Apache Kafka is software where topics (A topic might be a category) can be defined and further processed. Read more on Kafka here: What is Apache Kafka and Ho
    4 min read
  • Spring MVC – Implementing Asynchronous Request Processing
    In Spring MVC, asynchronous request processing allows the server to handle requests in a non-blocking manner. This approach improves performance by freeing up resources while waiting for responses, which is especially useful for long-running tasks such as remote service calls, file processing, or da
    8 min read
  • Apache Kafka - Create Producer with Keys using Java
    Apache Kafka is a publish-subscribe messaging system. A messaging system lets you send messages between processes, applications, and servers. Apache Kafka is software where topics (A topic might be a category) can be defined and further processed. Read more on Kafka here: What is Apache Kafka and Ho
    6 min read
  • Microservices Communication with Apache Kafka in Spring Boot
    Apache Kafka is a distributed streaming platform and can be widely used to create real-time data pipelines and streaming applications. It can publish and subscribe to records in progress, save these records in an error-free manner, and handle floating records as they arrive. Combined with Spring Boo
    6 min read
  • How to Get Number of Messages in a Topic in Apache Kafka in Java?
    Apache Kafka is the distributed event streaming platform capable of handling high throughput, low latency, and fault tolerance. One of the common tasks when working with Kafka is determining the number of messages on a specific topic. This article will guide you through the process of using Java to
    4 min read
  • How to Subscribe to the Topic in Apache Kafka from the Java application?
    Subscribing to a Kafka topic from a Java application requires setting up a Kafka consumer that reads messages from a specific topic. This is a key part of many microservice architectures where services must process messages asynchronously. Apache Kafka provides a robust and scalable platform for bui
    4 min read
  • Apache Kafka vs Confluent Kafka
    Apache Kafka is an open-source and distributed event store stream-processing platform. It can be used to gather application logs on a large scale. Confluent Kafka is a data streaming platform that includes most of Kafka's functionality and a few additional ones. Its primary goal is not just to provi
    4 min read
  • Introduction to Apache Kafka Partitions
    Apache Kafka, a powerful publish-subscribe messaging system, has emerged as the preferred choice of high-volume data streams utilized by international corporations such as LinkedIn, Netflix, and Uber. Lying at the heart of Kafka's superior performance is its design of Kafka partitions. Partitions ar
    13 min read
  • Understanding In-Sync Replicas (ISR) in Apache Kafka
    Apache Kafka, a distributed streaming platform, relies on a robust replication mechanism to ensure data durability and availability. Central to this mechanism is the concept of In-Sync Replicas (ISR). Understanding ISR is crucial for anyone working with Kafka, as it directly impacts data consistency
    4 min read
  • Effective Strategy to Avoid Duplicate Messages in Apache Kafka Consumer
    Apache Kafka is a good choice for distributed messaging systems because of its robust nature. In this article, we will explore advanced strategies to avoid duplicate messages in Apache Kafka consumers. Challenge of Duplicate Message ConsumptionApache Kafka's at-least-once delivery system ensures mes
    3 min read
geeksforgeeks-footer-logo
Corporate & Communications Address:
A-143, 7th Floor, Sovereign Corporate Tower, Sector- 136, Noida, Uttar Pradesh (201305)
Registered Address:
K 061, Tower K, Gulshan Vivante Apartment, Sector 137, Noida, Gautam Buddh Nagar, Uttar Pradesh, 201305
GFG App on Play Store GFG App on App Store
Advertise with us
  • Company
  • About Us
  • Legal
  • Privacy Policy
  • In Media
  • Contact Us
  • Advertise with us
  • GFG Corporate Solution
  • Placement Training Program
  • Languages
  • Python
  • Java
  • C++
  • PHP
  • GoLang
  • SQL
  • R Language
  • Android Tutorial
  • Tutorials Archive
  • DSA
  • Data Structures
  • Algorithms
  • DSA for Beginners
  • Basic DSA Problems
  • DSA Roadmap
  • Top 100 DSA Interview Problems
  • DSA Roadmap by Sandeep Jain
  • All Cheat Sheets
  • Data Science & ML
  • Data Science With Python
  • Data Science For Beginner
  • Machine Learning
  • ML Maths
  • Data Visualisation
  • Pandas
  • NumPy
  • NLP
  • Deep Learning
  • Web Technologies
  • HTML
  • CSS
  • JavaScript
  • TypeScript
  • ReactJS
  • NextJS
  • Bootstrap
  • Web Design
  • Python Tutorial
  • Python Programming Examples
  • Python Projects
  • Python Tkinter
  • Python Web Scraping
  • OpenCV Tutorial
  • Python Interview Question
  • Django
  • Computer Science
  • Operating Systems
  • Computer Network
  • Database Management System
  • Software Engineering
  • Digital Logic Design
  • Engineering Maths
  • Software Development
  • Software Testing
  • DevOps
  • Git
  • Linux
  • AWS
  • Docker
  • Kubernetes
  • Azure
  • GCP
  • DevOps Roadmap
  • System Design
  • High Level Design
  • Low Level Design
  • UML Diagrams
  • Interview Guide
  • Design Patterns
  • OOAD
  • System Design Bootcamp
  • Interview Questions
  • Inteview Preparation
  • Competitive Programming
  • Top DS or Algo for CP
  • Company-Wise Recruitment Process
  • Company-Wise Preparation
  • Aptitude Preparation
  • Puzzles
  • School Subjects
  • Mathematics
  • Physics
  • Chemistry
  • Biology
  • Social Science
  • English Grammar
  • Commerce
  • World GK
  • GeeksforGeeks Videos
  • DSA
  • Python
  • Java
  • C++
  • Web Development
  • Data Science
  • CS Subjects
@GeeksforGeeks, Sanchhaya Education Private Limited, All rights reserved
We use cookies to ensure you have the best browsing experience on our website. By using our site, you acknowledge that you have read and understood our Cookie Policy & Privacy Policy
Lightbox
Improvement
Suggest Changes
Help us improve. Share your suggestions to enhance the article. Contribute your expertise and make a difference in the GeeksforGeeks portal.
geeksforgeeks-suggest-icon
Create Improvement
Enhance the article with your expertise. Contribute to the GeeksforGeeks community and help create better learning resources for all.
geeksforgeeks-improvement-icon
Suggest Changes
min 4 words, max Words Limit:1000

Thank You!

Your suggestions are valuable to us.

What kind of Experience do you want to share?

Interview Experiences
Admission Experiences
Career Journeys
Work Experiences
Campus Experiences
Competitive Exam Experiences