Skip to content
geeksforgeeks
  • Courses
    • DSA to Development
    • Get IBM Certification
    • Newly Launched!
      • Master Django Framework
      • Become AWS Certified
    • For Working Professionals
      • Interview 101: DSA & System Design
      • Data Science Training Program
      • JAVA Backend Development (Live)
      • DevOps Engineering (LIVE)
      • Data Structures & Algorithms in Python
    • For Students
      • Placement Preparation Course
      • Data Science (Live)
      • Data Structure & Algorithm-Self Paced (C++/JAVA)
      • Master Competitive Programming (Live)
      • Full Stack Development with React & Node JS (Live)
    • Full Stack Development
    • Data Science Program
    • All Courses
  • Tutorials
    • Data Structures & Algorithms
    • ML & Data Science
    • Interview Corner
    • Programming Languages
    • Web Development
    • CS Subjects
    • DevOps And Linux
    • School Learning
  • Practice
    • Build your AI Agent
    • GfG 160
    • Problem of the Day
    • Practice Coding Problems
    • GfG SDE Sheet
  • Contests
    • Accenture Hackathon (Ending Soon!)
    • GfG Weekly [Rated Contest]
    • Job-A-Thon Hiring Challenge
    • All Contests and Events
  • DevOps Lifecycle
  • DevOps Roadmap
  • Docker Tutorial
  • Kubernetes Tutorials
  • Amazon Web Services [AWS] Tutorial
  • AZURE Tutorials
  • GCP Tutorials
  • Docker Cheat sheet
  • Kubernetes cheat sheet
  • AWS interview questions
  • Docker Interview Questions
  • Ansible Interview Questions
  • Jenkins Interview Questions
Open In App
Next Article:
Sending Message to Kafka Using Java
Next article icon

Publish Message to Kafka Using Java

Last Updated : 08 Jul, 2024
Comments
Improve
Suggest changes
Like Article
Like
Report

Apache Kafka is amongst the popular message brokers being used in the industry. Kafka works on a publish-subscribe model and can be used to establish async communication between microservices. We are here going to discuss how to publish messages to Kafka topic using a Java application.

There are some key terms being used in the context of Kafka

  • Topic: It refers to the queue in Kafka where messages are published and subscribed from.
  • Producer: It refers to the application which publishes messages to the queue. Here, in our case, our Java application will be acting as a producer and will be responsible for publishing messages on the topic.
  • Consumer: It refers to the application that reads/subscribes to the message from the Kafka topic. We will be using Kafka-provided console-based consumers to read the message from the topic.

Creating Java Application to Publish Messages in Kafka

Requirements:

  • JDK 17 or above
  • IntelliJ Idea or your other favorite IDE
  • Apache Kafka

Step 1: Spring Boot Application Initialization

Create a spring boot application from Spring Initializer with the set of configurations provided below.

Add two dependencies by clicking on add dependencies button:

  • Spring Web
  • Spring for Apache Kafka
Spring Initializr

pom.xml file

<?xml version="1.0" encoding="UTF-8"?>
<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 https://maven.apache.org/xsd/maven-4.0.0.xsd">
<modelVersion>4.0.0</modelVersion>
<parent>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-parent</artifactId>
<version>3.3.0</version>
<relativePath/> <!-- lookup parent from repository -->
</parent>
<groupId>com.kafkaDemo</groupId>
<artifactId>KafkaDemo</artifactId>
<version>0.0.1-SNAPSHOT</version>
<name>KafkaDemo</name>
<description>Publish Message to Kafka using Java</description>
<properties>
<java.version>17</java.version>
</properties>
<dependencies>
<dependency>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-web</artifactId>
</dependency>
<dependency>
<groupId>org.springframework.kafka</groupId>
<artifactId>spring-kafka</artifactId>
</dependency>

<dependency>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-test</artifactId>
<scope>test</scope>
</dependency>
<dependency>
<groupId>org.springframework.kafka</groupId>
<artifactId>spring-kafka-test</artifactId>
<scope>test</scope>
</dependency>
</dependencies>

<build>
<plugins>
<plugin>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-maven-plugin</artifactId>
</plugin>
</plugins>
</build>

</project>

Step 2: Create User.java file

We will be creating a POJO class named User, we will publish this User class's object into Kafka topic. We will accept this object through client side by exposing an API endpoint.

User.java:

import java.io.Serializable;

public class User implements Serializable {

private String name;
private String email;
private Integer age;

public User(String name, String email, Integer age) {
this.name = name;
this.email = email;
this.age = age;
}

public User() {
}

public String getName() {
return name;
}

public void setName(String name) {
this.name = name;
}

public String getEmail() {
return email;
}

public void setEmail(String email) {
this.email = email;
}

public Integer getAge() {
return age;
}

public void setAge(Integer age) {
this.age = age;
}
}

Step 3: Configuring KafkaTemplate Bean

Spring Kafka dependency provides KafkaTemplate bean to publish messages to kafka. This dependency comes with pre-configured bean of KafkaTemplate<String,String>. It also provides us the power to create our own type of KafkaTemplate bean. We will be configuring the bean of KafkaTemplate<String,User>. To do so, we are going to do these task-

  1. Create a KafkaProducerConfig.java file
  2. Configuring ProducerFactory and KafkaTemplate bean it.

Add the code give below in KafkaConfiguration.java

KafkaProducerConfig.java:

import com.kafkaDemo.KafkaDemo.models.User;
import org.apache.kafka.clients.producer.ProducerConfig;
import org.apache.kafka.common.serialization.StringSerializer;
import org.springframework.beans.factory.annotation.Value;
import org.springframework.context.annotation.Bean;
import org.springframework.context.annotation.Configuration;
import org.springframework.kafka.core.DefaultKafkaProducerFactory;
import org.springframework.kafka.core.KafkaTemplate;
import org.springframework.kafka.core.ProducerFactory;
import org.springframework.kafka.support.serializer.JsonSerializer;

import java.util.HashMap;
import java.util.List;
import java.util.Map;

@Configuration
public class KafkaProducerConfig {

@Value("${spring.kafka.bootstrap-servers}")
private List<String> bootstrapServers;

@Bean("userProducerFactory")
public ProducerFactory<String, User> producerFactory() {
Map<String, Object> props = new HashMap<>();
props.put(ProducerConfig.BOOTSTRAP_SERVERS_CONFIG, bootstrapServers);
props.put(ProducerConfig.KEY_SERIALIZER_CLASS_CONFIG, StringSerializer.class);
props.put(ProducerConfig.VALUE_SERIALIZER_CLASS_CONFIG, JsonSerializer.class);
return new DefaultKafkaProducerFactory<String,User>(props);
}

@Bean("userKafkaTemplate")
public KafkaTemplate<String, User> userKafkaTemplate() {
return new KafkaTemplate<>(producerFactory());
}

}

Step 4: Adding property in applicaton.properties

Add the below properties in application.properties file. You can add the list of Kafka-brokers address in spring.kafka.bootstrap-servers property in comma separated format.

spring.application.name=KafkaDemo
spring.kafka.bootstrap-servers=localhost:9092
server.port=7070

Step 5: Creating a REST Controller

Create KafkaController.java file and the below code in it.

KafkaController.java:

import com.kafkaDemo.KafkaDemo.models.User;
import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.beans.factory.annotation.Qualifier;
import org.springframework.http.ResponseEntity;
import org.springframework.kafka.core.KafkaTemplate;
import org.springframework.web.bind.annotation.PostMapping;
import org.springframework.web.bind.annotation.RequestBody;
import org.springframework.web.bind.annotation.RequestMapping;
import org.springframework.web.bind.annotation.RestController;

@RestController
@RequestMapping("/api/kafka")
public class KafkaController {


private KafkaTemplate<String,User> kafkaTemplate;

@Autowired
public KafkaController(@Qualifier("userKafkaTemplate") KafkaTemplate<String, User> kafkaTemplate) {
this.kafkaTemplate = kafkaTemplate;
}

@PostMapping("/publish")
public ResponseEntity<String> publish(@RequestBody User user) {
if (user == null) {return ResponseEntity.badRequest().build();}
kafkaTemplate.send("userTopic",user.getEmail(),user);
return ResponseEntity.accepted().body("Message sent to be published");
}
}

Step 6: Creating a topic named userTopic in Kafka

Open a terminal in the location of bin folder where you have installed Kafka in your system, and run the command given below. For Linux or mac machine, run shell script file instead of windows batch file.

.\kafka-topics.bat --create --topic userTopic  --bootstrap-server localhost:9092

Step 7: Starting Apache ZooKeeper, Kafka Broker and Consumer

  • First of all, start zookeeper with the command below: -
.\zookeeper-server-start.bat ..\..\config\zookeeper.properties
  • Secondly, start the Kafka broker on port 9092. By default, Kafka broker starts on this port number.
.\kafka-server-start.bat ..\..\config\server.properties
  • Third, start the console-based consumer provided by kafka by running the command given below: -
  .\kafka-console-consumer.bat --topic userTopic --bootstrap-server localhost:9092

Step 8: Output Check

Finally, we will start our java application, and will hit the "/publish" endpoint and will check the output in the consumer.

Application started on localhost:7070
Hitting /publish endpoint from postman
Message getting subscribed by consumer

Conclusion

In conclusion, we've walked through the steps of publishing messages to a Kafka topic using a Java application, specifically with the Spring Boot framework. We started by setting up our development environment and dependencies, then created a User model class and configured a KafkaTemplate bean to handle message publishing. We also added necessary properties to the application configuration, built a REST controller to expose an API endpoint for publishing messages, and finally, created and tested a Kafka topic. This comprehensive guide demonstrates how to effectively use Kafka for asynchronous communication between microservices, leveraging Java's robust ecosystem.


    Next Article
    Sending Message to Kafka Using Java
    author
    jhaashish03
    Improve
    Article Tags :
    • DevOps
    • Apache Kafka

    Similar Reads

    • Sending Message to Kafka Using Java
      Apache Kafka is an open-source real-time data handling platform. It is used to transmit messages between two microservices. The sender is referred to as the Producer, who produces the messages, and the receiver is referred to as the Consumer, who consumes the messages. In this setup, Kafka acts as a
      4 min read
    • Message Passing in Java
      What is message passing and why it is used? Message Passing in terms of computers is communication between processes. It is a form of communication used in object-oriented programming as well as parallel programming. Message passing in Java is like sending an object i.e. message from one thread to a
      3 min read
    • How to Connect to Kafka Using Java
      Apache Kafka is open source real time data handling platform. It is a distributed event store which is used for low latency and high volume of data. Kafka is a message queue and it can be used between microservices to communicate or passing the message. In kafka architecture we have four major compo
      7 min read
    • Apache Kafka - Create Safe Producer using Java
      Apache Kafka Producers are going to write data to topics and topics are made of partitions. Now the producers in Kafka will automatically know to which broker and partition to write based on your message and in case there is a Kafka broker failure in your cluster the producers will automatically rec
      5 min read
    • How to Get Number of Messages in a Topic in Apache Kafka in Java?
      Apache Kafka is the distributed event streaming platform capable of handling high throughput, low latency, and fault tolerance. One of the common tasks when working with Kafka is determining the number of messages on a specific topic. This article will guide you through the process of using Java to
      4 min read
    • Apache Kafka - Create Producer with Keys using Java
      Apache Kafka is a publish-subscribe messaging system. A messaging system lets you send messages between processes, applications, and servers. Apache Kafka is software where topics (A topic might be a category) can be defined and further processed. Read more on Kafka here: What is Apache Kafka and Ho
      6 min read
    • Apache Kafka Load Testing Using JMeter
      Apache Kafka is designed as a key component with real-time data flows and event-driven scheduling to accelerate data flow to applications. In this article, we will explore how to incorporate JMeter into Apache Kafka tests but understand what it does before we begin the main contents. Producer: In Ka
      5 min read
    • Apache Kafka - Real World Project with Twitter using Java
      Apache Kafka is a publish-subscribe messaging system. A messaging system let you send messages between processes, applications, and servers. Apache Kafka is software where topics (A topic might be a category) can be defined and further processed. To know more about this, please refer to the article
      5 min read
    • Apache Kafka - Create High Throughput Producer using Java
      Apache Kafka Producers are going to write data to topics and topics are made of partitions. Now the producers in Kafka will automatically know to which broker and partition to write based on your message and in case there is a Kafka broker failure in your cluster the producers will automatically rec
      4 min read
    • Understanding In-Sync Replicas (ISR) in Apache Kafka
      Apache Kafka, a distributed streaming platform, relies on a robust replication mechanism to ensure data durability and availability. Central to this mechanism is the concept of In-Sync Replicas (ISR). Understanding ISR is crucial for anyone working with Kafka, as it directly impacts data consistency
      4 min read
    geeksforgeeks-footer-logo
    Corporate & Communications Address:
    A-143, 7th Floor, Sovereign Corporate Tower, Sector- 136, Noida, Uttar Pradesh (201305)
    Registered Address:
    K 061, Tower K, Gulshan Vivante Apartment, Sector 137, Noida, Gautam Buddh Nagar, Uttar Pradesh, 201305
    GFG App on Play Store GFG App on App Store
    Advertise with us
    • Company
    • About Us
    • Legal
    • Privacy Policy
    • In Media
    • Contact Us
    • Advertise with us
    • GFG Corporate Solution
    • Placement Training Program
    • Languages
    • Python
    • Java
    • C++
    • PHP
    • GoLang
    • SQL
    • R Language
    • Android Tutorial
    • Tutorials Archive
    • DSA
    • Data Structures
    • Algorithms
    • DSA for Beginners
    • Basic DSA Problems
    • DSA Roadmap
    • Top 100 DSA Interview Problems
    • DSA Roadmap by Sandeep Jain
    • All Cheat Sheets
    • Data Science & ML
    • Data Science With Python
    • Data Science For Beginner
    • Machine Learning
    • ML Maths
    • Data Visualisation
    • Pandas
    • NumPy
    • NLP
    • Deep Learning
    • Web Technologies
    • HTML
    • CSS
    • JavaScript
    • TypeScript
    • ReactJS
    • NextJS
    • Bootstrap
    • Web Design
    • Python Tutorial
    • Python Programming Examples
    • Python Projects
    • Python Tkinter
    • Python Web Scraping
    • OpenCV Tutorial
    • Python Interview Question
    • Django
    • Computer Science
    • Operating Systems
    • Computer Network
    • Database Management System
    • Software Engineering
    • Digital Logic Design
    • Engineering Maths
    • Software Development
    • Software Testing
    • DevOps
    • Git
    • Linux
    • AWS
    • Docker
    • Kubernetes
    • Azure
    • GCP
    • DevOps Roadmap
    • System Design
    • High Level Design
    • Low Level Design
    • UML Diagrams
    • Interview Guide
    • Design Patterns
    • OOAD
    • System Design Bootcamp
    • Interview Questions
    • Inteview Preparation
    • Competitive Programming
    • Top DS or Algo for CP
    • Company-Wise Recruitment Process
    • Company-Wise Preparation
    • Aptitude Preparation
    • Puzzles
    • School Subjects
    • Mathematics
    • Physics
    • Chemistry
    • Biology
    • Social Science
    • English Grammar
    • Commerce
    • World GK
    • GeeksforGeeks Videos
    • DSA
    • Python
    • Java
    • C++
    • Web Development
    • Data Science
    • CS Subjects
    @GeeksforGeeks, Sanchhaya Education Private Limited, All rights reserved
    We use cookies to ensure you have the best browsing experience on our website. By using our site, you acknowledge that you have read and understood our Cookie Policy & Privacy Policy
    Lightbox
    Improvement
    Suggest Changes
    Help us improve. Share your suggestions to enhance the article. Contribute your expertise and make a difference in the GeeksforGeeks portal.
    geeksforgeeks-suggest-icon
    Create Improvement
    Enhance the article with your expertise. Contribute to the GeeksforGeeks community and help create better learning resources for all.
    geeksforgeeks-improvement-icon
    Suggest Changes
    min 4 words, max Words Limit:1000

    Thank You!

    Your suggestions are valuable to us.

    What kind of Experience do you want to share?

    Interview Experiences
    Admission Experiences
    Career Journeys
    Work Experiences
    Campus Experiences
    Competitive Exam Experiences