Skip to content
geeksforgeeks
  • Courses
    • DSA to Development
    • Get IBM Certification
    • Newly Launched!
      • Master Django Framework
      • Become AWS Certified
    • For Working Professionals
      • Interview 101: DSA & System Design
      • Data Science Training Program
      • JAVA Backend Development (Live)
      • DevOps Engineering (LIVE)
      • Data Structures & Algorithms in Python
    • For Students
      • Placement Preparation Course
      • Data Science (Live)
      • Data Structure & Algorithm-Self Paced (C++/JAVA)
      • Master Competitive Programming (Live)
      • Full Stack Development with React & Node JS (Live)
    • Full Stack Development
    • Data Science Program
    • All Courses
  • Tutorials
    • Data Structures & Algorithms
    • ML & Data Science
    • Interview Corner
    • Programming Languages
    • Web Development
    • CS Subjects
    • DevOps And Linux
    • School Learning
  • Practice
    • Build your AI Agent
    • GfG 160
    • Problem of the Day
    • Practice Coding Problems
    • GfG SDE Sheet
  • Contests
    • Accenture Hackathon (Ending Soon!)
    • GfG Weekly [Rated Contest]
    • Job-A-Thon Hiring Challenge
    • All Contests and Events
  • Java Arrays
  • Java Strings
  • Java OOPs
  • Java Collection
  • Java 8 Tutorial
  • Java Multithreading
  • Java Exception Handling
  • Java Programs
  • Java Project
  • Java Collections Interview
  • Java Interview Questions
  • Java MCQs
  • Spring
  • Spring MVC
  • Spring Boot
  • Hibernate
Open In App
Next Article:
Apache Kafka - Create Producer with Callback using Java
Next article icon

Apache Kafka - Create Producer with Keys using Java

Last Updated : 24 Apr, 2025
Comments
Improve
Suggest changes
Like Article
Like
Report

Apache Kafka is a publish-subscribe messaging system. A messaging system lets you send messages between processes, applications, and servers. Apache Kafka is software where topics (A topic might be a category) can be defined and further processed. Read more on Kafka here: What is Apache Kafka and How Does it Work. Kafka Producers are going to write data to topics and topics are made of partitions. Now the producers in Kafka will automatically know to which broker and partition to write based on your message and in case there is a Kafka broker failure in your cluster the producers will automatically recover from it which makes Kafka resilient and which makes Kafka so good and used today. In this article, we are going to discuss the step-by-step implementation of how to Create an Apache Kafka Producer with Keys using Java.

Step-by-Step Implementation

Step 1: Create a New Apache Kafka Project in IntelliJ

To create a new Apache Kafka Project in IntelliJ using Java and Maven please refer to How to Create an Apache Kafka Project in IntelliJ using Java and Maven.

Step 2: Install and Run Apache Kafka

To Install and Run Apache Kafka in your local system please refer to How to Install and Run Apache Kafka.

Step 3: Create Producer with Keys

First, we have to create Producer Properties. And to create Producer Properties refer to the below code snippet

Create Producer Properties:

Properties properties = new Properties();  properties.setProperty(ProducerConfig.BOOTSTRAP_SERVERS_CONFIG, bootstrapServer);  properties.setProperty(ProducerConfig.KEY_SERIALIZER_CLASS_CONFIG, StringSerializer.class.getName());  properties.setProperty(ProducerConfig.VALUE_SERIALIZER_CLASS_CONFIG, StringSerializer.class.getName());

Create the Producer:

KafkaProducer<String, String> producer = new KafkaProducer<>(properties);

Create a Producer Record with Key:

String topic = "gfg_topic";  String value = "hello_geeksforgeeks " + i;  String key = "id_" + i;    ProducerRecord<String, String> record =            new ProducerRecord<>(topic, key, value);

Java Producer with Callback:

producer.send(record, (recordMetadata, e) -> {                  // Executes every time a record successfully sent                  // or an exception is thrown                  if (e == null) {                      logger.info("Received new metadata. \n" +                              "Topic: " + recordMetadata.topic() + "\n" +                              "Partition: " + recordMetadata.partition() + "\n" +                              "Offset: " + recordMetadata.partition() + "\n");                  } else {                      logger.error("Error while producing ", e);                  }              }).get(); // Block the .send() to make it synchronous

Flush and Close the Producer:

producer.flush();  producer.close();

Below is the complete code. Comments are added inside the code to understand the code in more detail.

Java
package org.kafkademo.basics;  import org.apache.kafka.clients.producer.*; import org.apache.kafka.common.serialization.StringSerializer; import org.slf4j.Logger; import org.slf4j.LoggerFactory;  import java.util.Properties; import java.util.concurrent.ExecutionException;  public class KafkaProducerWithKeyDemo {     public static void main(String[] args) throws ExecutionException, InterruptedException {          Logger logger = LoggerFactory.getLogger(KafkaProducerWithKeyDemo.class);          String bootstrapServer = "127.0.0.1:9092";          // Create Producer Properties         Properties properties = new Properties();         properties.setProperty(ProducerConfig.BOOTSTRAP_SERVERS_CONFIG, bootstrapServer);         properties.setProperty(ProducerConfig.KEY_SERIALIZER_CLASS_CONFIG, StringSerializer.class.getName());         properties.setProperty(ProducerConfig.VALUE_SERIALIZER_CLASS_CONFIG, StringSerializer.class.getName());          // Create the Producer         KafkaProducer<String, String> producer = new KafkaProducer<>(properties);          for (int i = 0; i < 10; i++) {              String topic = "gfg_topic";             String value = "hello_geeksforgeeks " + i;             String key = "id_" + i;              // Log the Key             logger.info("Key: " + key);              // Create a Producer Record with Key             ProducerRecord<String, String> record =                     new ProducerRecord<>(topic, key, value);              // Java Producer with Callback             producer.send(record, (recordMetadata, e) -> {                 // Executes every time a record successfully sent                 // or an exception is thrown                 if (e == null) {                     logger.info("Received new metadata. \n" +                             "Topic: " + recordMetadata.topic() + "\n" +                             "Partition: " + recordMetadata.partition() + "\n" +                             "Offset: " + recordMetadata.partition() + "\n");                 } else {                     logger.error("Error while producing ", e);                 }             }).get(); // Block the .send() to make it synchronous         }          // Flush and Close the Producer         producer.flush();         producer.close();      } } 

Step 4: Run the Application

Now run the application and below is the output.

"C:\Users\Amiya Rout\.jdks\corretto-11.0.15\bin\java.exe" "-javaagent:  [main] INFO org.apache.kafka.clients.producer.ProducerConfig - ProducerConfig values:       acks = 1      batch.size = 16384      bootstrap.servers = [127.0.0.1:9092]      buffer.memory = 33554432      client.dns.lookup = use_all_dns_ips      client.id = producer-1      compression.type = none      connections.max.idle.ms = 540000      delivery.timeout.ms = 120000      enable.idempotence = false      interceptor.classes = []      internal.auto.downgrade.txn.commit = false      key.serializer = class org.apache.kafka.common.serialization.StringSerializer      linger.ms = 0      max.block.ms = 60000      max.in.flight.requests.per.connection = 5      max.request.size = 1048576      metadata.max.age.ms = 300000      metadata.max.idle.ms = 300000      metric.reporters = []      metrics.num.samples = 2      metrics.recording.level = INFO      metrics.sample.window.ms = 30000      partitioner.class = class org.apache.kafka.clients.producer.internals.DefaultPartitioner      receive.buffer.bytes = 32768      reconnect.backoff.max.ms = 1000      reconnect.backoff.ms = 50      request.timeout.ms = 30000      retries = 2147483647      retry.backoff.ms = 100      sasl.client.callback.handler.class = null      sasl.jaas.config = null      sasl.kerberos.kinit.cmd = /usr/bin/kinit      sasl.kerberos.min.time.before.relogin = 60000      sasl.kerberos.service.name = null      sasl.kerberos.ticket.renew.jitter = 0.05      sasl.kerberos.ticket.renew.window.factor = 0.8      sasl.login.callback.handler.class = null      sasl.login.class = null      sasl.login.refresh.buffer.seconds = 300      sasl.login.refresh.min.period.seconds = 60      sasl.login.refresh.window.factor = 0.8      sasl.login.refresh.window.jitter = 0.05      sasl.mechanism = GSSAPI      security.protocol = PLAINTEXT      security.providers = null      send.buffer.bytes = 131072      socket.connection.setup.timeout.max.ms = 30000      socket.connection.setup.timeout.ms = 10000      ssl.cipher.suites = null      ssl.enabled.protocols = [TLSv1.2, TLSv1.3]      ssl.endpoint.identification.algorithm = https      ssl.engine.factory.class = null      ssl.key.password = null      ssl.keymanager.algorithm = SunX509      ssl.keystore.certificate.chain = null      ssl.keystore.key = null      ssl.keystore.location = null      ssl.keystore.password = null      ssl.keystore.type = JKS      ssl.protocol = TLSv1.3      ssl.provider = null      ssl.secure.random.implementation = null      ssl.trustmanager.algorithm = PKIX      ssl.truststore.certificates = null      ssl.truststore.location = null      ssl.truststore.password = null      ssl.truststore.type = JKS      transaction.timeout.ms = 60000      transactional.id = null      value.serializer = class org.apache.kafka.common.serialization.StringSerializer    [main] INFO org.apache.kafka.common.utils.AppInfoParser - Kafka version: 2.8.0  [main] INFO org.apache.kafka.common.utils.AppInfoParser - Kafka commitId: ebb1d6e21cc92130  [main] INFO org.apache.kafka.common.utils.AppInfoParser - Kafka startTimeMs: 1674837796602  [main] INFO org.kafkademo.basics.KafkaProducerWithKeyDemo - Key: id_0  [kafka-producer-network-thread | producer-1] INFO org.apache.kafka.clients.Metadata - [Producer clientId=producer-1] Cluster ID: orhF-HNsR465cORhmU3pTg  [kafka-producer-network-thread | producer-1] INFO org.kafkademo.basics.KafkaProducerWithKeyDemo - Received new metadata.   Topic: gfg_topic  Partition: 0  Offset: 0    [main] INFO org.kafkademo.basics.KafkaProducerWithKeyDemo - Key: id_1  [kafka-producer-network-thread | producer-1] INFO org.kafkademo.basics.KafkaProducerWithKeyDemo - Received new metadata.   Topic: gfg_topic  Partition: 0  Offset: 0    [main] INFO org.kafkademo.basics.KafkaProducerWithKeyDemo - Key: id_2  [kafka-producer-network-thread | producer-1] INFO org.kafkademo.basics.KafkaProducerWithKeyDemo - Received new metadata.   Topic: gfg_topic  Partition: 0  Offset: 0    [main] INFO org.kafkademo.basics.KafkaProducerWithKeyDemo - Key: id_3  [kafka-producer-network-thread | producer-1] INFO org.kafkademo.basics.KafkaProducerWithKeyDemo - Received new metadata.   Topic: gfg_topic  Partition: 0  Offset: 0    [main] INFO org.kafkademo.basics.KafkaProducerWithKeyDemo - Key: id_4  [kafka-producer-network-thread | producer-1] INFO org.kafkademo.basics.KafkaProducerWithKeyDemo - Received new metadata.   Topic: gfg_topic  Partition: 0  Offset: 0    [main] INFO org.kafkademo.basics.KafkaProducerWithKeyDemo - Key: id_5  [kafka-producer-network-thread | producer-1] INFO org.kafkademo.basics.KafkaProducerWithKeyDemo - Received new metadata.   Topic: gfg_topic  Partition: 0  Offset: 0    [main] INFO org.kafkademo.basics.KafkaProducerWithKeyDemo - Key: id_6  [kafka-producer-network-thread | producer-1] INFO org.kafkademo.basics.KafkaProducerWithKeyDemo - Received new metadata.   Topic: gfg_topic  Partition: 0  Offset: 0    [main] INFO org.kafkademo.basics.KafkaProducerWithKeyDemo - Key: id_7  [kafka-producer-network-thread | producer-1] INFO org.kafkademo.basics.KafkaProducerWithKeyDemo - Received new metadata.   Topic: gfg_topic  Partition: 0  Offset: 0    [main] INFO org.kafkademo.basics.KafkaProducerWithKeyDemo - Key: id_8  [kafka-producer-network-thread | producer-1] INFO org.kafkademo.basics.KafkaProducerWithKeyDemo - Received new metadata.   Topic: gfg_topic  Partition: 0  Offset: 0    [main] INFO org.kafkademo.basics.KafkaProducerWithKeyDemo - Key: id_9  [kafka-producer-network-thread | producer-1] INFO org.kafkademo.basics.KafkaProducerWithKeyDemo - Received new metadata.   Topic: gfg_topic  Partition: 0  Offset: 0    [main] INFO org.apache.kafka.clients.producer.KafkaProducer - [Producer clientId=producer-1] Closing the Kafka producer with timeoutMillis = 9223372036854775807 ms.  [main] INFO org.apache.kafka.common.metrics.Metrics - Metrics scheduler closed  [main] INFO org.apache.kafka.common.metrics.Metrics - Closing reporter org.apache.kafka.common.metrics.JmxReporter  [main] INFO org.apache.kafka.common.metrics.Metrics - Metrics reporters closed  [main] INFO org.apache.kafka.common.utils.AppInfoParser - App info kafka.producer for producer-1 unregistered    Process finished with exit code 0

So in the console, you can see, now we have displayed the Key also. Refer to the below image.

 

Also, you can see the message in the Kafka consumer console. Run this command

kafka-console-consumer --bootstrap-server 127.0.0.1:9092 --topic gfg_topic --group gfg-group

Output:

 

Next Article
Apache Kafka - Create Producer with Callback using Java

A

AmiyaRanjanRout
Improve
Article Tags :
  • Java
  • Apache Kafka Java
Practice Tags :
  • Java

Similar Reads

  • Apache Kafka - Create Producer using Java
    Apache Kafka is a publish-subscribe messaging system. A messaging system lets you send messages between processes, applications, and servers. Apache Kafka is software where topics (A topic might be a category) can be defined and further processed. Read more on Kafka here: What is Apache Kafka and Ho
    4 min read
  • Apache Kafka - Create Producer with Callback using Java
    Apache Kafka is a publish-subscribe messaging system. A messaging system let you send messages between processes, applications, and servers. Apache Kafka is software where topics (A topic might be a category) can be defined and further processed. Read more on Kafka here: What is Apache Kafka and How
    6 min read
  • Apache Kafka - Create Safe Producer using Java
    Apache Kafka Producers are going to write data to topics and topics are made of partitions. Now the producers in Kafka will automatically know to which broker and partition to write based on your message and in case there is a Kafka broker failure in your cluster the producers will automatically rec
    5 min read
  • Apache Kafka - Create High Throughput Producer using Java
    Apache Kafka Producers are going to write data to topics and topics are made of partitions. Now the producers in Kafka will automatically know to which broker and partition to write based on your message and in case there is a Kafka broker failure in your cluster the producers will automatically rec
    4 min read
  • Apache Kafka - Create Consumer with Threads using Java
    Threads are a subprocess with lightweight with the smallest unit of processes and also have separate paths of execution. These threads use shared memory but they act independently hence if there is an exception in threads that do not affect the working of other threads despite them sharing the same
    8 min read
  • How to Create Apache Kafka Producer with Conduktor?
    Kafka Producers are going to write data to topics and topics are made of partitions. Now the producers in Kafka will automatically know to which broker and partition to write based on your message and in case there is a Kafka broker failure in your cluster the producers will automatically recover fr
    4 min read
  • Apache Kafka - Create Consumer using Java
    Kafka Consumer is used to reading data from a topic and remember a topic again is identified by its name. So the consumers are smart enough and they will know which broker to read from and which partitions to read from. And in case of broker failures, the consumers know how to recover and this is ag
    6 min read
  • Apache Kafka - Real World Project with Twitter using Java
    Apache Kafka is a publish-subscribe messaging system. A messaging system let you send messages between processes, applications, and servers. Apache Kafka is software where topics (A topic might be a category) can be defined and further processed. To know more about this, please refer to the article
    5 min read
  • Spring Boot – Integrate with Apache Kafka for Streaming
    Apache Kafka is a widely used distributed streaming platform that enables the development of scalable, fault-tolerant, and high-throughput applications. In this article, we'll walk you through the process of integrating Kafka with a Spring Boot application, providing detailed code examples and expla
    7 min read
  • How to Create an Apache Kafka Project in IntelliJ using Java and Maven?
    Apache Kafka allows you to decouple your data streams and systems. So the idea is that the source systems will be responsible for sending their data into Apache Kafka. Then any target systems that want to get access to this data feed this data stream will have to query and read from Apache Kafka to
    3 min read
geeksforgeeks-footer-logo
Corporate & Communications Address:
A-143, 7th Floor, Sovereign Corporate Tower, Sector- 136, Noida, Uttar Pradesh (201305)
Registered Address:
K 061, Tower K, Gulshan Vivante Apartment, Sector 137, Noida, Gautam Buddh Nagar, Uttar Pradesh, 201305
GFG App on Play Store GFG App on App Store
Advertise with us
  • Company
  • About Us
  • Legal
  • Privacy Policy
  • In Media
  • Contact Us
  • Advertise with us
  • GFG Corporate Solution
  • Placement Training Program
  • Languages
  • Python
  • Java
  • C++
  • PHP
  • GoLang
  • SQL
  • R Language
  • Android Tutorial
  • Tutorials Archive
  • DSA
  • Data Structures
  • Algorithms
  • DSA for Beginners
  • Basic DSA Problems
  • DSA Roadmap
  • Top 100 DSA Interview Problems
  • DSA Roadmap by Sandeep Jain
  • All Cheat Sheets
  • Data Science & ML
  • Data Science With Python
  • Data Science For Beginner
  • Machine Learning
  • ML Maths
  • Data Visualisation
  • Pandas
  • NumPy
  • NLP
  • Deep Learning
  • Web Technologies
  • HTML
  • CSS
  • JavaScript
  • TypeScript
  • ReactJS
  • NextJS
  • Bootstrap
  • Web Design
  • Python Tutorial
  • Python Programming Examples
  • Python Projects
  • Python Tkinter
  • Python Web Scraping
  • OpenCV Tutorial
  • Python Interview Question
  • Django
  • Computer Science
  • Operating Systems
  • Computer Network
  • Database Management System
  • Software Engineering
  • Digital Logic Design
  • Engineering Maths
  • Software Development
  • Software Testing
  • DevOps
  • Git
  • Linux
  • AWS
  • Docker
  • Kubernetes
  • Azure
  • GCP
  • DevOps Roadmap
  • System Design
  • High Level Design
  • Low Level Design
  • UML Diagrams
  • Interview Guide
  • Design Patterns
  • OOAD
  • System Design Bootcamp
  • Interview Questions
  • Inteview Preparation
  • Competitive Programming
  • Top DS or Algo for CP
  • Company-Wise Recruitment Process
  • Company-Wise Preparation
  • Aptitude Preparation
  • Puzzles
  • School Subjects
  • Mathematics
  • Physics
  • Chemistry
  • Biology
  • Social Science
  • English Grammar
  • Commerce
  • World GK
  • GeeksforGeeks Videos
  • DSA
  • Python
  • Java
  • C++
  • Web Development
  • Data Science
  • CS Subjects
@GeeksforGeeks, Sanchhaya Education Private Limited, All rights reserved
We use cookies to ensure you have the best browsing experience on our website. By using our site, you acknowledge that you have read and understood our Cookie Policy & Privacy Policy
Lightbox
Improvement
Suggest Changes
Help us improve. Share your suggestions to enhance the article. Contribute your expertise and make a difference in the GeeksforGeeks portal.
geeksforgeeks-suggest-icon
Create Improvement
Enhance the article with your expertise. Contribute to the GeeksforGeeks community and help create better learning resources for all.
geeksforgeeks-improvement-icon
Suggest Changes
min 4 words, max Words Limit:1000

Thank You!

Your suggestions are valuable to us.

What kind of Experience do you want to share?

Interview Experiences
Admission Experiences
Career Journeys
Work Experiences
Campus Experiences
Competitive Exam Experiences