Kafka Consumer: Send Message to a Dead Letter Topic

In this tutorial, I’m going to show you how to set up a Kafka Consumer in a Spring Boot microservice so that it sends messages to a dead letter topic when something goes wrong. If you’re new to Kafka or Spring Boot, don’t worry – I’ll walk you through every step and explain everything clearly.

A dead letter topic is a special Kafka topic where messages that can’t be processed are sent. This is really useful because it lets you separate out problematic messages and deal with them later, without stopping or breaking your main message processing flow.

By the end of this tutorial, you’ll understand how to configure this setup in your own projects. I’ll provide code examples and explain what each part does, so you can easily follow along and apply this knowledge to your work. Ready to get started? Let’s move on to setting up our Kafka Consumer.

If you are interested in video lessons then check my video course Apache Kafka for Event-Driven Spring Boot Microservices.

What is a Dead Letter Topic?

A dead letter topic is a Kafka topic where messages that couldn’t be processed due to errors are sent. It’s a standard practice for handling message failures in Kafka, allowing you to isolate and investigate problematic messages without impacting the main message flow.

Dead Letter Topic Naming

You can create a separate topic to be used as a Dead Letter Topic(DLT) or it can be created for you automatically if the DLT does not exist yet. By default, if the DLT does not exists, it will be created automatically and will have an extension .DLT at the end. For example if the topic name you are working with is called “product-created-events-topic” then the automatically created DLT for it will be “product-created-events-topic.DLT”.

Alright, so let’s continue and lets configure our application to send failed messages to a dead letter topic.

Setting up Kafka Consumer

In a Kafka system, the Consumer is responsible for reading messages from one or more Kafka topics.  In our case, the Kafka Consumer will be part of a Spring Boot application. Its job is to consume messages and, if any issues occur during processing, to send those messages to a dead letter topic.

To continue with this tutorial, you will need to have Kafka Consumer created. If you’re not familiar with how to create a Kafka Consumer in Spring Boot, please this detailed tutorial first: Kafka Consumer in Spring Boot. It covers everything you need to know to get a Kafka Consumer up and running.

Once you have basic Kafka Consumer setup, come back here to continue with the next steps.

Configuring the Kafka Consumer for Error Handling

To set up this configuration, you’ll work within the Kafka Consumer configuration class. Specifically, you’ll modify the kafkaListenerContainerFactory method to include an instance of DefaultErrorHandler and DeadLetterPublishingRecoverer. Here’s how it’s done:

@Bean
ConcurrentKafkaListenerContainerFactory<String, Object> kafkaListenerContainerFactory(
        ConsumerFactory<String, Object> consumerFactory, KafkaTemplate<String, Object> kafkaTemplate) {
    
    DefaultErrorHandler errorHandler = new DefaultErrorHandler(
        new DeadLetterPublishingRecoverer(kafkaTemplate));

    ConcurrentKafkaListenerContainerFactory<String, Object> factory = new ConcurrentKafkaListenerContainerFactory<>();
    factory.setConsumerFactory(consumerFactory);
    factory.setCommonErrorHandler(errorHandler);
    
    return factory;
}

Explaining the code:

  • DefaultErrorHandler: This class provides error handling capabilities for Kafka consumers. It’s used to manage exceptions that occur during message consumption.
  • DeadLetterPublishingRecoverer: This component is essential for directing failed messages to a dead letter topic. It requires a KafkaTemplate object, which is responsible for sending messages to Kafka topics. I will create the KafkaTemplate object in the following section in this tutorial.
  • new DeadLetterPublishingRecoverer(kafkaTemplate): Here, the DeadLetterPublishingRecoverer is instantiated with the kafkaTemplate, enabling it to send failed messages to the dead letter topic.
  • factory.setCommonErrorHandler(errorHandler): This line registers the errorHandler with the Kafka listener, ensuring that it’s used for handling errors during message consumption.

How it works:

When an error occurs during message consumption, the DefaultErrorHandler uses the DeadLetterPublishingRecoverer to send the problematic message to a dead letter topic. This happens automatically, without manual intervention.

Kafka Template Configuration in Error Handling

To complete our setup for handling errors in Kafka Consumers, you need to create a KafkaTemplate object.

First, you need to create a KafkaTemplate bean and a ProducerFactory bean. The ProducerFactory is responsible for creating Kafka Producers, which are needed by the KafkaTemplate to send messages. Here’s the code for these configurations:

@Bean
KafkaTemplate<String, Object> kafkaTemplate(ProducerFactory<String, Object> producerFactory) {
    return new KafkaTemplate<>(producerFactory);
}

@Bean
ProducerFactory<String, Object> producerFactory() {
    Map<String, Object> config = new HashMap<>();
    config.put(ProducerConfig.BOOTSTRAP_SERVERS_CONFIG, environment.getProperty("spring.kafka.consumer.bootstrap-servers"));
    config.put(ProducerConfig.VALUE_SERIALIZER_CLASS_CONFIG, JsonSerializer.class);
    config.put(ProducerConfig.KEY_SERIALIZER_CLASS_CONFIG, StringSerializer.class);
    
    return new DefaultKafkaProducerFactory<>(config);
}

Explaining the code:

  • kafkaTemplate Bean: This method creates a KafkaTemplate using the provided ProducerFactory. The KafkaTemplate is crucial for sending messages, including those that need to be sent to the dead letter topic.
  • producerFactory Bean: This method sets up the ProducerFactory with necessary configurations. We’re specifying the bootstrap servers and serializers for key and value. The JsonSerializer is used for message values, and the StringSerializer is for keys.

Conclusion

I hope this tutorial was helpful to you.

To learn more about how Apache Kafka can be used to build Event-Driven Spring Boot Microservices, please check my Kafka tutorials for beginners page.

If you are interested in video lessons then check my video course Apache Kafka for Event-Driven Spring Boot Microservices.