2

I have got an application that uses SpringBoot 2.10.0.Release and kafka in the version 2.10.0. The application has got a simple producer and consumer: The sender works with KafkaTemplate and the consumer with KafkaListener. What I try to achieve is to be able to start the SpringBoot application even if the KafkaServer is not running. Currently without a running KafkaBroker the application cannot be started with this error message:

org.springframework.context.ApplicationContextException: Failed to start bean 'org.springframework.kafka.config.internalKafkaListenerEndpointRegistry'; nested exception is org.apache.kafka.common.errors.TimeoutException 

Is there a way to achieve this and if yes could anybody give me hint or a keyword how to manage this?

6
  • I think you can't do it unless you make a separate profile which will omit kafka-depend configuration. Commented Jan 25, 2019 at 12:38
  • @Kamil. Thank you so far. Do you mean with profile different properties-files ? I need this feature always - dev, test, prod - so I conclude to omit the kafka-depend configuration there must exist a property-setting. Right ? Commented Jan 25, 2019 at 14:42
  • Yes, you can tag your beans with @Profile("profileName") so if you start your spring-boot app with diffrent profile as you provided in annotation, then this bean will be not initzialized Commented Jan 25, 2019 at 16:23
  • Thank you. Sorry I have a additional question. Does an pattern exist for Spring which activates all beans after the application started? Because the scenario is that the application should start if the kafka broker is not available. But the beans which are responisble for sending and consuming messages from kafka have to be initialized in a second phase. And in the case if the kafka broker is not available they should log error messages. Commented Jan 25, 2019 at 20:31
  • 1
    I added the concrete question to a new thread because I think the solution could be relevant for other users. Dzienkuje Kamil ;-) Commented Jan 28, 2019 at 17:00

1 Answer 1

4

When running the Spring-Boot application with a KafkaListener, the listener will per default try to listen to Kafka. If the KafkaBroker is invalid or missing, then you will get a org.apache.kafka.common.KafkaException.

You can change the default behaviour of the container factory by setting the autoStartup property to false. One way to do this is by adding autoStartup = "false" element to your KafkaListener annotation:

@KafkaListener(topics = "some_topic", autoStartup = "false") public void fooEventListener(){ 

Now your spring boot application will start. You will still get an error when trying to use the KafkaListener if the broker is still down or invalid, but you will now be able to handle the error within your Java code instead of a Spring Boot server crash.

Documentation about KafkaListner autoStartup element.

It have to be mentioned that the error you are receiving (TimeoutException) is not because the broker is down, it is what Kafka will throw if the buffer is full. The batch records will then be removed from the queue and will not be delivered to the broker. This error will not be the reason for you application using Kafka not to start.

Sign up to request clarification or add additional context in comments.

Comments

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.