2

From last version of Kafka (0.11.0.0) released the 28th of June 2017, the kafka team provided new features in order to support exactly once delivery. After I downloaded the latest version I tried configuring the Producer (executed through kafka-console-producer.sh script) as described in Producer configs: I set enable.idempotence=true and transactional.id=0A0A.

The problem is that when I start the producer I get a ConfigException saying that acks must be set to all or -1 (even if I set it in the producer.properties file which I pass as argument to the console script).

Could be the root cause that idempotence cannot be set using the console script? Moreover, is there a way to do an atomic transaction producing messages through the provided console script?

Details:

In sythesis, the adopted solution is based upon two main concepts:

  • idempotent Producer able to write a specific message only once, thanks to the introduction of transactional ids in producers config that guarantee atomicity property also in case of multiple partitions for a single topic);
  • on the Consumer side, through the isolation.level=read_committed property, we are now able to read messages only after the transaction is committed.
1
  • Console producer defaults to acks=1 so double check you are correctly changing this to "all" or -1. Commented Jul 8, 2017 at 19:00

1 Answer 1

6

Console producer sets its own defaults. Try adding --request-required-acks "all" or --request-required-acks -1 to set acks to all in place of the default of 1.

Sign up to request clarification or add additional context in comments.

2 Comments

Thanks! I don't get the exception anymore. The problem now is: if I produce the same key-value pair multiple times, it appears also multiple times in the log. I was expecting to see the pair only once. Is it possible to achieve this and to test it with the console scripts? The documentation says that enable.idempotency "When set to 'true', the producer will ensure that exactly one copy of each message is written in the stream."
If you publish the same contents twice that is still two separate messages with unique message IDs so EOS does not apply. There are lots of applications where you want to be able to publish similar messages multiple times. EOS only deduplicates multiple copies of the same message which are unusually created by failures and retries.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.