1

I have two docker machines and I want to create a kafka cluster inside docker swarm. My docker-compose.yml looks like this:

version: '3.2' services: zookeeper: image: wurstmeister/zookeeper ports: - "2181:2181" kafka: image: wurstmeister/kafka:latest ports: - "9092:9092" - "29092:29092" environment: KAFKA_ZOOKEEPER_CONNECT: zookeeper:2181 KAFKA_LISTENER_SECURITY_PROTOCOL_MAP: PLAINTEXT:PLAINTEXT,PLAINTEXT_HOST:PLAINTEXT KAFKA_LISTENERS: PLAINTEXT://:9092,PLAINTEXT_HOST://:29092 KAFKA_ADVERTISED_LISTENERS: PLAINTEXT://kafka:9092,PLAINTEXT_HOST://localhost:29092 

I followed this question: Unable to connect to Kafka run in container from Spring Boot app run outside container and I am trying to access kafka from outside using localhost:29092.

I have already create the topic mytesttopic inside kafka. The below python code:

from kafka import KafkaConsumer, SimpleProducer, TopicPartition, KafkaClient def consume_from_topic(): try: consumer = KafkaConsumer('mytesttopic', group_id= None, bootstrap_servers=['localhost:29092'], auto_offset_reset='earliest') for message in consumer: #consumer.commit() print ("%s:%d:%d: key=%s value=%s" % (message.topic, message.partition, message.offset, message.key, message.value)) except Exception as e: print(e) pass if __name__ == '__main__': consume_from_topic() 

returns:

NoBrokersAvailable

Does anyone know what I am missing here?

3
  • You mentioned you are running docker swarm over two machines, from which source are you running the python application? One of the docker nodes or your local machine? Commented Feb 18, 2020 at 12:41
  • On my local machine. Commented Feb 18, 2020 at 12:49
  • 1
    Which command do you execute to bring up the stack? Commented Feb 18, 2020 at 14:39

2 Answers 2

0

Given you are running docker swarm on 2 other machines you won't be able to connect on localhost:29092 due to the fact that kafka will be exposed on port 29092 on your nodes of the docker swarm. Try connecting to kafka by using the hostname of one of your nodes + port 29092. You should be able to connect to kafka this way.

Please note that this will only work if you are running docker swarm with routing mesh, the routing mesh makes sure that each node accepts incoming requests on a published port for any service, no matter if it is running on the same hosts and makes sure the traffic reaches the actual host where your container is running.

If you have not yet setup routing mesh try connceting to the actual hostname where a kafka container is running (not recommended, but for testing purposes it works)

I hope this helps you!

Sign up to request clarification or add additional context in comments.

9 Comments

the hostname is manager. you mean bootstrap_servers=['manager:29092'] ? If so, this does not work
@Telperinquar, can you ping manager:29092? and this is one of your docker nodes which you can reach with this hostname?
No I cannot ping. I found the hostname with: docker node ls, right?
@Telperinquar, that is the hostname of that node within the swarm itself, you need the IP address of this node or the hostname which you would also for SSH to your node
the ip is 192.168.99.100 and I cannot ping
|
0

Your listeners are the exact same.

You need to set PLAINTEXT_HOST://0.0.0.0:29092 to bind the listener to all interfaces

Comments

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.