I have two docker machines and I want to create a kafka cluster inside docker swarm. My docker-compose.yml looks like this:
version: '3.2' services: zookeeper: image: wurstmeister/zookeeper ports: - "2181:2181" kafka: image: wurstmeister/kafka:latest ports: - "9092:9092" - "29092:29092" environment: KAFKA_ZOOKEEPER_CONNECT: zookeeper:2181 KAFKA_LISTENER_SECURITY_PROTOCOL_MAP: PLAINTEXT:PLAINTEXT,PLAINTEXT_HOST:PLAINTEXT KAFKA_LISTENERS: PLAINTEXT://:9092,PLAINTEXT_HOST://:29092 KAFKA_ADVERTISED_LISTENERS: PLAINTEXT://kafka:9092,PLAINTEXT_HOST://localhost:29092 I followed this question: Unable to connect to Kafka run in container from Spring Boot app run outside container and I am trying to access kafka from outside using localhost:29092.
I have already create the topic mytesttopic inside kafka. The below python code:
from kafka import KafkaConsumer, SimpleProducer, TopicPartition, KafkaClient def consume_from_topic(): try: consumer = KafkaConsumer('mytesttopic', group_id= None, bootstrap_servers=['localhost:29092'], auto_offset_reset='earliest') for message in consumer: #consumer.commit() print ("%s:%d:%d: key=%s value=%s" % (message.topic, message.partition, message.offset, message.key, message.value)) except Exception as e: print(e) pass if __name__ == '__main__': consume_from_topic() returns:
NoBrokersAvailable
Does anyone know what I am missing here?