5

I have setup django project using django cookiecutter. The project scaffolding is excellent. I also opted to use docker along with it. Now I am struggling with getting celery v4.0.x working in the whole setup.

This is my docker-compose.yml

version: '2' volumes: postgres_data_dev: {} postgres_backup_dev: {} services: postgres: build: ./compose/postgres volumes: - postgres_data_dev:/var/lib/postgresql/data - postgres_backup_dev:/backups environment: - POSTGRES_USER=application django: build: context: . dockerfile: ./compose/django/development/Dockerfile depends_on: - postgres environment: - POSTGRES_USER=application - USE_DOCKER=yes volumes: - .:/app - /tmp/ links: - postgres - redis expose: - "8000" env_file: - ./dev.env restart: - "on-failure" nginx: build: context: . dockerfile: ./compose/nginx/development/Dockerfile depends_on: - django ports: - "0.0.0.0:80:80" links: - django volumes_from: - django redis: image: redis:latest hostname: redis celeryworker: build: context: . dockerfile: ./compose/django/development/Dockerfile env_file: ./dev.env depends_on: - postgres - redis command: celery -A application.taskapp worker -l INFO restart: "on-failure" celerybeat: build: context: . dockerfile: ./compose/django/development/Dockerfile env_file: ./dev.env depends_on: - postgres - redis command: celery -A application.taskapp beat -l INFO 

Quite honestly I feel there seems to be some tiny issue with config for celerybeat/celeryworker service. It would be nice if someone can point it out.

Update:

When I execute the command to run the containers, I get an error saying that application could not be found

Update

This is the new compose file which ironed out few errors in my compose. Somewhere along the way of getting it all working I also came across thread where someone had mentioned that ordering of the services mattered as well. So in the new version, django is placed first.

version: '2' volumes: postgres_data_dev: {} postgres_backup_dev: {} services: django: &django build: context: . dockerfile: ./compose/django/development/Dockerfile depends_on: - postgres volumes: - .:/app - /tmp/ links: - postgres - redis environment: - POSTGRES_USER=application - USE_DOCKER=yes expose: - "8000" env_file: - ./dev.env postgres: build: ./compose/postgres volumes: - postgres_data_dev:/var/lib/postgresql/data - postgres_backup_dev:/backups environment: - POSTGRES_USER=application ports: - "5432:5432" redis: image: redis:latest hostname: redis ports: - "0.0.0.0:6379:6379" env_file: - ./dev.env nginx: build: context: . dockerfile: ./compose/nginx/development/Dockerfile depends_on: - django ports: - "0.0.0.0:80:80" links: - django volumes_from: - django celeryworker: <<: *django depends_on: - redis - postgres command: "celery -A application.taskapp worker --loglevel INFO --uid taskmaster" 
8
  • What is the actual issue you're running into? Why can't you get it working, are you getting any errors? Commented Jun 26, 2017 at 20:07
  • @Bono, my bad just check the update Commented Jun 27, 2017 at 3:54
  • Could you post the error itself? :) Commented Jun 27, 2017 at 9:24
  • 1
    @Bono I got it fixed. I will update what I did with changes to my docker-compose.yml Commented Jun 27, 2017 at 9:47
  • @Bono: how did you fixed it ? Commented Sep 14, 2017 at 11:54

1 Answer 1

4

I am using the same tech stack . This works fine for me. docker-compose.yml

redis: image: redis container_name: redis command: ["redis-server", "--port", "${REDIS_PORT}", "--appendonly", "yes","--maxmemory", "1gb", "--maxmemory-policy", "allkeys-lru"] ports: - "${REDIS_PORT}:${REDIS_PORT}" volumes: - .:/redis.conf networks: - pipeline-net celery-worker: build: context: ./app container_name: celery-worker entrypoint: celery command: -A celery_app.celery worker --loglevel=info volumes: - .:/var/www/app/worker links: - redis depends_on: - redis networks: - pipeline-net celery-beat: build: context: ./app container_name: celery-beat entrypoint: celery command: -A celery_app.celery beat --loglevel=info volumes: - .:/var/www/app/beat links: - celery-worker - redis depends_on: - celery-worker - redis networks: - pipeline-net flower: image: mher/flower container_name: flower environment: - CELERY_BROKER_URL=redis://redis:6379 - FLOWER_PORT=8888 ports: - 8888:8888 links: - redis - celery-worker - celery-beat depends_on: - redis - celery-worker - celery-beat networks: - pipeline-net 
Sign up to request clarification or add additional context in comments.

Comments

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.