0

I'm building an application which involves a task queue from django-background-tasks. The issue that I'm facing is that I can't seem to start the queue automatically and have to manually run process_tasks in my application container.

I've already tried running the process_tasks command within my Dockerfile, however this seems to not do anything as open tasks are not resolved.

CMD ["bash", "-c", "python manage.py runserver 0.0.0.0:8000 && python manage.py process_tasks"] 

Has anyone experienced this issue, and if so how did you handle it?

I tried to run the process_tasks command in my Dockerfile after executing runserver , I expected this to automatically start the task queue, however this did not yield any result.

1
  • The command after && will not run. Use docker run the commands separately in your docker compose file. Another option will inside your app ready method. If it's possible use celery or Apscheduler. Commented Oct 2, 2024 at 7:56

2 Answers 2

0

Problem(May be): The CMD instruction in your docker file executes commands sequentially. Since runserver runs indefinitely and the command process_tasks never gets a chance to start.

The problem you are facing right now with django-background-tasks not starting automatically in the docker container due to the background work.Your current approach with CMD might not be the optimal solution and you could try some alternatives like:

If I were in your problem then I would choose supervisor or celery to run the background task simultaneously.

1. Supervisor: You have to install process manager like supervisor in your docker container.Supervisord will ensure both runserver and process_tasks run in the background simultaneously.

RUN pip install supervisor # Supervisor configuration COPY supervisord.conf /etc/supervisor/conf.d/ CMD ["supervisord", "-n"] 

N.B: You have to create supervisord.conf file with configurations for both runserver and process_tasks. You can google it for this.

2. Celery: Django background tasks can be integrated with Celery, a powerful distributed task queue. It offers more features and control over background processing but requires additional configurations.

I hope you will get some ideas to solve this problem. I hope you will enjoy Python/Django ecosystem :)

Ref:

  1. https://imamhossainroni.me/supercharge-your-process-management-using-supervisor
  2. http://supervisord.org/configuration.html
  3. Python/django application not running with supervisord
Sign up to request clarification or add additional context in comments.

Comments

0

The second command after the && is only executed when the first one has ended. As python manage.py runserver runs until it is interrupted, python manage.py process_tasks never gets executed.

You have the following options:

  1. Run python manage.py process_tasks in a different container.
  2. Write a shell script that runs the commands in parallel, see https://stackoverflow.com/a/58861363/92106.
  3. Use a process manager such as circus or encab.

Comments

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.