How to Configure Celery and Celery Beat with Django deployed on Elastic Beanstalk Amazon Linux 2?

0

In our Django web app, there are some background tasks that we want to run every midnight. With this, I tried celery with celery beat.

I was able to successfully implement background task scheduler with celery beat and worker with Redis as Celery broker following the tutorial on:

  1. https://realpython.com/asynchronous-tasks-with-django-and-celery/
  2. https://docs.celeryq.dev/en/stable/django/first-steps-with-django.html#using-celery-with-django
  3. https://docs.celeryq.dev/en/latest/userguide/periodic-tasks.html

The feature is working locally by running the servers, scheduler and workers accordingly on separate terminals.

Django Server: python manage.py runserver

Redis Server: redis-server

Celery Worker: celery -A django_project.celery beat -l info

Celery Beat Scheduler: celery -A django_project worker -l info

My question is how do I configure this for deployment in Elastic Beanstalk?

What is the correct way to set this up properly with Elasticache as the Redis server?

Current Stack:

Django 3.1 deployed on AWS Elastic Beanstalk

Python 3.8 running on 64bit Amazon Linux 2/3.3.9 with ElastiCache endpoint

redis==4.3.4 # https://pypi.org/project/redis/

celery==5.2.7 # https://pypi.org/project/celery/

asked 2 years ago3532 views
3 Answers
0

Hello,

Thank you for reaching out to us. I understand that you want to know how to deploy a Django application with Celery and Celery Beat to Elastic Beanstalk and use Elasticache cluster for Redis.

You can deploy a Django application to Elastic Beanstalk following the steps mentioned in the documentation below[1]. While creating a Django project in second step, you can define an instance of the Celery library and create required files such as celery.py, init.py etc. within the project location by following the steps in the celery official document you are following[2].

[1] Deploying a Django application to Elastic Beanstalk - https://docs.aws.amazon.com/elasticbeanstalk/latest/dg/create-deploy-python-django.html

[2] First steps with Django - https://docs.celeryq.dev/en/stable/django/first-steps-with-django.html#using-celery-with-django

Following this, you should be able to implement Celery and Celery Beat with Django deployed on Elastic Beanstalk.

For using Elasticache Cluster as Redis, you have to set your Elasticache endpoint instead of localhost in your project settings file.

  • from the documentation you shared.
# django_celery/settings.py
# ...
# Celery settings
CELERY_BROKER_URL = "redis://localhost:6379"
CELERY_RESULT_BACKEND = "redis://localhost:6379"
  • You can configure Elasticache endpoint as below:
# Celery settings
CELERY_BROKER_URL = "redis://<elasticache_endpoint>:6379"
CELERY_RESULT_BACKEND = "redis://<elasticache_endpoint>:6379"

Please make sure that the traffic from the instance security group used in Elastic Beanstalk application is allowed in Elasticache Redis Cluster Security Group.

[3] Authorize access to the cluster - https://docs.aws.amazon.com/AmazonElastiCache/latest/red-ug/GettingStarted.AuthorizeAccess.html

I sincerely hope this helps.

AWS
SUPPORT ENGINEER
Tarun_S
answered 2 years ago
  • Hi @Tauron_S,

    Thank you for your answer.

    How do I set up celery and celery beat to run in daemon mode?

    If you notice in my local development, I have to run them separately in terminals:

    Celery Worker: celery -A django_project.celery beat -l info Celery Beat Scheduler: celery -A django_project worker -l info

    Along with django and redis: Django Server: python manage.py runserver Redis Server: redis-server

    How do I set this up as a config in elasticbeanstalk?

0

I have same problem :( did you solve the problem?

devsurf
answered a year ago
0

I tried to do something similar. The best thing would be a smooth integration with the AWS Beanstalk Worker Environment. In the documentation, it says that it handles the SQS queue and dead-letter queue by itself.

When you create a Worker Env with Beanstalk, you will have a SQS queue created. The idea is to configure Celery to use this queue to post to the SQS queue.

Now, on the worker side, we should have (from my understanding) 2 things:

  • celery-beat to dispatch workload
  • celery-task to run the task

So the first question would be, do we have to take care of the celery beat, or the Worker daemon is replacing the need for the celery scheduler?

Since Celery is supposedly designed for small tasks, could it be more advantageous to deploy Django using Daphne ASGI server and run small tasks asynchronously? In that case, if you have a Middleware incompatible with ASGI it could not work at all, so be careful there...

Julien
answered 6 months ago

You are not logged in. Log in to post an answer.

A good answer clearly answers the question and provides constructive feedback and encourages professional growth in the question asker.

Guidelines for Answering Questions