- Neueste
- Die meisten Stimmen
- Die meisten Kommentare
Hello,
Thank you for reaching out to us. I understand that you want to know how to deploy a Django application with Celery and Celery Beat to Elastic Beanstalk and use Elasticache cluster for Redis.
You can deploy a Django application to Elastic Beanstalk following the steps mentioned in the documentation below[1]. While creating a Django project in second step, you can define an instance of the Celery library and create required files such as celery.py, init.py etc. within the project location by following the steps in the celery official document you are following[2].
[1] Deploying a Django application to Elastic Beanstalk - https://docs.aws.amazon.com/elasticbeanstalk/latest/dg/create-deploy-python-django.html
[2] First steps with Django - https://docs.celeryq.dev/en/stable/django/first-steps-with-django.html#using-celery-with-django
Following this, you should be able to implement Celery and Celery Beat with Django deployed on Elastic Beanstalk.
For using Elasticache Cluster as Redis, you have to set your Elasticache endpoint instead of localhost in your project settings file.
- from the documentation you shared.
# django_celery/settings.py
# ...
# Celery settings
CELERY_BROKER_URL = "redis://localhost:6379"
CELERY_RESULT_BACKEND = "redis://localhost:6379"
- You can configure Elasticache endpoint as below:
# Celery settings
CELERY_BROKER_URL = "redis://<elasticache_endpoint>:6379"
CELERY_RESULT_BACKEND = "redis://<elasticache_endpoint>:6379"
Please make sure that the traffic from the instance security group used in Elastic Beanstalk application is allowed in Elasticache Redis Cluster Security Group.
[3] Authorize access to the cluster - https://docs.aws.amazon.com/AmazonElastiCache/latest/red-ug/GettingStarted.AuthorizeAccess.html
I sincerely hope this helps.
I tried to do something similar. The best thing would be a smooth integration with the AWS Beanstalk Worker Environment. In the documentation, it says that it handles the SQS queue and dead-letter queue by itself.
When you create a Worker Env with Beanstalk, you will have a SQS queue created. The idea is to configure Celery to use this queue to post to the SQS queue.
Now, on the worker side, we should have (from my understanding) 2 things:
- celery-beat to dispatch workload
- celery-task to run the task
So the first question would be, do we have to take care of the celery beat, or the Worker daemon is replacing the need for the celery scheduler?
Since Celery is supposedly designed for small tasks, could it be more advantageous to deploy Django using Daphne ASGI server and run small tasks asynchronously? In that case, if you have a Middleware incompatible with ASGI it could not work at all, so be careful there...
Relevanter Inhalt
- AWS OFFICIALAktualisiert vor 2 Jahren
- AWS OFFICIALAktualisiert vor 7 Monaten
- AWS OFFICIALAktualisiert vor 2 Jahren
Hi @Tauron_S,
Thank you for your answer.
How do I set up celery and celery beat to run in daemon mode?
If you notice in my local development, I have to run them separately in terminals:
Celery Worker:
celery -A django_project.celery beat -l info
Celery Beat Scheduler:celery -A django_project worker -l info
Along with django and redis: Django Server:
python manage.py runserver
Redis Server:redis-server
How do I set this up as a config in elasticbeanstalk?