I have a Django project, and it runs celery using Rabbitmq as a broker in a remote server. I wonder if I can run worker on the remote server too. I do not want to run broker and worker in one remote server, but run it on seperated remote server. I wonder if this is possible.
Yes. This is completely possible. Just make sure that both the celery worker and the django web server are pointed to the same broker.
Related
In my django web app I am running a calculation which takes 2-10 minutes, and the AWS server times out with a 504 server error.
It seems that the best way to fix this problem is to implement a worker, and offload the calculations. Through some research, it seems that Celery with a Redis server (maybe AWS SQS instead?) are best for Django.
However, I only see tutorials for instances that are run locally. The Redis server is hosted by Railway and Celery is run in a separate terminal than Django (on the same local machine). I'm curious if/where Celery and Redis should be running on AWS.
This answer says you should try to run celery as a deamon in the background. AWS Elastic Beanstalk uses supervisord already to run some deamon processes. So you can leverage that to run celeryd and avoid creating a custom AMI for this. It works nicely for me. but don't the Celery and Redis servers still need to run somehow?
Where does the Celery server run?
How do I start it?
How can I leverage supervisord to run daemon processes? The documentation isn't helping me very much with AWS integration
You can configure Procfile to run multiple processes like main django app, celery and celery-beat in parallel as documented here:
web: <command to start your django app>
celery: celery -A <path_to_celery_app> worker
celery_beat: celery -A <path_to_celery_app> beat
So i have a django project which I installed celery and heroku redis for it and I use google cloud for deployment. Everything works fine at local but i need to run my celery workers on website 24/7. I searched for supervisor and installed it too. I start supervisor from my command line. Celery workers runs since I ran supervisor. But there is a problem. I can not hold my pc open all the time. When I close, supervisor stops too. I did not figured out cloud tasks as well. Lastly, I read some infos about kubernetes and celery. Is it possible to use celery with kubernetes and how can i install kubernetes-celery django setup?
You need to be running your Django server with Gunicorn, your Redis service as a separate service, and your celery worker as a third service.
Alternatively, if you want one single container instance (Pod in k8s) you can set a supervisor to run gunicorn and your celery worker inside the same pod
I have recently learned to integrate Celery + RabbitMQ in local machine, all three of them were running in separate servers.
During production, should I run celery & RabbitMQ in separate server ?
I have searched for Celery hosting options but couldn't find useful links.
I usually use Pythonanywhere to host my Django app
I found this https://stackoverflow.com/a/46105723/10515390 answer useful but did't get a clarification
I have a flask / django webapp running on multiple web servers, in particular the celery task will run and update a server at random. I'm in a situation where I need the task to run once on all servers and have each server updated by the task.
How can I run a celery task and ensure the task is run once on all web servers?
I'm building a Django project on Heroku.
I understand that gunicorn is recommended as a webserver so I need an event loop type of worker and I use gevent for that.
It seems that monkey patching gevent does most of the work for me so I can have concurrency, but how am I supposed to connect to the RabbitMQ without real threads or jamming the whole loop?
I am baffled by this since Heroku themselves recommend gunicorn, celery and RabbitMQ but I don't see how all of these work together.
Do you understand that celery and gunicorn are used for different purposes?
Gunicorn is the webserver responding to requests made by users, serving them web pages or JSON data.
Celery is an asynchronous task manager, i.e. it lets you run arbitrary python code irrespective of web requests to your server.
Do you understand this distinction?