Where to run Celery on AWS - django

In my django web app I am running a calculation which takes 2-10 minutes, and the AWS server times out with a 504 server error.
It seems that the best way to fix this problem is to implement a worker, and offload the calculations. Through some research, it seems that Celery with a Redis server (maybe AWS SQS instead?) are best for Django.
However, I only see tutorials for instances that are run locally. The Redis server is hosted by Railway and Celery is run in a separate terminal than Django (on the same local machine). I'm curious if/where Celery and Redis should be running on AWS.
This answer says you should try to run celery as a deamon in the background. AWS Elastic Beanstalk uses supervisord already to run some deamon processes. So you can leverage that to run celeryd and avoid creating a custom AMI for this. It works nicely for me. but don't the Celery and Redis servers still need to run somehow?
Where does the Celery server run?
How do I start it?
How can I leverage supervisord to run daemon processes? The documentation isn't helping me very much with AWS integration

You can configure Procfile to run multiple processes like main django app, celery and celery-beat in parallel as documented here:
web: <command to start your django app>
celery: celery -A <path_to_celery_app> worker
celery_beat: celery -A <path_to_celery_app> beat

Related

How to deploy a django project to google cloud with celery workers?

So i have a django project which I installed celery and heroku redis for it and I use google cloud for deployment. Everything works fine at local but i need to run my celery workers on website 24/7. I searched for supervisor and installed it too. I start supervisor from my command line. Celery workers runs since I ran supervisor. But there is a problem. I can not hold my pc open all the time. When I close, supervisor stops too. I did not figured out cloud tasks as well. Lastly, I read some infos about kubernetes and celery. Is it possible to use celery with kubernetes and how can i install kubernetes-celery django setup?
You need to be running your Django server with Gunicorn, your Redis service as a separate service, and your celery worker as a third service.
Alternatively, if you want one single container instance (Pod in k8s) you can set a supervisor to run gunicorn and your celery worker inside the same pod

Where can i execute celery commands after deploying a django application on google app engine standard environment

I am a django developer. One of my clients asked me to deploy the application on google app engine standard environment. I am new to this approach.
Redis is used as a task broker for celery in my application. As i have learnt there is no cli for GAE, so where can i run commands such as
celery -A project worker --loglevel=info
celery -A project beat
or am i making some kind of mistake here.

Supervisord can't control celery workers, how to do the same using Monit

Supervisord cant controll my celery services(workers, flower, django and celerybeat).
When system startup, I can see all processes running successfully, if 1 of processes fail and supervisor does not restart it, when I reload or restart supervisord, everything under supervisord fails and cant be re-launched.
I have tried to move my celery services under monit.
I created celery services in /etc/init.d/ and services run perfectly, I can start/stop them without problems, then I set up monit scripts for all of them.
The problem is that monit is unable to start my celery-workers services, it is able to start/stop "django", "flower" and "celerybeat" without problems, but workers services under init.d are unable to be started by monit.
Could you please sugest me any idea how to fix this?
Why other services can be managed by monit without problems, but workers not?
Thank You in advance.

Celery ensure a task is run on each server

I have a flask / django webapp running on multiple web servers, in particular the celery task will run and update a server at random. I'm in a situation where I need the task to run once on all servers and have each server updated by the task.
How can I run a celery task and ensure the task is run once on all web servers?

Heroku, Django and celery on RabbitMQ

I'm building a Django project on Heroku.
I understand that gunicorn is recommended as a webserver so I need an event loop type of worker and I use gevent for that.
It seems that monkey patching gevent does most of the work for me so I can have concurrency, but how am I supposed to connect to the RabbitMQ without real threads or jamming the whole loop?
I am baffled by this since Heroku themselves recommend gunicorn, celery and RabbitMQ but I don't see how all of these work together.
Do you understand that celery and gunicorn are used for different purposes?
Gunicorn is the webserver responding to requests made by users, serving them web pages or JSON data.
Celery is an asynchronous task manager, i.e. it lets you run arbitrary python code irrespective of web requests to your server.
Do you understand this distinction?