Executing two celery workers from one line - django

I am working on a project for my university within a team where I was mostly working on the frontend and some basic django models, so I am not so familiar and eloquent with django-celery and I also did not set it up. At first we were using one celery worker, but I had to add one more so I was able to finish a user story.
I am currently running two workers with one in a terminal each like this:
exec celery -A my_proj --concurrency=1 worker
exec celery -A my_proj --concurrency=1 worker -B -Q notification
While i run those two my project works, but I need these to start from one line. So:
How do I get those two into one line for a script?
So far I've tried around this:
exec celery multi start celery -A my_proj --concurrency=1 notification -A my_proj --concurrency=1 -B -Q notification
But it stops my project from functioning.
Any help is appreciated, thank you!

Solution
celery multi start 2 -A my_proj -c=1 -B:2 -Q:2 notification
The above tells start 2 workers with the 2nd worker to process the notification queue and embed celery beat to it
Explanation
You can run the following to see the commands resulting from this
celery multi show 2 -A my_proj -c=1 -B:2 -Q:2 notification
Output:
celery worker -A my_proj -c=1
celery worker -A my_proj -c=1 -B -Q notification

try
exec celery -A my_proj --concurrency=1 worker && exec celery -A my_proj --concurrency=1 worker -B -Q notification

Related

How to deploy django application with celery and celery beat in digitalocean droplet?

Hello developers i nedd to deploy django application with celery and celery beat in digitalocean droplet.
where celery worker and celery beat will execute simulteneoulsy.
for celery worker:
celery -A <project_name>.celery worker --pool=solo -l INFO
for celery beat:
celery -A <project_name> beatb -l INFO
afterward it will run.

Why does celery works with "--concurrency 1 -P solo" and not without?

When I use :
celery -A FAM worker -l info --concurrency 1 -P solo
I can run my tasks from celery.
when I use :
celery -A FAM worker -l info
It doesn't work. But I cannot understand why. I see there is a difference : "16 (prefork)" vs "1 (solo)". What is the difference ? And why does the solo works and the other does not ?
The issue was that Celery does not support Windows since version 4, and so running it on Windows can give issues. As it is with concurrency.
Based on this article I could solve this issue by using Eventlet.
pip install eventlet
And when starting the celery worker:
celery -A FAM worker -l info --concurrency 4 -P eventlet

Celery worker exit with shell

i use Celery with Django on a Debian server.
I connect to the server with putty via ssh and start the celery worker with the following command
celery -A django-project worker
Now i would like to close putty but than the celery worker exit apparently.
Why and what can i do to always run the celery worker?
Start celery daemonized:
celery multi start worker1 -A django-project

how to start worker in djcelery

how can I start a worker in djcelery using django.I am new to django and djcelery.I installed django and djcelery.but I don't know how to start the worker and how to add the worker and task.sorry for my bad english
You generally start a celery worker using the following command:
$ celery -A project worker -l info
Where -A is your application / project.
Since you are using djcelery, you can run it using manage.py too:
$ python manage.py celery worker
Read celery docs

django-celery process goes to 100 % and tasks are not executed

I am using last version of django-celery with RabbitMQ Worker on Ubuntu 12.04 Server. I start to have problem with celery tasks before a month ago and I can not figure out what is the problem. I run celery into production with supervisord, I dont know why but there is moments when some of the process that run celery go to 100% usages of CPU and stay at that point until I restart the celery an kill existing process. When this happen worker does not get more tasks and they are not executed until I restart the celery.
Comman with which I run celery from supervisord is:
django-admin.py celeryd -v 2 -B -s celery -E -l INFO
Thank you for your help.