Celery worker exit with shell - django

i use Celery with Django on a Debian server.
I connect to the server with putty via ssh and start the celery worker with the following command
celery -A django-project worker
Now i would like to close putty but than the celery worker exit apparently.
Why and what can i do to always run the celery worker?

Start celery daemonized:
celery multi start worker1 -A django-project

Related

How to deploy django application with celery and celery beat in digitalocean droplet?

Hello developers i nedd to deploy django application with celery and celery beat in digitalocean droplet.
where celery worker and celery beat will execute simulteneoulsy.
for celery worker:
celery -A <project_name>.celery worker --pool=solo -l INFO
for celery beat:
celery -A <project_name> beatb -l INFO
afterward it will run.

Django: Do i have to restart celery beat, celery worker and Django gunicorn when new changes are uploaded to production server

I have a production server running Django application
Django server is run using gunicorn and nginx
pipenv run gunicorn --workers=1 --threads=50 --bind 0.0.0.0:8888 boiler.wsgi:application
celery worker is run using
pipenv run celery -A boiler worker
celery beat is run using
pipenv run celery -A boiler beat
Now i have updated my model and few views on my production server (i.e pulled some changes using github)
Now inorder to reflect the changes should i restart all celery beat,celery worker and Django server gunicorn
or only celery worker and Django server gunicorn is sufficient
or only Django server gunicorn is sufficient
If you have made changes to any code that in one way or the other affects the celery tasks then yes, you should restart the celery worker. If you are not sure, a safe bet is to restart. And since celery beat tracks the scheduling of periodic tasks you should also restart it if you restart the workers. Of course, you should ensure there are no current tasks running or properly kill them before restarting. You can monitor the tasks using Flower

Executing two celery workers from one line

I am working on a project for my university within a team where I was mostly working on the frontend and some basic django models, so I am not so familiar and eloquent with django-celery and I also did not set it up. At first we were using one celery worker, but I had to add one more so I was able to finish a user story.
I am currently running two workers with one in a terminal each like this:
exec celery -A my_proj --concurrency=1 worker
exec celery -A my_proj --concurrency=1 worker -B -Q notification
While i run those two my project works, but I need these to start from one line. So:
How do I get those two into one line for a script?
So far I've tried around this:
exec celery multi start celery -A my_proj --concurrency=1 notification -A my_proj --concurrency=1 -B -Q notification
But it stops my project from functioning.
Any help is appreciated, thank you!
Solution
celery multi start 2 -A my_proj -c=1 -B:2 -Q:2 notification
The above tells start 2 workers with the 2nd worker to process the notification queue and embed celery beat to it
Explanation
You can run the following to see the commands resulting from this
celery multi show 2 -A my_proj -c=1 -B:2 -Q:2 notification
Output:
celery worker -A my_proj -c=1
celery worker -A my_proj -c=1 -B -Q notification
try
exec celery -A my_proj --concurrency=1 worker && exec celery -A my_proj --concurrency=1 worker -B -Q notification

how to start worker in djcelery

how can I start a worker in djcelery using django.I am new to django and djcelery.I installed django and djcelery.but I don't know how to start the worker and how to add the worker and task.sorry for my bad english
You generally start a celery worker using the following command:
$ celery -A project worker -l info
Where -A is your application / project.
Since you are using djcelery, you can run it using manage.py too:
$ python manage.py celery worker
Read celery docs

django-celery process goes to 100 % and tasks are not executed

I am using last version of django-celery with RabbitMQ Worker on Ubuntu 12.04 Server. I start to have problem with celery tasks before a month ago and I can not figure out what is the problem. I run celery into production with supervisord, I dont know why but there is moments when some of the process that run celery go to 100% usages of CPU and stay at that point until I restart the celery an kill existing process. When this happen worker does not get more tasks and they are not executed until I restart the celery.
Comman with which I run celery from supervisord is:
django-admin.py celeryd -v 2 -B -s celery -E -l INFO
Thank you for your help.