how to start worker in djcelery - django

how can I start a worker in djcelery using django.I am new to django and djcelery.I installed django and djcelery.but I don't know how to start the worker and how to add the worker and task.sorry for my bad english

You generally start a celery worker using the following command:
$ celery -A project worker -l info
Where -A is your application / project.
Since you are using djcelery, you can run it using manage.py too:
$ python manage.py celery worker
Read celery docs

Related

How to deploy django application with celery and celery beat in digitalocean droplet?

Hello developers i nedd to deploy django application with celery and celery beat in digitalocean droplet.
where celery worker and celery beat will execute simulteneoulsy.
for celery worker:
celery -A <project_name>.celery worker --pool=solo -l INFO
for celery beat:
celery -A <project_name> beatb -l INFO
afterward it will run.

Django: Do i have to restart celery beat, celery worker and Django gunicorn when new changes are uploaded to production server

I have a production server running Django application
Django server is run using gunicorn and nginx
pipenv run gunicorn --workers=1 --threads=50 --bind 0.0.0.0:8888 boiler.wsgi:application
celery worker is run using
pipenv run celery -A boiler worker
celery beat is run using
pipenv run celery -A boiler beat
Now i have updated my model and few views on my production server (i.e pulled some changes using github)
Now inorder to reflect the changes should i restart all celery beat,celery worker and Django server gunicorn
or only celery worker and Django server gunicorn is sufficient
or only Django server gunicorn is sufficient
If you have made changes to any code that in one way or the other affects the celery tasks then yes, you should restart the celery worker. If you are not sure, a safe bet is to restart. And since celery beat tracks the scheduling of periodic tasks you should also restart it if you restart the workers. Of course, you should ensure there are no current tasks running or properly kill them before restarting. You can monitor the tasks using Flower

Executing two celery workers from one line

I am working on a project for my university within a team where I was mostly working on the frontend and some basic django models, so I am not so familiar and eloquent with django-celery and I also did not set it up. At first we were using one celery worker, but I had to add one more so I was able to finish a user story.
I am currently running two workers with one in a terminal each like this:
exec celery -A my_proj --concurrency=1 worker
exec celery -A my_proj --concurrency=1 worker -B -Q notification
While i run those two my project works, but I need these to start from one line. So:
How do I get those two into one line for a script?
So far I've tried around this:
exec celery multi start celery -A my_proj --concurrency=1 notification -A my_proj --concurrency=1 -B -Q notification
But it stops my project from functioning.
Any help is appreciated, thank you!
Solution
celery multi start 2 -A my_proj -c=1 -B:2 -Q:2 notification
The above tells start 2 workers with the 2nd worker to process the notification queue and embed celery beat to it
Explanation
You can run the following to see the commands resulting from this
celery multi show 2 -A my_proj -c=1 -B:2 -Q:2 notification
Output:
celery worker -A my_proj -c=1
celery worker -A my_proj -c=1 -B -Q notification
try
exec celery -A my_proj --concurrency=1 worker && exec celery -A my_proj --concurrency=1 worker -B -Q notification

Celery worker exit with shell

i use Celery with Django on a Debian server.
I connect to the server with putty via ssh and start the celery worker with the following command
celery -A django-project worker
Now i would like to close putty but than the celery worker exit apparently.
Why and what can i do to always run the celery worker?
Start celery daemonized:
celery multi start worker1 -A django-project

Django on Heroku - how can I get a celery worker to run correctly?

I am trying to deploy the simplest possible "hello world" celery configuration on heroku for my Django app. My Procfile is as follows:
web: gunicorn myapp.wsgi
worker: celery -A myapp worker -l info -B -b amqp://XXXXX:XXXXX#red-thistle-3.bigwig.lshift.net:PPPP/XXXXX
This is the RABBITMQ_BIGWIG_RX_URL that I'm giving to the celery worker. I have the corresponding RABBITMQ_BIGWIG_TX_URL in my settings file as the BROKER_URL.
If I use these broker URLs in my local dev environment, everything works fine and I can actually use the Heroku RabbitMQ system. However, when I deploy my app to Heroku it isn't working.
This Procfile seems to work (although Celery is giving me memory leak issues).
web: gunicorn my_app.wsgi
celery: celery worker -A my_app -l info --beat -b amqp://XXXXXXXX:XXXXXXXXXXXXXXXXXXXX#red-thistle-3.bigwig.lshift.net:PPPP/XXXXXXXXX