Django-Celery in production? - django

So I've been trying to figure out how to make scheduled tasks, I've found Celery and been able to to make simple scheduled tasks. To do this I need to open up a command line and run celery -A proj beat for the tasks to happen. This works fine in a development environment, but when putting this into production that will be an issue.
So how can I get celery to work without the command line use? When my production server is online, how can I make sure my scheduler goes up with it? Can Celery do this or do I need to go down another method?

We use Celery in our production environment, which happens to be on Heroku. We are in the process of moving to AWS. In both environments, Celery hums along nicely.
It would be helpful to understand what your production environment will look like. I'm slightly confused as to why you would be worried about turning off your computer, as using Django implies that you are running serving up a website... Are you serving your website from your laptop??
Anyway, assuming that you are going to run your production server from a cloud platform, all you have to do is send whatever command lines you need to run Django AND the command lines for Celery (as you have already noted in your question).
In terms of configuration, you say that you have 'scheduled' tasks, so that implies you have set up a beat schedule in your config.py file. If not, it should look something like this (assumes you have a module called tasks.py which holds your celery task definitions:
from celery.schedules import crontab
beat_schedule = {
'task1': {
'task': 'tasks.task_one',
'schedule': 3600
},
'task2': {
'task': 'tibController.tasks.update_old_retail',
'schedule': crontab(hour=12, minute=0, day_of_week='mon-fri'
}
}
Then in your tasks.py just call the config file you just do this:
from celery import Celery
import config
app = Celery('tasks')
app.config_from_object(config)
You can find more on crontab in the docs. You can also checkout this repo for a simple Celery example.
In summary:
Create a config file that identifies which tasks to run when
Load the config file into your Celery app
Get a cloud platform to run your code on.
Run celery exactly like you have already identified
Hope that helps.

Related

Celery Beat Task Hangs With Out Any Errors

I have a Django app, and I'm using Celery Beat to run a task periodically. If I call the task when running Celery, it runs without errors:
app/tasks.py
...
#task(name='task1')
def func():
# Some code
func.run()
...
If I then start Celery celery -A project worker -l info, the task runs without errors.
The issue comes when I try to run that same task with Celery Beat, imagine I have this schedule:
app.conf.beat_schedule = {
'some_task': {
'task': 'task1',
'schedule': crontab(minute=30, hour='22')
}
}
This task should run every day on 22:30, and it does, the task starts but then hangs without logging anything, I cannot figure out the root of the issue, this is not a memory error, I have already checked that, and the task runs fine on my local machine using Celery Beat.
I have also tried to use Celery Beat Daemon, but the task keeps hanging whenever it starts. I can't figure out what is happening, any suggestions?
Use the app.task or shared_task decorator for your task. Without an app instance, celery beat will not be calling the task with the correct task signature for the celery app to recognize. You can find the documentation on how to write a basic task here.

Celery app ignoring beat_schedule option specified in settings.py file

I am trying to extend my django app with celery crontab functionality. For this purposes i created celery.py file where i put code as mentioned in official documentation.
Here is my the code from project/project/celery.py
import os
from celery import Celery
os.environ.setdefault('DJANGO_SETTINGS_MODULE','project.settings')
app=Celery('project')
app.config_from_object('django.conf::settings',namespace='CELERY')
Than inside my project/settings.py file i specify related to celery configs as follow
CELERY_TIMEZONE = "Europe/Moscow"
CELERYBEAT_SHEDULE = {
'test_beat_tasks':{
'task':'webhooks.tasks.adding',
'schedule':crontab(minute='*/1),
},
}
Than i run worker an celery beat in the same terminal by
celery -A project worker -B
But nothing happened i mean i didnt see that my celery beat task printing any output while i expected that my task webhooks.tasks.adding will execute
Than i decided to check that celery configs are applied. For this purposes in command line **python manage.py shell i checked celery.app.conf object
#i imported app from project.celery.py module
from project import celery
#than examined app configs
celery.app.conf
And inside of huge config's output of celery configs i saw that timezone is set to None
As i understand my problem is that initiated in project/celery.py app is ignoring my project/settings.py CELERY_TIMEZONE and CELERY_BEAT_SCHEDULE configs but why so? What i am doing wrong? Please guide me
After i spent so much time researching to solve this problem i found that my mistake was inside how i run worker and celery beat. While running worker as i did it wouldnt execute task in the terminal. To see is task is executing i should run it as follow celery -A project worker -B -l INFO or instead of INFO if you want more detailed output DEBUG can be added. Hope it will help anyone

how manage.py can start the iteration of flask

I am making a price tracker.My project structure is this:
Myapp-folder
manage.py-from flask script module
subApp-folder
__init__.py
form.py
models.py
views.py
pricemonitor-folder
main.py
__init__.py
send_email.py
price_compare_sendemail.py-with class Compare_sendemail and start_monitor function
In the main.py, I have an interation to compare the prices every 60s and send-email if needed.
from app.PriceMonitor.price_compare_sendmail import Compare_sendemail
break_time = 60 # set waiting time for one crawl round
monitor = Compare_sendemail()
monitor.start_monitor(break_time)
The manage.py is as below:
from flask_script import Manager, Server
from app import app, db
manager = Manager(app)
manager.add_command("runserver",Server(host='127.0.0.1', port=5000, use_debugger=True))
if __name__ == '__main__':
manager.run()
But the iteration doesn't work when I run python manage.py runserver while I directly run the main.py successfully. How can I make up code to run the flask server with the compare_sendemail iteration running at the background? Thanks.
I think you are looking for Celery.
you can use Celery background task. If your application has a long running task, such as processing some uploaded data or sending email, you don’t want to wait for it to finish during a request. Instead, use a task queue to send the necessary data to another process that will run the task in the background while the request returns immediately.
here you can find documentation for celery
https://flask.palletsprojects.com/en/1.1.x/patterns/celery/
and if you want to wait for Task to complete you can use Coroutines and Tasks
https://docs.python.org/3/library/asyncio-task.html
there are other options for flask background task
like
RQ
https://blog.miguelgrinberg.com/post/the-flask-mega-tutorial-part-xxii-background-jobs
some other Alternatives
https://smirnov-am.github.io/background-jobs-with-flask/
Threads
uWSGI thread
uWSGI spooler
uSWGI spooler is great for simple tasks. like sending OTP SMS or Email.
I answer part of my own question.
In the main.py, I used while loop and time module to iterate the price_compare_sendemail.py
every 60s. While this is not an ideal background task handler, this project is currently just for my own usage so it is OK for me. My original thought was using the flask script manager to handle all the python commands-I don't know if it is the right thought though because I just started to learn Flask.
After some google search, I found the way to use manager.
from subapp.pricemonitor.main import Start_monitor
Monitor=Start_monitor()
#manager.command
def monitor_start():
break_time=10
Monitor.start_monitoring(break_time)
Then use the command 'python manage.py monitor_start' to start the background task. I don't know if it is useful but at least it fit my original thought.

Decouple and Dockerize Django and Celery

I am wondering what is the best way to decouple Celery from Django in order to dockerize the two parts and use docker swarm service? Typically one starts their celery workers and celery beat using a command that references there Django application:
celery worker -A my_app
celery beat -A my_app
From this I believe celery picks up config info from settings file and a celery.py file which is easy to move to a microservice. What I don't totally understand is how the tasks would leverage the Django ORM? Or is that not really the microservices mantra and Celery should be designed to make GET/POST calls to Django REST Framework API for the data it needs to complete the task?
I use a setup where the code for both the django app and its celery workers is the same (as in a single repository).
When deploying I make sure to have the same code release everywhere, to avoid any surprises with the ORM, etc...
Celery starts with a reference to the django app, so that it has access to the models, etc...
Communication between the workers and the main app happens either through the messaging queue (rabbitmq or redis...) or via the database (as in, the celery worker works directly in the db, since it knows the models, etc...).
I'm not sure if that follows the microservices mantra, but it does work :)
Celery's .send_task or .signature might be helpful:
https://www.distributedpython.com/2018/06/19/call-celery-task-outside-codebase/

Does django's runserver option provide a hook for running other restart scripts?

I've recently been playing around with django and celery. One annoying thing during development is the fact that I have to restart the celery daemon each time I modify a task. When I'm developing, I usually like to use 'manage.py runserver' which automatically reloads the django framework on modifications to my apps.
Is there a way to add a hook to the reloading process that runserver does so that it automatically restarts the celery daemon I have running?
Alternatively, does celery have a similar monitor-and-reload-on-change mode that I should be using for development?
Django-supervisor works very well for this purpose. You can have it start the Django server, Celery, and anything else you need, and have different configurations for development and production servers. It also knows to reload the celery daemon when your code changes.
https://github.com/rfk/django-supervisor
I believe you can set CELERY_ALWAYS_EAGER to true.
Yes. Django provides auto reload hook, which can be used to restart other scripts.
Here is a simple management command which prints a message on reload
import subprocess
from django.core.management.base import BaseCommand
from django.utils import autoreload
def reload():
print('Code changed. Auto reloading...')
class Command(BaseCommand):
def handle(self, *args, **options):
autoreload.main(reload)
Now you can save to a reload.py and run it with python manage.py reload. A management command to reload celery workers is available here.
Celery didn't have any feature for reload code or for auto restart when the code change, than you have to restart it manually.
There isn't a way for add an hook, and I think not worthwhile of edit the source code of django just for perform a restart.
Personally while I'm developing i prefere to see the output shell of celery that is decorated with color instead of tail the logs, is more readable.
Celery 2.5 has an experimental runtime option --autoreload that could be used for this purpose, too. Here's more detail in the release notes. That being said, I think django-supervisor (via #Lee Semel) looks like the better way of doing things. I thought I would post this alternative here in case other readers do not want to have to configure another app for asynchronous processing.