I have a function that fetches data and needs to be run periodically.
All I care about is running it every 30 seconds.
I searched and found the following options -
celery
django-apscheduler
Apscheduler
I have tried Apscheduler using BackgroundScheduler and it has this problem that it'll run a new scheduler for each process.
I am completely new to scheduling functions and have no idea which one I should use or if there is a better way.
I experienced a similar issue and solved it by creating a custom management command and scheduling it on the web server.
Within the root of the app, create management/commands directory:
some_app/
__init__.py
models.py
management/
commands/
dosomething.py
tests.py
views.py
// dosomething.py
from django.core.management.base import BaseCommand, CommandError
class Command(BaseCommand):
help = 'Description of the action here'
def handle(self):
print('Doing something')
return
To confirm if it's working, run python manage.py dosomething
The scheduling itself depends on which web server you are using. In my case it was Heroku and I used their Scheduler add-on.
Related
I am making a price tracker.My project structure is this:
Myapp-folder
manage.py-from flask script module
subApp-folder
__init__.py
form.py
models.py
views.py
pricemonitor-folder
main.py
__init__.py
send_email.py
price_compare_sendemail.py-with class Compare_sendemail and start_monitor function
In the main.py, I have an interation to compare the prices every 60s and send-email if needed.
from app.PriceMonitor.price_compare_sendmail import Compare_sendemail
break_time = 60 # set waiting time for one crawl round
monitor = Compare_sendemail()
monitor.start_monitor(break_time)
The manage.py is as below:
from flask_script import Manager, Server
from app import app, db
manager = Manager(app)
manager.add_command("runserver",Server(host='127.0.0.1', port=5000, use_debugger=True))
if __name__ == '__main__':
manager.run()
But the iteration doesn't work when I run python manage.py runserver while I directly run the main.py successfully. How can I make up code to run the flask server with the compare_sendemail iteration running at the background? Thanks.
I think you are looking for Celery.
you can use Celery background task. If your application has a long running task, such as processing some uploaded data or sending email, you don’t want to wait for it to finish during a request. Instead, use a task queue to send the necessary data to another process that will run the task in the background while the request returns immediately.
here you can find documentation for celery
https://flask.palletsprojects.com/en/1.1.x/patterns/celery/
and if you want to wait for Task to complete you can use Coroutines and Tasks
https://docs.python.org/3/library/asyncio-task.html
there are other options for flask background task
like
RQ
https://blog.miguelgrinberg.com/post/the-flask-mega-tutorial-part-xxii-background-jobs
some other Alternatives
https://smirnov-am.github.io/background-jobs-with-flask/
Threads
uWSGI thread
uWSGI spooler
uSWGI spooler is great for simple tasks. like sending OTP SMS or Email.
I answer part of my own question.
In the main.py, I used while loop and time module to iterate the price_compare_sendemail.py
every 60s. While this is not an ideal background task handler, this project is currently just for my own usage so it is OK for me. My original thought was using the flask script manager to handle all the python commands-I don't know if it is the right thought though because I just started to learn Flask.
After some google search, I found the way to use manager.
from subapp.pricemonitor.main import Start_monitor
Monitor=Start_monitor()
#manager.command
def monitor_start():
break_time=10
Monitor.start_monitoring(break_time)
Then use the command 'python manage.py monitor_start' to start the background task. I don't know if it is useful but at least it fit my original thought.
I am developing a Django app running on Heroku.
In order to update my database with some data coming from a certain API service, I need to periodically run a certain script (let's say myscript).
How can I use Heroku Scheduler to do it?
As already explained here, quick and simple way to answer this question is asking yourself how would you do to run that script periodically, as if you were the scheduler yourself.
Now, the best way to run a script in your Django app at any moment, it is to create a custom management command and to run it from your command prompt when you need it, like this:
python manage.py some_custom_command
Then, if you were the scheduler, you would run that command from your command prompt at every time written in the schedule.
So, a good idea would be to make Heroku Scheduler behave the same. Thus, the aim here is to have Heroku Scheduler run python manage.py some_custom_command at scheduled times.
Here is how you can do it:
In your_app directory, create a folder management and then inside it create another folder commands and finally, inside it, create a file some_custom_command.py
So, just to be clear
your_app/management/commands/some_custom_command.py
Then, inside some_custom_command.py insert:
from django.core.management.base import BaseCommand
from your_app.path_to_myscript_file import myscript
class Command(BaseCommand):
def handle(self, *args, **options):
# Put here some script to get the data from api service and store it into your models.
myscript()
Then go on Heroku > your_app > resources
In add-ons section select Heroku Scheduler, click on it so that its window opens, then click on add job, select the time you want, insert the command python manage.py some_custom_command and save.
In my django project i have to check if a database/table exist before starting application, i don't know how is better insert the code for check.
I try to add in views.py into login function a try except block but i was try to find an elegant and more effective solution.
Thanks in advance
To check before the app starts then you can use the AppConfig.ready() function. This gets called before the application starts, but after your project has started. If you want to check before the project starts then you will have to hook into the method you use to start your project, eg within wsgi.py or even manage.py
AppConfig.ready() docs Note that the docs specifically say
avoid interacting with the database in your ready() implementation.
But your use case may justify doing this.
The ready function is called when you run the commands from manage.py eg
manage.py shell / manage.py migrate / etc
It won't get called when your site is visited of course. If you want to run a DB check in response to a visitor action then that code should go into your view
You put the ready() function in your apps.py:
from django.apps import AppConfig
from django.db import connection
class MyAppConfig(AppConfig):
name = 'MyApp'
def ready(self):
print("i am the ready function and the database test code goes here")
# put your test code here, eg you could read all the tables from sqlite
with connection.cursor() as cursor:
cursor.execute("SELECT name FROM sqlite_master;")
rows=cursor.fetchall()
print (rows)
You can write your custom django admin commands and run it by using python manage.py your_command right before calling python manage.py runserver. There are detailed examples at the official documentation.
One of the advantages of using commands is that testing commands are fairly easy. Django has a package for calling commands django.core.management.call_command which enables funtionally test your command.
So I've been trying to figure out how to make scheduled tasks, I've found Celery and been able to to make simple scheduled tasks. To do this I need to open up a command line and run celery -A proj beat for the tasks to happen. This works fine in a development environment, but when putting this into production that will be an issue.
So how can I get celery to work without the command line use? When my production server is online, how can I make sure my scheduler goes up with it? Can Celery do this or do I need to go down another method?
We use Celery in our production environment, which happens to be on Heroku. We are in the process of moving to AWS. In both environments, Celery hums along nicely.
It would be helpful to understand what your production environment will look like. I'm slightly confused as to why you would be worried about turning off your computer, as using Django implies that you are running serving up a website... Are you serving your website from your laptop??
Anyway, assuming that you are going to run your production server from a cloud platform, all you have to do is send whatever command lines you need to run Django AND the command lines for Celery (as you have already noted in your question).
In terms of configuration, you say that you have 'scheduled' tasks, so that implies you have set up a beat schedule in your config.py file. If not, it should look something like this (assumes you have a module called tasks.py which holds your celery task definitions:
from celery.schedules import crontab
beat_schedule = {
'task1': {
'task': 'tasks.task_one',
'schedule': 3600
},
'task2': {
'task': 'tibController.tasks.update_old_retail',
'schedule': crontab(hour=12, minute=0, day_of_week='mon-fri'
}
}
Then in your tasks.py just call the config file you just do this:
from celery import Celery
import config
app = Celery('tasks')
app.config_from_object(config)
You can find more on crontab in the docs. You can also checkout this repo for a simple Celery example.
In summary:
Create a config file that identifies which tasks to run when
Load the config file into your Celery app
Get a cloud platform to run your code on.
Run celery exactly like you have already identified
Hope that helps.
I've recently been playing around with django and celery. One annoying thing during development is the fact that I have to restart the celery daemon each time I modify a task. When I'm developing, I usually like to use 'manage.py runserver' which automatically reloads the django framework on modifications to my apps.
Is there a way to add a hook to the reloading process that runserver does so that it automatically restarts the celery daemon I have running?
Alternatively, does celery have a similar monitor-and-reload-on-change mode that I should be using for development?
Django-supervisor works very well for this purpose. You can have it start the Django server, Celery, and anything else you need, and have different configurations for development and production servers. It also knows to reload the celery daemon when your code changes.
https://github.com/rfk/django-supervisor
I believe you can set CELERY_ALWAYS_EAGER to true.
Yes. Django provides auto reload hook, which can be used to restart other scripts.
Here is a simple management command which prints a message on reload
import subprocess
from django.core.management.base import BaseCommand
from django.utils import autoreload
def reload():
print('Code changed. Auto reloading...')
class Command(BaseCommand):
def handle(self, *args, **options):
autoreload.main(reload)
Now you can save to a reload.py and run it with python manage.py reload. A management command to reload celery workers is available here.
Celery didn't have any feature for reload code or for auto restart when the code change, than you have to restart it manually.
There isn't a way for add an hook, and I think not worthwhile of edit the source code of django just for perform a restart.
Personally while I'm developing i prefere to see the output shell of celery that is decorated with color instead of tail the logs, is more readable.
Celery 2.5 has an experimental runtime option --autoreload that could be used for this purpose, too. Here's more detail in the release notes. That being said, I think django-supervisor (via #Lee Semel) looks like the better way of doing things. I thought I would post this alternative here in case other readers do not want to have to configure another app for asynchronous processing.