Automate update database - django

i have application for dailyhours for work,i just looking around about specific hour will be run script to check database if some value are None will be add value to database.
So my question is i need do this with some script with rest api or maybe django-celery will be okay for that ?
example:
start:8:00
end:12:00
daily:4:00
but sometimes if you forget close work you need change value for some workers in PA so i wanna create script which will be do it automate everyday if value are None

It seems that you need some background mechanics. I suggest two approaches
Django Celery. You can run periodic tasks with Celery.
Manually set up crontab. It seems easier option for your task. You can create management command check_db.py and run it via crontab like this 0 8,12,16 * * * python3 /path/to/your/app/manage.py check_db

Related

Can I edit cronjobs using my Django application?

I am creating a Django application for school project. I want to schedule jobs (every (work)day on 9:00 and 17:00).
I am trying to do it with Celery right now, but I stuck very hard on it, and as the deadline is in sight, I want to use alternative options: just a cronjob. I think just the cronjob works fine, but the user should be able to edit the times of the cronjobs using the Django web application (so not logging in to SSH, edit the crontab manually).
Is this possible? Can't find anything about it on the internet.
You need django-celery-beat plugin that adds new models to the django admin named "Periodic tasks" where you can manage cron schedule for your tasks.
As an alternate, if you really do not want to run a background tasks, you can create django management commands and use a library like python-crontab to add/modify/remove cron jobs in the system.

django how to automate functions inside views.py file

Well, this question will surely make the delights of the downvotes brigade and may be tagged as" too broad etc", but it is not!, but precisely because it requires "general" knowledge of how things work, I cannot find an answer in the books I have to ask it.
Having my Django application, yes, I can make it interactive by means of the MVC flow. The issue that I have is when I have methods that are not in connection with an html page (that the user sees) but are methods that are supposed to be running constantly in the background. For example, just to illustrate, imagine a code snippet that queries a DB and sends an email with news every 2 hours. It is just not doing anything because I dont know how to "wake that code snippet up".
I dont have that problem if I am writing a desktop application in just python without Django. If I right click and say, run this file, the code will running in the background alright.
Yes, naturally I have heard of cron jobs etc, but so far I see that you can cron-tab a file but how do I crontab a method inside views.py? but I suppose that is not either the way to go. I am find that you downvote it, as long as I get an answer.
thank you
I've been using a combination of commands and cron jobs for that purpose.
Write your command, set up your cronjob:
30 3 * * * /home/ubuntu/project/env/bin/python /home/ubuntu/project/manage.py command_name
Profit.
If you want to execute periodic tasks in Django, then there are multiple options.
Using Crontab.
There are multiple Django crontab apps are available. (django-crontab)
You just need to add the cron function in your settings file.
CRONJOBS = [
('*/5 * * * *', 'myapp.cron.my_scheduler')
]
cron.py
from myapp.views import send_email
def my_scheduler():
# add logic
send_email()
Using Celery Beat
I personally prefer Celery over crontab . You can check it here

Django and services

I'm building a simple website with django that requires constant monitoring of text-based data from another website, that's the way it have to be.
How could I run this service on my web-host using django? would I have to start a separate app and run it via SSH, so it updates the database used by django, or are there any easier/better way?
You could use celery to schedule a job that would read data from that other website and do whatever you want with it.
As an alternative to celery, you could also create a cron job that executes a custom django-admin command. That would give you full access to your django install and ORM. The downside is that cron's smallest time resolution is 1 minute, so if you need it to be real-time, you're not going to be able to do that.
If you do need realtime, then creating a python daemon might be a better option.

Rather than using crontab, can Django execute something automatically at a predefined time

How to make Django execute something automatically at a particular time.?
For example, my django application has to ftp upload to remote servers at pre defined times. The ftp server addresses, usernames, passwords, time, day and frequency has been defined in a django model.
I want to run a file upload automatically based on the values stored in the model.
One way to do is to write a python script and add it to the crontab. This script runs every minute and keeps an eye on the time values defined in the model.
Other thing that I can roughly think of is maybe django signals. I'm not sure if they can handle this issue. Is there a way to generate signals at predefined times (Haven't read indepth about them yet).
Just for the record - there is also celery which allows to schedule messages for the future dispatch. It's, however, a different beast than cron, as it requires/uses RabbitMQ and is meant for message queues.
I have been thinking about this recently and have found django-cron which seems as though it would do what you want.
Edit: Also if you are not specifically looking for Django based solution, I have recently used scheduler.py, which is a small single file script which works well and is simple to use.
I've had really good experiences with django-chronograph.
You need to set one crontab task: to call the chronograph python management command, which then runs other custom management commands, based on an admin-tweakable schedule
The problem you're describing is best solved using cron, not Django directly. Since it seems that you need to store data about your ftp uploads in your database (using Django to access it for logs or graphs or whatever), you can make a python script that uses Django which runs via cron.
James Bennett wrote a great article on how to do this which you can read in full here: http://www.b-list.org/weblog/2007/sep/22/standalone-django-scripts/
The main gist of it is that, you can write standalone django scripts that cron can launch and run periodically, and these scripts can fully utilize your Django database, models, and anything else they want to. This gives you the flexibility to run whatever code you need and populate your database, while not trying to make Django do something it wasn't meant to do (Django is a web framework, and is event-driven, not time-driven).
Best of luck!

How do you deploy cron jobs to production?

How do people deploy/version control cronjobs to production? I'm more curious about conventions/standards people use than any particular solution, but I happen to be using git for revision control, and the cronjob is running a python/django script.
If you are using Fabric for deploment you could add a function that edits your crontab.
def add_cronjob():
run('crontab -l > /tmp/crondump')
run('echo "#daily /path/to/dostuff.sh 2> /dev/null" >> /tmp/crondump')
run('crontab /tmp/crondump')
This would append a job to your crontab (disclaimer: totally untested and not very idempotent).
Save the crontab to a tempfile.
Append a line to the tmpfile.
Write the crontab back.
This is propably not exactly what you want to do but along those lines you could think about checking the crontab into git and overwrite it on the server with every deploy. (if there's a dedicated user for your project.)
Using Fabric, I prefer to keep a pristine version of my crontab locally, that way I know exactly what is on production and can easily edit entries in addition to adding them.
The fabric script I use looks something like this (some code redacted e.g. taking care of backups):
def deploy_crontab():
put('crontab', '/tmp/crontab')
sudo('crontab < /tmp/crontab')
You can also take a look at:
http://django-fab-deploy.readthedocs.org/en/0.7.5/_modules/fab_deploy/crontab.html#crontab_update
django-fab-deploy module has a number of convenient scripts including crontab_set and crontab_update
You can probably use something like CFEngine/Chef for deployment (it can deploy everything - including cron jobs)
However, if you ask this question - it could be that you have many production servers each running large number of scheduled jobs.
If this is the case, you probably want a tool that can not only deploy jobs, but also track success failure, allow you to easily look at logs from the last run, run statistics, allow you to easily change the schedule for many jobs and servers at once (due to planned maintenance...) etc.
I use a commercial tool called "UC4". I don't really recommend it, so I hope you can find a better program that can solve the same problem. I'm just saying that administration of jobs doesn't end when you deploy them.
There are really 3 options of manually deploying a crontab if you cannot connect your system up to a configuration management system like cfengine/puppet.
You could simply use crontab -u user -e but you run the risk of someone having an error in their copy/paste.
You could also copy the file into the cron directory but there is no syntax checking for the file and in linux you must run touch /var/spool/cron in order for crond to pickup the changes.
Note Everyone will forget the touch command at some point.
In my experience this method is my favorite manual way of deploying a crontab.
diff /var/spool/cron/<user> /var/tmp/<user>.new
crontab -u <user> /var/tmp/<user>.new
I think the method I mentioned above is the best because you don't run the risk of copy/paste errors which helps you maintain consistency with your version controlled file. It performs syntax checking of the cron tasks inside of the file, and you won't need to perform the touch command as you would if you were to simply copy the file.
Having your project under version control, including your crontab.txt, is what I prefer. Then, with Fabric, it is as simple as this:
#task
def crontab():
run('crontab deployment/crontab.txt')
This will install the contents of deployment/crontab.txt to the crontab of the user you connect to the server. If you dont have your complete project on the server, you'd want to put the crontab file first.
If you're using Django, take a look at the jobs system from django-command-extensions.
The benefits are that you can keep your jobs inside your project structure, with version control, write everything in Python and configure crontab only once.
I use Buildout to manage my Django projects. With Buildout, I use z3c.recipe.usercrontab to install cron jobs in deploy or update.
You said:
I'm more curious about conventions/standards people use than any particular solution
But, to be fair, the particular solution will depend in your environment and there is no universal elegant silver bullet. Given that you happen to be using Python/Django, I recommend Celery. It is an asynchronous task queue for Python, which integrates nicely with Django. And, on top of the features that it gives as an asynchronous task queue, it also has specific features for periodic tasks.
I have personally used the django-celery-beat integration and it integrates perfectly with Django settings and behaves correctly in distributed environments. If your periodic tasks are related to Django stuff, I strongly recommend to take a look at Celery I started using it only for certain asynchronous mailing and ended up using it for a lot of asynchronous tasks + periodic sanity checks and other web application maintenance stuff.