Show management commands in admin site - django

Maybe I am just naïve in my expectations, but I cannot find any simple configuration or app that would allow running a Django project's management commands from the admin interface.
Surely allowing commands to be executed remotely without having to shell to the machine is a pretty common thing?
Do you always have to implement it yourself? If so, how do you add to the existing admin site without replacing it entirely?

Management commands are fundamentally different things than admin views views. Management commands take arguments on the command line, can ask for user input interactively, etc. They can also run in situations where the web app can't run successfully (for example running migrations). That can be long, expensive tasks. Or they can be things that you want to run outside of normal user/admin interaction, such as scheduled tasks from from a cron job.
In a way, management commands operate at a lower level than the admin app.

Related

What is usecase of Django Management script?

I heard of about Django Custom Management script in lots of community and i am not getting why should i write custom management script as django manage.py a lot of useful command? I am very new in django...
Can anyone tell me some usecase of custom management script?
Thanks
The documentation on Writing custom django-admin commands mentions:
Applications can register their own actions with manage.py. For example, you might want to add a manage.py action for a Django app that you’re distributing
Usually those commands are used for "maintenance" activities. Or to give administrators more convenient tooling.
For example, you can use django-unused-media [GitHub]. This will add an extra admin command, that you can use like:
./manage.py cleanup_unused_media --noinput
This will then remove media files that are no longer referenced. You can then make a cronjob that for example each week calls that command to clean up media files. It thus here can help you to reduce disk space usage, without having to interfere yourself.
Another use case is to make management of your server more convenient. In fact makemigrations [GitHub], migrate [GitHub], makemessages [GitHub] and runserver [GitHub] are managment commands as well. So you probably use these commands all the time.
You put a certain amount of logic into these that is useful, and that you want to repeat several times. So instead of defining migration files yourself each time you change your model, you can implement the logic once to write such commands, and then the developer/administrator, can run a convenient script to do some work.

Deploying Django as standalone internal app?

I'm developing an tool using Django for internal use at my organization. It's used to search and tag documents (using Haystack and Solr), and will be employed on different projects. My team currently has a working prototype and we want to deploy it 'in the wild.'
Our security environment is strict. Project documents are located on subfolders on a network drive, and access to these folders is restricted based on users' Windows credentials (we also have an MS SQL server that uses the same credentials). A user can only access the projects they are involved in. Since we're an exclusively Microsoft shop, if we want to deploy our app on the company intranet, we'll need to use an IIS server to deal with these permissions. No one on the team has the requisite knowledge to work with IIS, Active Directory, and our IT department is already over-extended. In short, we're not web developers and we don't have immediate access to anybody experienced.
My hacky solution is to forgo IIS entirely and have each end user run a lightweight server locally (namely, CherryPy) while each retaining access to a common project-specific database (e.g. a SQLite DB living on the network drive or a DB on the MS SQL server). In order to use the tool, they would just launch an all-in-one batch script and point their browser to 127.0.0.1:8000. I recognize how ugly this is, but I feel like it leverages the security measures already in place (note that never expect more than 10 simultaneous users on a given project). Is this a terrible idea, and if so, what's a better solution?
I've dealt with a similar situation (primary development was geared toward a normal deployment situation, but some users have a requirement to use the application on a standalone workstation). Rather than deploy web and db servers on a standalone workstation, I just run the app with the Django internal development server and a SQLite DB. I didn't use CherryPy, but hopefully this is somewhat useful to you.
My current solution makes a nice executable for users not familiar with the command line (who also have trouble remembering the URL to put in their browser) but is also relatively easy development:
Use PyInstaller to package up the Django app into single executable. Once you figure this out, don't continue to do it by hand, add it to your continuous integration system (or at least write a script).
Modify the manage.py to:
Detect if the app is frozen by PyInstaller and there are no arguments (i.e.: user executed it by double clicking it) and if so, then run execute_from_command_line(..) with arguments to start the Django development server.
Right before running the execute_from_command_line(..), pop off a thread that does a time.sleep(2) (to let the development server come up fully) and then webbrowser.open_new("http://127.0.0.1:8000").
Modify the app's settings.py to detect if frozen and change things around such as the path to the DB server, enabling the development server, etc.
A couple additional notes.
If you go with SQLite, Windows file locking on network shares may not be adequate if you have concurrent writing to the DB; concurrent readers should be fine. Additionally, since you'll have different DB files for different projects you'll have to figure out a way for the user to indicate which file to use. Maybe prompt in app, or build the same app multiple times with different settings.py files. Variety of a ways to hit this nail...
If you go with MSSQL (or any client/server DB), the app will have to know the DB credentials (which means they could be extracted by a knowledgable user). This presents a security risk that may not be acceptable. Basically, don't try to have the only layer of security within the app that the user is executing. The DB credentials used by the app that a user is executing should only have the access that the user is allowed.

Django custom management commands from admin interface

I asked a previous question getting a django command to run on a schedule. I got a solution for that question, but I still want to get my commands to run from the admin interface. The obstacle I'm hitting is that my custom management commands aren't getting recognized once I get to the admin interface.
I traced this back to the __init__.py file of the django/core/management utility. There seems to be some strange behavior going on. When the server first comes up, a dictionary variable _commands is populated with the core commands (from django/core/management/commands). Custom management commands from all of the installed apps are also pushed into the _commands variable for an overall dictionary of all management commands.
Somehow, though between when the server starts and when django-chronograph goes to run the job from the admin interface, the _commands variable loses the custom commands; the only commands in the dictionary are the core commands. I'm not sure why this is. Could it be a path issue? Am I missing some setting? Is it a django-chronograph specific problem? So forget scheduling. How might I run a custom management command from the django admin graphical interface to prove that it can indeed be done? Or rather, how can I make sure that custom management commands available from said interface?
I'm also using django-chronograph
and for me it works fine. I did also run once into the problem that my custom commands were not regognized by the auto-discovery feature. I think the first reason was because the custom command had an error in it. Thus it might be an idea to check whether your custom commands run without problems from the command line.
The second reason was indeed some strange path issue. I might check back with my hosting provider to provide you with a solution. Will post back to you in a few days..
i am the "unix-guy" mentioned above by tom tom.
as far as i remember there were some issues in the cronograph code itself, so it would be a good idea to use the code tom tom posted in the comments.
where on the filesystem is django-cronograph stored (in you app-folder, in an extra "lib-folder" or in your site-packages?
when you have it in site-packages or another folder that is in your "global pythonpath" pathing should be no issue.
the cron-process itself DOES NOT USE THE SAME pythonpath, as your django app. remember: you start the cron-process via your crontab - right? so there are 2 different process who do not "know" each other: the cron-process AND the django-process (initialized by the webserver) so i would suggest to call the following script via crontab and export pythonpath again:
#!/bin/bash
PYTHONPATH=/path/to/libs:/path/to/project_root:/path/to/other/libs/used/in/project
export PYTHONPATH
python /path/to/project/manage.py cron
so the cron-started-process has the same pythonpath-information as your project.
greez from vienna/austria
berni

Rather than using crontab, can Django execute something automatically at a predefined time

How to make Django execute something automatically at a particular time.?
For example, my django application has to ftp upload to remote servers at pre defined times. The ftp server addresses, usernames, passwords, time, day and frequency has been defined in a django model.
I want to run a file upload automatically based on the values stored in the model.
One way to do is to write a python script and add it to the crontab. This script runs every minute and keeps an eye on the time values defined in the model.
Other thing that I can roughly think of is maybe django signals. I'm not sure if they can handle this issue. Is there a way to generate signals at predefined times (Haven't read indepth about them yet).
Just for the record - there is also celery which allows to schedule messages for the future dispatch. It's, however, a different beast than cron, as it requires/uses RabbitMQ and is meant for message queues.
I have been thinking about this recently and have found django-cron which seems as though it would do what you want.
Edit: Also if you are not specifically looking for Django based solution, I have recently used scheduler.py, which is a small single file script which works well and is simple to use.
I've had really good experiences with django-chronograph.
You need to set one crontab task: to call the chronograph python management command, which then runs other custom management commands, based on an admin-tweakable schedule
The problem you're describing is best solved using cron, not Django directly. Since it seems that you need to store data about your ftp uploads in your database (using Django to access it for logs or graphs or whatever), you can make a python script that uses Django which runs via cron.
James Bennett wrote a great article on how to do this which you can read in full here: http://www.b-list.org/weblog/2007/sep/22/standalone-django-scripts/
The main gist of it is that, you can write standalone django scripts that cron can launch and run periodically, and these scripts can fully utilize your Django database, models, and anything else they want to. This gives you the flexibility to run whatever code you need and populate your database, while not trying to make Django do something it wasn't meant to do (Django is a web framework, and is event-driven, not time-driven).
Best of luck!

How do you deploy cron jobs to production?

How do people deploy/version control cronjobs to production? I'm more curious about conventions/standards people use than any particular solution, but I happen to be using git for revision control, and the cronjob is running a python/django script.
If you are using Fabric for deploment you could add a function that edits your crontab.
def add_cronjob():
run('crontab -l > /tmp/crondump')
run('echo "#daily /path/to/dostuff.sh 2> /dev/null" >> /tmp/crondump')
run('crontab /tmp/crondump')
This would append a job to your crontab (disclaimer: totally untested and not very idempotent).
Save the crontab to a tempfile.
Append a line to the tmpfile.
Write the crontab back.
This is propably not exactly what you want to do but along those lines you could think about checking the crontab into git and overwrite it on the server with every deploy. (if there's a dedicated user for your project.)
Using Fabric, I prefer to keep a pristine version of my crontab locally, that way I know exactly what is on production and can easily edit entries in addition to adding them.
The fabric script I use looks something like this (some code redacted e.g. taking care of backups):
def deploy_crontab():
put('crontab', '/tmp/crontab')
sudo('crontab < /tmp/crontab')
You can also take a look at:
http://django-fab-deploy.readthedocs.org/en/0.7.5/_modules/fab_deploy/crontab.html#crontab_update
django-fab-deploy module has a number of convenient scripts including crontab_set and crontab_update
You can probably use something like CFEngine/Chef for deployment (it can deploy everything - including cron jobs)
However, if you ask this question - it could be that you have many production servers each running large number of scheduled jobs.
If this is the case, you probably want a tool that can not only deploy jobs, but also track success failure, allow you to easily look at logs from the last run, run statistics, allow you to easily change the schedule for many jobs and servers at once (due to planned maintenance...) etc.
I use a commercial tool called "UC4". I don't really recommend it, so I hope you can find a better program that can solve the same problem. I'm just saying that administration of jobs doesn't end when you deploy them.
There are really 3 options of manually deploying a crontab if you cannot connect your system up to a configuration management system like cfengine/puppet.
You could simply use crontab -u user -e but you run the risk of someone having an error in their copy/paste.
You could also copy the file into the cron directory but there is no syntax checking for the file and in linux you must run touch /var/spool/cron in order for crond to pickup the changes.
Note Everyone will forget the touch command at some point.
In my experience this method is my favorite manual way of deploying a crontab.
diff /var/spool/cron/<user> /var/tmp/<user>.new
crontab -u <user> /var/tmp/<user>.new
I think the method I mentioned above is the best because you don't run the risk of copy/paste errors which helps you maintain consistency with your version controlled file. It performs syntax checking of the cron tasks inside of the file, and you won't need to perform the touch command as you would if you were to simply copy the file.
Having your project under version control, including your crontab.txt, is what I prefer. Then, with Fabric, it is as simple as this:
#task
def crontab():
run('crontab deployment/crontab.txt')
This will install the contents of deployment/crontab.txt to the crontab of the user you connect to the server. If you dont have your complete project on the server, you'd want to put the crontab file first.
If you're using Django, take a look at the jobs system from django-command-extensions.
The benefits are that you can keep your jobs inside your project structure, with version control, write everything in Python and configure crontab only once.
I use Buildout to manage my Django projects. With Buildout, I use z3c.recipe.usercrontab to install cron jobs in deploy or update.
You said:
I'm more curious about conventions/standards people use than any particular solution
But, to be fair, the particular solution will depend in your environment and there is no universal elegant silver bullet. Given that you happen to be using Python/Django, I recommend Celery. It is an asynchronous task queue for Python, which integrates nicely with Django. And, on top of the features that it gives as an asynchronous task queue, it also has specific features for periodic tasks.
I have personally used the django-celery-beat integration and it integrates perfectly with Django settings and behaves correctly in distributed environments. If your periodic tasks are related to Django stuff, I strongly recommend to take a look at Celery I started using it only for certain asynchronous mailing and ended up using it for a lot of asynchronous tasks + periodic sanity checks and other web application maintenance stuff.