Django define procedure - django

I would like to define a few PostgreSQL stored procedures in Django for later use, but I cannot find any support from Django to achieve this. I can do this by writing separate python script or SQL file, however I prefer to use Django migration feature. Anyone has done this before?

I have never used stored procedures with Django migration.
However, I am wondering: does it make sense? A stored procedure is not going to change your database schema in most of cases, therefore from my point of view you don't need to make a migration for it

Related

Will I lose my data if I update Django version?

So I am developing a django project in an AWS virtual env. To use a package, I need a newer version of Django, but I already have a lot of important data stored in Django's database.
My question is: Will updating the Django version mid-development compromise the data I already have in the database?
I apologize if the question seems stupid, I just really don't mess anything up.
Thanks in advance
The database Django uses is a separate thing (e.g. PostgreSQL, MySQL...), independent from it. Django only interacts with it to write and read data.
Updating Django to a new version might break something in your code if it uses old Django features that have been removed, but it won't affect your database.
Nevertheless, it's always a good idea to backup everything before crucial updates.

How to change values in database at a specific time

I'm trying to change values of a column in a table in a database at a specified time without having to do it manually. Is there a way to achieve this? If yes, wouldn't mind an example or something.
Thanks!! :)
ps. I'm using django with sqlite (using sqlite just because it comes with django as a default and I'm still learning django)
there are two ways to do
Install plugin for django scheduled tasks:django-chronograph.It should be really simple to create your schedule.
You will have to specify the linux command ./manage.py dbshell. Then use your query as parameter
Command reference: django-admin

What is a good way to manage PL/pgSQL functions in a Django project?

In a Django project, I am using several custom PL/pgSQL functions. So far I have used migrations to add these to the database. However, I think this is not the best way to do it, especially if you need to add some changes now and then. What do you consider as best way to oganize your database functions in a Django environment?
Maybe as fixtures? Maybe as custom handler to the post_sync signal?

Is there a way to update the database with the changes in my models? [duplicate]

This question already has answers here:
Closed 11 years ago.
Possible Duplicate:
update django database to reflect changes in existing models
I've used Django in the past and one of the frustrations I've had with it as an ORM tools is the inability to update an existing database with changes in the model. (Hibernate does this very well and makes things really easy for updating and heavily modifying a model and applying this to an existing database.) Is there a way to do this without wiping the database every time? It gets really old having to regenerate admin users and sites after every change in the model which I'd like to play with.
You will want to look into South. It provides a migrations system to migrate both schema changes as well as data from one version to the next.
It's quite powerful and the vast majority of changes can be handled simple by going
manage.py schemamigration --auto
manage.py migrate
The auto functionality does have it limits, and especially if the change is going to be run on a production system eventually you should check the code --auto generated to be sure it's doing what you expect.
South has a great guide to getting started and is well documented. You can find it at http://south.aeracode.org
No.
As the documentation of syncdb command states:
Syncdb will not alter existing tables
syncdb will only create tables
for models which have not yet been installed. It will never issue
ALTER TABLE statements to match changes made to a model class after
installation. Changes to model classes and database schemas often
involve some form of ambiguity and, in those cases, Django would have
to guess at the correct changes to make. There is a risk that critical
data would be lost in the process.
If you have made changes to a model and wish to alter the database
tables to match, use the sql command to display the new SQL structure
and compare that to your existing table schema to work out the
changes.
South seems to be how most people solve this problem, but a really quick and easy way to do this is to change the db directly through your database's interactive shell. Just launch your db shell (usually just dbshell) and manually alter, add, drop the fields and tables you need changed using your db syntax.
You may want to run manage.py sqlall appname to see the sql statements Django would run if it was creating the updated table, and then use those to alter the database tables and fields as required.
The Making Changes to a Database Schema section of the Django book has a few examples of how to do this: http://www.djangobook.com/en/1.0/chapter05/
I manually go into the database - whatever that may be for you: MySQL, PostgreSQL, etc. - to change database info, and then I adjust the models.py accordingly for reference. I know there is Django South, but I didn't want to bother with using another 3rd party application.

How do I test a Django site that uses UDFs in the database?

I have a django project that uses a postgres db with a number of UDFs. The UDFs are written in plpythonu. Since plpyhtonu is an untrusted language, only database super users can use it to to create UDFs. This poses a serious problem in that I have no way of programmatically creating them within the test suite. Right now I see two options:
Modify django.db.backends.creation._create_test_db to create the test db from a template, which has my UDFs already loaded. This seems hacky and laborious to maintain.
Create a super user with MD5 authentication and load the UDFs in the test suite with psycopg2. This seems unsecure.
Are there less terrible ways I can do this?
Thanks.
I don't know the PG UDF model, only the MySQL and SQLite ones. A few other work-arounds might be:
Create a PG instance which you use just for testing, isolated so that potential security problems won't be a concern. Databases are cheap after all.
If the UDFs are simple (or the test data size makes them simple) then recreate them in SQLite and use that database for testing. This assumes that you don't need to test the plpython functionality as part of the project.
If the UDF functionality is much more stable than the rest of the project, then split the UDF code into its own subproject, and for the rest of the code assume the UDFs will be present. At the very least this will make it so most of the code can be automated, and only the UDF code needs manual intervention to specify the password.
I went with #1. It's not ideal but it works ok.