Migrate after removing ForeignKey to third-party model - django

I'm in the process of removing some dead code in a project and I have the opportunity to remove a dependency on a third-party app that we've been using since we started the project. One of our models has a ForeignKey to a model in the third-party app and I'm running into trouble when trying to apply migrations on a fresh instance of the project.
An example model:
from django.db import models
from thirdparty.models import ThirdPartyModel
class MyModel(models.Model):
fk = models.ForeignKey(ThirdPartyModel)
Removing MyModel.fk is detected by South and the migration is created successfully. Applying the migration and rolling it back works too. I can now remove thirdparty from INSTALLED_APPS and commit the changes (the new migration and settings.py).
The trouble starts when I clone the repository on another machine. ./manage.py syncdb runs as expected and creates all the tables not managed by South but ./manage.py migrate myapp fails when creating the table for (an early version of) MyModel because the foreign key to thirdparty_thirdpartymodel cannot be created (as expected because thirdparty is no longer in INSTALLED_APPS so none of the tables are created).
Are there standard ways to handle removing external dependencies? Is this an appropriate time to reset my migrations?

This is an old question but it is still valid and even independent of South and would also be an issue with Django Migrations.
You should take care that the migration files are separated in a way that you can fake the migrations that depend on the non-existing app (removed from INSTALLED_APPS). That way, you would go about creating new installations by faking those migrations, and actually running those migrations on existing installations.
Of course, if you have the possibility to start over (like a complete relaunch), you could wipe your database, remove all existing migration files and simply create completely new migrations. All other developers would have to drop their databases, as well.
In case you have existing production data and still want to start from scratch, there are different possibilities on how to transfer the data.
Which way is best depends on how much data there is, how much the structure has changed etc:
plain SQL (change the DB by hand, after running the new migrations, by transferring the data from old tables into new and dropping the tables, and foreign keys etc.)
fixtures (dump the data via Django in the old system and change the JSON to fit the new structure)
two parallel installations of the old and new system and transfer via Django/Python scripts (slower than plain SQL but you can make use of the Django Model logic, apply validation checks, transformations etc. in a more comfortable way).
Of course, don't do this in production but somewhere else and simply apply the result.

Related

Best way to update Doctrine migrations tables

Overview
I have an application that uses Doctrine migrations. The application is a software code base used to manage multiple businesses who all use the same code instance in their own unique environments. i.e. Each code base is identical other than configurations.
Mistakes we made
One of the mistakes the dev team made was to include core migrations in the customer folders. i.e. With each new application we have to drop in the migration files or run a migration:diff to get going which I feel is not efficient and can lead to a mess.
What I would prefer is to have core migrations as part of the core code since it rarely changes and custom migrations on the client app side.
What I want to do
I want to move all core structure migrations to our code files
I want to have a single custom migration to drop in customised data in the client migration folder.
The problem
The problem I face is pretty much how to reorganize the migrations without breaking databases on existing client applications.
I have thought of two solutions:
Solution 1:
Add blank migrations as a placeholder for the new migrations I want.
Commit these to the repo and deploy to our environments.
They will be run, nothing will be changed, the migraitons table will store them as having been executed.
Next, Update the blank migrations to the actual code I want, and empty all other migration files. Commit this to the environments.
Finally - remove the unwanted migration files, remove the unwanted database migration records.
Solution 2
Change the migration location in the db to a new location
Remove all migration files and add blank migrations for the new ones I want
Commit this to the repo, allow to run and record the migrations as being run in the new table.
Add migration code.
Now all new applications will have the updated migration files and the old apps will have the new migration files...
Question:
Am I re-inventing the wheel? Is there a standard on how to do this as I am certain I am not the first to bump into this problem?
So for anyone who finds themselves in a similar position where they need to tidy up a mess of doctrine migrations, this should serve as a simple pattern to follow.
In our development environment we use continuous integration/git/kubernetes etc. and the following process works well with our environment.
The steps:
Update the migrations table name, this you can do in the configs quite easily.
'table_storage' => [
'table_name' => 'migration_version',
'version_column_name' => 'version_timestamp',
],
Next, delete your old migrations (delete the files) and run migrations:diff to generate a new one which will be a combination of all your changes.
Now comment out the code in the new file so that it's essentially an empty migration file.
On local, delete the old migrations table and run your build process which will add the new migration to the new table.
Commit to develop/staging/live etc. and repeat the process.
Now that the db in all your environments has the updated migrations file in it. You can now uncomments the code which will not be executed when you commit the file since it exists in your migrations table.
Hope this helps someone!

Techniques to Avoid Problems with Django Migrations?

I'm building an e-commerce website with Django 1.8 and PostgreSQL 9.4. I'm interested in learning what techniques I can use when I change my database to avoid having problems with Django migrations, particularly in the event that I can't get migrations to run and I have to delete my migrations, rebuild my database, and restore from backups.
In development, I've found that when I change my database schema and re-run migrations, they only run successfully about 50% of the time. Clearly I'm doing some things wrong. What's worse is that when migrations don't work, it's not always easy to understand exactly why they failed and how to modify my migration files so that they will run. In those situations, I always have to delete the database and start over. This is acceptable in development but it's not a good strategy when I go into production.
What are some "best practices" or "do's and don'ts" you follow when you modify your model classes/database schema so as to increase the probability that your Django migrations will run? And are there any steps you take to ensure that you can restore your database in the event that your migrations won't run and you have to rebuild the database from scratch? I should add that I'm a one-person startup so I don't have the conflict issues that a team working from the same code base would have.
These Techniques are what I'm using
Work locally in the same environment what I'm working in it on server. Same version of the Django and database server then push the migrations itself, don't ignore it, and migrate on the server using there migrations.
This one I used once that I migrate manually, I created the tables, indices, relations using sql commands manually and it worked properly too.
I prefer the first one more

Recreate the tables for a single Django 1.7 app

Many moons ago I used commands like ./manage.py reset appname to DROP and then recreate the database tables for a single App. This was handy for when other developers had inadvertently but manually broken something in the database and you wanted to reset things back without affecting other apps (or needing to go through a lengthy dump/load process).
The advent of Django 1.7 and its builtin migrations support seems to have removed and renamed a lot of these commands and I'm going crosseyed with all the shared prefixes in the documentation. Can somebody spell this out for me?
How do I reset the tables for a single application (one with migrations)?
If your Django migration subsystem is not broken in itself, the normal way to reset an app is to run manage.py migrate <app> zero.
This will run all of the app's migrations backwards, so a few things are noteworthy:
if some of the app's migrations are not reversible, the process will fail. Should not happen normally as Django only creates reversible migrations. You can build irreversible ones yourself, though - usually when you create data migrations.
if some other app has a dependency on this app, it will also be migrated backwards up to the last migration that did not depend on it.
You can then run migrate again, so it is run forwards.
In any case, remember migrations introduce a risk for your data, so backup your database before touching anything.

shared DB across django projects

Our product has a restful API and a server rendered app (the CMS). Both share the database. Both are written in django
The fields and the models needed in both are not mutually exclusive, there are some only particular to the API, some particular to the CMS, and some which are common.
My question is if I run migrations on one of the repos will they try to drop the fields that aren't present in the models of that particular repo, and needed by the other. Will running the migrations individually in both repos keep the database up to date and not pose a problem.
The only other valid option IMHO (besides merging projects) is turning off automation of Django migrations on common models (Meta.managed = False) and taking table creation & versioning into your own hands. You still can write migration scripts using django.db.migrations but makemigrations command won't do anything for these tables.
This was solved by using a schema migration tool external to Django's own. We use
yoyo migrations to migrate our schema now.
Will running the migrations individually in both repos keep the database up to
date and not pose a problem.
Unfortunately, no. As you suspected, changes in one will attempt to override the other.
The easiest thing to do is merge the two projects into one so this problem goes away entirely.
If this isn't an option, can the code be organised in such a way that both projects share the same models.py files? You could do this by perhaps having the models.py files and migrations folders only exist in one project. The second project could have a symlink across to each models.py file it uses. The trick (and the difficult part) will be to make sure you never create migrations for the app which uses the symlinks.
I think the best things to do would be to have one repo that contains all the fields. This project will be responsible to apply the migrations.
In the other projects, you'll need a db_router containing a function allow_migrate which will return False on your model classes.
Also having different db user with different db permissions can prevent from altering the tables.

Is there a way to update the database with the changes in my models? [duplicate]

This question already has answers here:
Closed 11 years ago.
Possible Duplicate:
update django database to reflect changes in existing models
I've used Django in the past and one of the frustrations I've had with it as an ORM tools is the inability to update an existing database with changes in the model. (Hibernate does this very well and makes things really easy for updating and heavily modifying a model and applying this to an existing database.) Is there a way to do this without wiping the database every time? It gets really old having to regenerate admin users and sites after every change in the model which I'd like to play with.
You will want to look into South. It provides a migrations system to migrate both schema changes as well as data from one version to the next.
It's quite powerful and the vast majority of changes can be handled simple by going
manage.py schemamigration --auto
manage.py migrate
The auto functionality does have it limits, and especially if the change is going to be run on a production system eventually you should check the code --auto generated to be sure it's doing what you expect.
South has a great guide to getting started and is well documented. You can find it at http://south.aeracode.org
No.
As the documentation of syncdb command states:
Syncdb will not alter existing tables
syncdb will only create tables
for models which have not yet been installed. It will never issue
ALTER TABLE statements to match changes made to a model class after
installation. Changes to model classes and database schemas often
involve some form of ambiguity and, in those cases, Django would have
to guess at the correct changes to make. There is a risk that critical
data would be lost in the process.
If you have made changes to a model and wish to alter the database
tables to match, use the sql command to display the new SQL structure
and compare that to your existing table schema to work out the
changes.
South seems to be how most people solve this problem, but a really quick and easy way to do this is to change the db directly through your database's interactive shell. Just launch your db shell (usually just dbshell) and manually alter, add, drop the fields and tables you need changed using your db syntax.
You may want to run manage.py sqlall appname to see the sql statements Django would run if it was creating the updated table, and then use those to alter the database tables and fields as required.
The Making Changes to a Database Schema section of the Django book has a few examples of how to do this: http://www.djangobook.com/en/1.0/chapter05/
I manually go into the database - whatever that may be for you: MySQL, PostgreSQL, etc. - to change database info, and then I adjust the models.py accordingly for reference. I know there is Django South, but I didn't want to bother with using another 3rd party application.