How to (intentionally) skip an app with Django syncdb - django

I have several django applications:
INSTALLED_APPS = (
'geonode.exposure',
'geonode.isc_viewer',
'geonode.geodetic',
'geonode.observations',
'geonode.ged4gem',
I need to manage all of them except one with syncdb.
How can I get syncdb to intentionally skip the geonode.exposure application?
Update:
I did not describe the full configuration, please allow me to go into more detail:
I am using south to manage db migrations and fixtures for all the apps except exposure.
The exposure app is accessing an external database and is using a router to do so (this is why I want it to be skipped by syncdb).
My router settings look like this:
class GedRouter(object):
def db_for_read(self, model, **hints):
"Point all operations on ged models to 'geddb'"
if model._meta.app_label == 'exposure':
return 'geddb'
return 'default'
def allow_syncdb(self, db, model):
if db == 'geddb' or model._meta.app_label == "ged":
return False # we're not using syncdb on our legacy database
else: # but all other models/databases are fine
return True
Is south not respecting the allow_syncdb method? is south running syncbd on the exposure app because I do not have a migration for it?

You can use managed = False in the model's Meta class. This way, syncdb won't create the app's tables. More information on the documentation.

There is a model meta option "managed", for more info check django documentation:
https://docs.djangoproject.com/en/dev/ref/models/options/#managed

Ok, this is not what your asking directly, but please consider using South http://south.aeracode.org
You can decided which apps to include which version of the model to migrate etc. Sounds like you need a solution here.

Related

What django do when allow_migrate() on a database router returns false?

I'm writing a DB router to control what should be or not applied on the DB, so, implemented allow_migrate().
Here is what I did:
def allow_migrate(self, db, app_label, model_name=None, **hints):
if app_label == 'django_q':
return db == 'tasks'
elif app_label != 'django_q':
return db == 'default'
I was expecting that when a router returns False, then an app wouldn't be applied on the current DB at all, it happened somehow, no tables were created, but, the entire app migration list is added to django_migrations table at the DB.
I was expecting that when allow_migrate() return False, then Django won't add anything to current DB about this app, but if this behavior is the expected, then, this is the same as migrate --fake, right? if this so, I prefer doing it manually for the other DB.
I have done some research to see what is actually done when allow_migrate() returns False but got no luck.
Thanks in advance.

Django migrations with multiple databases with multiple apps

I have a django projects that has 2 apps and each app runs on a different DB (lets call them default and DB2).
I have created a router to manage the data and added the router to my settings.py
When I run migrate I get Applying analytics.0001_initial... OK but in the default DB the only thing that gets updated is the django_migrations table showing that the app analytics has been migrated with 0001, and in the DB2 the table itself isn’t even created at all.
Going to the django shell and trying to do obj.objects.all() gives me table DB2.table_name does not exist
I also verified in sql DB2 exists but doesn’t have any table from what I created
My router:
class DB2Router(object):
"""
A router for apps that connect directly to the DB2 database.
This router allows apps to directly update the DB2 database rather
than using raw SQL as in the past. The router forces the use of the DB2
database for any app in the "apps" list.
"""
db = "DB2"
apps = ["analytics"]
def db_for_read(self, model, **hints):
if model._meta.app_label in self.apps:
return self.db
return None
def db_for_write(self, model, **hints):
if model._meta.app_label in self.apps:
return self.db
return None
def allow_relation(self, obj1, obj2, **hints):
# Allow any relation between two models that are both in the same app.
if (obj1._meta.app_label in self.apps) and (obj2._meta.app_label in self.apps):
return True
return None
def allow_migrate(self, db, app_label, model_name=None, **hints):
if app_label in self.apps:
return db == self.db
return None
My model:
class PartnerImpression(models.Model):
""" used to track impressions on widgets through our partners """
partner = models.CharField(max_length=1024)
referer = models.CharField(default="N/A", max_length=2048)
created = models.DateTimeField(auto_now_add=True, blank=True)
The migration:
class Migration(migrations.Migration):
initial = True
dependencies = [
]
operations = [
migrations.CreateModel(
name='PartnerImpression',
fields=[
('id', models.AutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
('partner', models.CharField(max_length=1024)),
('referer', models.CharField(default='N/A', max_length=2048)),
('created', models.DateTimeField(auto_now_add=True)),
],
),
]
DB2 exists and has an empty django_migrations table.
Defining the database in the manage.py migrate --database=DB2 command is not an option since I run a CI process on dozens of servers and can't run this manually on all of them so the command needs to stay without arguments.
I don't want any raw SQL in my migrations
I have also found this: Django migrations with multiple databases but I don't know if anything has changed since 2016, and also hoping for a way to get it running without the database option
By default migrate takes --database value as default (DEFAULT_DB_ALIAS), if not provided; hence applies all the migrations for that database. If you want to use what Django provides out-of-the-box, you have to use --database mentioning which database to operate on. If for some reason, you can't use that (like you mentioned), you can create a custom management command e.g. migrate_all to apply all your migrations to all databases at once.
An example approach below with only showing how the handle method can look like, you can take inspiration from migrate:
migrate_all.py:
from importlib import import_module
from django.apps import apps
from django.db import connections
class Command(BaseCommand):
help = "Updates all database schemas."
def add_arguments(self, parser):
# Add arguments if you want
pass
def handle(self, *args, **options):
# Import the 'management' module within each installed app, to register
# dispatcher events.
for app_config in apps.get_app_configs():
if module_has_submodule(app_config.module, "management"):
import_module('.management', app_config.name)
# Iterate over and operate on each of the databases
for connection in connections.databases:
# https://github.com/django/django/blob/master/django/core/management/commands/migrate.py#L84
connection.prepare_database()
...
...
After getting each connection from connections.databases, follow the operations from migrations for each.

How can I ensure only 1 model in my django app creates a migration?

Django 1.8 - I wrote a new model and I want to create a migration for it. I don't want the other models in the app to be created by my migration since they're proxies.
I tried making sure all the other models have class Meta: managed = False, this didn't stop them from showing up in my migration file.
In my db router, I tried to make use of allow_migrate but again, all the models showed up as "Created" in my migration file.
def allow_migrate(self, db, app_label, model_name=None, **hints):
if db == 'a123admin_rw' and app_label == 'article' and model_name == 'articlestat':
return True
elif db == 'a123admin_rw':
return False
return None
What should I be doing to ensure only my model gets a migration when I run makemigrations?
Thanks to the comment that was posted on here, I worked through it. Here's my final report: http://learnedandhacked.blogspot.ca/2016/02/a-story-of-migrating-new-model-in-app.html

Migrating existing auth.User data to new Django 1.5 custom user model?

I'd prefer not to destroy all the users on my site. But I want to take advantage of Django 1.5's custom pluggable user model. Here's my new user model:
class SiteUser(AbstractUser):
site = models.ForeignKey(Site, null=True)
Everything works with my new model on a new install (I've got other code, along with a good reason for doing this--all of which are irrelevant here). But if I put this on my live site and syncdb & migrate, I'll lose all my users or at least they'll be in a different, orphaned table than the new table created for my new model.
I'm familiar with South, but based on this post and some trials on my part, it seems its data migrations are not currently a fit for this specific migration. So I'm looking for some way to either make South work for this or for some non-South migration (raw SQL, dumpdata/loaddata, or otherwise) that I can run on each of my servers (Postgres 9.2) to migrate the users once the new table has been created while the old auth.User table is still in the database.
South is more than able to do this migration for you, but you need to be smart and do it in stages. Here's the step-by-step guide: (This guide presupposed you subclass AbstractUser, not AbstractBaseUser)
Before making the switch, make sure that south support is enabled in the application
that contains your custom user model (for the sake of the guide, we'll call it accounts and the model User).
At this point you should not yet have a custom user model.
$ ./manage.py schemamigration accounts --initial
Creating migrations directory at 'accounts/migrations'...
Creating __init__.py in 'accounts/migrations'...
Created 0001_initial.py.
$ ./manage.py migrate accounts [--fake if you've already syncdb'd this app]
Running migrations for accounts:
- Migrating forwards to 0001_initial.
> accounts:0001_initial
- Loading initial data for accounts.
Create a new, blank user migration in the accounts app.
$ ./manage.py schemamigration accounts --empty switch_to_custom_user
Created 0002_switch_to_custom_user.py.
Create your custom User model in the accounts app, but make sure it is defined as:
class SiteUser(AbstractUser): pass
Fill in the blank migration with the following code.
# encoding: utf-8
from south.db import db
from south.v2 import SchemaMigration
class Migration(SchemaMigration):
def forwards(self, orm):
# Fill in the destination name with the table name of your model
db.rename_table('auth_user', 'accounts_user')
db.rename_table('auth_user_groups', 'accounts_user_groups')
db.rename_table('auth_user_user_permissions', 'accounts_user_user_permissions')
def backwards(self, orm):
db.rename_table('accounts_user', 'auth_user')
db.rename_table('accounts_user_groups', 'auth_user_groups')
db.rename_table('accounts_user_user_permissions', 'auth_user_user_permissions')
models = { ....... } # Leave this alone
Run the migration
$ ./manage.py migrate accounts
- Migrating forwards to 0002_switch_to_custom_user.
> accounts:0002_switch_to_custom_user
- Loading initial data for accounts.
Make any changes to your user model now.
# settings.py
AUTH_USER_MODEL = 'accounts.User'
# accounts/models.py
class SiteUser(AbstractUser):
site = models.ForeignKey(Site, null=True)
create and run migrations for this change
$ ./manage.py schemamigration accounts --auto
+ Added field site on accounts.User
Created 0003_auto__add_field_user_site.py.
$ ./manage.py migrate accounts
- Migrating forwards to 0003_auto__add_field_user_site.
> accounts:0003_auto__add_field_user_site
- Loading initial data for accounts.
Honestly, If you already have good knowledge of your setup and already use south, It should be as simple as adding the following migration to your accounts module.
# encoding: utf-8
from south.db import db
from south.v2 import SchemaMigration
from django.db import models
class Migration(SchemaMigration):
def forwards(self, orm):
# Fill in the destination name with the table name of your model
db.rename_table('auth_user', 'accounts_user')
db.rename_table('auth_user_groups', 'accounts_user_groups')
db.rename_table('auth_user_permissions', 'accounts_user_permissions')
# == YOUR CUSTOM COLUMNS ==
db.add_column('accounts_user', 'site_id',
models.ForeignKey(orm['sites.Site'], null=True, blank=False)))
def backwards(self, orm):
db.rename_table('accounts_user', 'auth_user')
db.rename_table('accounts_user_groups', 'auth_user_groups')
db.rename_table('accounts_user_user_permissions', 'auth_user_user_permissions')
# == YOUR CUSTOM COLUMNS ==
db.remove_column('accounts_user', 'site_id')
models = { ....... } # Leave this alone
EDIT 2/5/13: added rename for auth_user_group table. FKs will auto update to point at the correct table due to db constraints, but M2M fields' table names are generated from the names of the 2 end tables and will need manual updating in this manner.
EDIT 2: Thanks to #Tuttle & #pix0r for the corrections.
My incredibly lazy way of doing this:
Create a new model (User), extending AbstractUser. Within new model, in it's Meta, override db_table and set to 'auth_user'.
Create an initial migration using South.
Migrate, but fake the migration, using --fake when running migrate.
Add new fields, create migration, run it normally.
This is beyond lazy, but works. You now have a 1.5 compliant User model, which just uses the old table of users. You also have a proper migration history.
You can fix this later on with manual migrations to rename the table.
I think you've correctly identified that a migration framework like South is the right way to go here. Assuming you're using South, you should be able to use the Data Migrations functionality to port the old users to your new model.
Specifically, I would add a forwards method to copy all rows in your user table to the new table. Something along the lines of:
def forwards(self, orm):
for user in orm.User.objects.all():
new_user = SiteUser(<initialize your properties here>)
new_user.save()
You could also use the bulk_create method to speed things up.
I got tired of struggling with South so I actually ended up doing this differently and it worked out nicely for my particular situation:
First, I made it work with ./manage.py dumpdata, fixing up the dump, and then ./manage.py loaddata, which worked. Then I realized I could do basically the same thing with a single, self-contained script that only loads necessary django settings and does the serialization/deserialization directly.
Self-contained python script
## userconverter.py ##
import json
from django.conf import settings
settings.configure(
DATABASES={
# copy DATABASES configuration from your settings file here, or import it directly from your settings file (but not from django.conf.settings) or use dj_database_url
},
SITE_ID = 1, # because my custom user implicates contrib.sites (which is why it's in INSTALLED_APPS too)
INSTALLED_APPS = ['django.contrib.sites', 'django.contrib.auth', 'myapp'])
# some things you have to import after you configure the settings
from django.core import serializers
from django.contrib.auth.models import User
# this isn't optimized for huge amounts of data -- use streaming techniques rather than loads/dumps if that is your case
old_users = json.loads(serializers.serialize('json', User.objects.all()))
for user in old_users:
user['pk'] = None
user['model'] = "myapp.siteuser"
user['fields']["site"] = settings['SITE_ID']
for new_user in serializers.deserialize('json', json.dumps(old_users)):
new_user.save()
With dumpdata/loaddata
I did the following:
1) ./manage.py dumpdata auth.User
2) Script to convert auth.user data to new user. (or just manually search and replace in your favorite text editor or grep) Mine looked something like:
def convert_user_dump(filename, site_id):
file = open(filename, 'r')
contents = file.read()
file.close()
user_list = json.loads(contents)
for user in user_list:
user['pk'] = None # it will auto-increment
user['model'] = "myapp.siteuser"
user['fields']["site"] = side_id
contents = json.dumps(user_list)
file = open(filename, 'w')
file.write(contents)
file.close()
3) ./manage.py loaddata filename
4) set AUTH_USER_MODEL
*Side Note: One critical part of doing this type of migration, regardless of which technique you use (South, serialization/modification/deserialization, or otherwise) is that as soon as you set AUTH_USER_MODEL to your custom model in the current settings, django cuts you off from auth.User, even if the table still exists.*
We decided to switch to a custom user model in our Django 1.6/Django-CMS 3 project, perhaps a little bit late because we had data in our database that we didn't want to lose (some CMS pages, etc).
After we switched AUTH_USER_MODEL to our custom model, we had a lot of problems that we hadn't anticipated, because a lot of other tables had foreign keys to the old auth_user table, which wasn't deleted. So although things appeared to work on the surface, a lot of things broke underneath: publishing pages, adding images to pages, adding users, etc. because they tried to create an entry in a table that still had a foreign key to auth_user, without actually inserting a matching record into auth_user.
We found a quick and dirty way to rebuild all the tables and relations, and copy our old data across (except for users):
do a full backup of your database with mysqldump
do another backup with no CREATE TABLE statements, and excluding a few tables that won't exist after the rebuild, or will be populated by syncdb --migrate on a fresh database:
south_migrationhistory
auth_user
auth_user_groups
auth_user_user_permissions
auth_permission
django_content_types
django_site
any other tables that belong to apps that you removed from your project (you might only find this out by experimenting)
drop the database
recreate the database (e.g. manage.py syncdb --migrate)
create a dump of the empty database (to make it faster to go round this loop again)
attempt to load the data dump that you created above
if it fails to load because of a duplicate primary key or a missing table, then:
edit the dump with a text editor
remove the statements that lock, dump and unlock that table
reload the empty database dump
try to load the data dump again
repeat until the data dump loads without errors
The commands that we ran (for MySQL) were:
mysqldump <database> > ~/full-backup.sql
mysqldump <database> \
--no-create-info \
--ignore-table=<database>.south_migrationhistory \
--ignore-table=<database>.auth_user \
--ignore-table=<database>.auth_user_groups \
--ignore-table=<database>.auth_user_user_permissions \
--ignore-table=<database>.auth_permission \
--ignore-table=<database>.django_content_types \
--ignore-table=<database>.django_site \
> ~/data-backup.sql
./manage.py sqlclear
./manage.py syncdb --migrate
mysqldump <database> > ~/empty-database.sql
./manage.py dbshell < ~/data-backup.sql
(edit ~/data-backup.sql to remove data dumped from a table that no longer exists)
./manage.py dbshell < ~/empty-database.sql
./manage.py dbshell < ~/data-backup.sql
(repeat until clean)

How to load fixtures in Django south migrations properly?

I am using Django 1.5b1 and south migrations and life has generally been great. I have some schema updates which create my database, with a User table among others. I then load a fixture for ff.User (my custom user model):
def forwards(self, orm):
from django.core.management import call_command
fixture_path = "/absolute/path/to/my/fixture/load_initial_users.json"
call_command("loaddata", fixture_path)
All has been working great until I have added another field to my ff.User model, much further down the migration line. My fixture load now breaks:
DatabaseError: Problem installing fixture 'C:\<redacted>create_users.json':
Could not load ff.User(pk=1): (1054, "Unknown column 'timezone_id' in 'field list'")
Timezone is the field (ForeignKey) which I added to my user model.
The ff.User differs from what is in the database, so the Django ORM gives up with a DB error. Unfortunately, I cannot specify my model in my fixture as orm['ff.User'], which seems to be the south way of doing things.
How should I load fixtures properly using south so that they do not break once the models for which these fixtures are for gets modified?
I found a Django snippet that does the job!
https://djangosnippets.org/snippets/2897/
It load the data according to the models frozen in the fixture rather than the actual model definition in your apps code! Works perfect for me.
I proposed a solution that might interest you too:
https://stackoverflow.com/a/21631815/797941
Basicly, this is how I load my fixture:
from south.v2 import DataMigration
import json
class Migration(DataMigration):
def forwards(self, orm):
json_data=open("path/to/your/fixture.json")
items = json.load(json_data)
for item in items:
# Be carefull, this lazy line won't resolve foreign keys
obj = orm[item["model"]](**item["fields"])
obj.save()
json_data.close()
This was a frustrating part of using fixtures for me as well. My solution was to make a few helper tools. One which creates fixtures by sampling data from a database and includes South migration history in the fixtures.
There's also a tool to add South migration history to existing fixtures.
The third tool checks out the commit when this fixture was modified, loads the fixture, then checks out the most recent commit and does a south migration and dumps the migrated db back to the fixture. This is done in a separate database so your default db doesn't get stomped on.
The first two can be considered beta code, and the third please treat as usable alpha, but they're already being quite helpful to me.
Would love to get some feedback from others:
git#github.com:JivanAmara/django_fixture_tools.git
Currently, it only supports projects using git as the RCS.
The most elegant solution I've found is here where by your app model's get_model function is switched out to instead supply the model from the supplied orm. It's then set back after the fixture is applied.
from django.db import models
from django.core.management import call_command
def load_fixture(file_name, orm):
original_get_model = models.get_model
def get_model_southern_style(*args):
try:
return orm['.'.join(args)]
except:
return original_get_model(*args)
models.get_model = get_model_southern_style
call_command('loaddata', file_name)
models.get_model = original_get_model
You call it with load_fixture('my_fixture.json', orm) from within you forwards definition.
Generally South handles migrations using forwards() and backwards() functions. In your case you should either:
alter the fixtures to contain proper data, or
import fixture before migration that breaks it (or within the same migration, but before altering the schema),
In the second case, before migration adding (or, as in your case, removing) the column, you should perform the migration that will explicitly load the fixtures similarly to this (docs):
def forwards(self, orm):
from django.core.management import call_command
call_command("loaddata", "create_users.json")
I believe this is the easiest way to accomplish what you needed. Also make sure you do not do some simple mistakes like trying to import data with new structure before applying older migrations.
Reading the following two posts has helped me come up with a solution:
http://andrewingram.net/2012/dec/common-pitfalls-django-south/#be-careful-with-fixtures
http://news.ycombinator.com/item?id=4872596
Specifically, I rewrote my data migrations to use output from 'dumpscript'
I needed to modify the resulting script a bit to work with south. Instead of doing
from ff.models import User
I do
User = orm['ff.User']
This works exactly like I wanted it to. Additionally, it has the benefit of not hard-coding IDs, like fixtures require.