How to create per-project initial_data fixtures in Django 1.7+ - django

Before Django 1.7 I used to define a per-project fixtures directory in the settings:
FIXTURE_DIRS = ('myproject/fixtures',)
and use that to place my initial_data.json fixture storing the default groups essential for the whole project. This has been working well for me as I could keep the design clean by separating per-project data from app-specific data.
Now with Django 1.7, initial_data fixtures have been deprecated, suggesting to include data migrations together with app's schema migrations; leaving no obvious choice for global per-project initial data.
Moreover the new migrations framework installs all legacy initial data fixtures before executing migrations for the compliant apps (including the django.contrib.auth app). This behavior causes my fixture containing default groups to fail installation, since the auth_group table is not present in the DB yet.
Any suggestions on how to (elegantly) make fixtures run after all the migrations, or at least after the auth app migrations? Or any other ideas to solve this problem?
I find fixtures a great way for providing initial data and would like to have a simple and clean way of declaring them for automatic installation. The new RunPython is just too cumbersome and I consider it an overkill for most purposes; and it seems to be only available for per-app migrations.

If you absolutely want to use fixtures, just use RunPythonand call_command in your data migrations.
from django.db import migrations
from django.core.management import call_command
def add_data(apps, schema_editor):
call_command('loaddata', 'thefixture.json')
def remove_data(apps, schema_editor):
call_command('flush')
class Migration(migrations.Migration):
dependencies = [
('roundtable', '0001_initial'),
]
operations = [
migrations.RunPython(
add_data,
reverse_code=remove_data),
]
However this is recommanded to load data using python code and Django ORM, as you won't have to face integrity issues.
Source.

I recommend using factories instead of fixtures, they are a mess and difficult to maintain, better to use FactoryBoy with Django.

Related

Is it possible to dockerize Django flatpages?

I need to handle Django pages that don't have their own app, and the Django documentation recommends I use flatpages for one-off pages. However, these pages are stored in the database attached to Django, not the filesystem.
This is causing issues because I can't transfer data in the database the same way I can transfer files in the filesystem.
When I run the website through docker-compose, it starts with a fresh postgres image with no data. As a result, the flatpages are left behind anytime I re-run the docker container without migrating all of the data.
The only solution I can think of is simply copying over the flatpages manually every time I bring the website to a new machine. Unfortunately, this seems like a terrible solution because it adds more deployment steps and goes against the ideology of dockerizing websites.
Is there a better solution available?
The (semi-tested) solution I've found is to use a Django data migration, and run the command python manage.py migrate --noinput before starting the server via my docker-compose file.
The data migration itself makes use of the "RunPython" operation, which allows the creation of a hard-coded flatpages through python.
The rough draft of my migration looks like this:
from django.conf import settings
from django.db import migrations, models
from django.contrib.flatpages.models import FlatPage
from django.contrib.sites.models import Site
FLATPAGES = [
{
"title": "Home",
"url": "/home/",
"content": "",
"enable_comments": False,
"template_name":"home.html",
"registration_required": False
},
]
def add_flatpages_data(apps, schema_editor):
site = Site.objects.create(domain='domain.here', name='name.here')
for page_dict in FLATPAGES:
new_page = FlatPage.objects.create(
title=page_dict['title'],
url=page_dict['url'],
content=page_dict['content'],
enable_comments=page_dict['enable_comments'],
template_name=page_dict['template_name'],
registration_required=page_dict['registration_required'],
)
new_page.sites.add(site)
class Migration(migrations.Migration):
dependencies = [
]
operations = [
migrations.RunPython(
add_flatpages_data,
),
]
This isn't pretty, and I don't think it's good whatsoever, but it's the only solution I have found thus far.
As it turns out, Django's fixtures will not work for this application, as Fixtures can only be used for models defined in an app created by the developer. The only reason this is the case is that Django tries to read the fixture's model as "app.model", and thus will not accept the model "django.contrib.flatpages.FlatPage", despite it being just as valid.
I would still like improvements and alternate solutions from others, but I thought I should document my findings.

How to store third party apps migrations in django

I'm fairly new to python and django, and trying to build a simple calendar based on django-scheduler package.
According to django-scheduler docs, a custom base class can be used to add additional fields, managers and such.
So, I used an abstract model to add a new field:
#myproject/customer_calendar/models.py
from django.db import models
from main.models import Customer
class CalendarAbstract(models.Model):
customer = models.OneToOneField(to=Customer, null=True, blank=True, related_name='calendar')
class Meta:
abstract = True
And added this to settings.py
SCHEDULER_BASE_CLASSES = {
'Calendar': ['customer_calendar.models.CalendarAbstract'],
}
Now, if I use makemigrations command, a new migration is created inside scheduler app (which is located in site-packages of the current virtual env), which doesn't allow me to keep track of migrations via VCS.
I've found a couple of solutions:
1) Keep the whole scheduler app inside my project. According to SO it' s considered a bad practice and third-party apps should always be retrieved via pip.
2) Use django setting to store all django-scheduler migrations inside my calendar app
MIGRATION_MODULES = {
'schedule': 'customer_calendar.migrations',
}
The second one looks good to me, but I don't know if it's considered to be a valid solution to this problem.
Is there any other ways to store third-party apps migrations?
The second one looks good to me, but I don't know if it's considered
to be a valid solution to this problem. Is there any other ways to
store third-party apps migrations?
As also stated in this answer, FeinCMS docs recommend the use of MIGRATION_MODULES to monitor the migrations of FeinCMS as a third-party app.
FeinCMS itself does not come with any migrations. It is recommended
that you add migrations for FeinCMS models yourself inside your
project.
...
Create a new folder named migrate in your app with an empty init.py inside.
Add the following configuration to your settings.py:
MIGRATION_MODULES = {
'page': 'yourapp.migrate.page',
'medialibrary': 'yourapp.migrate.medialibrary', }
You must not use migrations as folder name for the FeinCMS migrations,
otherwise Django will get confused.

Test Django models without running migration or syncdb

EDIT: The issue described below was caused not by bad workflow but by an apparent bug when loading fixtures. One of my apps had the fixture initial_data.json. The testing framework loads the fixture before performing the necessary migrations. (FWIW, I'm using Django 1.7 + python3.4) This issue is described here. (My workaround: rename the fixture to data.json.)
I'll leave the rest of the post intact in case this helps someone else in the future.
I'm trying to use Django's built-in tests to rapidly test my Django models during development. Unfortunately, when I try this, I get the error:
psycopg2.ProgrammingError: relation "app_relation" does not exist
The workflow I was imagining was
Define a few model fields (possibly across apps)
Test logic using Django tests
Fix logic errors, modify fields, and iterate this process.
This way, I can build my models incrementally without creating a large number of migrations. Migrations create headaches for me because I frequently add, remove, and rename fields or models as I'm validating my logic.
For example, my model has demographic fields, and I'm not sure whether I should keep an male_under_18 field or split it up into male_under_5, male_5_to_9, male_10_to_15, and male_16_to_18 granularity.
It sure would be nice to verify the decision using tests.py before making my migrations.
My understanding was that Django's manage.py test myapp created a database independent of the development database, and so doesn't require that my development database match the current schema defined by my models.
If the workflow above is impossible (or downright silly), I'm open to other ways to solving my problem.
Related question: django unit tests without a db. (Doesn't work because I want to test the DB!)

Django 1.7 - makemigrations creating migration for unmanaged model

I am creating some dynamic Django models in my application and everything seems to be working as expected except for the migration system.
If I create a dynamic Django model and set managed = False, Django's makemigrations command still generates a migration for that new model. The migration looks something like this:
class Migration(migrations.Migration):
dependencies = [
('atom', '0001_initial'),
]
operations = [
migrations.CreateModel(
name='books',
fields=[
],
options={
'db_table': 'books',
'managed': False,
},
bases=(models.Model,),
),
]
If I don't create the migration, when I run python manage.py migrate, I see the following message (in big scary red letters):
Your models have changes that are not yet reflected in a migration, and so won't be applied.
Run 'manage.py makemigrations' to make new migrations, and then re-run 'manage.py migrate' to apply them.
Is there a way to tell the migrations system in Django 1.7 to ignore unmanaged models all together? or perhaps a migrations = False setting in the Meta class of the models?
UPDATE: for clarification, I am using a method to create my dynamic models similar to the ones describe in the following places:
http://dynamic-models.readthedocs.org/en/latest/topics/model.html#topics-model
https://code.djangoproject.com/wiki/DynamicModels
This method is great for generating my dynamic models based on information stored in my Configuration models (https://code.djangoproject.com/wiki/DynamicModels#Adatabase-drivenapproach). I did have to register a signal to clear the django model cache to catch changes to the models when a Configuration instance is changed, but everything seems to be working great, except for the fact that migrations are generated for these models. If I delete one of the configurations and the model is deleted from Django's cache, the migration would need to be updated again, removing the model that it shouldn't care about.
These dynamic models are not used in the application specifically. No where in the code do I refer to a books model (from the example above). They are generated at runtime and used to read information from the legacy tables they provide access to.
The short answer is that Django is not built for this. Making your model "unmanaged" only means Django will not create or delete the table for it -- nothing else.
That said, if you have no regular models alongside these dynamic models in the same app, you can conditionally add the app to INSTALLED_APPS in settings.py:
if not ('makemigrations' in sys.argv or 'migrate' in sys.argv):
INSTALLED_APPS += (
'app_with_dynamic_models',
'another_app_with_dynamic_models',
)
This should make Django ignore the app when creating and running migrations. However, you will eventually have to make and run migrations for the models if you want to use them, since the ability to have apps which do not use migrations is meant to go away in Django 1.9. Could your dynamic models be refactored to use the contenttypes framework?
I suggest you replace the generated migrations.CreateModel operation by one of your own that always reflect the actual model state. This way no state changes should be ever detected.
class CreateDynamicModel(CreateModel):
def __init__(self):
# ... dynamically generate the name, fields, options and bases
super(CreateDynamicModel, self).super(
name=name, fields=fields, options=optins, bases=bases
)
You can probably write a custom database router with the allow_migrate method returning False for your dynamic models. The migrate command will disallow them in that case.
As long as you don't load these dynamic models in any models.py module, makemigrations shouldn't pick them up either.

Run south default data migrations for built-in applications such as auth.group or flatpages?

The issue at hand is that in order to install default data into the database for built-in django applications such as flatpages or auth.group requires specifying SOUTH_MIGRATION_MODULES although the South documentation isn't very clear regarding the directory structure and the way to generate such data migrations.
Any ideas how to do this?
It's not necessary to use South for this, you can use an initial fixture which is auto loaded when you do a syncdb:
https://docs.djangoproject.com/en/dev/howto/initial-data/#automatically-loading-initial-data-fixtures
The docs do note that from Django 1.7 onwards this is deprecated (in fact syncdb itself is deprecated) and you should make a data migration using the new built-in migrations functionality, which supersedes South:
https://docs.djangoproject.com/en/dev/topics/migrations/#data-migrations
If you want to use South migrations for say django.contrib.auth.models.Group then I think you can do it like this:
/projectroot/
/projectroot/myapp
/projectroot/myapp/migrations
/projectroot/myapp/migrations/__init__.py
/projectroot/myapp/migrations/auth/__init__.py
/projectroot/myapp/migrations/auth/0001_initial_groups.py # arbitrary name
(any structure that results in a sane Python dotted import path to your actual migration from your project code)
Then in settings:
SOUTH_MIGRATION_MODULES = {
'auth': 'myapp.migrations.auth',
}