I am using Django models to create a PostgreSQL-DB. I have a DateTimeField where I would like to set the current timestamp as default value. I am aware of multiple sources suggesting how to do this. However, when I inspect my DB outside of Django, the default timestamp doesn't show up.
Approaches I have tried:
1.
created_at = models.DateTimeField(auto_now_add=True)
from django.db.models.functions import Now
...
created_at = models.DateTimeField(default=Now())
from django.utils.timezone import now
...
created_at = models.DateTimeField(default=now)
What I would expect is for the PostgreSQL database to show:
TIMESTAMPTZ NOT NULL DEFAULT CURRENT_TIMESTAMP
What It shows is timestamp with time zone not null but no default value. Any ideas on how to do that would be greatly appreciated.
However, when I inspect my DB outside of Django, the default timestamp doesn't show up.
That is correct, Django itself manages the default values. Not the database, Django also manages the ON DELETE trigger, not the database. This gives more flexibility. For example you can pass a callable to the default like you did with default=now. This callable can perform sophisticated actions to determine the default value like making extra queries, API calls, file I/O, etc. This is not possible with a default at the database side.
You can make a data migration file and manually alter the table. You can initialize a data migration with:
python manage.py makemigrations --empty app_name
next you can alter the file it has generated and specify a default with:
# Generated by Django 3.1 on 2020-12-16 17:14
from django.db import migrations
class Migration(migrations.Migration):
dependencies = [
('app_name', 'migration_name'),
]
operations = [
migrations.RunSQL(
'ALTER TABLE table_name ALTER COLUMN created_at SET DEFAULT CURRENT_TIMESTAMP';
)
]
The advantage of doing this is that Django manages the migrations, and it will thus migrate the databases that have not been migrated and thus add a default value.
Related
I'm trying to rename a model and I would like to write the migration in the way that it doesn't depend on the old name still present while it being applied. Can I somehow get data from a database table that no longer has a model in my migration code?
Details:
I have a Region model that I want to move into a more generic GeoObject model and remove from the models.py. If I write my migration code that creates GeoObjects from existing Regions with from models import Region I'll have to keep Region model until my main database will migrate. But I'd like to write a migration so that it doesn't depend on Region model being present, just check that the database table exists and use it. Is it possible to do it using Django instruments, without depending on a specific database type if possible?
Yes, you can.
But first of all, you really shouldn't import any model inside migration.
Take look at RunPython operation, that will allow you to run any python code inside your migration. RunPython will pass to your function 2 parameters: apps and schema_editor. First parameter contains structure of your models at stage of applying that migration, so if actual removing of model is later on that migration, you can still access that model using apps passed into function.
Let's say your model looked like this:
class SomeModel(models.Model):
some_field = models.CharField(max_length=32)
Now you're deleting that model, automatically created migration will contain:
class Migration(migrations.Migration):
dependencies = [
('yourapp', '0001_initial'), # or any other dependencies
]
operations = [
migrations.DeleteModel(
name='Main',
),
]
You can modify that migration by injecting RunPython just above DeleteModel operation:
operations = [
migrations.RunPython(
move_data_to_other_model,
move_data_back, # for backwards migration - if you won't ever want to undo this migration, just don't pass that function at all
),
migrations.DeleteModel(
name='SomeModel',
),
]
and creating 2 functions before Migration class:
def move_data_to_other_model(apps, schema_editor):
SomeModel = apps.get_model('yourapp', 'SomeModel')
for something in SomeModel.objects.all():
# do your data migration here
o = OtherModel.objects.get(condition=True)
o.other_field = something.some_field
def move_data_back(apps, schema_editor):
SomeModel = apps.get_model('yourapp', 'SomeModel')
for something in OtherModel.objects.all():
# move back your data here
SomeModel(
some_field=something.other_field,
).save()
It doesn't matter that your model is no longer defined in models.py, django can rebuild that model based on migration history. But remember: save method from your models (and other customized methods) won't be called in migrations. Also any pre_save or post_save signals won't be triggered.
Am trying to create simple blog using django.
At first,i created database with the command
python manage.py syncdb
when i try to save blog post,i get the following error
DatabaseError: table blog_app_post has no column named body
models.py code :
from django.db import models
from taggit.managers import TaggableManager
class Post(models.Model):
title = models.CharField(max_length=255)
body = models.TextField()
created = models.DateTimeField()
tags = TaggableManager()
def __unicode__(self):
return self.title
but the column named body is actually created in the Db.
BEGIN;
CREATE TABLE "blog_app_post" (
"id" integer NOT NULL PRIMARY KEY,
"title" varchar(255) NOT NULL,
"body" text NOT NULL,
"created" datetime NOT NULL
)
what does this error mean and anyone would propose a solution for this?
This is probably because you changed the structure of your posts data structure. What you need to do now is delete the schema for your previous table and paste in the new one.
You can avoid problems like this by using migration managers like south.
So, in order to solve this, run manage.py sql <app_name>, then you simply copy the latest SQL table on the list, the first one that is printed. Then you simply maange.py dbshell and then just paste and run the SQL.
How do you say that it's created, you checking it using python manage.py sqlall?
Did you add field body after running syncdb initially. In that case you will have to use a migration.
I use Django and South for my database. Now I want to add a new Model and a field in an existing model, referencing the new model. For example:
class NewModel(models.Model):
# a new model
# ...
class ExistingModel(models.Model):
# ... existing fields
new_field = models.ForeignKey(NewModel) # adding this now
Now South obviously complains that I added a non-null field and asks me to enter a one-off value. But what I really want is to create a new NewModel instance for every existing ExistingModel instance, thus fulfilling the database requirements. Is that possible somehow?
The easiest way to do this is to write a schema migration that makes the column change, and then write a datamigration to correctly fill in the value. Depending on the database you're using you'll have to do this in slightly different ways.
Sqlite
For Sqlite, you can add a sentinel value for the relation and use a datamigration to fill it in without any issue:
0001_schema_migration_add_foreign_key_to_new_model_from_existing_model.py
from south.db import db
from south.v2 import SchemaMigration
from django.db import models
class Migration(SchemaMigration):
def forwards(self, orm):
db.add_column('existing_model_table', 'new_model',
self.gf('django.db.models.fields.related.ForeignKey')(default=0, to=orm['appname.new_model']), keep_default=False)
0002_data_migration_for_new_model.py:
class Migration(DataMigration):
def forwards(self, orm):
for m in orm['appname.existing_model'].objects.all():
m.new_model = #custom criteria here
m.save()
This will work just fine, with no issues.
Postgres and MySQL
With MySql, you have to give it a valid default. If 0 isn't actually a valid Foreignkey, you'll get errors telling you so.
You could default to 1, but there are instances where that isn't a valid foreign key (happened to me because we have different environments, and some environments publish to other databases, so the IDs rarely match up (we use UUIDs for cross-database identification, as God intended).
The second issue you get is that South and MySQL don't play well together. Partially because MySQL doesn't have the concept of DDL transactions.
In order to get around some issues you will inevitably face (including the error I mentioned above and from South asking you to mark orm items in a SchemaMigration as no-dry-run), you need to change the above 0001 script to do the following:
0001_schema_migration_add_foreign_key_to_new_model_from_existing_model.py
from south.db import db
from south.v2 import SchemaMigration
from django.db import models
class Migration(SchemaMigration):
def forwards(self, orm):
id = 0
if not db.dry_run:
new_model = orm['appname.new_model'].objects.all()[0]
if new_model:
id = new_model.id
db.add_column('existing_model_table', 'new_model',
self.gf('django.db.models.fields.related.ForeignKey')(default=id, to=orm['appname.new_model']), keep_default=False)
And then you can run the 0002_data_migration_for_new_model.py file as normal.
I advise using the same example above for Postgres and for MySql. I don't remember any issues offhand with Postgres with the first example, but I'm certain the second example works for both (tested).
You want a data migration to supplement your schema migration in this scenario.
South has a nice step by step tutorial on how to achieve this in the docs, here.
It's not uncommon in South to have the desired outcome spread over two or three schema/data migrations as its not always possible to do it in one big hit (sometimes depends on the underlying db if it will tolerate adding a non null column with no default). So in this case you might add a schema migration that has a default, then a data migration with your object manipulation then a final schema migration.
I'd prefer not to destroy all the users on my site. But I want to take advantage of Django 1.5's custom pluggable user model. Here's my new user model:
class SiteUser(AbstractUser):
site = models.ForeignKey(Site, null=True)
Everything works with my new model on a new install (I've got other code, along with a good reason for doing this--all of which are irrelevant here). But if I put this on my live site and syncdb & migrate, I'll lose all my users or at least they'll be in a different, orphaned table than the new table created for my new model.
I'm familiar with South, but based on this post and some trials on my part, it seems its data migrations are not currently a fit for this specific migration. So I'm looking for some way to either make South work for this or for some non-South migration (raw SQL, dumpdata/loaddata, or otherwise) that I can run on each of my servers (Postgres 9.2) to migrate the users once the new table has been created while the old auth.User table is still in the database.
South is more than able to do this migration for you, but you need to be smart and do it in stages. Here's the step-by-step guide: (This guide presupposed you subclass AbstractUser, not AbstractBaseUser)
Before making the switch, make sure that south support is enabled in the application
that contains your custom user model (for the sake of the guide, we'll call it accounts and the model User).
At this point you should not yet have a custom user model.
$ ./manage.py schemamigration accounts --initial
Creating migrations directory at 'accounts/migrations'...
Creating __init__.py in 'accounts/migrations'...
Created 0001_initial.py.
$ ./manage.py migrate accounts [--fake if you've already syncdb'd this app]
Running migrations for accounts:
- Migrating forwards to 0001_initial.
> accounts:0001_initial
- Loading initial data for accounts.
Create a new, blank user migration in the accounts app.
$ ./manage.py schemamigration accounts --empty switch_to_custom_user
Created 0002_switch_to_custom_user.py.
Create your custom User model in the accounts app, but make sure it is defined as:
class SiteUser(AbstractUser): pass
Fill in the blank migration with the following code.
# encoding: utf-8
from south.db import db
from south.v2 import SchemaMigration
class Migration(SchemaMigration):
def forwards(self, orm):
# Fill in the destination name with the table name of your model
db.rename_table('auth_user', 'accounts_user')
db.rename_table('auth_user_groups', 'accounts_user_groups')
db.rename_table('auth_user_user_permissions', 'accounts_user_user_permissions')
def backwards(self, orm):
db.rename_table('accounts_user', 'auth_user')
db.rename_table('accounts_user_groups', 'auth_user_groups')
db.rename_table('accounts_user_user_permissions', 'auth_user_user_permissions')
models = { ....... } # Leave this alone
Run the migration
$ ./manage.py migrate accounts
- Migrating forwards to 0002_switch_to_custom_user.
> accounts:0002_switch_to_custom_user
- Loading initial data for accounts.
Make any changes to your user model now.
# settings.py
AUTH_USER_MODEL = 'accounts.User'
# accounts/models.py
class SiteUser(AbstractUser):
site = models.ForeignKey(Site, null=True)
create and run migrations for this change
$ ./manage.py schemamigration accounts --auto
+ Added field site on accounts.User
Created 0003_auto__add_field_user_site.py.
$ ./manage.py migrate accounts
- Migrating forwards to 0003_auto__add_field_user_site.
> accounts:0003_auto__add_field_user_site
- Loading initial data for accounts.
Honestly, If you already have good knowledge of your setup and already use south, It should be as simple as adding the following migration to your accounts module.
# encoding: utf-8
from south.db import db
from south.v2 import SchemaMigration
from django.db import models
class Migration(SchemaMigration):
def forwards(self, orm):
# Fill in the destination name with the table name of your model
db.rename_table('auth_user', 'accounts_user')
db.rename_table('auth_user_groups', 'accounts_user_groups')
db.rename_table('auth_user_permissions', 'accounts_user_permissions')
# == YOUR CUSTOM COLUMNS ==
db.add_column('accounts_user', 'site_id',
models.ForeignKey(orm['sites.Site'], null=True, blank=False)))
def backwards(self, orm):
db.rename_table('accounts_user', 'auth_user')
db.rename_table('accounts_user_groups', 'auth_user_groups')
db.rename_table('accounts_user_user_permissions', 'auth_user_user_permissions')
# == YOUR CUSTOM COLUMNS ==
db.remove_column('accounts_user', 'site_id')
models = { ....... } # Leave this alone
EDIT 2/5/13: added rename for auth_user_group table. FKs will auto update to point at the correct table due to db constraints, but M2M fields' table names are generated from the names of the 2 end tables and will need manual updating in this manner.
EDIT 2: Thanks to #Tuttle & #pix0r for the corrections.
My incredibly lazy way of doing this:
Create a new model (User), extending AbstractUser. Within new model, in it's Meta, override db_table and set to 'auth_user'.
Create an initial migration using South.
Migrate, but fake the migration, using --fake when running migrate.
Add new fields, create migration, run it normally.
This is beyond lazy, but works. You now have a 1.5 compliant User model, which just uses the old table of users. You also have a proper migration history.
You can fix this later on with manual migrations to rename the table.
I think you've correctly identified that a migration framework like South is the right way to go here. Assuming you're using South, you should be able to use the Data Migrations functionality to port the old users to your new model.
Specifically, I would add a forwards method to copy all rows in your user table to the new table. Something along the lines of:
def forwards(self, orm):
for user in orm.User.objects.all():
new_user = SiteUser(<initialize your properties here>)
new_user.save()
You could also use the bulk_create method to speed things up.
I got tired of struggling with South so I actually ended up doing this differently and it worked out nicely for my particular situation:
First, I made it work with ./manage.py dumpdata, fixing up the dump, and then ./manage.py loaddata, which worked. Then I realized I could do basically the same thing with a single, self-contained script that only loads necessary django settings and does the serialization/deserialization directly.
Self-contained python script
## userconverter.py ##
import json
from django.conf import settings
settings.configure(
DATABASES={
# copy DATABASES configuration from your settings file here, or import it directly from your settings file (but not from django.conf.settings) or use dj_database_url
},
SITE_ID = 1, # because my custom user implicates contrib.sites (which is why it's in INSTALLED_APPS too)
INSTALLED_APPS = ['django.contrib.sites', 'django.contrib.auth', 'myapp'])
# some things you have to import after you configure the settings
from django.core import serializers
from django.contrib.auth.models import User
# this isn't optimized for huge amounts of data -- use streaming techniques rather than loads/dumps if that is your case
old_users = json.loads(serializers.serialize('json', User.objects.all()))
for user in old_users:
user['pk'] = None
user['model'] = "myapp.siteuser"
user['fields']["site"] = settings['SITE_ID']
for new_user in serializers.deserialize('json', json.dumps(old_users)):
new_user.save()
With dumpdata/loaddata
I did the following:
1) ./manage.py dumpdata auth.User
2) Script to convert auth.user data to new user. (or just manually search and replace in your favorite text editor or grep) Mine looked something like:
def convert_user_dump(filename, site_id):
file = open(filename, 'r')
contents = file.read()
file.close()
user_list = json.loads(contents)
for user in user_list:
user['pk'] = None # it will auto-increment
user['model'] = "myapp.siteuser"
user['fields']["site"] = side_id
contents = json.dumps(user_list)
file = open(filename, 'w')
file.write(contents)
file.close()
3) ./manage.py loaddata filename
4) set AUTH_USER_MODEL
*Side Note: One critical part of doing this type of migration, regardless of which technique you use (South, serialization/modification/deserialization, or otherwise) is that as soon as you set AUTH_USER_MODEL to your custom model in the current settings, django cuts you off from auth.User, even if the table still exists.*
We decided to switch to a custom user model in our Django 1.6/Django-CMS 3 project, perhaps a little bit late because we had data in our database that we didn't want to lose (some CMS pages, etc).
After we switched AUTH_USER_MODEL to our custom model, we had a lot of problems that we hadn't anticipated, because a lot of other tables had foreign keys to the old auth_user table, which wasn't deleted. So although things appeared to work on the surface, a lot of things broke underneath: publishing pages, adding images to pages, adding users, etc. because they tried to create an entry in a table that still had a foreign key to auth_user, without actually inserting a matching record into auth_user.
We found a quick and dirty way to rebuild all the tables and relations, and copy our old data across (except for users):
do a full backup of your database with mysqldump
do another backup with no CREATE TABLE statements, and excluding a few tables that won't exist after the rebuild, or will be populated by syncdb --migrate on a fresh database:
south_migrationhistory
auth_user
auth_user_groups
auth_user_user_permissions
auth_permission
django_content_types
django_site
any other tables that belong to apps that you removed from your project (you might only find this out by experimenting)
drop the database
recreate the database (e.g. manage.py syncdb --migrate)
create a dump of the empty database (to make it faster to go round this loop again)
attempt to load the data dump that you created above
if it fails to load because of a duplicate primary key or a missing table, then:
edit the dump with a text editor
remove the statements that lock, dump and unlock that table
reload the empty database dump
try to load the data dump again
repeat until the data dump loads without errors
The commands that we ran (for MySQL) were:
mysqldump <database> > ~/full-backup.sql
mysqldump <database> \
--no-create-info \
--ignore-table=<database>.south_migrationhistory \
--ignore-table=<database>.auth_user \
--ignore-table=<database>.auth_user_groups \
--ignore-table=<database>.auth_user_user_permissions \
--ignore-table=<database>.auth_permission \
--ignore-table=<database>.django_content_types \
--ignore-table=<database>.django_site \
> ~/data-backup.sql
./manage.py sqlclear
./manage.py syncdb --migrate
mysqldump <database> > ~/empty-database.sql
./manage.py dbshell < ~/data-backup.sql
(edit ~/data-backup.sql to remove data dumped from a table that no longer exists)
./manage.py dbshell < ~/empty-database.sql
./manage.py dbshell < ~/data-backup.sql
(repeat until clean)
In django models we have option named managed which can be set True or False
According to documentation the only difference this option makes is whether table will be managed by django or not. Is management by django or by us makes any difference?
Is there any pros and cons of using one option rather than other?
I mean why would we opt for managed=False? Will it give some extra control or power which affects my code?
The main reason for using managed=False is if your model is backed by something like a database view, instead of a table - so you don't want Django to issue CREATE TABLE commands when you run syncdb.
Right from Django docs:
managed=False is useful if the model represents an existing table or a database view that has been created by some other means. This is the only difference when managed=False. All other aspects of model handling are exactly the same as normal
When ever we create the django model, the managed=True implicitly is
true by default. As we know that when we run python manage.py makemigrations the migration file(which we can say a db view) is
created in migration folder of the app and to apply that migration i.e
creates the table in db or we can say schema.
So by managed=False, we restrict Django to create table(scheme, update
the schema of the table) of that model or its fields specified in
migration file.
Why we use its?
case1: Sometime we use two db for the project for
example we have db1(default) and db2, so we don't want particular
model to be create the schema or table in db1 so we can this or we can
customize the db view.
case2. In django ORM, the db table is tied to django ORM model, It
help tie a database view to bind with a django ORM model.
Can also go through the link:
We can add our raw sql for db view in migration file.
The raw sql in migration look like: In 0001_initial.py
from future import unicode_literals
from django.db import migrations, models
class Migration(migrations.Migration):
initial = True
dependencies = [
]
operations = [
migrations.RunSQL(
CREATE OR REPLACE VIEW app_test AS
SELECT row_number() OVER () as id,
ci.user_id,
ci.company_id,
),
]
Above code is just for overview of the looking of the migration file, can go through above link for brief.