I'm creating a data migration using the RunPython method. However when I try to run a method on the object none are defined. Is it possible to call a method defined on a model using RunPython?
Model methods are not available in migrations, including data migrations.
However there is workaround, which should be quite similar to calling model methods. You can define functions inside migrations that mimic those model methods you want to use.
If you had this method:
class Order(models.Model):
'''
order model def goes here
'''
def get_foo_as_bar(self):
new_attr = 'bar: %s' % self.foo
return new_attr
You can write function inside migration script like:
def get_foo_as_bar(obj):
new_attr = 'bar: %s' % obj.foo
return new_attr
def save_foo_as_bar(apps, schema_editor):
old_model = apps.get_model("order", "Order")
for obj in old_model.objects.all():
obj.new_bar_field = get_foo_as_bar(obj)
obj.save()
Then use it in migrations:
class Migration(migrations.Migration):
dependencies = [
('order', '0001_initial'),
]
operations = [
migrations.RunPython(save_foo_as_bar)
]
This way migrations will work. There will be bit of repetition of code, but it doesn't matter because data migrations are supposed to be one time operation in particular state of an application.
did you call your model like said in the documentation ?
def combine_names(apps, schema_editor):
# We can't import the Person model directly as it may be a newer
# version than this migration expects. We use the historical version.
Person = apps.get_model("yourappname", "Person")
for person in Person.objects.all():
person.name = "%s %s" % (person.first_name, person.last_name)
person.save()
Data-Migration
Because at this point, you can't import your Model directly :
from yourappname.models import Person
Update
The internal Django code is in this file django/db/migrations/state.py
django.db.migrations.state.ModelState#construct_fields
def construct_fields(self):
"Deep-clone the fields using deconstruction"
for name, field in self.fields:
_, path, args, kwargs = field.deconstruct()
field_class = import_string(path)
yield name, field_class(*args, **kwargs)
There is only fields that are clones in a "fake" model instance:
MyModel.__module__ = '__fake__'
Github Django
The fine print is laid in Historical Models
Because it’s impossible to serialize arbitrary Python code, these historical models will not have any custom methods that you have defined.
It was quite a surprise when I first encountered it during migration and didn't read the fine print because it seems to contradict their Design Philosophy (adding functions around models)
As of Django 1.8, you can make model managers available to migrations by setting use_in_migrations = True on the model manager. See the migrations documentation.
This does not answer the OP, but might still be of use to someone.
Not only are custom model methods unavailable in migrations, but the same holds for other model attributes, such as class "constants" used for model field choices. See examples in the docs.
In this specific edge case, we cannot access the historical values of the choices directly, during migration, but we can get the historical values from the model field, using the model _meta api, because those values are contained in migrations.
Given Django's Student example:
class Student(models.Model):
FRESHMAN = 'FR'
...
YEAR_IN_SCHOOL_CHOICES = [(FRESHMAN, 'Freshman'), ...]
year_in_school = models.CharField(
max_length=2,
choices=YEAR_IN_SCHOOL_CHOICES,
default=FRESHMAN,
)
We can get the historic value of Student.FRESHMAN inside a migration as follows:
...
Student = apps.get_model('my_app', 'Student')
YEAR_IN_SCHOOL_CHOICES = Student._meta.get_field('year_in_school').choices
...
Something useful that worked for me when you have many complex methods calling each other and you need them available via your object:
First copy those model methods over into your migration file
def A(self):
return self.B() + self.C()
def B(self):
return self.name
def C(self):
return self.description
Then in your migration function:
def do_something_to_your_objects(apps, schema_editor):
MyModel = apps.get_model("my_app", "MyModel")
MyModel.A = A
MyModel.B = B
MyModel.C = C
for my_object in MyModel.objects.all():
my_object.name_and_decription = my_object.C()
my_object.save()
class Migration(migrations.Migration):
dependencies = [
('initial', '0001_initial'),
]
operations = [
migrations.RunPython(do_something_to_your_objects)
]
If you are like me, who came here because you got the error ValueError: RunPython must be supplied with a callable
It's because you put "()" at the end of the function that you are assigning to code in migrations.RunPython
Error e.g. migrations.RunPython(code=do_something(), reverse=noop)
It should be:
migrations.RunPython(code=do_something, reverse=noop) without the ()
Related
I am trying to construct a MultipleChoiceFilter where the choices are the set of possible dates that exist on a related model (DatedResource).
Here is what I am working with so far...
resource_date = filters.MultipleChoiceFilter(
field_name='dated_resource__date',
choices=[
(d, d.strftime('%Y-%m-%d')) for d in
sorted(resource_models.DatedResource.objects.all().values_list('date', flat=True).distinct())
],
label="Resource Date"
)
When this is displayed in a html view...
This works fine at first, however if I create new DatedResource objects with new distinct date values I need to re-launch my webserver in order for them to get picked up as a valid choice in this filter. I believe this is because the choices list is evaluated once when the webserver starts up, not every time my page loads.
Is there any way to get around this? Maybe through some creative use of a ModelMultipleChoiceFilter?
Thanks!
Edit:
I tried some simple ModelMultipleChoice usage, but hitting some issues.
resource_date = filters.ModelMultipleChoiceFilter(
field_name='dated_resource__date',
queryset=resource_models.DatedResource.objects.all().values_list('date', flat=True).order_by('date').distinct(),
label="Resource Date"
)
The HTML form is showing up just fine, however the choices are not accepted values to the filter. I get "2019-04-03" is not a valid value. validation errors, I am assuming because this filter is expecting datetime.date objects. I thought about using the coerce parameter, however those are not accepted in ModelMultipleChoice filters.
Per dirkgroten's comment, I tried to use what was suggested in the linked question. This ends up being something like
resource_date = filters.ModelMultipleChoiceFilter(
field_name='dated_resource__date',
to_field_name='date',
queryset=resource_models.DatedResource.objects.all(),
label="Resource Date"
)
This also isnt what I want, as the HTML now form is now a) displaying the str representation of each DatedResource, instead of the DatedResource.date field and b) they are not unique (ex if I have two DatedResource objects with the same date, both of their str representations appear in the list. This also isnt sustainable because I have 200k+ DatedResources, and the page hangs when attempting to load them all (as compared to the values_list filter, which is able to pull all distinct dates out in seconds.
One of the easy solutions will be overriding the __init__() method of the filterset class.
from django_filters import filters, filterset
class FooFilter(filterset.FilterSet):
def __init__(self, *args, **kwargs):
super().__init__(*args, **kwargs)
try:
self.filters['user'].extra['choices'] = [(d, d.strftime('%Y-%m-%d')) for d in sorted(
resource_models.DatedResource.objects.all().values_list('date', flat=True).distinct())]
except (KeyError, AttributeError):
pass
resource_date = filters.MultipleChoiceFilter(field_name='dated_resource__date', choices=[], label="Resource Date")
NOTE: provide choices=[] in your field definition of filterset class
Results
I tested and verified this solution with following dependencies
1. Python 3.6
2. Django 2.1
3. DRF 3.8.2
4. django-filter 2.0.0
I used following code to reproduce the behaviour
# models.py
from django.db import models
class Musician(models.Model):
name = models.CharField(max_length=50)
def __str__(self):
return f'{self.name}'
class Album(models.Model):
artist = models.ForeignKey(Musician, on_delete=models.CASCADE)
name = models.CharField(max_length=100)
release_date = models.DateField()
def __str__(self):
return f'{self.name} : {self.artist}'
# serializers.py
from rest_framework import serializers
class AlbumSerializer(serializers.ModelSerializer):
artist = serializers.StringRelatedField()
class Meta:
fields = '__all__'
model = Album
# filters.py
from django_filters import rest_framework as filters
class AlbumFilter(filters.FilterSet):
def __init__(self, *args, **kwargs):
super().__init__(*args, **kwargs)
self.filters['release_date'].extra['choices'] = self.get_album_filter_choices()
def get_album_filter_choices(self):
release_date_list = Album.objects.values_list('release_date', flat=True).distinct()
return [(date, date) for date in release_date_list]
release_date = filters.MultipleChoiceFilter(choices=[])
class Meta:
model = Album
fields = ('release_date',)
# views.py
from rest_framework.viewsets import ModelViewSet
from django_filters import rest_framework as filters
class AlbumViewset(ModelViewSet):
serializer_class = AlbumSerializer
queryset = Album.objects.all()
filter_backends = (filters.DjangoFilterBackend,)
filter_class = AlbumFilter
Here I've used the django-filter with DRF.
Now, I populated some data through Django Admin console. After that, the album api become as below,
and I got the release_date as
Then, I added new entry through Django admin -- (Screenshot) and I refresh the DRF API endpoint and the possible choices became as below,
I have looked into your problem and I have following suggestions
The Problem
You have got the problem right. Choices for your MultipleChoiceFilter are calculated statically whenever you run server.Thats why they don't get updated dynamically whenever you insert new instance in DatedResource.
To get it working correctly, you have to provide choices dynamically to MultipleChoiceFilter. I searched in documentation but did not find anything regarding this. So here is my solution.
The solution
You have to extend MultipleChoiceFilter and create your own filter class. I have created this and here it is.
from typing import Callable
from django_filters.conf import settings
import django_filters
class LazyMultipleChoiceFilter(django_filters.MultipleChoiceFilter):
def get_field_choices(self):
choices = self.extra.get('choices', [])
if isinstance(choices, Callable):
choices = choices()
return choices
#property
def field(self):
if not hasattr(self, '_field'):
field_kwargs = self.extra.copy()
if settings.DISABLE_HELP_TEXT:
field_kwargs.pop('help_text', None)
field_kwargs.update(choices=self.get_field_choices())
self._field = self.field_class(label=self.label, **field_kwargs)
return self._field
Now you can use this class as replacement and pass choices as lambda function like this.
resource_date = LazyMultipleChoiceFilter(
field_name='dated_resource__date',
choices=lambda: [
(d, d.strftime('%Y-%m-%d')) for d in
sorted(resource_models.DatedResource.objects.all().values_list('date', flat=True).distinct())
],
label="Resource Date"
)
Whenever instance of filter will be created choices will be updated dynamically. You can also pass choices statically (without lambda function) to this field if want default behavior.
I am trying to migrate my models to use Guardian permissions. At this point I have:
class Data(models.Model):
user = models.ForeignKey(settings.AUTH_USER_MODEL, on_delete=models.CASCADE)
class Meta:
permissions = (
('view', 'View'),
('edit', 'Edit'),
('owner', 'Owner'),
)
I created one migration that added the new permissions, and in a custom migration I am trying to assign the permissions like this:
def assignDataPermissions(apps, schema_editor):
Data = apps.get_model('api', 'Data')
for data in Data.objects.all():
assign_perm('api.owner', data.user, data)
class Migration(migrations.Migration):
dependencies = [
('api', '0169_auto_20180304_1619'),
]
operations = [
migrations.RunPython(assignDataPermissions)
]
This fails with
guardian.exceptions.NotUserNorGroup: User/AnonymousUser or Group instance is required (got EmailUser object).
Is there a better/proper way of migrating to Guardian? If not, how do I make it see my custom User class?
I ended up using higher level workaround.
Inside the migration, data.user is actually an object of __fake.EmailUser, while get_user_model() returns custom_user.models.EmailUser. As a result Guardian fails the check isinstance(identity, get_user_model()).
My workaround(hack?) is to explicitly get the EmailUser object from the database corresponding to data.user. Like so:
def assignDataPermissions(apps, schema_editor):
Data = apps.get_model('api', 'Data')
User = get_user_model()
for data in Data.objects.all():
user = User.objects.get(id=data.user_id)
assign_perm('api.owner', user, data)
In general, loading external libraries in migrations is prone to errors.
Try something low-level like that:
def assignDataPermissions(apps, schema_editor):
Data = apps.get_model('api', 'Data')
Permission = apps.get_model('auth', 'Permission')
owner_api = Permission.objects.get(content_type__applabel='api', codename='owner')
UserObjectPermission = apps.get_model('guardian', 'UserObjectPermission')
for data in Data.objects.all():
UserObjectPermission.objects.create(permission=owner_api, user=data.user, content_object=data)
In my application's models I need a way of linking Problems and Solutions -- every Problem can have multiple Solutions and a given Solution can map back to multiple Problems.
Solution is an abstract base class, since there can be many varieties of Solutions. So, I figured out I need a mapping table ProblemSolutionMapping which uses a GenericForeignKey to accommodate all those child classes. But I'm trying to figure out how to limit the classes to just the children of Solutions and not all the classes available in the whole application, which is what is currently happening.
# Thanks to http://stackoverflow.com/a/23555691/1149759
class Solution(models.Model):
...
#classmethod
def get_subclasses(cls):
content_types = ContentType.objects.filter(app_label=cls._meta.app_label)
models = [ct.model_class() for ct in content_types]
return [model for model in models
if (model is not None and
issubclass(model, cls) and
model is not cls)]
class Meta:
abstract = True
class ProblemSolutionMapping(models.Model):
problem = models.ForeignKey(Problem)
content_type = models.ForeignKey(ContentType,
limit_choices_to=Solution.get_subclasses()) # <==== This is the issue
object_id = models.PositiveIntegerField()
content_object = GenericForeignKey('content_type', 'object_id')
The issue is that when I start up my Django app, the call to ContentType.objects.filter(app_label=cls._meta.app_label) throws the error:
django.core.exceptions.AppRegistryNotReady: Models aren't loaded yet.
Not sure what to do -- I tried making the mapping table the last one in the relevant models file (all the child classes are defined above it in the same file), but it made no difference. Is this something that I have to move into the admin form? Or is there some other way to do this at the model level?
(Django 1.9, in case it matters.)
Thanks in advance for your help!
Refrecing to a model during import time is no longer supported from django 1.7. You should use your models after all application are loaded. So you should either Staticly pass a list to your limit_choices_to or Use Q object like this:
limit_choices_to=models.Q(app_label = 'app', model = 'a') | models.Q(app_label = 'app', model = 'b')
Also you can limit what shows to user in form level
So I arrived here looking for the answer. Based on Mehran's post, I developed the below approach which is similar to yours. Instead the limit_choice_to calls a method that returns a runtime created Q object.
Below is the part that is similar to your get_subclasses.
def get_subclasses(cls, *args, **kwargs):
for app_config in apps.get_app_configs():
for app_model in app_config.get_models():
model_classes = [c.__name__ for c in inspect.getmro(app_model)]
if cls.__name__ in model_classes:
yield app_model
This creates the Q filter(s) for us (in my implementation, this is just a plain old method not attached to any class, but I suppose it could be):
def get_content_choices():
query_filter = None
for cls in Solution.get_subclasses():
app_label, model = cls._meta.label_lower.split('.')
current_filter = models.Q(app_label=app_label, model=model)
if query_filter is None:
query_filter = current_filter
else:
query_filter |= current_filter
return query_filter
And finally, in our model:
class ProblemSolutionMapping(models.Model):
...
content_type = models.ForeignKey(ContentType, limit_choices_to=get_content_choices())
...
When Django creates tables it give them names of form app_class. I'm in the position of retrofitting a different (but basically similar in content) database to a Django installation. My tables name are not prepended with app_.
I could recreate my database and its tables accordingly but I'd like to see if Django has the flexibility to modify how it handles table names.
That is, I have a table coi_fs - I'd like to see if I can change the Django installation such that it will refer not to app_coi_fs but simply coi_fs?
If you already have a database I would recommend using the database introspection option. This will create the models needed to use your current database as is.
$ django-admin.py inspectdb > models.py
To answer your original question though, from the docs (https://docs.djangoproject.com/en/dev/ref/models/options/#table-names), you can use the db_table meta property.
models.py
class DjangoSite(models.Model):
domain = models.CharField(max_length=100)
name = models.CharField(max_length=50)
class Meta:
db_table = u'site'
Of course you can do it in Django, just change Model's metaclass:
from django.db.models.base import ModelBase
class ModelMetaClass(ModelBase):
def __new__(cls, name, bases, attrs):
new_class = super().__new__(cls, name, bases, attrs)
new_class._meta.db_table = some_awesome_logic_implemented_by_you()
return new_class
and then in the Model:
class DjangoSite(models.Model, metaclass=ModelMetaClass):
# __metaclass__ = ModelMetaClass # Python 2.x
class Meta:
pass
and you are done!
If you want to universally rename all tables, edit django source. This seems dangerous but if you explicitly want to change the Django installation than have at it. https://github.com/django/django/blob/master/django/db/models/options.py
old options.py
self.db_table = "%s_%s" % (self.app_label, self.model_name)
new options.py
self.db_table = self.model_name
I am trying to add a dynamic Meta attribute to all of my Django models using model inheritance, but I can't get it to work. I have a permission that I want to add to all my models like this:
class ModelA(models.Model):
class Meta:
permisssions =(('view_modela','Can view Model A'),)
class ModelB(models.Model):
class Meta:
permisssions =(('view_modelb','Can view Model B'),)
I tried creating an abstract base class like this:
class CustomModel(models.Model):
def __init__(self, *args, **kwargs):
self._meta.permissions.append(('view_'+self._meta.module_name, u'Can view %s' % self._meta.verbose_name))
super(CustomModel,self).__init__(*args, **kwargs)
class ModelA(CustomModel):
....
class ModelB(CustomModel):
...
but it's not working. Is this the right approach? Because Django uses introspection to construct the Model classes, I'm not sure if adding permissions during the __init__() of the class will even work. With my current implementation every time I access a model instance it appends another tuple of the permissions.
Your instinct is right that this won't work. In Django, permissions are stored in the database, which means that:
they need to be available at the class level when syncdb is run in order to populate the auth_permission table (and your approach requires an instance, which won't be made during syncdb)
even if you did add it to _meta.permissions in __init__, the User object wouldn't pick it up in any permission check calls because those consult the permissions table in the DB (and a cache of that table, at that).
Your goal can't be accomplished using inheritance. What you actually need here is a Python metaclass.
This metaclass re-writes your ModelA and ModelB class definitions dynamically before they are defined, thus it doesn't require a ModelA instance, and is available to syncdb. Since Django's models also use metaclasses to build the Meta object in the first place, the only requirement is that your metaclass must inherit from the same metaclass as Django's models.
Here's some sample code (Python 2):
from django.db.models.base import ModelBase
class CustomModelMetaClass(ModelBase):
def __new__(cls, name, bases, attrs):
klas = super(CustomModelMetaClass, cls).__new__(cls, name, bases, attrs)
klas._meta.permissions.append(
(
'view_{0.module_name}'.format(klas._meta),
u'Can view {0.verbose_name}'.format(klas._meta))
)
return klas
class ModelA(models.Model):
__metaclass__ = CustomModelMetaClass
test = models.CharField(max_length=5)
Python 3:
from django.db.models.base import ModelBase
class CustomModelMetaClass(ModelBase):
def __new__(cls, name, bases, attrs):
klas = super().__new__(cls, name, bases, attrs)
klas._meta.permissions.append(
(
'view_{0.module_name}'.format(klas._meta),
'Can view {0.verbose_name}'.format(klas._meta))
)
return klas
class ModelA(models.Model, metaclass=CustomModelMetaClass):
test = models.CharField(max_length=5)
Note that permissions in this case will be written only on migrate. If you need to change permissions dynamically at run time base on the user, you'll want to provide your own authentication backend.
Try to use a custom manager:
#create a custom manager
class DynTableNameManager(models.Manager):
#overwrite all() (example)
#provide table_name
def all(self, table_name):
from django.db import connection
cursor = connection.cursor()
cursor.execute("""
SELECT id, name
FROM %s
""" % table_name)
result_list = []
for row in cursor.fetchall():
p = self.model(id=row[0], name=row[1])
result_list.append(p)
return result_list
#cerate a dummy table
class DummyTable(models.Model):
name = models.CharField ( max_length = 200 )
objects = DynTableNameManager()
use like this:
f = DummyTable.objects.all('my_table_name')