I need to loop over all m2m relations of the model instance and copy them to new model instance.
source_id=request.GET.get('source_id', 1)
obj = Artist.objects.create(title='New artist')
source_obj = Artist.objects.get(id=source_id)
if source_obj.galleries.count():
obj.galleries = source_obj.galleries.all()
if source_obj.suggested_artists.count():
obj.suggested_artists = source_obj.suggested_artists.all()
Currently i am doing it like this, but i want to loop over all m2m fields and copy the related data to obj.
I want something like:
for m2m_rel in source_obj.m2m_relations:
print geattr(source_obj, m2m_rel).count()
print geattr(source_obj, m2m_rel).all()
Any suggestions?
You can access the m2m relation entries like this:
for field in source_obj._meta.many_to_many:
source = getattr(source_obj, field.attname)
for item in source.all():
# do something with item...
print repr(item)
If you're trying to clone a model intance, you can use a generic clone_objects function like the one below. The function will clone a list of objects and return a new list of cloned objects (with new ID's):
# import Python's copy library
import copy
def clone_objects(objects):
"""
Generic model object cloner function.
"""
def clone(obj):
"""Return an identical copy of the instance with a new ID."""
if not obj.pk:
raise ValueError('Instance must be saved before it can be cloned.')
duplicate = copy.copy(obj)
# Setting pk to None tricks Django into thinking this is a new object.
duplicate.pk = None
duplicate.save()
# ... but the trick loses all ManyToMany relations.
for field in obj._meta.many_to_many:
source = getattr(obj, field.attname)
destination = getattr(duplicate, field.attname)
for item in source.all():
destination.add(item)
return duplicate
if not hasattr(objects,'__iter__'):
objects = [ objects ]
objs = []
for obj in objects:
new_obj = clone(obj)
new_obj.save()
objs.append(new_obj)
return objs
The main part of the "cloning" code is from this snippet: Clone model mixin
Related
obj = Foo.objects.get(pk=<some_existing_pk>)
obj.pk = None
obj.save()
class Foo2(Foo):
pass
This method works if we are cloning objects to same model.
What to do if we have another model Foo2 Which is inherited from Fooconsisting of same fields of Foo
I want to save obj to Foo2. Is there a shortcut to do this?
The best way would be to hash the result from .get() query.
obj = Foo.objects.get(pk=<some_existing_pk>)
data = dict(obj.values()[0])
# pop the pk
data.pop('id')
# pass the dict as **kwargs
Foo2.objects.create(**data).save()
Problem
I am using the django-model-utils InheritanceManager. I have a super Notification(models.Model) class which I use to create many notification subclasses such as PostNotification(Notification), CommentNotification(Notification), etc., and when trying to run CommentNotification.objects.bulk_create(list_of_comment_notification_objects), i get the following traceback:
File "/home/me/.virtualenvs/project/local/lib/python2.7/site-packages/django/db/models/query.py", line 429, in bulk_create
raise ValueError("Can't bulk create a multi-table inherited model")
ValueError: Can't bulk create a multi-table inherited model
and upon inspecting the query.py file, we get this causes the error:
for parent in self.model._meta.get_parent_list():
if parent._meta.concrete_model is not self.model._meta.concrete_model:
raise ValueError("Can't bulk create a multi-table inherited model")
Environment
Django Model Utils version: 3.1.1
Django version: 1.11.7
Python version: 2.7.3
Example
PostNotification.objects.bulk_create(
[PostNotification(related_user=user, post=instance) for user in users]
)
throws the above exception
What I have tried and though was a success originally:
I though that simply running:
BaseClass.objects.bulk_create(list_of_SubClass_objects) instead of SubClass.objects.bulk_create(list_of_SubClass_objects) would work and return a list of SubClass values, but subsequently running SubClass.objects.all() would return an empty result. The bulk_create() would only create a Notification base class object for each item in the list.
Found a hacky solution. I hope it works in your case. The trick is create a model (which is not an inherited one) dynamically that has some meta (db_table) set. And use this dynamic model to create Child objects in bulk (in other words write into Child's DB table).
class Parent(models.Model):
name = models.CharField(max_length=10)
class Child(Parent):
phone = models.CharField(max_length=12)
# just an example. Should be expanded to work properly.
field_type_mapping = {
'OneToOneField': models.IntegerField,
'CharField': models.CharField,
}
def create_model(Model, app_label='children', module='', options=None):
"""
Create specified model
"""
model_name = Model.__name__
class Meta:
managed = False
db_table = Model._meta.db_table
if app_label:
# app_label must be set using the Meta inner class
setattr(Meta, 'app_label', app_label)
# Update Meta with any options that were provided
if options is not None:
for key, value in options.iteritems():
setattr(Meta, key, value)
# Set up a dictionary to simulate declarations within a class
attrs = {'__module__': module, 'Meta': Meta}
# Add in any fields that were provided
fields = dict()
for field in Model._meta.fields:
if field.attname == 'id':
continue
if field.model.__name__ == model_name:
field_class_name = type(field).__name__
print(field.attname)
fields[field.attname] = field_type_mapping[field_class_name]()
# Create the class, which automatically triggers ModelBase processing
attrs.update(fields)
model = type(f'{model_name}Shadow', (models.Model,), attrs)
return model
mod = create_model(Child)
parents = [Parent(name=i) for i in range(15)]
parents = Parent.objects.bulk_create(parents)
children = [mod(phone=parent.name, parent_ptr_id=parent.id) for parent in parents]
mod.objects.bulk_create(children)
I've done a custom implementation of bulk_create that seems to be working for my case (only one parent relationship and not autoincremented pk):
from django.db import models
class MultiTableChildQueryset(models.QuerySet):
def bulk_create(self, objs, batch_size=None):
assert batch_size is None or batch_size > 0
if not objs:
return objs
self._for_write = True
objs = list(objs)
parent_model = self.model._meta.pk.related_model
parent_objs = []
for obj in objs:
parent_values = {}
for field in [f for f in parent_model._meta.fields if hasattr(obj, f.name)]:
parent_values[field.name] = getattr(obj, field.name)
parent_objs.append(parent_model(**parent_values))
setattr(obj, self.model._meta.pk.attname, obj.id)
parent_model.objects.bulk_create(parent_objs, batch_size=batch_size)
with transaction.atomic(using=self.db, savepoint=False):
self._batched_insert(objs, self.model._meta.local_fields, batch_size)
return objs
A slightly easier to read version of Moises:
from typing import TypeVar
from django.db.models import Model
M = TypeVar('M', bound=Model)
def multi_inheritance_table_bulk_insert(data: List[M]) -> None:
"""
Bulk insert data into a multi-inheritance table.
"""
if not data:
return
model = data[0].__class__
local_fields = model._meta.local_fields
parent_model = model._meta.pk.related_model
parent_fields = parent_model._meta.local_fields
parent_objects = [
parent_model(**{field.name: getattr(obj, field.name) for field in parent_fields})
for obj in data
]
parent_model.objects.bulk_create(parent_objects)
for parent, obj in zip(parent_objects, data):
obj.pk = parent.pk
queryset = QuerySet(model)
queryset._for_write = True
with transaction.atomic(using=queryset.db, savepoint=False):
queryset._batched_insert(
data,
local_fields,
batch_size=None,
)
I have a ModelBase, and ModelA, ModelB.
I want to change ModelA instance to ModelB instance. (I can handle the difference of attributes they have)
I've seen related questions but doesn't quite work for me.
How can I create an inherited django model instance from an existing base model instance?
Change class of child on django models
EDIT
When you have Place - Restaurant/Bar relationship,
I think it's quite reasonable to be able to switch a restaurant to a bar.
I had to deal with the same problem, both yuvi and arctelix answers did not work for me. yuvi solution gives an error and arctelix solution creates new object with new pk.
The goal here is to change the subclass model while keeping the original superclass as it is with the old pk.
First: Delete the old subclass and keep the superclass.Check Django documents.
Second: Add the new subclass with its fields and pass the superclass to it.
Check this q
Example: A place could be a restaurant or a caffe, and you want to change a restaurant place to a caffee; as follow:
class Place(models.Model):
name = models.CharField(max_length=50)
address = models.CharField(max_length=80)
class Caffe(Place):
serves_hot_dogs = models.BooleanField(default=False)
serves_pizza = models.BooleanField(default=False)
class Restaurant(Place):
serves_tea = models.BooleanField(default=False)
serves_coffee = models.BooleanField(default=False)
# get the objecte to be changed
rest = Restaurant.objects.get(pk=1) #arbitrary number
#delete the subclass while keeping the parent
rest.delete(keep_parents=True)
place = Place.objects.get(pk=1) # the primary key must be the same as the deleted restaurant
# Create a caffe and pass the original place
caffee = Caffe(place_ptr_id=place.pk) #this will empty the parent field
#update parent fields
caffee.__dict__.update(place.__dict__)
#add other field
........
#save the caffe
caffee.save()
I would create an entirely new instance of the second model with the same values of their shared attributes, then delete the old one. Seems like the cleanest way to me.
If ModelBase is abstract:
instance = ModelA.objects.get(pk=1) #arbitrary
# find parent class fields:
fields = [f.name for f in ModelBase._meta.fields]
# get the values from the modelA instance
values = dict( [(x, getattr(instance, x)) for x in fields] )
#assign same values to new instance of second model
new_instance = ModelB(**values)
#add any additional information to new instance here
new_instance.save() #save new one
instance.delete() # remove the old one
If ModelBase is not abstract, however, you'll have to do an extra workaround:
fields = [f.name for f in ModelBase._meta.fields if f.name != 'id']
#... other parts are the same...
new_instance.modelbase_ptr = instance.modelbase_ptr #re-assign related parent
instance.delete() #delete this first!
new_instance.save()
In yuvi's answer manually assigning modelbase_ptr and saving fails since instance.modelbase_ptr is deleted prior to save.
Building on yuvi's answer here a more explicit example and works generically for abstract and non-abstract conversions of:
ModelBase -> ModelChild
ModelChild -> ModelBase
ModelChild -> ModelChild
Optionally preserves the original id and this follows the django docs recomended methodology.
ex_model = ModelA
new_model = ModelB
ex_instance = ex_model.objects.get(pk=1) #arbitrary
# find fields required for new_model:
new_fields = [f.name for f in new_model._meta.fields]
# make new dict of existing field : value
new_fields_dict = dict( [(x, getattr(ex_instance, x, None)) for x in new_fields] )
# Save temp copy as new_model with new id
# modelbase_ptr will be created automatically as required
new_fields_dict.pop('project_ptr', None)
temp_instance = new_model(**new_fields_dict)
temp_instance.pk = None
temp_instance.id = None
temp_instance.save()
# you must set all your related fields here
temp_instance.copy_related(ex_instance)
ex_instance.delete()
# (optional) Save final copy as new_model with original id
final_instance = new_model(**new_fields_dict)
final_instance.save()
final_instance.copy_related(temp_instance)
temp_instance.delete()
# here are the removed fields, handle as required
removed_fields = [f.name for f in ex_model._meta.fields if f.name not in new_fields_dict.keys()]
removed_fields_dict = dict( [(x, getattr(ex_instance, x, None)) for x in removed_fields] )
In Class ModelBase:
def copy_related(self, from):
# include all your related fields here
self.related_field = from.related_field.all()
self.related_field_a = from.related_field_a.all()
Suppose following model class,
class Bookmark(models.Model):
owner = models.ForeignKey(UserProfile,related_name='bookmarkOwner')
parent = models.ForeignKey(UserProfile,related_name='bookmarkParent')
sitter = models.ForeignKey(UserProfile,related_name='bookmarkSitter')
How can I get sitter objects from owner Objects?
user = UserProfile.objects.get(pk=1)
UserProfile.objects.filter(bookmarkOwner=user)
returns empty tuple, and I cannot specify sitter variable.
I believe you can do something like this, if you want to avoid using a loop:
pks = some_user_profile.bookmarkOwner.values_list('sitter', flat=True)
sitters = UserProfile.objects.filter(pk__in=pks).all()
Alternatively, you might want to experiment with setting up a many-to-many field and using the through parameter. See the Django docs: https://docs.djangoproject.com/en/2.0/ref/models/fields/#manytomanyfield
you should do
objs = Bookmark.objects.filter(owner=user)
# This will return all bookmarks related to the user profile.
for obj in objs:
print obj.owner # gives owner object
print obj.parent # gives parent object
print obj.sitter # gives sitter object
If there is only one Bookmark object for a user profile (no multiple entries). Then you should use .get method instead (which return a single object).
obj = Bookmark.objects.get(owner=user)
print obj.owner
print obj.parent
print obj.sitter
Why is Django executing statements such as this:
SELECT (1) AS [a] FROM [my_table]
WHERE ([my_table].[id] = ?
AND NOT ([my_table].[id] = ? )) (1, 1)
This happens when calling is_valid() on a formset created the following way:
MyFormSet = modelformset_factory(Table, fields=['my_field'], extra=0)
my_form_set = MyFormSet(request.POST,
queryset=Table.objects.all())
where Table and MyForm are as simple as, say:
class Table(models.Model):
my_field = models.CharField(max_length=10)
class MyForm(forms.ModelForm):
class Meta:
model = Table
Hint: I looked at the call stack and the code responsible for it (in django/forms/models.py) is below:
def _perform_unique_checks(self, unique_checks):
import pdb; pdb.set_trace()
bad_fields = set()
form_errors = []
for unique_check in unique_checks:
# Try to look up an existing object with the same values as this
# object's values for all the unique field.
lookup_kwargs = {}
for field_name in unique_check:
lookup_value = self.cleaned_data[field_name]
# ModelChoiceField will return an object instance rather than
# a raw primary key value, so convert it to a pk value before
# using it in a lookup.
if isinstance(self.fields[field_name], ModelChoiceField):
lookup_value = lookup_value.pk
lookup_kwargs[str(field_name)] = lookup_value
qs = self.instance.__class__._default_manager.filter(**lookup_kwargs)
# Exclude the current object from the query if we are editing an
# instance (as opposed to creating a new one)
if self.instance.pk is not None:
qs = qs.exclude(pk=self.instance.pk)
Basically the pk is both included for the uniqueness check and excluded. Looks like Django can be smarter and avoid such inefficiency.
I haven't looked at it in detail, but I think you are right that Django could shortcut this query. Please file a ticket at http://code.djangoproject.com/.
Looks like this has been fixed already in trunk (by adding new functionality that also fixes this particular problem)