In this topic I found a good way to prevent cascade deleting of relating objects, when it's not neccessary.
class Factures(models.Model):
idFacture = models.IntegerField(primary_key=True)
idLettrage = models.ForeignKey('Lettrage', db_column='idLettrage', null=True, blank=True)
class Paiements(models.Model):
idPaiement = models.IntegerField(primary_key=True)
idLettrage = models.ForeignKey('Lettrage', db_column='idLettrage', null=True, blank=True)
class Lettrage(models.Model):
idLettrage = models.IntegerField(primary_key=True)
def delete(self):
"""Dettaches factures and paiements from current lettre before deleting"""
self.factures_set.clear()
self.paiements_set.clear()
super(Lettrage, self).delete()
But this method seems to fail when we are using ForeignKey field with "related_name" parameter.
As it seems to me, "clear()" method works fine and saves the instance of "deassociated" object. But then, while deleting, django uses another memorized copy of this very object and since it's still associated with object we are trying to delete - whooooosh! ...bye-bye to relatives :)
Database was arcitectured before me, and in somewhat odd way, so I can't escape these "related_names" in reasonable amount of time. Anybody heard about workaround for such a trouble?
What about re-reading the object again and delete that?
to_delete = self.__class__.objects.get(pk=self.pk)
to_delete.delete()
That way is the deleted object is a new fresh copy. The problem is to do all properly other stuff that the original delete() method has to do, like signal calling, return the correct value, etc...
Related
This is a duplicate question to this Django dynamic models.FileField Storage, but the question is not answered with correct solution yet.
I also have the similar use case. I need to dynamically change the storage on the basis of the model field.
I have tried using the callable for storage https://docs.djangoproject.com/en/3.1/topics/files/#using-a-callable. But I think this callable gets called before the model field values are initialized.
Edit:
Code Sample I have:
class MediaDocument(models.Model):
file_name = models.CharField(max_length=255)
object_storage_name = models.CharField(max_length=255, null=True)
file = models.FileField(upload_to=mediadocument_directory_path, storage=select_storage(object_storage_name))
I want to do something like this:
def select_storage(object_storage_name):
if object_storage_name == 'alibaba OSS':
return AlibabaStorage
else:
return MediaStorage
Please suggest how can I do this.
Luckily, got this article on internet https://medium.com/#hiteshgarg14/how-to-dynamically-select-storage-in-django-filefield-bc2e8f5883fd
It solves the similar use case.
I have a database containing a list of ingredients. I'd like to avoid duplicate entries in this table. I don't want to use the unique keyword for 2 reasons :
My uniqueness constraints are a bit more sophisticated than a mere =
I don't want to raise an exception when a pre-existing ingredient model is created, instead I just want to return that model, so that I can write Ingredient(ingredient_name='tomato') and just go on with my day rather than encapsulating all of that in a try clause. This will allow me to easily add ingredients to my recipe table on the fly.
One solution is simply to have a wrapper function like create_ingredient, but I don't find that to be particularly elegant and more specifically it's not robust to some other developer down the line simply forgetting to use the wrapper. So instead, I'm playing around with the pre_init and post_init signals.
Here's what I have so far :
class Ingredient(models.Model):
ingredient_name = models.CharField(max_length=200)
recipes = models.ManyToManyField(Recipe,related_name='ingredients')
def __str__(self):
return self.ingredient_name
class Name(models.Model):
main_name = models.CharField(max_length=200, default=None)
equivalent_name = models.CharField(max_length=200, primary_key=True, default=None)
def _add_ingredient(sender, args, **kwargs):
if 'ingredient_name' not in kwargs['kwargs'] :
return
kwargs['kwargs']['ingredient_name'] = kwargs['kwargs']['ingredient_name'].lower()
# check if equivalent name exists, make this one the main one otherwise
try:
kwargs['kwargs']['ingredient_name'] = Name.objects.filter(
equivalent_name=kwargs['kwargs']['ingredient_name']
)[0].main_name
except IndexError:
name = Name(main_name=kwargs['kwargs']['ingredient_name'],
equivalent_name=kwargs['kwargs']['ingredient_name'])
name.save()
pre_init.connect(_add_ingredient, Ingredient)
So far so good. This actually works and will replace ingredient_name when needed before the model is initialized. Now what I'd like is to check if the ingredient in question already exists and have the initializer return it if it does. I think I need to play around with post_init to do this but I don't know how to modify the particular instance that's being created. Here's what I mean by that :
def _finalize_ingredient(sender, instance, **kwargs):
try:
# doesn't work because of python's "pass arguments in python's super unique way of doing things" thing
instance = Ingredient.objects.filter(ingredient_name=instance.ingredient_name)[0]
except IndexError:
pass
post_init.connect(_finalize_ingredient, Ingredient)
As I've commented, I don't expect this to work because instance = ... doesn't actually modify instance, it just reassigns the variable name (incidentally if you try to run this all sorts of terrible things happen which I don't care to understand because I know this is flat out wrong). So how do I actually do this ? I really hope wrapper functions aren't the cleanest option here. I'm a big fan of OOP and gosh darn it I want an OOP solution to this (which, as I've said, I think in the long run would be much more robust and safer than wrappers).
I realize of course that I can add an add_ingredient method to Recipe which will do all of this for me, but I really like the idea of containing all of this in my Ingredient class as it will guarantee the proper database behavior under any circumstance. I'm also curious as to know if/how the post_init method can be used to completely override the created object for a given circumstance.
By the way, some of you may be wondering why I don't have a ForeignKey entry in my Name class that would connect the Name table to the Ingredient table. After all, isn't this what my check is essentially accomplishing in my _add_ingredient method ? One of the reasons is that if I do this then I end up with the same problem I'm trying to solve here : If I want to create an ingredient on the fly to add it to my recipe, I could simply create a Name object when creating an Ingredient object, but that would raise an exception if it corresponds to a main_name that is already in use (rather than simply returning the object I need).
I believe you are looking for get_or_create(), which is already a built-in in Django.
You mention:
One solution is simply to have a wrapper function like create_ingredient, but I don't find that to be particularly elegant and more specifically it's not robust to some other developer down the line simply forgetting to use the wrapper.
Well, look at it the other way around. What if you actually need to create a "duplicate" ingredient? Then it is nice to have the possibility.
I've come up with something that is as elegant and robust as I think it's possible to be given what I'm after. I've still had to define an add_ingredient method, but I still have the robustness that I need. I've made it so that it can be generalized to any class with a primary key, and the Name table will contain the info that will define the name uniqueness of any table :
class Name(models.Model):
main_name = models.CharField(max_length=200, default=None)
equivalent_name = models.CharField(max_length=200, primary_key=True, default=None)
def _pre_init_unique_fetcher(sender, args, **kwargs):
pk_name = sender._meta.pk.name
if pk_name not in kwargs['kwargs'] :
return
kwargs['kwargs'][pk_name] = kwargs['kwargs'][pk_name].lower()
# check if equivalent name exists, make this one the main one otherwise
try:
kwargs['kwargs'][pk_name] = Name.objects.filter(
equivalent_name=kwargs['kwargs'][pk_name]
)[0].main_name
except IndexError:
name = Name(main_name=kwargs['kwargs'][pk_name],
equivalent_name=kwargs['kwargs'][pk_name])
name.save()
sender._input_dict = kwargs['kwargs']
def _post_init_unique_fetcher(sender, instance, **kwargs):
pk_name = sender._meta.pk.name
pk_instance = instance.__dict__[pk_name]
filter_dict = {}
filter_dict[pk_name] = pk_instance
try:
post_init.disconnect(_post_init_unique_fetcher,sender)
instance.__dict__ = sender.objects.filter(**filter_dict)[0].__dict__
post_init.connect(_post_init_unique_fetcher, sender)
for key in sender._input_dict:
instance.__dict__[key] = sender._input_dict[key]
del sender._input_dict
except IndexError:
post_init.connect(_post_init_unique_fetcher, sender)
except:
post_init.connect(_post_init_unique_fetcher, sender)
raise
unique_fetch_models = [Ingredient, Recipe, WeekPlan]
for unique_fetch_model in unique_fetch_models :
pre_init.connect(_pre_init_unique_fetcher, unique_fetch_model)
post_init.connect(_post_init_unique_fetcher, unique_fetch_model)
Now what this will do is load up any new model with the pre-existing data of the previous model (rather than the default values) if one with the same name exists. The reason I still need an add_ingredient method in my Recipe class is because I can't call Ingredient.objects.create() for a pre-existing ingredient without raising an exception despite the fact that I can create the model and immediately save it. This has to do with how Django handles the primary_key designation : if you create the model then save it, it assumes you're just updating the entry if it already exists with that key, and yet if you create it, it tries to add another entry and that conflicts with the primary_key designation. So now I can do things like recipe.add_ingredient(Ingredient(ingredient_name='tomato', vegetarian=True)).
I'm looping through a list of objects and saving. I need the newly generated id or pointer id right after the save but it is None.
Here is my code:
for category in category_list:
saved_category = category.save()
print saved_category.parentCategory_ptr_id
print saved_category.id
This saves my object after the routine is run, but again, does not give me the id at this line.
here is my model:
class ParentCategory(models.Model):
name = models.CharField(max_length=255)
class Category(ParentCategory):
description = models.CharField(max_length=255)
category list was created like so:
category_list = []
for row in value_list:
category = Category(description=row.description)
category_list.append(category)
return category_list
What am I doing wrong?
The problem is with:
saved_category = category.save()
It needs to be:
category = category.save()
The original saved object in the list is the object that contains the id.
Their is no need of reassign.
category.save()
When you call save() method the object is saved into database and assign id (primary key) to the object.
Saving Objects Django Official
Other same question asked on Stackoverflow and correct answer by Daniel Roseman
I don't think the object will get saved. Seems that the object you are creating lacks sufficient data to satisfy db constraints. You might have used try-catch somewhere, you would have seen the error. Try adding blank=True, null=True to name in ParentCategory or provide a name while creating the object. I hope this works...
I've got a model that looks something like this
class SecretKey(Model):
user = ForeignKey('User', related_name='secret_keys')
created = DateTimeField(auto_now_add=True)
updated = DateTimeField(auto_now=True)
key = CharField(max_length=16, default=randstr(length=16))
purpose = PositiveIntegerField(choices=SecretKeyPurposes)
expiry_date = DateTimeField(default=datetime.datetime.now()+datetime.timedelta(days=7), null=True, blank=True)
You'll notice that the default value for key is a random 16-character string. Problem is, I think this value is getting cached and being used several times in a row. Is there any way I can get a different string every time? (I don't care about uniqueness/collisions)
Yes, the default will only be set when the Model metaclass is initialized, not when you create a new instance of SecretKey.
A solution is to make the default value a callable, in which case the function will be called each time a new instance is created.
def my_random_key():
return randstr(16)
class SecretKey(Model):
key = CharField(max_length=16, default=my_random_key)
You could, of course, also set the value in the model's __init__ function, but callables are cleaner and will still work with standard syntax like model = SecretKey(key='blah').
I'm trying to store sections of a document in a Django app. The model looks like:
class Section(models.Model):
project = models.ForeignKey(Project)
parent_section = models.ForeignKey('Section', blank=True, null=True, related_name='child_set')
predecessor_section = models.ForeignKey('Section', blank=True, null=True, related_name='predecessor_set')
name = models.CharField(max_length=100)
text = models.TextField(blank=True, null=True)
I create a whole lot of sections, link them (parent_section, predecessor_section) and store them by calling each of their save methods. However, when I look into the table after saving it, the parent_section_id and the predecessor_section_id are not set, even though I had objects attached to them before saving.
I assume it has to do with the fact that some parent_section instances don't have an id assigned as their instance hasn't been stored yet, but using manual transactions couldn't solve the problem.
Any thoughts on that?
Cheers,
Max
objects do not have an id until you save them in Django ORM.
So I'd say you need to save() the object, then reference it in your parent/child sections (and re-save the sections).
However, another option to storing prec and next as pointers is to store an sequence_index (spaced by 10 to allow further inserts wiothout reordering) and order by this index.
Try doing a save() on all the objects, then update their relations, and then save() all of them again.
When you assign a foreignkey, the related (target) object's id is copied. since at the moment of assigning the relations (parent_section, predecessor_section) the related objects don't have an id yet, you get a funky result:
A = Section(name='A')
B = Section(name='B')
B.parent_section = A
A.save()
B.save()
B.parent_section # this will say A
B.parent_section_id # this will say **None**
But this should work:
A = Section(name='A')
B = Section(name='B')
A.save()
B.save()
B.parent_section = A
B.parent_section # this will say A
B.parent_section_id # this will say A.id
B.save() # don't forget this one :)