Let us say I have a model which contains related (foreign key) fields. Likewise, those Foreign Key fields may refer to models which may or may not contain related fields. Note that relational fields in Django may be one-to-one, many-to-one, or many-to-many.
Now, given an instance of a model, I want to recursively and dynamically get all instances of the models related to it, either directly or indirectly down the line. Conceptually, i want to perform a traversal of the related objects and return them.
Example:
class Model1{
rfield1 = models.ForeignKey("Model2")
rfield2 = models.ManyToManyField("Model3")
normalfield1 = models.Charfield(max_length=50)
}
class Model2{
sfield = models.ForeignKey("Model3")
normalfield = models.CharField(max_length=50)
}
class Model3{
normalfield = models.CharField(max_length=50)
}
Let's say, I have an instance of model Model1 model1, and I want to get objects directly related to it i.e. all Model2 and Model3 objects, and also those which are indirectly related i.e. all Model3 objects related to the Model2 objects retrieved previously. I also want to consider the case of a One-to-One field where the related field is defined on the OTHER MODEL.
Also, note that it might not be the case that I know the model of an instance I'm currently working on. Let's say in the previous example, I may not now that model1 is an instance of Model1 model. So I want to perform all these dynamically.
In order to this, I think I need a way to get all related fields of an object.
How to get all the related fields of an object?
And how should I use them to get the actual related objects?
Or is there a way to better to do this? Thank you very much!
UPDATE:
I already know how to perform 1, and 2 basically follows directly from 1. :) Update later.
If you have model1 getting all it's many to many field names (etc) is easy since this is well know and these are all stored in the meta's 'local_many_to_many' list:
[field.name for field in model1._meta.local_many_to_many]
The foreign keys are a bit more tricky since they are stored with all other fields in the meta's 'local_fields' list. Hence we need to make sure that it has a relation of sorts. This can be done as follows:
[field.name for field in model1._meta.local_fields if field.rel]
This method has requires no knowledge of your models. Also further interrogation can be done on the field object if the name is not enough.
Related
I have an Item model that is reverse-related to two other models (ItemComponent and or ItemComponentCategory). The idea is I'd like to be able to validate that Items have no more than 4 relations to the two other models, combined, before being saved.
class Item(models.Model):
name = models.CharField(max_length=40, unique=True)
class ItemComponent(models.Model):
parent_item = models.ForeignKey(Item, related_name='components')
class ItemComponentCategory(models.Model):
parent_item = models.ForeignKey(Item, related_name='categories')
I'd like to create a validation that raises an error before saving either the Item, ItemComponent, or ItemComponentCategory objects if the saved objects will result in > 4 object relations between them.
I have tried adding something like this to the clean methods for all three:
def clean(self):
if (self.parent_item.components.count() + self.parent_item.categories.count()) > 4:
raise ValidationError(_(f'Items can have no more than 4 components and/or component categories'))
This seems to work as long as the Items and their relations are already saved with 4 and you're trying to add more relations.
However, if I create a TabularInline in the ItemAdmin to add these 'sub types,' if you will.. I can create a new Item and add as many of these sub types and save it no problem.
What am I missing here?
It seems my issue is related to timing and order of processes.
When saving an admin page with inline references, the model of the main page is saved first then the inline objects. This poses an issue when trying to use a model's clean() method to validate against the relation, because it simply does not exist yet.
As mentioned in THIS article it seems that my solution is to modify the modelAdmin functions to ensure that the inline objects are saved first, then validate the main objects after they're saved.
Also, I think this save_related method for modelAdmin will come in handy.
Is it possible without related name (related_name="+") to prefetch objects on the target instance? Sure I know it's not a problem with the related name, but I'm not really sure if it's possible without it.
Here is the example code:
from django.db import models
class Parent(models.Model):
name = models.CharField(max_length=50)
class Child(models.Model):
parent = models.ForeignKey(to=Parent, related_name="+", on_delete=models.CASCADE)
name = models.CharField(max_length=50)
Parent.objects.all().prefetch_related('child_set')
Maybe it's possible using the Prefetch(lookup, queryset=None, to_attr=None) object, because it takes the queryset in the argument list?
Looked through the code a bit and found this line:
rel_obj_descriptor = getattr(instance.__class__, through_attr, None)
Here instance is the model instance, and through_attr is the field name of related instance to be fetched. This line basically tries to get a related descriptor to perform the prefetch query. In your case rel_obj_descriptor would contain None.
To answer your question no it is not possible at least for a Foreign Key, there may be some hack for Many to Many relationships as Django appears to use some internal descriptors for them.
I would advice you to simply not set related_name="+" since you want to use the backwards relation here. You say "It's because of separation of concerns between multiple apps" but that does not make much sense. Don't we set a foreign key to the user model for various other models anyway and still use the related name? Does the point of separation of concerns arise there(the user model is in a separate app)?
try
parent = Parent.objects.get(id=pk)
parent.child_set.all()
I don't know if having related_name = '+' prevents this situation, but if you never define related_name, you can definitely use this method.
I think this is best explained with some simple model code (I'm writing this from scratch so possible syntax issues - unimportant here):
class Car(models.Model)
make = models.CharField(...)
model = models.CharField(...)
class StatisticType(models.Model):
name = models.CharField(...)
class Statistic(models.Model)
car = models.ForeignKey('Car')
stype = models.ForeignKey('StatisticType')
data = models.CharField(...)
class Meta:
unique_together = (('car', 'stype'),)
We have a car with some hard-coded stats and we have some database controlled statistics. I might add Colours, Wheel Size, etc. The point is it's editable from the admin so neither I or the client need to climb through the data, but it's limited so users can only pick one of each stat (you can't define "Colours" twice).
So I'm trying to write the data input form for this now and I want a list of optional ModelForms that I can chuck on the page. I've got the simplest ModelForm possible:
class StatisticForm(forms.ModelForm):
class Meta:
model = Statistic
The tricky part (in my head) is generating an instance of this ModelForm for each StatisticType, regardless of it existing yet. That is to say if a Car object doesn't have a Colour assigned to it, the form still shows. Similarly, if it does, that instance of a Statistic is loaded in the ModelForm.
In my view, how do I generate a list of these things, regardless of there being a pre-existing instance of any given Statistic?
This seems like it should be a stupidly simple thing to do but it's late on Friday and everything looks skwonky.
Sounds like you might want to leverage an inline model formset factory.
That would allow you to create as many instances of your Statistic object as you need. If you're needing to create instances of your StatisticType on the fly, that's a bit different.
When Django instantiates forms, for a foreign key, m2m or choice field, it will only accept choices that it deems "valid", and will complain if you add a choice using JavaScript that doesn't exist in a related model or set of choices server-side.
So, if you need to make StatisticTypes on the fly, and then populate formset instances with this new value, I would suggest using Knockout.js. It's very good at keeping lots of DOM elements in sync when data changes.
I've searched around stack overflow for an answer to this (probably simple) question, but most of the solutions I see seem overly complicated and hard to understand.
I have a model "Post" which is an abstract base class. Models "Announcement" and "Event" inherit from Post.
Right now I'm keeping related lists of Events and Announcements in other models. For instance, I have "removed_events" and "removed_announcements" fields in another model.
However, in my project, "removed_events" and "removed_announcements" are treated exactly the same way. There is no need to disambiguate between a "removed event" and a "removed announcement." In other words, a field keeping track of "removed_posts" would be sufficient.
I don't know how to (or perhaps can't) create a field "removed_posts," since Post is abstract. However, right now I feel like I'm repeating myself in the code (and having to do a lot of clutter-some checks to figure out whether the post I'm looking at is an event or an announcement and add it to the appropriate removed field).
What is the best option here? I could make Posts non-abstract, but Post objects themselves should never be created, and I don't think I can enforce this on a non-abstract object.
My understanding of databases is weak, but I'm under the impression that making Post non-abstract would complicate the database due to joins. Is this a big deal?
Finally, there are other fields in other models where I'd like to condense things that amount to an event_list and an announcement_list into a post_list, but those fields do need to be disambiguated. I could filter the post_list based on post type, but the call to filter() would be slower than being able to directly access the event and announcement lists separately, wouldn't it? Any suggestions here?
Thanks a ton for reading through this.
There are two kinds of model subclassing in Django - Abstract Base Classes; and Multi-Table inheritance.
Abstract Base Classes aren't ever used by themselves, and do not have a database table or any form of identification. They are simply a way of shortening code, by grouping sets of common fields in code, not in the database.
For example:
class Address(models.Model):
street = ...
city = ...
class Meta:
abstract = True
class Employee(Address):
name = ...
class Employer(Address):
employees = ...
company_name = ...
This is a contrived example, but as you can see, an Employee isn't an Address, and neither is an Employer. They just both contain fields relating to an address. There are only two tables in this example; Employee, and Employer - and both of them contain all the fields of Address. An employer address can not be compared to an employee address at the database level - an address doesn't have a key of its own.
Now, with multi-table inheritance, (remove the abstract=True from Address), Address does have a table all to itself. This will result in 3 distinct tables; Address, Employer, and Employee. Both Employer and Employee will have a unique foreign key (OneToOneField) back to Address.
You can now refer to an Address without worrying about what type of address it is.
for address in Address.objects.all():
try:
print address.employer
except Employer.DoesNotExist: # must have been an employee
print address.employee
Each address will have its own primary key, which means it can be saved in a fourth table on its own:
class FakeAddresses(models.Model):
address = models.ForeignKey(Address)
note = ...
Multi-table Inheritance is what you're after, if you need to work with objects of type Post without worrying about what type of Post it is. There will be an overhead of a join if accessing any of the Post fields from the subclass; but the overhead will be minimal. It is a unique index join, which should be incredibly quick.
Just make sure, that if you need access to the Post, that you use select_related on the queryset.
Events.objects.select_related(depth=1)
That will avoid additional queries to fetch the parent data, but will result in the join occurring. So only use select related if you need the Post.
Two final notes; if a Post can be both an Announcement AND an Event, then you need to do the traditional thing, and link to Post via a ForeignKey. No subclassing will work in this case.
The last thing is that if the joins are performance critical between the parent and the children, you should use abstract inheritance; and use Generic Relations to refer to the abstract Posts from a table that is much less performance critical.
Generic Relations essentially store data like this:
class GenericRelation(models.Model):
model = ...
model_key = ...
DeletedPosts(models.Model):
post = models.ForeignKey(GenericRelation)
That will be a lot more complicated to join in SQL (django helps you with that), but it will also be less performant than a simple OneToOne join. You should only need to go down this route if the OneToOne joins are severely harming performance of your application which is probably unlikely.
Generic relationships and foreign keys are your friend in your path to succeed. Define an intermediate model where one side is generic, then the other side will get a related list of polymorphic models. It's just a little more complicated than a standard m2m join model, in that the generic side has two columns, one to ContentType (actually a FK) and the other to the PK of the actual linked model instance. You can also restrict the models to be linked with using standard FK parameters.
You'll get used with it quickly.
(now that I get an actual keyboard to write with, here there is the example:)
class Post(models.Model):
class Meta: abstract = True
CONCRETE_CLASSES = ('announcement', 'event',)
removed_from = generic.GenericRelation('OwnerRemovedPost',
content_type_field='content_type',
object_id_field='post_id',
)
class Announcement(Post): pass
class Event(Post): pass
class Owner(models.Model):
# non-polymorphic m2m
added_events = models.ManyToManyField(Event, null=True)
# polymorphic m2m-like property
def removed_posts(self):
# can't use ManyToManyField with through.
# can't return a QuerySet b/c it would be a union.
return [i.post for i in self.removed_post_items.all()]
def removed_events(self):
# using Post's GenericRelation
return Event.objects.filter(removed_from__owner=self)
class OwnerRemovedPost(models.Model):
content_type = models.ForeignKey(ContentType,
limit_choices_to={'name__in': Post.CONCRETE_CLASSES},
)
post_id = models.PositiveIntegerField()
post = generic.GenericForeignKey('content_type', 'post_id')
owner = models.ForeignKey(Owner, related_name='removed_post_items')
class Meta:
unique_together = (('content_type', 'post_id'),) # to fake FK constraint
You can't filter into the related collection like a classic many-to-many, but with the proper methods in Owner, and using the concrete classes' managers smartly, you get everywhere you want.
I've got two models: Common and ARecord. ARecord has a ForeignKey relationship to Common. I want to ensure that ARecord is unique with a combination of items from ARecord and Common.
class Common(models.Model):
NAIC_number = models.CharField(max_length=5)
file_location_state = models.CharField(max_length=2)
file_location_code = models.CharField(max_length=2)
class ARecord(models.Model):
common = models.ForeignKey(Common)
coverage_code = models.CharField(max_length=6)
record_type = models.CharField(max_length=1)
class Meta:
unique_together = ('coverage_code', 'common__NAIC_number')
However, when I attempt to access the foreign key object property via the usual double underscore, I get a model validation error.
`arecord.arecord: "unique_together" refers to common__NAIC_number, a field that doesn't exist. Check your syntax.`
This seems like it should be possible and, a slightly different question was asked that indicates it is , but perhaps I'm missing something obvious?
As Manoj implies, you can't do this with unique_together, because that is a database constraint and the sort of thing you want can't be done with database constraints.
Instead, you want do this programmatically, probably via model validation, which will ensure that no instances are created that violate your constraint.
This doesn't make sense to me. The documentation defines unique_together thus:
This is a list of lists of fields that must be unique when considered together. It's used in the Django admin and is enforced at the database level (i.e., the appropriate UNIQUE statements are included in the CREATE TABLE statement).
(Emphasis added)
I don't know how an UNIQUE statement can be added at the database level for such a case (using one column in the current table and another in a different table accessed through a foreign key). I hope those who know better about databases will correct me if I am wrong.