Add relations before adding them to DB in django - django

If I have a relation like this in django
class Reporter(models.Model):
pass
class Article(models.Model):
reporter = models.ForeignKey(Reporter)
and I want to create a new reporter with articles at once, I first have to save() the Reporter to DB and then I can add the articles.
But sometime, I would like to prepare everything "offline" (in sense of, before pushing anything to the DB), so like creating a Reporter object, adding articles to it and maybe afterwards still modifying some attributes of the Reporter object.
Then, when everything is done, I want to push all together to the DB. But of course when I use Reporter.article_set.add() before calling Reporter.save() I will get an error, because django will try to add the articles and foreign keys to the DB automatically. Is there any way to prevent this, and prepare my object inlcuding the relations "offline" ?
My own approach would be, to add a set_articles method to Reporter and then override the save() method so it will check if there are any articles set and add them after saving the Reporter
But before I start improvising I would like to know if there are already any solutions within django

The save() method accepts an optional commit keyword argument, which accepts either True or False. If you call save() with commit=False, then it will return an object that hasn’t yet been saved to the database. In this case, it’s up to you to call save() on the resulting model instance.

Related

Migrations can't run due to evaluation of QuerySet in Serializer?

Trying to solve this chicken-and-egg issue. I've defined a new model which an existing model foreign keys into, and made the new migrations, including a data migration that provisions some defaults for the new model.
In the serializer for the existing model, I wanted to define a default like so:
new_thing = serializers.PrimaryKeyRelatedField(
queryset=NewThing.objects.all(),
default=NewThing.objects.get(id=DEFAULT_NEW_THING_ID),
)
I don't understand django's specific mechanism, but this seems to cause problems while running the migrations (although the makemigrations seems fine). It seems the migration runs a series of app checks on the views and eventually the serializer, where it evaluates the NewThing QuerySet and returns an error that the relation for NewThing does not exist (since the migration hasn't been run yet).
You should never run actual queries in class-level definitions; this is generally true in both Django generally as well as DRF. get is a query and will hit the database at definition time, although the queryset argument will not and is OK.
If you want to set a default that is based on an actual object, you should do it at create time by defining the create() method, in which you would check that the value is not supplied.

Why is a auto_now=True field designed to not update when using QuerySet.update()?

from django:
The field is only automatically updated when calling Model.save(). The field isn’t updated when making updates to other fields in other ways such as QuerySet.update(), though you can specify a custom value for the field in an update like that.
Both will have to execute an update query, so what is the reason behind save.() updating the auto_now=True field and QuerySet.update() not updating the field?
Update query is meant to be faster than the regular field changing and saving pattern, thus it does not call the save() method, which handles updating auto_now fields, sending signals and so on. If you're not sure what you're doing, then it's always a good idea to explicitly call the save() on a model. Advanced and "less restricted" methods such as update or bulk_create are faster and meant for editing data on DB level. From Django docs:
Finally, realize that update() does an update at the SQL level and,
thus, does not call any save() methods on your models, nor does it
emit the pre_save or post_save signals (which are a consequence of
calling Model.save()).
If you were hoping for a more technical explanation, then the update query probably doesn't bother to check if the table has an auto_now field. It would require some data gathering and make the process slower. If you do want to update the field, you can update it explicitly.

Django Haystack RealTimeSearchIndex on ManyToMany Relationships strange behaviour

Following a related (as yet unanswered) question, I did some investigation and found that the current implementation of Django Haystack's RealTimeSearchIndex makes no attempt to also update on related field (Many to Many) changes. I thought this would be an easy fix - after all, I could just extend RealTimeSearchIndex like this:
class RealTimeM2MSearchIndex(RealTimeSearchIndex):
def _setup_save(self, model):
signals.m2m_changed.connect(self.update_object, sender=model)
signals.post_save.connect(self.update_object, sender=model)
But then I realized (or at least assumed, since it's not working) that this only works if the M2M field is defined on the model itself, and not if it's the "reverse" side of the M2M relationship. Trying to fix that, I then did something like the following:
signals.m2m_changed.connect(self.update_object, sender=model.related_field.through)
Where related_field is the name of the specific Model on other side of the ManyToMany definition. Strangely enough, upon running, Django then complains that the Model has no such field, related_field.
And sure enough, if I inspect the object, Django has not yet extended the model to have the related_field field. If I inspect the same object when displaying a view, however, it does have that related_field.
Summary
So the problem seems to be that Django's automatic behavior to add an attribute to the reverse side of an M2M relationship has yet to happen when Haystack runs its code. How can I overcome this obstacle, and allow Haystack's RealTimeSearchIndex to also update on related field changes?
I think the simplest solution is to just use the built in RealTimeSearchIndex, and add a signal listener in your models.py to reindex the model on m2m_changed, or whenever. See my answer to the other question - you could easily modify it to index on m2m_changed instead of post_save.
Just tried implementing this myself, and your problem is the value of the sender argument in this line:
signals.m2m_changed.connect(self.update_object, sender=model)
I read the documentation for the m2m_changed signal and the sender will be something like MyModel.my_field.through so you need to use that. This means you can't have a generic class as you are trying to do, but will need to define the _setup_save method in each case, with the m2m_changed signal connected for each ManyToMany field that model has.
For example, if your model had two ManyToManyFields called region and sector, you could do:
# we implement these to force the update when the ManyToMany fields change
def _setup_save(self, model):
signals.m2m_changed.connect(self.update_object,
sender=MyModel.sector.through)
signals.m2m_changed.connect(self.update_object,
sender=MyModel.region.through)
signals.post_save.connect(self.update_object, sender=model)
You should also really define the _teardown_save() method:
def _teardown_save(self, model):
signals.m2m_changed.disconnect(self.update_object,
sender=MyModel.sector.through)
signals.m2m_changed.disconnect(self.update_object,
sender=MyModel.region.through)
signals.post_save.disconnect(self.update_object, sender=model)
(This is based on code I have tested, it appears to work - certainly no errors about fields not existing).
Update: Just read your question more closely. Is it possible that your model has the ManyToManyField added dynamically? Is the call to register() after you have defined all your classes?

Adding an item to django-cart (through ./manage shell)

I would like to add an Item entry into my database in my django app, but I am having problems. I'm still learning django (who isn't?), but I've done db entries before. Part of this is because of things like cart instance, and contenttype instances.
Generally I start with...
item1 = Item(Cart(...), ContentType(...), quanity='4',<etc.>)
And depending on what I put in, it will let me do that, but when I do item1.save(), it yells at me, and unfortunately the stack trace is hardly helpful. Or, maybe it's just me.
Any suggestions?
First suggestion is to post the stacktrace or even just the main exception; it's always more helpful to know what it's yelling.
My guess is first that you are passing in positional arguments and the model doesn't know what to do with which argument.
My second guess is that you are passing in unsaved instances Item(Cart()...) to foreign key fields that are non nullable so django or the database would complain if you didn't pass in an actual Cart instance with an ID defined.
So, explicitly define which fields you are passing to the constructor, and ensure you are passing in saved instances (not Cart() but Cart.objects.get(id=X))
cart = Cart.objects.latest('id')
item = Item(cart=cart, etc.)
item.save()

Django - logical delete

I want to make the following modification to the Django framework.
I want it to create a "deleted" field for each model I create.
I want it to be checked as deleted when I delete it from the admin page instead of being physically deleted.
I do not want these records checked as deleted to be listed.
I'm new to Django, I'm seeing if I can do what I want to do with it easily. I need this change because it's the way we currently work.
So far these are the changes I have made, I would like to understand how the whole Django framewok works inside but I'm so far from that, is there any documentation online which explains clearly how the inside framework parts/files/modules/classes work together, the specific role of each one, etc
In the base.py file, in the modelbase class, below this code,
for obj_name, obj in attrs.items():
new_class.add_to_class(obj_name, obj)
I added,
from django.db import models
new_class.add_to_class('deleted', models.BooleanField())
When it creates a model it adds the "deleted" field to it.
In the base.py file, in the save method, I changed what it was there for
self.deleted = True
self.save()
So, now it check as deleted a record instead of physically delete it.
Now what I want is those records not to be listed.
I don't understand why you're modifying the framework code instead of putting your deleted field in a model base class that all of your models extend from.
Nevertheless, a nice way to filter those records out would be to add a custom manager to the model (or your base model class, if you choose to create one). To this manager, override the get_query_set method as described here. In your overridden method, add a exclude(deleted=True) filter.
Take a look at the Django-logicaldelete app, You just inherit your models from their provided Model class and you get Logical delete for all of them.
It comes with an adminModel as well so you can manage logically deleted models there too.
Override the delete() method in your model class, set the deleted attribute there
Create a custom manager which will filter by deleted attribute and set it as the default one (objects = MyDeletedManager)