I'm not entirely sure how this engine works, but let me set up a scenario. I have a Django-Mongo project with say a model class like:
class BaseModel(models.Model):
created_at = models.DateTimeField(null=False, auto_now_add=True)
modified_at = models.DateTimeField(null=False, auto_now=True)
I create a bunch of base models and everything is all lovely in the project and I get lots of data. Then, later the project grows and we need to expand BaseModel. It suddenly becomes:
class BaseModel(models.Model):
created_at = models.DateTimeField(null=False, auto_now_add=True)
modified_at = models.DateTimeField(null=False, auto_now=True)
active = models.BooleanField(default=True)
So I get that mongo will allow me to change this without having to touch the database, but say the very next thing I do is do a query like:
BaseModel.objects.filter(active=True)
Should I expect old records that needed to use the default but have not been access yet to show up in this query? Or do I need to do anything specific for the query engine to realize that the value I'm looking for matches the default?
Ok, found the answer out after some trial and error. When Django executes the query, it looks against what Mongo has in the database and ignores any default settings. However, when you already get an object, Django puts the default into the object in memory (but not in the Mongo record).
This seems kind of gross. What it means is, before we can query upon that refactored field we need to do something like this once:
for mod in BaseModel.objects.all():
mod.save()
Needless to say, this is a really really expensive query potentially. Thankfully it only has to be run once, but still.
If there are better alternatives that'd be great.
Related
I have a strange situation where Django seems to be giving me records that do not actually exist in the database when queried via a related_name on a foreign key. Here's a simplified example:
Let's say I have a Person model and a Pet model, where each Pet has an owner, which is a foreign key on Person:
class Pet(models.Model):
id = models.UUIDField(primary_key=True, default=uuid.uuid4, editable=False)
name =models.CharField(max_length=50, null=False, db_index=True)
owner = models.ForeignKey("Person", null=False, related_name="pets", on_delete=models.CASCADE)
relationship = models.IntegerField(null=False, choices=PetRelationship.choices(), db_index=True)
Now, I have the below function that retrieves a person's pets:
def pet_tester(person):
for pet in person.pets.filter(relationship=PetRelationship.FRIENDLY):
pet_id = pet.id
LOGGER.info(f"*************** pet.id = {pet_id}")
LOGGER.info(f"******* Pet exists? = {Pet.objects.filter(id=pet_id).exists()}")
...
Note that this is NOT a "minimal reproducible example". I can only reproduce it in my much larger application.
For some reason (in the real app), the "exists" query is coming back as False. The output is like this:
*************** pet.id = 123e4567-e89b-12d3-a456-426614174000
******* Pet exists? = False
If I query the actual database (Postgresql) directly (outside of Django), that pet ID sure enough does NOT exist. However, the person.pets.filter query is returning it just the same.
I do not understand how this is even possible. It JUST retrieved the pet from the database (or so it would seem - it even has a UUID), but it's not really there after all when I immediately try to query it back.
It seems like the Django "filter" results are somehow inconsistent with the underlying database, but I don't know why. This is very reproducible in my app (it happens every time, consistently). It doesn't appear to be a corrupt database, as it happens consistently even on brand new machines, with the app and database freshly installed.
I know you don't know what else my app is doing. I obviously can't post all the source code, but I can assure you that the beginning of the actual function is pretty much verbatim of the pet_tester function above. I only changed the class/field names. There are no extra lines in the actual app between the "for" line and the 3 lines below it.
I'm mostly looking for ideas about how this can possibly happen, and what to look for in the app's other code.
It turns out that there is another service that is deleting the "pets" from the database. There must be some kind of caching in Django, and since the deletion happened outside of Django, Django didn't invalidate its "person.pets" cache.
I am currently using django-shortuuidfield to generate a unique UUID primary key on the Customer model as shown below
class Customer(models.Model):
customer_id = ShortUUIDField()
user = models.ForeignKey(User, on_delete=models.CASCADE)
name = models.CharField(max_length=255, blank=False)
This guarantees that the customer_id field is unique with space and time. However, this generates an id similar to B9fcKdMDHbwKCBoADjbNyA and I want to prefix this with cust_ to make it like this cust_B9fcKdMDHbwKCBoADjbNyA. How do I achieve this without making multiple db calls?
I also looked into django-hashid-field which supports prefixes out of the box but this does not guarantee UUID, and on a larger scale, we may run into unique contain failed issues that are not desirable.
Any thoughts on this? Let me know...
Probably a stale thread, but I just recently came across this problem myself. I ended up just forking and modifying ShortUUIDField module to include a prefix and suffix optionally. Kind of a hacky job I have to admit, but I thought I'd share for posterity.
https://github.com/nick-fournier/django-customshortuuidfield
use like:
class Business(models.Model):
id = CustomShortUUIDField(primary_key=True, prefix="biz_")
I've run into an issue that I really haven't dealt with before. I have a task to upgrade from django 1 ==> 2. (django 1 doesn't require on_delete when dealing with relationships)
I have a couple of crucial models that have relationships inside, but I definitely don't want to CASCADE those records. For example, if a user deletes their account, I don't want their expenses to be deleted. Maybe we need to keep those expense instances for tax records later, etc.
I have read that DO_NOTHING can also be dangerous.
With a model like this, what would be the best course of action when dealing with the ForeignKeys?
I appreciate all the help in advance.
class Expenses(models.Model):
user = models.ForeignKey(Account, null=True, blank=True,
on_delete=models.?)
event = models.ForeignKey(Event, null=True, blank=True,
on_delete=models.?)
payee = models.CharField(max_length=128, null=True, blank=True)
category = models.ForeignKey(Category, blank=True, null=True,
related_name='expense_category', on_delete=models.?)
I have a task to upgrade from django 1 to 2. (django 1 doesn't require on_delete when dealing with relationships)
In django-1.x, if you did not specify on_delete, it used CASCADE [Django-doc], so in fact by specifying it, you can make it more safe.
I have read that DO_NOTHING can also be dangerous.
Well most databases will raise an integrity error for this, since then it would refer to a user that no longer exists. So DO_NOTHING is not in itself dangerous, it will simply for most databases not allow deleting the database, but that by rasing an IntegrityError.
With a model like this, what would be the best course of action when dealing with the ForeignKeys?
Perhaps here PROTECT [Django-doc] is here more appropriate, since it will simply prevent deleting the object if it is still referenced.
The best solution however depends on a large number of details. Therefore it might be better to look at the possible on_delete=… strategies [Django-doc].
Not sure if this is a bug in Django, or it just doesn't support what I'm trying to do (or how i'm doing it).
A snippet of my model:
class UserProfile(models.Model):
user = models.OneToOneField(User, primary_key=True, related_name='profile'
login_hash = models.CharField(max_length=36, blank=True, null=True, default=uuid.uuid4())
...
As you see, i've set the default for login_hash to a call to uuid.uuid4()
works fine... however, multiple calls to the UserProfile (creating new users quickly, even seemingly a few minutes, but i've not an official time) will result in the same login_hash for multiple users.
It appears that django (i'm on 1.7.4) is caching the result of uuid4() for some period of time. not good for what i'm trying to do.
SOLUTION:
that i'm using. I've simply set an 'on insert' trigger on the database, so that when i insert a new record, the database generates the UUID, but only on inserts/new records.
Is there a way to do it within django so that i can keep it database agnostic?
works fine... however, multiple calls to the UserProfile (creating new users quickly, even seemingly a few minutes, but i've not an official time) will result in the same login_hash for multiple users.
As the code is currently written you're calling uuid.uuid4() at the point UserProfile is imported. It'll be called once and the resulting value will be the default for all new creations.
What you instead what to do is pass a callable as the default. Like so: default=uuid.uuid4.
Also, for CharField I'd strongly suggest not allowing NULL values as well as blank values. It's also not clear if you really do want to allow blank values for this field, but let's assume that you do. You should end up with this:
login_hash = models.CharField(max_length=36, blank=True, default=uuid.uuid4)
I have a Foreign Key from one model into another model in a differente database (I know I shouldn't do it but if I take care properly of Referential Integrity it shouldn't be a problem).
The thing is that everything works fine...all the system does (relationships on any direction, the router takes care of it) but when I try to delete the referenced model (which doesn't have the foreign key attribute)...Django still wants to go throught the relationship to check if the relationship is empty, but the related object is on another database so it doesn't find the object in this database.
I tried to set up on_delete=models.DO_NOTHING with no success. Also tried to clear the relationship (but it happens clear doesn't have "using" argument so I it doesn't work either). Also tried to empty the relationship with delete(objects...), no success.
Now I am pretty sure the problem is in super(Object,self).delete(), I can not do super(Object,self).delete(using=other_database) because the self object is not in another database just the RelatedManager is. So I don't know how to make Django to understand I don't want even to check that relationship, which by the way was already emptied before the super(Object,self).delete() request.
I was thinking if there is some method I can override to make Django avoid this check.
More graphical:
DB1: "default" database (orders app)
from django.db import models from shop.models import Order
class IOrder(models.Model):
name = models.CharField(max_length=20, unique=True, blank=False, null=False)
order = models.ForeignKey(Order, related_name='iorders', blank=True, null=True)
DB2: "other" database
class Order(models.Model):
description = models.CharField(max_length=20, blank=False, null=False)
def delete(self):
# Delete iOrder if any
for iorder in self.iorders.using('default'):
iorder.delete()
# Remove myself
super(Order, self).delete()
The problem happens when supper(Order.self).delete() is called, then it can not find the table (iorder) in this database (because it is in 'default')
Some idea? Thanks in advance,
I already resolved my issue changing super(Order,self).delete() with a raw SQL delete command. Anyway I would love to know if there is a more proper way of doing this