Say I have some groups with some items. I would like items to have a unique index within a group:
class Item(models.Model):
group = models.ForeignKey(Group, null=True)
index = models.IntegerField(default=0)
class Meta:
unique_together=('group','index')
def save(self):
if self.pk is None and self.group_id is not None:
self.thread_index = Item.objects.filter(group_id=group_id).count()+1
return super(Item, self).save()
But this is problematic because the update is not atomic, i.e. another transaction may have added another row after I calculate thread_index and before I write to the database.
I understand that I can get it working with catching IntegrityError and retrying. But I wonder if there's a good way to populate it atomically as part of the insert command, with SQL something like:
insert into app_item (group, index)
select 25, count(*) from app_item where group_id=25
Since you're using django models, what does adding an index field give you over the primary key added to all models unless you specify otherwise? Unless you need items to have sequential indices, which would still be problematic upon deleting objects, the id field would certainly appear to satisfy the requirement as you outlined it.
If you have to have this field, then an option could be to take advantage of signals. Write a receiver for the object's post_save signal and it could conceivably handle it.
One possible solution could be if you have issues with signals or custom save approach, calculate it at runtime by making index a property field
class Item(models.Model):
group = models.ForeignKey(Group, null=True)
def _get_index(self):
"Returns the index of groups"
return Item.objects.filter(group_id=group_id).count()+1
index = property(_get_index)
Now use index as field of Item model.
Related
There is a race condition situation, when I want to create a new instance of model Order.
There is a daily_id field that everyday for any category starts from one. It means every category has its own daily id.
class Order(models.Model):
daily_id = models.SmallIntegerField(default=0)
category = models.ForeignKey(Categoty, on_delete=models.PROTECT, related_name="orders")
declare_time = models.DateField()
...
}
daily_id field of new record is being calculated using this method:
def get_daily_id(category, declare_time):
try:
last_order = Order.objects.filter(declare_time=declare_time,
category=category).latest('daily_id')
return last_order.daily_id + 1
except Order.DoesNotExist:
# If no order has been registered in declare_time date.
return 1
The problem is that when two different users are registering orders in the same category at the same time, it is highly likely that the orders have the repetitive daily_id values.
I have tried #transaction.atomic decorator for post method of DRF APIView and it didn't work!
You must use an auto increment and add a view that computes your semantic order like :
SELECT *, ROW_NUMBER() OVER(PARTITION BY MyDayDate ORDER BY id_autoinc) AS daily_id
I have a class/model in my models.py that can receive a textField. I want to add a column in my database that corresponds to the length of this textField. What is the best way of doing that?
You can add a field that stores the character count value to your model:
class YourModel(models.Model):
my_text_field = models.TextField()
char_count = models.DecimalField(max_digits=3, decimal_places=0, blank=True, null=True)
And override the model save method to update the field on save:
def save(self, *args, **kwargs):
if self.my_text_field:
self.char_count = len(self.my_text_field)
super().save(*args, **kwargs)
Adjust the field options (like max_digits) to suit your needs.
It depends how you want to use the length. The answer with the model property (#property) is good if you don't want to do complex queries involving the length, such as filtering by length, ordering, etc.
The solution with an extra database field solves the problem above, but it puts on you the duty of keeping the char_count column updated. If you need the length in your queries, a better way would be to use the database LENGTH function, which can be done readily in Django without changing your model. Examples, with a Article model that has a text CharField:
Order by article text length:
from django.db.models.functions import Length
sorted_qs = Article.objects.order_by(Length('text'))
If you print str(sorted_qs.query), you can see ORDER BY LENGTH("app_article"."text") ASC in the query.
Filter by length greater than x:
Article.objects.annotate(text_len=Length('text')).filter(text_len__gte=x)
See the doc on aggregation.
If you have a large database table, consider adding an index on the length. In PostgreSQL, that would be:
CREATE INDEX app_article_text_length_idx ON app_article (LENGTH(text));
You can define a function in your model that returns length of the TextField field.
Suppose your model is like this:
class ModelName(models.Model):
str_field = models.TextField()
#property
def length(self):
return (self.str_field)
And my_model is one object of this model class, below code returns length of str_field
my_model.length
This way doesn't add a column to database table but you can access the length of str_field.
I have three models defined like this:
class Equipment(models.Model):
hostname = models.CharField(max_length=128)
class Interface(models.Model):
internal_ip = models.GenericIPAddressField()
index = models.IntegerField()
equipment = models.ForeignKey("Equipament", on_delete=models.CASCADE)
class Event(models.Model):
equipment = models.ForeignKey("Equipment", on_delete=models.CASCADE)
index = models.IntegerField(null=True)
datetime = models.DateTimeField(auto_now=True)
When an Event is created, it may have an index, but it also may not. This index refer to the index of the Interface.
events = Event.objects.all()
events = events.annotate(interface_ip=Case(
When(Q(index__isnull=False) & Q(equipment__interface__index=F('index')), then=F('equipment__interface__internal_ip')),
output_field=models.GenericIPAddressField()
))
When I apply this annotation, the events get multiplied by the interfaces the equipment has. This makes sense, because it's evaluating which interfaces, in the given equipment, match the index specified by F('index').
How can I simply exclude the ones that don't match?
From my point of view, Q(equipment__interface__index=F('index')) should compare if the interface__index equals the index, and jump if it doesn't (on the other hand, I know it makes sense, since it has a default value of None, when it doesn't match).
I wish I'd have control over how the Events get inserted on the table, but unfortunately I don't, since it's an external system making direct inserts on the MySQL database, and they simply get the index and throw it in the "index" field.
models:
class CouponUsage(models.Model):
coupon = models.ForeignKey('Coupon', on_delete=models.CASCADE, related_name="usage")
date = models.DateTimeField(auto_now_add=True)
class Coupon(models.Model):
name = models.CharField(max_length=255)
capacity = models.IntegerField()
#property
def remaining(self):
usage = self.usage.all().count()
return self.capacity - usage
views:
def use_coupon(request):
coupon = Coupon.objects.get(condition)
if coupon.remaining > 0:
# do something
I don't know how to handle concurrency issues in the code above, I believe one possible bug is that when the if clause in the view is executing another CouponUsage object can be created...
how do I go about handling that?
how do I prevent CouponUsage objects from being created when inside the if clause in the view
One way of doing this would be to rely on the database integrity checks and transactions. Assuming your capacity must always be in the range [0, +infinity) you could change your Coupon model to use a PositiveIntegerField instead of an IntegerField:
class Coupon(models.Model):
name = models.CharField(max_length=255)
capacity = models.PositiveIntegerField()
Then you need to update your Coupon capacity every time a CouponUsage is created. You can override the save() method to reflect this change:
from django.db import models, transaction
class CouponUsage(models.Model):
coupon = models.ForeignKey('Coupon', on_delete=models.CASCADE, related_name="usage")
date = models.DateTimeField(auto_now_add=True)
#transaction.atomic()
def save(self, ...): # Arguments missing
if not self.pk: # This is an insert, you may want to raise an error otherwise
self.coupon.capacity = models.F('capacity') - 1 # The magic is here, this is executed at the database level so no problem with old in memory values
self.coupon.save()
super().save(...)
Now whenever a CuponUsage is created you update the capacity for the associated Coupon instance. The key here is that instead of reading the value from database into python's memory, updating and then saving, which could lead to inconsistent results, the update to capacity is made at the database level using an F expression. This guarantees that no two transactions use the same value.
Now, notice that by using a PositiveInteger field instead of an IntegerField the database will also guarantee that capacity cannot fall below 0. Therefore if you now try to create a CuponUsage instance such that the Cupon capacity would get a negative value, an exception will arise, thus preventing the creation of such CuponUsage.
You now need to take advantage of this in your code by doing something like the following:
def use_coupon(request):
coupon = Coupon.objects.get(condition)
try:
usage = CuponUsage.objects.create(coupon=coupon)
# Do whatever you want here, you already 'consumed' a coupon
except IntegrityError: # Check for the specific exception
# Sorry no capacity left
pass
If in the event of getting the coupon you need to do things that may fail, and in such a case you need to 'revert' the usage, you can enclose your whole use_coupon function inside a transaction.
Say I have a model like
class Product(models.Model):
name = models.CharField(max_length=64)
[...]
and another like
class Price(models.Model):
product = models.ForeignKey('Product')
date = models.DateField()
value = models.FloatField()
[...]
and I want to display products in a modelAdmin, with the last price registered in a column.
So far, the only way I have found is to add the following method to the product object:
#property
def last_price(self):
price = Price.objects.filter(product=self.pk).order_by("-date")
return price[0].value if price else None
and then adding last_price to the list_display in Product's ModelAdmin. I don't like this approach because it ends up doing a query for each row displayed.
Is there a better way to do this in code?. Or do I have to create a column in the Product table and cache the last price there?.
Thanks!
to reduce the the queries for each entry use the following:
Price.objects.filter(product=self.pk).order_by("-date").select_related('product')
this will decrease the product query at each object, hope it is helpful, vote up please
A cleaner version of what you have would be:
def last_price(self):
latest_price = self.price_set.latest('date')
return latest_price.value if latest_price else None
But this still involves queries for each item.
You if you want to avoid this I would suggest adding a latest_price column to Product. Then you could set up a post_save signal for Price that then updates the related Product latest_price (This could be a ForiegnKey or the value itself.)
Update
Here is a receiver that would update the products latest price value when you save a Price. Obviously this assumes that you are saving Price models in chronological order so the lastest one saved is the latest_value
#receiver(post_save, sender=Price)
def update_product_latest_value(sender, instance, created, **kwargs):
if created:
instance.product.latest_value = instance.value
instance.product.save()