I do want to keep some info about removed users (like username) to show in forum posts. How can I achieve that?
So far I have this:
class Post(models.Model):
(...)
creator = models.ForeignKey(User, blank=True, null=True,on_delete=models.SET(get_sentinel_user))
#receiver(pre_delete, sender=User, dispatch_uid='user_delete_signal')
def create_sentinel_user(sender, instance, using, **kwargs):
SentinelUser.objects.get_or_create( \
username=instance.username+" (left)")[0]
def get_sentinel_user():
return SentinelUser.objects.latest('id')
However if I use that in Admin, it doesn't work because for some reason get_sentinel_user is run sooner than pre_delete and therefor the sentinel user doesn't exist yet.
If your ultimate goal is to save data about the user, the django user docs suggest to use the is_active flag rather than deleting users. That way, you will maintain access to the user information, will not need to worry about your ForeignKey consistency, and don't have this overhead of creating a dummy user for every user that you delete.
You could then back that up by using on_delete=models.PROTECT to make sure you still protect the consistency of your database.
If for some reason you have to delete your users, I would override the delete() on Post rather than trying to use signals.
So, a sample solution could be something like:
def delete(self):
self.created = SentinelUser.objects.get_or_create(username=self.created.username+" (left)")[0]
self.save()
return super(Post,self).delete(self)
Related
Is there an option to disable the delete function in the project object if a non-admin user tries to do this via delete function?
For example:
class Product(models.Model):
name = models.CharField(max_lenght=200)
show = models.BooleanField()
logs = models.TextField()
And now if we have in code of project product.delete() we block it through the model property and check if it is an admin, if it is not an admin add a reflection in the field.:
class Product(models.Model):
name = models.CharField(max_lenght=200)
show = models.BooleanField(default=True)
logs = models.TextField()
def delete(self, *args, **kwargs):
super().save(*args, **kwargs)
if request.user.is_superuser
Product.delete()
else:
show = False
logs = 'ther user try remove this ads in time: 08.11 04.11.2022'
save()
It's the responsibility of the view to check that the user (in request.user) has sufficient privilege to perform the requested operation. There's no standard way for a Django object method to obtain the current user, and a programmer with access through the Django shell >>> obj.delete() has to be prevented by other means (such as the risk of losing his job). There simply is no Django logged-in user in that context, but the DB to which it has access ought not to be a production one.
Out on a limb, it is possible to disable the object's delete() method completely by subclassing it to a no-op or to raise an exception. Deleting such an object would then require other means. Either relying on a CASCADE when some other object was deleted, or using raw SQL (the psql command, if PostgreSQL) to remove its data row from the DB table.
(I haven't done the latter, but there are certain objects in the project I am working on which should never require deletion under any normal circumstances, and for which there are no delete views or similar. They will accumulate at the rate of around one per week in production, and removing ones which are completely stale is a problem which can safely be deferred until the year 2100 or later :-)
I'm trying to use CurrentUserDefault with a field that can be null:
# model
class Package(models.Model):
# User can be empty because we allow anonymous donations
owner = models.ForeignKey(
settings.AUTH_USER_MODEL,
on_delete=models.CASCADE,
null=True,
blank=True,
)
# serializer
class PackageSerializer(serializers.ModelSerializer):
owner = serializers.HiddenField(default=serializers.CurrentUserDefault())
Everything works fine when a user is logged in. However, if a user is not authenticated I get this:
ValueError at /api/organizations/village/packages/
Cannot assign "<django.contrib.auth.models.AnonymousUser object at 0x7fc97ad6d940>": "Package.owner" must be a "User" instance.
Is there a reason why CurrentUserDefault doesn't work with anonymous users?
P.S. I know I can use this instead of CurrentUserDefault and it will work:
class AuthorizedUserOrNone:
requires_context = True
def __call__(self, serializer_field):
user = serializer_field.context["request"].user
if user.is_authenticated:
return user
return None
def __repr__(self):
return "%s()" % self.__class__.__name__
What do you return if user.is_anonymous? Work that out and add that to def __call__(self, serializer_field) and you should be good. CurrentUserDefault does not appear to handle anonymous users.
Try:
get_object_or_404(User, fk_user=self.request.user)
This answer provides more helpful context: Django REST Framework - CurrentUserDefault use
Based on what I'm reading in the Traceback, unless you dive deeper into the code base behind the CurrentUserDefault class I don't think there is a standard way of handling this. My limited research has shown me that CurrentUserDefault doesn't appear to be designed to handle anonymous requests (which fits, given the word "User" in the class). The Traceback seems to align with this.
If I am correct, and you're dealing with an edge-case, then more modifications will likely be required to accomplish what you're looking for. Good luck. Sorry I couldn't help with a direct answer.
Note: the above is untested.
See: Django REST documentation
class Badge(SafeDeleteModel):
owner = models.ForeignKey(settings.AUTH_USER_MODEL,
blank=True, null=True,
on_delete=models.PROTECT)
restaurants = models.ManyToManyField(Restaurant)
identifier = models.CharField(max_length=2048) # not unique at a DB level!
I want to ensure that for any badge, for a given restaurant, it must have a unique identifier. Here are the 4 ideas I have had:
idea #1: using unique_together -> Does not work with M2M fields as explained [in documentation]
(https://docs.djangoproject.com/en/2.1/ref/models/options/#unique-together)
idea #2: overriding save() method. Does not fully work with M2M, because when calling add or remove method, save() is not called.
idea #3: using an explicite through model, but since I'm live in production, I'd like to avoid taking risks on migrating important structures like theses. EDIT: after thinking of it, I don't see how it could help actually.
idea #4: Using a m2m_changedsignal to check the uniqueness anytime the add() method is called.
I ended up with the idea 4 and thought everything was OK, with this signal...
#receiver(m2m_changed, sender=Badge.restaurants.through)
def check_uniqueness(sender, **kwargs):
badge = kwargs.get('instance', None)
action = kwargs.get('action', None)
restaurant_pks = kwargs.get('pk_set', None)
if action == 'pre_add':
for restaurant_pk in restaurant_pks:
if Badge.objects.filter(identifier=badge.identifier).filter(restaurants=restaurant_pk):
raise BadgeNotUnique(MSG_BADGE_NOT_UNIQUE.format(
identifier=badge.identifier,
restaurant=Restaurant.objects.get(pk=restaurant_pk)
))
...until today when I found in my database lots of badges with the same identifier but no restaurant (should not happend at the business level)
I understood there is no atomicity between the save() and the signal.
Which means, if the user have an error about uniqueness when trying to create a badge, the badge is created but without restaurants linked to it.
So, the question is: how do you ensure at the model level that if the signal raises an Error, the save() is not commited?
Thanks!
I see two separate issues here:
You want to enforce a particular constraint on your data.
If the constraint is violated, you want to revert previous operations. In particular, you want to revert the creation of the Badge instance if any Restaurants are added in the same request that violate the constraint.
Regarding 1, your constraint is complicated because it involves multiple tables. That rules out database constraints (well, you could probably do it with a trigger) or simple model-level validation.
Your code above is apparently effective at preventing adds that violate the constraint. Note, though, that this constraint could also be violated if the identifier of an existing Badge is changed. Presumably you want to prevent that as well? If so, you need to add similar validation to Badge (e.g. in Badge.clean()).
Regarding 2, if you want the creation of the Badge instance to be reverted when the constraint is violated, you need to make sure the operations are wrapped in a database transaction. You haven't told us about the views where these objects area created (custom views? Django admin?) so it's hard to give specific advice. Essentially, you want to have this:
with transaction.atomic():
badge_instance.save()
badge_instance.add(...)
If you do, an exception thrown by your M2M pre_add signal will rollback the transaction, and you won't get the leftover Badge in your database. Note that admin views are run in a transaction by default, so this should already be happening if you're using the admin.
Another approach is to do the validation before the Badge object is created. See, for example, this answer about using ModelForm validation in the Django admin.
I'm afraid the correct way to achieve this really is by adapting the "through" model. But remember that at database level this "through" model already exists, and therefore your migration would simply be adding a unique constraint. It's a rather simple operation, and it doesn't really involve any real migrations, we do it often in production environments.
Take a look at this example, it pretty much sums everything you need.
You can specify your own connecting model for your M2M-models, and then add a unique_together constraint in the meta class of the membership model
class Badge(SafeDeleteModel):
...
restaurants = models.ManyToManyField(Restaurant, through='BadgeMembership')
class BadgeMembership(models.Model):
restaurant = models.ForeignKey(Restaurant, null=False, blank=False, on_delete=models.CASCADE)
badge = models.ForeignKey(Badge, null=False, blank=False, on_delete=models.CASCADE)
class Meta:
unique_together = (("restaurant", "badge"),)
This creates an object that's between the Badge and Restaurant which will be unique for each badge per restaurant.
Optional: Save check
You can also add a custom save function where you can manually check for uniqueness. In this way you can manually raise an exception.
class BadgeMembership(models.Model):
restaurant = models.ForeignKey(Restaurant, null=False, blank=False, on_delete=models.CASCADE)
badge = models.ForeignKey(Badge, null=False, blank=False, on_delete=models.CASCADE)
def save(self, *args, **kwargs):
# Only save if the object is new, updating won't do anything
if self.pk is None:
membershipCount = BadgeMembership.objects.filter(
Q(restaurant=self.restaurant) &
Q(badge=self.badge)
).count()
if membershipCount > 0:
raise BadgeNotUnique(...);
super(BadgeMembership, self).save(*args, **kwargs)
Here is the sutuation i hit.
I have both User and ProfileUser. I would like to add additional logic to the model and since I can't add it to the Django User model, I have to add it to the ProfileUser. Currently all my models however have ForeignKey(User). Should I keep them like that or should I user ForeignKey(UserProfile) on my other models?
Example for my view if I keep the ForeignKey(User):
class myview(request):
user = request.user
userProfile = user.get_profile()
neededStuff = userProfile.get_needed_stuff()
and then in the UserProfile model:
def get_needed_stuff(self):
user= self.user # Or actually, is this right
goals = Goal.objects.get(<conditions that i wont bother writing here>)
return goals
So for this case, and for further development of the site, which foreign key should i use?
I think You should use User. UserProfile should be custom and can differ on each project. So if you will use same code for another project you can probably fail because of that. Also it is always easy to get user object in code and from that you have no problems to get profile user.get_profile() as you show (and profile is not always needed). So ingeneral I think it will be easier to use other modules and passing them just user object (or id) and not the profile.
What is also could be the solution - write your own class which will be responsible for the users. Just write methods to return profile, return stuff_needed or whatever you want and everything just by passing user object and additional parameters about what you want.
So in short, I'm for using User for Foreign keys, because in my opinion it just more logical, while the User model is always the main one (you always have it) and UserProfile is just extension.
Ignas
If you just want all the goals belonging to a specific user add a foreign key to User in your Goal model.
class Goal(models.Model):
user = models.ForeignKey(User)
def myview(request):
goals = Goal.objects.filter(user=request.user)
Or alternately save all the goals for a user on your UserProfile model and do
def myview(request):
user_profile = user.get_profile()
goals = user_profile.goals
...or use a method to do processing to calculate them
goals = user_profile.calculate_goals()
I've been pondering the same thing myself for one of my sites but i decided to use UserProfile rather than User.
Not sure if its the right decision but it just seems more flexible.
If we set up a profile how Django recommends:
class Profile(models.Model):
user = models.ForeignKey(User, unique=True)
Then when you delete the User object from Django admin, it deletes his profile too.This is because the profile has a foreign key to user and it wants to protect referential integrity. However, I want this functionality even if the pointer is going the other way. For example, on my Profile class I have:
shipper = models.ForeignKey(Shipper, unique=True, blank=True, null=True)
carrier = models.ForeignKey(Carrier, unique=True, blank=True, null=True)
affiliat = models.ForeignKey(Affiliate, unique=True, blank=True, null=True, verbose_name='Affiliate')
And I want it so that if you delete the Profile it'll delete the associated shipper/carrier/affiliate objects (don't ask me why Django made "affiliate" some weird keyword). Because shippers, carriers and affiliates are types of users, and it doesn't make sense for them to exist without the rest of the data (no one would be able to log in as one).
The reason I didn't put the keys on the other objects, is because then Django would have to internally join all those tables every time I wanted to check which type the user was...
While using a post_delete signal as described by bernardo above is an ok approach, that will work well, I try to avoid using signals as little as humanly possible as I feel like it convolutes your code unnecessarily by adding behavior to standard functionality in places that one might be expecting.
I prefer the overriding method above, however, the example given by Felix does have one fatal flaw; the delete() function it is overriding looks like this:
def delete(self, using=None):
using = using or router.db_for_write(self.__class__, instance=self)
assert self._get_pk_val() is not None, "%s object can't be deleted because its %s attribute is set to None." % (self._meta.object_name, self._meta.pk.attname)
collector = Collector(using=using)
collector.collect([self])
collector.delete()
Notice the parameter 'using', in most cases we call delete() with empty arguments so we may have even known it was there. In the above example this parameter is buried by us overriding and not looking at the superclass functionality, if someone where to pass the 'using' parameter when deleting Profile it will cause unexpected behavior. To avoid that, we would make sure to preserve the argument along with its default lika so:
class Profile(models.Model):
# ...
def delete(self, using=None):
if self.shipper:
self.shipper.delete()
if self.carrier:
self.carrier.delete()
if self.affiliat:
self.affiliat.delete()
super(Profile, self).delete(using)
One pitfall to the overriding approach, however, is that delete() does not get explicitly called per db record on bulk deletes, this means that if you are going to want to delete multiple Profiles at one time and keep the overriding behavior (calling .delete() on a django queryset for example) you will need to either leverage the delete signal (as described by bernardo) or you will need to iterate through each record deleting them individually (expensive and ugly).
A better way to do this and that works with object's delete method and queryset's delete method is using the post_delete signal, as you can see in the documentation.
In your case, your code would be quite similar to this:
from django.db import models
from django.dispatch import receiver
#receiver(models.signals.post_delete, sender=Profile)
def handle_deleted_profile(sender, instance, **kwargs):
if instance.shipper:
instance.shipper.delete()
if instance.carrier:
instance.carrier.delete()
if instance.affiliat:
instance.affiliat.delete()
This works only for Django 1.3 or greater because the post_delete signal was added in this Django version.
You can override the delete() method of the Profile class and delete the other objects in this method before you delete the actual profile.
Something like:
class Profile(models.Model):
# ...
def delete(self):
if self.shipper:
self.shipper.delete()
if self.carrier:
self.carrier.delete()
if self.affiliat:
self.affiliat.delete()
super(Profile, self).delete()