When i fetch a client i want to also prefetch all related probes and probe channels, and when i fetch a probe i also want to fetch all related probe channels.
But it seems that when i get a client object, or view the admin, it runs the get_queryset methods from both the client and the probe manager. so it will prefetch the probechannels twice. This can be seen both in the number of sql queries in debug toolbar, and by adding print statements in the get_queryset methods.
so i've got
class ClientManager(models.Manager):
def get_queryset(self, *args, **kwargs):
return super(ClientManager, self).get_queryset(*args, **kwargs).prefetch_related('probe_set__probechannel_set')
class Client(models.Model):
objects = ClientManager()
class ProbeManager(models.Manager):
def get_queryset(self, *args, **kwargs):
return super(ProbeManager, self).get_queryset(*args, **kwargs).prefetch_related('probechannel_set')
class Probe(models.Model):
Client = models.ForeignKey('Client')
objects = ProbeManager()
class ProbeChannel(models.Model):
Probe = models.ForeignKey('Probe')
Is there a way to avoid this? I have read about _base_managers being used for related objects, but the _base_manager here is just models.Manager. I'm using django 1.11 and this trendy new thing called python 2.7.
I'm still not sure if this is desired Django behaviour, but the prefetches are definitely firing twice. The answer i've worked out to avoid ProbeManager prefetches to be fired as well was simple- avoid using this manager.
So in the Probe class I've added
objects_without_prefetch = models.Manager()
and then used a prefetch object Prefetch('probe_set',Probe.objects_without_prefetch.all()) instead of simply using 'probe_set' which was calling its own prefetch.
Thanks anyway those that took the time to think about this and answer!
Related
When my program performs a soft deletion, the softly deleted rows would be marked as inactive or deleted (e.g. person.deleted=True). The question is, what is the best way to make sure that every retrieval of data from this table would only return the active records without having to add the deleted=False argument to the filter method (which is not only repetitive, but also prone to errors).
You can try creating custom object manager for your model. This may be enough or not, depending on your requirements and further project implementation.
class Person(models.Model):
# ...
objects = PersonManager()
class PersonManager(models.Manager):
def all(self, *args, **kwargs):
return super(PersonManager, self).filter(deleted=False)
def deleted(self, *args, **kwargs):
return super(PersonManager, self).filter(deleted=True)
# ...
Update: Another convenient way to do that is with django-livefield
Which option is best, 1 or 2?
1.
class TopicForm(forms.Form):
name = forms.CharField(required=True)
body = RichTextFormField(required=True)
def save(self, request):
t = models.Topic(user=request.user,
site=get_current_site(request),
name=self.cleaned_data['name'],
body=self.cleaned_data['body'])
t.slug = slugify(self.name)
t.body_html = seo.nofollow(seo.noindex(self.body))
t.ip = utils.get_client_ip(request)
t.save()
or 2.
class Topic(models.Model):
...
def save(self, *args, **kwargs):
self.slug = slugify(self.name)
self.body_html = seo.nofollow(seo.noindex(self.body))
self.ip = utils.get_client_ip(request)
super(Topic, self).save(*args, **kwargs)
The difference is that the first version is only applied when modifying objects through the form, while the second is applied whenever the model is saved (though that is still a subset of all the ways in which database rows can be modified in Django). Even if you currently only create objects through forms, I think it's still a useful distinction to keep in mind.
It looks to me like a mixture of the two makes sense in your case. A slug is something that you will always want to set based on name - that is, it's inherent to the model itself. On the other hand, the idea of a client_ip seems inexorably tied to the notion of creating an object with a form via a web request.
Of course, you are in a better position to know about the specifics of this model, but that is the general way I would approach the question.
It depends. If this should be applied to every models, then it is better in the model. It will assure you that every Topic object will have correct values, even those you are edited from the admin interface.
The form should be use only to check data from the user and the model is appropriate to automatize this kind of task (generate data before saving the object). Be careful, this shouldn't raise Exception or invalidate data however.
Personally I would prefer the second option. The model should define the business logic too, while forms should just handle user I/O. This way your application will keep consistent even if used in a programmatic way (imported and called from other code).
You shouldnt use 2. its better to use a signal like pre-save or post-save
Source: https://docs.djangoproject.com/en/dev/topics/signals/
#receiver(pre_save, sender=Topic)
def topic_pre_save_handler(sender, instance, **kwargs):
instance.slug = slugify(self.name)
instance.body_html = seo.nofollow(seo.noindex(self.body))
instance.ip = utils.get_client_ip(request)
The tests that I have written for my Django application have been working perfectly during initial development where I have been using SQLite. Now that I am getting ready to deploy I have setup a MySQL server (as that is what I'll be deploying to) but now some of my tests are failing.
Lastly the tests that are failing don't fail when I manually test the functionality.
What could be going on?
I'm not doing anything unusual, all of the views do some database shenanigans and return a response. There isn't anything timing related (no threading or anything).
The tests all inherit from django.test.TestCase and I'm not using any fixtures.
Here is an example of a test that fails.
class BaseTest(TestCase):
def setUp(self):
super(BaseTest, self).setUp()
self.userCreds = dict(username='test', password='a')
# Create an admin user
admin = models.User.objects.create_superuser(
email='', username='admin', password='a')
# Create a user and grant them a licence
user = models.User.objects.create_user(
email='some#address.com', first_name="Mister",
last_name="Testy", **self.userCreds)
profile = models.getProfileFor(user)
node = profile.createNode(
'12345', 'acomputer', 'auser',
'user#email.com', '0456 987 123')
self.node = node
class TestClientUIViews(BaseTest):
def test_toggleActive(self):
url = reverse('toggleActive') + '?nodeId=%s' % self.node.nodeId
self.assertFalse(self.node.active)
# This should fail because only authenticated users can toggle a node active
resp = self.client.get(url)
self.assertEqual(resp.status_code, 403)
self.assertFalse(self.node.active)
# Login and make sure visiting the url toggles the active state
self.client.login(**self.userCreds)
resp = self.client.get(url)
self.assertEqual(resp.status_code, 200)
self.assertTrue(self.node.active)
resp = self.client.get(url)
self.assertEqual(resp.status_code, 200)
self.assertFalse(self.node.active)
And here is what the model looks like:
class Node(models.Model):
#property
def active(self):
'''
Activation state gets explictly tracked in its own table but is
exposed as a property for the sake of convenience
'''
activations = NodeActivation.objects \
.filter(node=self) \
.order_by('-datetime')
try:
return activations[0].active
except IndexError:
return False
#active.setter
def active(self, state):
if self.active != state:
NodeActivation.objects.create(node=self, active=state)
class NodeActivation(models.Model):
node = models.ForeignKey("Node")
datetime = models.DateTimeField(default=datetimeM.datetime.now)
active = models.BooleanField(default=False)
My local MySQL is 5.5.19 (so its using InnoDB) but I get the same failures on the deployment server which is using 5.1.56. The tests fail regardless of the storage engine.
And as I mentioned at the beginning, if I switch back to use a SQLite database, all the tests go back to passing.
With more of the actual code now revealed I'll provide the following hypothesis as to why this test is failing.
The NodeActivation model is incorrect. The datetime field should be:
datetime = models.DateTimeField(auto_now=True)
Using datetime.datetime.now() in a model definition will only be evaluated once.
Every time the setter creates a new NodeActivation record, the record will be created with the same date/time. ie The date/time that the NodeActivation model was first evaluated.
Your getter only ever returns a single result. But since both activation records have the same date/time, the ordering may be dependent on the database back end. There will be two NodeActivation records in your database at the end of the test, which one is returned is indeterminate.
By changing the active property on the Node model class to this:
#property
def active(self):
'''
Activation state gets explictly tracked in its own table but is
exposed as a property for the sake of convenience
'''
activations = NodeActivation.objects \
.filter(node=self) \
.order_by('-id')
try:
return activations[0].active
except IndexError:
return False
the problem goes away.
Note the change to the order_by call.
The records were getting created so quickly that ordering by datetime wasn't deterministic, hence the erratic behaviour. And I guess SQLite is just slower than MySQL which is why it wasn't a problem when using it as the backing database.
NOTE: Thanks to Austin Phillips for the tip (check out comments in his answer)
I had a similar problem (not sure if its the same) going from SQLite to PostgreSQL using a setup method like you have for your initial database objects.
Example::
def setUp(self):
user = User.objects.create_user(username='john', password='abbeyRd', email='john#thebeatles.com')
user.save()
I found that this would work fine on SQLite, but on PostgreSQL I was getting database connection failures (it would still build the database like you said, but couldn't run the tests without a connection error).
Turns out, the fix is to run your setup method as follows:
#classmethod
def setUpTestData(cls):
user = User.objects.create_user(username='john', password='abbeyRd', email='john#thebeatles.com')
user.save()
I don't remember the exact reason why this works and the other one didn't, but it had something to do with the connection to the database trying to reconnect every test if you use setUp(self) and if you use setUpTestData(cls) it only connects when the class is instantiated, but I might not have the details perfect. All I really know is it works!
Hope that helps, not sure if it addresses your exact issue, but it took me a good few hours to figure out, so I hope it helps you and maybe others in the future!
I have a model which is overriding save() to slugify a field:
class MyModel(models.Model):
name = models.CharField(max_length=200)
slug = models.SlugField(max_length=200)
def save(self, *args, **kwargs):
self.slug = slugify(self.name)
super(MyModel, self).save(*args, **kwargs)
When I run load data to load a fixture, this save() does not appear to be called because the slug field is empty in the database. Am I missing something?
I can get it to work by a pre_save hook signal, but this is a bit of a hack and it would be nice to get save() working.
def mymodel_pre_save(sender, **kwargs):
instance = kwargs['instance']
instance.slug = slugify(instance.name)
pre_save.connect(mymodel_pre_save, sender=MyModel)
Thanks in advance.
No you're not. save() is NOT called by loaddata, by design (its way more resource intensive, I suppose). Sorry.
EDIT: According to the docs, pre-save is not called either (even though apparently it is?).
Data is saved to the database as-is, according to https://docs.djangoproject.com/en/dev/ref/django-admin/#what-s-a-fixture
I'm doing something similar now - I need a second model to have a parallel entry for each of the first model in the fixture. The second model can be enabled/disabled, and has to retain that value across loaddata calls. Unfortunately, having a field with a default value (and leaving that field out of the fixture) doesn't seem to work - it gets reset to the default value when the fixture is loaded (The two models could have been combined otherwise).
So I'm on Django 1.4, and this is what I've found so far:
You're correct that save() is not called. There's a special DeserializedObject that does the insertion, by calling save_base() on the Model class - overriding save_base() on your model won't do anything since it's bypassed anyway.
#Dave is also correct: the current docs still say the pre-save signal is not called, but it is. It's behind a condition: if origin and not meta.auto_created
origin is the class for the model being saved, so I don't see why it would ever be falsy.
meta.auto_created has been False so far with everything I've tried, so I'm not yet sure what it's for. Looking at the Options object, it seems to have something to do with abstract models.
So yes, the pre_save signal is indeed being sent.
Further down, there's a post_save signal behind the same condition that is also being sent.
Using the post_save signal works. My models are more complex, including a ManyToMany on the "Enabled" model, but basically I'm using it like this:
from django.db.models.signals import post_save
class Info(models.Model):
name = models.TextField()
class Enabled(models.Model):
info = models.ForeignKey(Info)
def create_enabled(sender, instance, *args, **kwards):
if Info == sender:
Enabled.objects.get_or_create(id=instance.id, info=instance)
post_save.connect(create_enabled)
And of course, initial_data.json only defines instances of Info.
I have two models: Activity and Place.
The Activity model has a ReferenceProperty to the Place model.
This was working fine until the Place table started growing and now
when trying to edit an Activity via django admin I get a memory error
from Google (it doesn't happen if I remove that field from the Activity
admin's fieldsets)
The widget used to edit the RefrenceProperty uses Place.all() to get
the possible values.
As both Activity and Place are sharded by a city property I would like
to optimize the widget's choice query from Place.all() to just the
relevant places, for example Place.all().filter("city =", )
I couldn't find a way to override the query in the docs and I was
wondering if the above is even possible? and if so, how?
Managed to optimize the query by overriding the admin form:
class ActivityAdminForm(forms.ModelForm):
def __init__(self, *args, **kwargs):
super(ActivityAdminForm, self).__init__(*args, **kwargs)
self.fields['place'].queryset = <... my query ...>
class ActivityAdmin(admin.ModelAdmin):
form = ActivityAdminForm