Set the maximum recursion depth while serializing a Django model with Foreign key to JSON - django

I have a Django model created for Google's App Engine,
Model A():
propA = ReferenceProperty(B)
Model B():
propB = ReferenceProperty(C)
Model C():
propC = ReferenceProperty(B)
I have written custom Django serializer which will fetch the data for the ReferenceProperty(s) and serialize that along the initial model.
The problem occurs when I try to serialize an instance of Model A. My custom serializer will try to get propA, which contains a reference to Model C so the serializer will fetch Model C, which contains a reference to Model B and the recursion goes on and on. Is there any way to stop the recursion after a depth of say 2??
My serializer is a customized version of link text
P.S: I am willing to publish my code if that seems to needed. I have not currently attached the code since I am not at my development machine.
Thanks,
Arun Shanker Prasad.

Just modify your functions to take a 'depth' argument. Any time you follow a ReferenceProperty, call the function with depth one less than the depth that was passed in. If a function is called with depth==0, return None, or whatever other placeholder value is suitable in your case.

Why don't you just do recursion properly? Any recursive operation must have a base case, otherwise it will continue forever, as your problem indicates.

I'm trying to find a serializer that works with Google App Engine and follows relationships. Would it be possible for you to post the modified code you used to do this?

Related

Python/Django model dictionary allows one type of update, but not another

I am working on some Django/Python code.
Basically, the backend of my code gets sent a dict of parameters named 'p'. These values all come off Django models.
When I tried to override them as such:
p['age']=25
I got a 'model error'. Yet, if I write:
p.age=25
it works fine.
My suspicion is that, internally, choice #1 tries to set a new value to an instance of a class created by Django that objects to being overridden, but internally Python3 simply replaces the Django instance with a "new" attribute of the same name ('age'), without regard for the prior origin, type, or class of what Django created.
All of this is in a RESTful framework, and actually in test code. So even if I am right I don't believe it changes anything for me in reality.
But can anyone explain why one type of assignment to an existing dict works, and the other fails?
p is a class, not a dict. Django built it that way.
But, as such, one approach (p.age) lets you change an attribute of the object in the class.

Odoo v7 to v8 translation

I am converting OpenERP code from version 7 to version 8 and I have come across a weird structure. In version 7 we can use fields, function and one of the attributes is store. The store function allows current field to be updated when fields of other objects are changed.
In the new API, the store function only accepts 'True' or 'False'. I was wondering if I have inherit other models and modify their fields so they perform a value update of model in question using "onchange"
Nope, in Odoo 8 store function is working fine. you can search add-ons and find some interesting examples, Understand from it.
Some example I found in online
[http://www.odoo.yenthevg.com/saving-and-resizing-images-in-odoo-8/]
go through it.
In v8 their is no fields.function field and with that we also do not need store with fields pararms, but you can achieve same thing very easily by using [#depends][1] decorator , which does same thing as store with field does.
```
#openerp.api.depends(*args)
Return a decorator that specifies the field dependencies of a "compute" method (for new-style function fields).
```
so you can say that you field to be calculated on change of some field change.

Django - Unit Testing an AdminForm

I am very new to unit testing and am probably doing something wrong, but when I simulate a post to update a model via the admin backend it seems like my save_model method in my AdminForm isn't being called. I am trying to test this method - what am I doing wrong?
My second, less relevant question is in general how can I make sure a method is being called when I use unit testing? Is there some way to list all the methods that were hit?
Below is the code my test is running. In my save_model method in my AdminForm for this model, I set this model's foobar attribute to the username of the currently signed in user. Below is my test:
self.client = Client()
self.client.login(username='username',password='password')
# self.dict is a dictionary of field names and values for mymodel to be updated
response = self.client.post('/admin/myapp/mymodel/%d/' % self.mymodel.id, self.dict)
self.assertEqual(response.status_code,200) # passes
self.assertEqual(self.mymodel.foobar,'username') # fails
self.client.logout()
It fails because it says that self.mymodel.foobar is an empty string. That was what it should have been before the update. No value for foobar is passed in the self.dict but my save_model method is designed to set it on its own when the update happens. It is also worth noting that my code works correctly and save_model seems to work fine, just my test is failing. Since I am a total noob at TDD, I'm sure the issue is with my test and not my code. Thoughts?
From the code it looks like the problem is that, after posting the form, you don't reload self.mymodel from the database. If you hold a reference to a model object stored in the database, and one or more of the fields on that object is changed in the database, then you will need to reload the object from the database to see the updated values. As detailed in this question, you can do this with something like:
self.mymodel = MyModelClass.objects.get(id=self.mymodel.id)
To answer your second question, probably the most useful way to see what is happening would be to use logging to output what is happening in your save_model method - this will not only help you debug the issue during testing, but also if you encounter any issues in this method when running your application. The django guide to logging gives an excellent introduction:
https://docs.djangoproject.com/en/dev/topics/logging/

Creating a Django callable object for Field.default

I'm trying to create a callable object to return the default value to use for a field when creating a new instance.
The logic for the value is dependent on other data in the model. I tried creating a separate class but have not hit on the right combination of factors. Example:
in models.py:
Class Box(models.Model):
inv_id = models.CharField(max_length=16,default=gen_inv_id())
The callable object will need to query the database model and increment a table value. I tried creating a class in a separate .py module under the app, but it needs a method to return a value. OO is not my strong suit at this point. I think the model has become invalid and the method depends on it so it seems like a chicken/egg scenario has emerged.
Thanks for any help.
Since forever (pre 1.0 days) the default keyword supported callables. The issue with your code is you're not passing in a callable (default=gen_inv_id), but the result of a callable (default=gen_inv_id()).
So you probably want to do:
Class Box(models.Model):
inv_id = models.CharField(max_length=16,default=gen_inv_id)
Check out the docs for the latest version that describes this:
https://docs.djangoproject.com/en/1.4/ref/models/fields/#default
I've run into this before. One thing you can do is to overwrite the class's save method, so that you first save the parameters you need to do the computation, then do the computation and resave. If you're overwriting the save method you'll need to call super.save() (I forget what the exact notation is)
edit: the notation for super.save is super(Model, self).save(*args, **kwargs)

Using Django ORM to serialize data and related members

I'm currently working on a REST api, using django. I started using the nice djangorestframework, which I loved to use the "View" class.
But, I'm facing with the serialization problem.
I do not like the Serialization using the Serializer classes.
The main goal is to prepare a sort of giant dict, with all the infos, and give it to a renderer class which translate it into an xml, json, yaml, depending on the "Accept:" HTTP header. The goal is classy, but 60% of the CPU time is spend on creating the "GIANT DICT".
This dict can be created using django Models, but I think using on the fly instanciated classes and object is VERY un-efficient ? I'm trying to use some QuerySet methods to filter which models member I want to have, and getting a simple dict : the ::values() method, but unfornately, I can't access the m2m and foreignkey from my models.
Did you already tried this ? Any though ?
You could use the QuerySet's iterator method:
... For a QuerySet which returns a large number of objects that you only
need to access once, this can results in better performance and a
significant reduction in memory.
Your code should looks like:
for obj in SomeModel.objects.values_list('id', 'name').iterator():
# do something