Creating a Django callable object for Field.default - django

I'm trying to create a callable object to return the default value to use for a field when creating a new instance.
The logic for the value is dependent on other data in the model. I tried creating a separate class but have not hit on the right combination of factors. Example:
in models.py:
Class Box(models.Model):
inv_id = models.CharField(max_length=16,default=gen_inv_id())
The callable object will need to query the database model and increment a table value. I tried creating a class in a separate .py module under the app, but it needs a method to return a value. OO is not my strong suit at this point. I think the model has become invalid and the method depends on it so it seems like a chicken/egg scenario has emerged.
Thanks for any help.

Since forever (pre 1.0 days) the default keyword supported callables. The issue with your code is you're not passing in a callable (default=gen_inv_id), but the result of a callable (default=gen_inv_id()).
So you probably want to do:
Class Box(models.Model):
inv_id = models.CharField(max_length=16,default=gen_inv_id)
Check out the docs for the latest version that describes this:
https://docs.djangoproject.com/en/1.4/ref/models/fields/#default

I've run into this before. One thing you can do is to overwrite the class's save method, so that you first save the parameters you need to do the computation, then do the computation and resave. If you're overwriting the save method you'll need to call super.save() (I forget what the exact notation is)
edit: the notation for super.save is super(Model, self).save(*args, **kwargs)

Related

Python/Django model dictionary allows one type of update, but not another

I am working on some Django/Python code.
Basically, the backend of my code gets sent a dict of parameters named 'p'. These values all come off Django models.
When I tried to override them as such:
p['age']=25
I got a 'model error'. Yet, if I write:
p.age=25
it works fine.
My suspicion is that, internally, choice #1 tries to set a new value to an instance of a class created by Django that objects to being overridden, but internally Python3 simply replaces the Django instance with a "new" attribute of the same name ('age'), without regard for the prior origin, type, or class of what Django created.
All of this is in a RESTful framework, and actually in test code. So even if I am right I don't believe it changes anything for me in reality.
But can anyone explain why one type of assignment to an existing dict works, and the other fails?
p is a class, not a dict. Django built it that way.
But, as such, one approach (p.age) lets you change an attribute of the object in the class.

Secure-by-default django ORM layer---how?

I'm running a Django shop where we serve each our clients an object graph which is completely separate from the graphs of all the other clients. The data is moderately sensitive, so I don't want any of it to leak from one client to another, nor for one client to delete or alter another client's data.
I would like to structure my code such that I by default write code which adheres to the security requirements (No hard guarantees necessary), but lets me override them when I know I need to.
My main fear is that in a Twig.objects.get(...), I forget to add client=request.client, and likewise for Leaf.objects.get where I have to check that twig__client=request.client. This quickly becomes error-prone and complicated.
What are some good ways to get around my own forgetfulness? How do I make this a thing I don't have to think about?
One candidate solution I have in mind is this:
Set the default object manager as DANGER = models.Manager() on my abstract base class(es).
Have a method ok(request) on said base classes which applies .filter(leaf__twig__branch__trunk__root__client=request.client) as applicable.
use MyModel.ok(request) instead of MyModel.objects wherever feasible.
Can this be improved upon? One not so nice issue is when a view calls a model method, e.g. branch.get_twigs_with_fruit, I now have to either pass a request for it to run through ok or I have to invoke DANGER. I like neither :-\
Is there some way of getting access to the current request? I think that might mitigate the situation...
Ill explain a different problem I had however I think the solution might be something to look into.
Once I was working on a project to visualize data where I needed to have a really big table which will store all the data for all visualizations. That turned out to be a big problem because I would have to do things like Model.objects.filter(visualization=5) which was just not very elegant and not efficient.
To make things simpler and more efficient I ended up creating dynamic models on the fly. Essentially I would create a separate table in the db on the fly and then store a data only for that one visualization in that. My code is something like:
def get_model_class(table_name):
class ModelBase(ModelBase):
def __new__(cls, name, bases, attrs):
name = '{}_{}'.format(name, table_name)
return super(ModelBase, cls).__new__(cls, name, bases, attrs)
class Data(models.Model):
# fields here
__metaclass__ = ModelBase
class Meta(object):
db_table = table_name
return Data
dynamic_model = get_model_class('foo')
This was useful for my purposes because it allowed queries to be much faster but getting back to your issue I think something like this can be useful because this will make sure that each client's data is separate not only via a foreign key, but is actually separated in the db.
Using this method is pretty straight forward except before using the model, you have to call the function to get it for each client. To make things more efficient you can cache/memoize the results of the function call so that it does not have to recompute the same thing more than once.

How to do multiple save object calls in a Django view, but commit only once

I have a Django view in which I call my_model.save() in a single object (conditionally) in multiple spots. my_model is a normal model class.
save() is commited at once in Django, and thus, the database gets hit several times in the worst case. To prevent this, I defined a boolean variable save_model and set it to True in the case of a object modification. At the end of my view, I check this boolean and call save on my object in needed.
Is there a simpler way of doing this? I tried Djangos transaction.commit_on_success as a view decorator, but the save-calls appear to get queued and committed anyway.
You could look into django-dirtyfields.
Simply use DirtyFieldsMixin as a mixin to your model. You will then be able to check if an object has changed (using obj.is_dirty()) before doing a save().
You can use transaction support everywhere in your code, Django docs say it explicitely:
Although the examples below use view functions as examples, these decorators and context managers can be used anywhere in your code that you need to deal with transactions
But this isn't the thing transactions are for. You can get rid of your boolean variable using some existing app for that, like django-dirtyfields.
But it smells like a bad design. Why do you need to call save multiple times? Are you sure there is no way to call it only once?
There can be two approaches for this... But they are similar... First one is calling save() before returning response.
def my_view(request):
obj = Mymodel.objects.get(...)
if cond1:
obj.attr1 = True
elif cond2:
obj.attr2 = True
else:
obj.attr1 = False
obj.attr2 = False
obj.save()
return .......
Second one is your approach...
But there is no other way to do this, except you define your own decorator or do some other approach, but in fact, you need to check if there is any modification on your model (or you want to save changes to your data).

Move a python / django object from a parent model to a child (subclass)

I am subclassing an existing model. I want many of the members of the parent class to now, instead, be members of the child class.
For example, I have a model Swallow. Now, I am making EuropeanSwallow(Swallow) and AfricanSwallow(Swallow). I want to take some but not all Swallow objects make them either EuropeanSwallow or AfricanSwallow, depending on whether they are migratory.
How can I move them?
It's a bit of a hack, but this works:
swallow = Swallow.objects.get(id=1)
swallow.__class__ = AfricanSwallow
# set any required AfricanSwallow fields here
swallow.save()
I know this is much later, but I needed to do something similar and couldn't find much. I found the answer buried in some source code here, but also wrote an example class-method that would suffice.
class AfricanSwallow(Swallow):
#classmethod
def save_child_from_parent(cls, swallow, new_attrs):
"""
Inputs:
- swallow: instance of Swallow we want to create into AfricanSwallow
- new_attrs: dictionary of new attributes for AfricanSwallow
Adapted from:
https://github.com/lsaffre/lino/blob/master/lino/utils/mti.py
"""
parent_link_field = AfricanSwallow._meta.parents.get(swallow.__class__, None)
new_attrs[parent_link_field.name] = swallow
for field in swallow._meta.fields:
new_attrs[field.name] = getattr(swallow, field.name)
s = AfricanSwallow(**new_attrs)
s.save()
return s
I couldn't figure out how to get my form validation to work with this method however; so it certainly could be improved more; probably means a database refactoring might be the best long-term solution...
Depends on what kind of model inheritance you'll use. See
http://docs.djangoproject.com/en/dev/topics/db/models/#model-inheritance
for the three classic kinds. Since it sounds like you want Swallow objects that rules out Abstract Base Class.
If you want to store different information in the db for Swallow vs AfricanSwallow vs EuropeanSwallow, then you'll want to use MTI. The biggest problem with MTI as the official django model recommends is that polymorphism doesn't work properly. That is, if you fetch a Swallow object from the DB which is actually an AfricanSwallow object, you won't get an instance o AfricanSwallow. (See this question.) Something like django-model-utils InheritanceManager can help overcome that.
If you have actual data you need to preserve through this change, use South migrations. Make two migrations -- first one that changes the schema and another that copies the appropriate objects' data into subclasses.
I suggest using django-model-utils's InheritanceCastModel. This is one implementation I like. You can find many more in djangosnippets and some blogs, but after going trough them all I chose this one. Hope it helps.
Another (outdated) approach: If you don't mind keeping parent's id you can just create brand new child instances from parent's attrs. This is what I did:
ids = [s.pk for s in Swallow.objects.all()]
# I get ids list to avoid memory leak with long lists
for i in ids:
p = Swallow.objects.get(pk=i)
c = AfricanSwallow(att1=p.att1, att2=p.att2.....)
p.delete()
c.save()
Once this runs, a new AfricanSwallow instance will be created replacing each initial Swallow instance
Maybe this will help someone :)

Set the maximum recursion depth while serializing a Django model with Foreign key to JSON

I have a Django model created for Google's App Engine,
Model A():
propA = ReferenceProperty(B)
Model B():
propB = ReferenceProperty(C)
Model C():
propC = ReferenceProperty(B)
I have written custom Django serializer which will fetch the data for the ReferenceProperty(s) and serialize that along the initial model.
The problem occurs when I try to serialize an instance of Model A. My custom serializer will try to get propA, which contains a reference to Model C so the serializer will fetch Model C, which contains a reference to Model B and the recursion goes on and on. Is there any way to stop the recursion after a depth of say 2??
My serializer is a customized version of link text
P.S: I am willing to publish my code if that seems to needed. I have not currently attached the code since I am not at my development machine.
Thanks,
Arun Shanker Prasad.
Just modify your functions to take a 'depth' argument. Any time you follow a ReferenceProperty, call the function with depth one less than the depth that was passed in. If a function is called with depth==0, return None, or whatever other placeholder value is suitable in your case.
Why don't you just do recursion properly? Any recursive operation must have a base case, otherwise it will continue forever, as your problem indicates.
I'm trying to find a serializer that works with Google App Engine and follows relationships. Would it be possible for you to post the modified code you used to do this?