i have i little problem, and that is how can serialize a django query with defer ?
I have this model :
class Evento(models.Model):
nome=models.CharField(max_length=100)
descricao=models.CharField(max_length=200,null=True)
data_inicio= models.DateTimeField()
data_fim= models.DateTimeField()
preco=models.DecimalField(max_digits=6,decimal_places=2)
consumiveis= models.CharField(max_length=5)
dress_code= models.CharField(max_length=6)
guest_list=models.CharField(max_length=15)
local = models.ForeignKey(Local)
user= models.ManyToManyField(User,null=True,blank=True)
def __unicode__(self):
return unicode('%s %s'%(self.nome,self.descricao))
my query is this :
eventos_totais = Evento.objects.defer("user").filter(data_inicio__gte=default_inicio,
data_fim__lte=default_fim)
it works fine i think (how can i check if the query has realy defer the field user ? ) but when i do:
json_serializer = serializers.get_serializer("json")()
eventos_totais = json_serializer.serialize(eventos_totais,
ensure_ascii=False,
use_natural_keys=True)
it always folow the natural keys for user and local, i need natural keys for this query because of the fields local. But i do not need the field user.
To serialize a subset of your models fields, you need to specify the fields argument to the serializers.serialize()
from django.core import serializers
data = serializers.serialize('xml', SomeModel.objects.all(), fields=('name','size'))
Ref: Django Docs
Related
I have database tables with a 'TYPE' column and many other fields. In many cases, certain column values are null based on the value of 'TYPE'.
E.g.
if I have a product table , with TYPE having either 'car' or 'helicopter'. The columns are:
vertical speed, horizontal speed, and horn amplitude.
In the case of 'car' types, vertical speed should always be null , and in the case of 'helicopter' , horn amplitude should always be null.
In flask admin, is there any way to hide the fields from being submitted based on the currently selected TYPE's value?
It is fine if it is a UI level change (i.e. no backend validation is required for security/consistency purposes).
In my real life scenario, there are over 10 columns with 5+ being null in cases, so it would be very helpful if those fields can be removed in the UI (since it makes the form very long and prone to errors).
I am using flask sqlalchemy as the backend for my flask admin.
Fun question. Here is a working solution. So basically you have product types, and each type has certain valid attributes (e.g. car and honking loudness). You can also have general attributes irregardless of the type, e.g. the name of each product.
On the form_prefill, you check what fields are valid for the product type. Then you throw away the invalid fields from the form, and return the form again. It's actually pretty straightforward.
from flask import Flask
from flask_sqlalchemy import SQLAlchemy
import flask_admin as admin
from flask_admin.contrib import sqla
app = Flask(__name__)
app.secret_key = 'arstt'
db = SQLAlchemy(app)
class Product(db.Model):
id = db.Column(db.Integer, primary_key=True)
type = db.Column(db.String)
name = db.Column(db.String)
vertical_speed = db.Column(db.Integer)
rotor_rpm = db.Column(db.Integer)
honking_loudness = db.Column(db.Integer)
def __str__(self):
return "{}".format(self.name)
class ProductView(sqla.ModelView):
general_product_attributes = ['name']
product_attributes_per_product_type_dict = {
'driving': ['honking_loudness'],
'flying': ['rotor_rpm', 'vertical_speed']
}
def on_form_prefill(self, form, id):
product = self.get_one(id)
form_attributes = self.general_product_attributes + self.product_attributes_per_product_type_dict[product.type]
for field in list(form):
if field.name not in form_attributes:
delattr(form, field.name)
return form
db.create_all()
admin = admin.Admin(app, name='Example: SQLAlchemy', template_mode='bootstrap3')
admin.add_view(ProductView(Product, db.session))
helicopter = Product(type='flying', name='helicopter1', vertical_speed=99)
car = Product(type='driving', name='car2', honking_loudness=33)
db.session.add(helicopter)
db.session.add(car)
db.session.commit()
Note that this only works for the edit form, all attribtutes are still being displayed on the create form because it is not certain yet what type an product will be.
You can override create_form for create form and on_prefill_form for edit form.
in this functions you can pass some parameters to fields using form_widget_args
def create_form(self):
form = super(JobsView, self).create_form()
# kw = decide which fields to show or hide
# kw["vertical_speed"]["class"] = "hide"
self.form_widget_args = kw
return form
def on_form_prefill(self, form, id):
# kw = decide which fields to show or hide
# kw["vertical_speed"]["class"] = "hide"
self.form_widget_args = kw
Maybe I misunderstand the purpose of Django's update_or_create Model method.
Here is my Model:
from django.db import models
import datetime
from vc.models import Cluster
class Vmt(models.Model):
added = models.DateField(default=datetime.date.today, blank=True, null=True)
creation_time = models.TextField(blank=True, null=True)
current_pm_active = models.TextField(blank=True, null=True)
current_pm_total = models.TextField(blank=True, null=True)
... more simple fields ...
cluster = models.ForeignKey(Cluster, null=True)
class Meta:
unique_together = (("cluster", "added"),)
Here is my test:
from django.test import TestCase
from .models import *
from vc.models import Cluster
from django.db import transaction
# Create your tests here.
class VmtModelTests(TestCase):
def test_insert_into_VmtModel(self):
count = Vmt.objects.count()
self.assertEqual(count, 0)
# create a Cluster
c = Cluster.objects.create(name='test-cluster')
Vmt.objects.create(
cluster=c,
creation_time='test creaetion time',
current_pm_active=5,
current_pm_total=5,
... more simple fields ...
)
count = Vmt.objects.count()
self.assertEqual(count, 1)
self.assertEqual('5', c.vmt_set.all()[0].current_pm_active)
# let's test that we cannot add that same record again
try:
with transaction.atomic():
Vmt.objects.create(
cluster=c,
creation_time='test creaetion time',
current_pm_active=5,
current_pm_total=5,
... more simple fields ...
)
self.fail(msg="Should violated integrity constraint!")
except Exception as ex:
template = "An exception of type {0} occurred. Arguments:\n{1!r}"
message = template.format(type(ex).__name__, ex.args)
self.assertEqual("An exception of type IntegrityError occurred.", message[:45])
Vmt.objects.update_or_create(
cluster=c,
creation_time='test creaetion time',
# notice we are updating current_pm_active to 6
current_pm_active=6,
current_pm_total=5,
... more simple fields ...
)
count = Vmt.objects.count()
self.assertEqual(count, 1)
On the last update_or_create call I get this error:
IntegrityError: duplicate key value violates unique constraint "vmt_vmt_cluster_id_added_c2052322_uniq"
DETAIL: Key (cluster_id, added)=(1, 2018-06-18) already exists.
Why didn't wasn't the model updated? Why did Django try to create a new record that violated the unique constraint?
The update_or_create(defaults=None, **kwargs) has basically two parts:
the **kwargs which specify the "filter" criteria to determine if such object is already present; and
the defaults which is a dictionary that contains the fields mapped to values that should be used when we create a new row (in case the filtering fails to find a row), or which values should be updated (in case we find such row).
The problem here is that you make your filters too restrictive: you add several filters, and as a result the database does not find such row. So what happens? The database then aims to create the row with these filter values (and since defaults is missing, no extra values are added). But then it turns out that we create a row, and that the combination of the cluster and added already exists. Hence the database refuses to add this row.
So this line:
Model.objects.update_or_create(field1=val1,
field2=val2,
defaults={
'field3': val3,
'field4': val4
})
Is to semantically approximately equal to:
try:
item = Model.objects.get(field1=val1, field2=val2)
except Model.DoesNotExist:
Model.objects.create(field1=val1, field2=val2, field3=val3, field4=val4)
else:
item = Model.objects.filter(
field1=val1,
field2=val2,
).update(
field3 = val3
field4 = val4
)
(but the original call is typically done in a single query).
You probably thus should write:
Vmt.objects.update_or_create(
cluster=c,
creation_time='test creaetion time',
defaults = {
'current_pm_active': 6,
'current_pm_total': 5,
}
)
(or something similar)
You should separate your field:
Fields that should be searched for
Fields that should be updated
for example:
If I have the model:
class User(models.Model):
username = models.CharField(max_length=200)
nickname = models.CharField(max_length=200)
And I want to search for username = 'Nikolas' and update this instance nickname to 'Nik'(if no User with username 'Nikolas' I need to create it) I should write this code:
User.objects.update_or_create(
username='Nikolas',
defaults={'nickname': 'Nik'},
)
see in https://docs.djangoproject.com/en/3.1/ref/models/querysets/
This is already answered well in the above.
To be more clear the update_or_create() method should have **kwargs as those parameters on which you want to check if that data already exists in DB by filtering.
select some_column from table_name where column1='' and column2='';
Filtering by **kwargs will give you objects. Now if you wish to update any data/column of those filtered objects, you should pass them in defaults param in update_or_create() method.
so lets say you found an object based on a filter now the default param values are expected to be picked and updated.
and if there's no matching object found based on the filter then it goes ahead and creates an entry with filters and the default param passed.
I am trying to filter in view my queryset based on relation between 2 fields .
however always getting the error that my field is not defined .
My Model has several calculated columns and I want to get only the records where values of field A are greater than field B.
So this is my model
class Material(models.Model):
version = IntegerVersionField( )
code = models.CharField(max_length=30)
name = models.CharField(max_length=30)
min_quantity = models.DecimalField(max_digits=19, decimal_places=10)
max_quantity = models.DecimalField(max_digits=19, decimal_places=10)
is_active = models.BooleanField(default=True)
def _get_totalinventory(self):
from inventory.models import Inventory
return Inventory.objects.filter(warehouse_Bin__material_UOM__UOM__material=self.id, is_active = true ).aggregate(Sum('quantity'))
totalinventory = property(_get_totalinventory)
def _get_totalpo(self):
from purchase.models import POmaterial
return POmaterial.objects.filter(material=self.id, is_active = true).aggregate(Sum('quantity'))
totalpo = property(_get_totalpo)
def _get_totalso(self):
from sales.models import SOproduct
return SOproduct.objects.filter(product__material=self.id , is_active=true ).aggregate(Sum('quantity'))
totalso = property(_get_totalpo)
#property
def _get_total(self):
return (totalinventory + totalpo - totalso)
total = property(_get_total)
And this is line in my view where I try to get the conditional queryset
po_list = MaterialFilter(request.GET, queryset=Material.objects.filter( total__lte = min_quantity ))
But I am getting the error that min_quantity is not defined
What could be the problem ?
EDIT:
My problem got solved thank you #Moses Koledoye but in the same code I have different issue now
Cannot resolve keyword 'total' into field.Choices are: am, author, author_id, bom, bomversion, code, creation_time, description, id, inventory, is_active, is_production, itemgroup, itemgroup_id, keywords, materialuom, max_quantity, min_quantity, name, pomaterial, produce, product, slug, trigger_quantity, uom, updated_by, updated_by_id, valid_from, valid_to, version, warehousebin
Basically it doesn't show any of my calculated fields I have in my model.
Django cannot write a query which is conditioned on a field whose value is unknown. You need to use a F expression for this:
from django.db.models import F
queryset = Material.objects.filter(total__lte = F('min_quantity'))
And your FilterSet becomes:
po_list = MaterialFilter(request.GET, queryset = Material.objects.filter(total__lte=F('min_quantity')))
From the docs:
An F() object represents the value of a model field or annotated
column. It makes it possible to refer to model field values and
perform database operations using them without actually having to pull
them out of the database into Python memory
I'm using Django 1.8.4 in my dev machine using Sqlite and I have these models:
class ModelA(Model):
field_a = CharField(verbose_name='a', max_length=20)
field_b = CharField(verbose_name='b', max_length=20)
class Meta:
unique_together = ('field_a', 'field_b',)
class ModelB(Model):
field_c = CharField(verbose_name='c', max_length=20)
field_d = ForeignKey(ModelA, verbose_name='d', null=True, blank=True)
class Meta:
unique_together = ('field_c', 'field_d',)
I've run proper migration and registered them in the Django Admin. So, using the Admin I've done this tests:
I'm able to create ModelA records and Django prohibits me from creating duplicate records - as expected!
I'm not able to create identical ModelB records when field_b is not empty
But, I'm able to create identical ModelB records, when using field_d as empty
My question is: How do I apply unique_together for nullable ForeignKey?
The most recent answer I found for this problem has 5 year... I do think Django have evolved and the issue may not be the same.
Django 2.2 added a new constraints API which makes addressing this case much easier within the database.
You will need two constraints:
The existing tuple constraint; and
The remaining keys minus the nullable key, with a condition
If you have multiple nullable fields, I guess you will need to handle the permutations.
Here's an example with a thruple of fields that must be all unique, where only one NULL is permitted:
from django.db import models
from django.db.models import Q
from django.db.models.constraints import UniqueConstraint
class Badger(models.Model):
required = models.ForeignKey(Required, ...)
optional = models.ForeignKey(Optional, null=True, ...)
key = models.CharField(db_index=True, ...)
class Meta:
constraints = [
UniqueConstraint(fields=['required', 'optional', 'key'],
name='unique_with_optional'),
UniqueConstraint(fields=['required', 'key'],
condition=Q(optional=None),
name='unique_without_optional'),
]
UPDATE: previous version of my answer was functional but had bad design, this one takes in account some of the comments and other answers.
In SQL NULL does not equal NULL. This means if you have two objects where field_d == None and field_c == "somestring" they are not equal, so you can create both.
You can override Model.clean to add your check:
class ModelB(Model):
#...
def validate_unique(self, exclude=None):
if ModelB.objects.exclude(id=self.id).filter(field_c=self.field_c, \
field_d__isnull=True).exists():
raise ValidationError("Duplicate ModelB")
super(ModelB, self).validate_unique(exclude)
If used outside of forms you have to call full_clean or validate_unique.
Take care to handle the race condition though.
#ivan, I don't think that there's a simple way for django to manage this situation. You need to think of all creation and update operations that don't always come from a form. Also, you should think of race conditions...
And because you don't force this logic on DB level, it's possible that there actually will be doubled records and you should check it while querying results.
And about your solution, it can be good for form, but I don't expect that save method can raise ValidationError.
If it's possible then it's better to delegate this logic to DB. In this particular case, you can use two partial indexes. There's a similar question on StackOverflow - Create unique constraint with null columns
So you can create Django migration, that adds two partial indexes to your DB
Example:
# Assume that app name is just `example`
CREATE_TWO_PARTIAL_INDEX = """
CREATE UNIQUE INDEX model_b_2col_uni_idx ON example_model_b (field_c, field_d)
WHERE field_d IS NOT NULL;
CREATE UNIQUE INDEX model_b_1col_uni_idx ON example_model_b (field_c)
WHERE field_d IS NULL;
"""
DROP_TWO_PARTIAL_INDEX = """
DROP INDEX model_b_2col_uni_idx;
DROP INDEX model_b_1col_uni_idx;
"""
class Migration(migrations.Migration):
dependencies = [
('example', 'PREVIOUS MIGRATION NAME'),
]
operations = [
migrations.RunSQL(CREATE_TWO_PARTIAL_INDEX, DROP_TWO_PARTIAL_INDEX)
]
Add a clean method to your model - see below:
def clean(self):
if Variants.objects.filter("""Your filter """).exclude(pk=self.pk).exists():
raise ValidationError("This variation is duplicated.")
I think this is more clear way to do that for Django 1.2+
In forms it will be raised as non_field_error with no 500 error, in other cases, like DRF you have to check this case manual, because it will be 500 error.
But it will always check for unique_together!
class BaseModelExt(models.Model):
is_cleaned = False
def clean(self):
for field_tuple in self._meta.unique_together[:]:
unique_filter = {}
unique_fields = []
null_found = False
for field_name in field_tuple:
field_value = getattr(self, field_name)
if getattr(self, field_name) is None:
unique_filter['%s__isnull' % field_name] = True
null_found = True
else:
unique_filter['%s' % field_name] = field_value
unique_fields.append(field_name)
if null_found:
unique_queryset = self.__class__.objects.filter(**unique_filter)
if self.pk:
unique_queryset = unique_queryset.exclude(pk=self.pk)
if unique_queryset.exists():
msg = self.unique_error_message(self.__class__, tuple(unique_fields))
raise ValidationError(msg)
self.is_cleaned = True
def save(self, *args, **kwargs):
if not self.is_cleaned:
self.clean()
super().save(*args, **kwargs)
One possible workaround not mentioned yet is to create a dummy ModelA object to serve as your NULL value. Then you can rely on the database to enforce the uniqueness constraint.
So I'm trying to return a JSON object for a project. I've spent a few hours trying to get Django just returning the JSON.
Heres the view that we've been working with:
def json(request, first_name):
user = User.objects.all()
#user = User.objects.all().values()
result = simplejson.dumps(user, default=json_util.default)
return HttpResponse(result)
Here's my model:
class User(Document):
gender = StringField( choices=['male', 'female', 'Unknown'])
age = IntField()
email = EmailField()
display_name = StringField(max_length=50)
first_name = StringField(max_length=50)
last_name = StringField(max_length=50)
location = StringField(max_length=50)
status = StringField(max_length=50)
hideStatus = BooleanField()
photos = ListField(EmbeddedDocumentField('Photo'))
profile =ListField(EmbeddedDocumentField('ProfileItem'))
allProfile = ListField(EmbeddedDocumentField('ProfileItem')) #only return for your own profile
This is what it's returning:
[<User: User object>, <User: User object>] is not JSON serializable
Any thoughts on how I can just return the JSON?
With MongoEngine 0.8 or greater, objects and querysets have a to_json() method.
>>> User.objects.to_json()
simplejson.dumps() doesn't know how to "reach into" your custom objects; the default function, json_util.default must just be calling str() or repr() on your documents. (Is json_util custom code you've written? If so, showing its source here could prove my claim.)
Ultimately, your default function will need to be able to make sense of the MongoEngine documents. I can think of at least two ways that this might be implemented:
Write a custom default function that works for all MongoEngine documents by introspecting their _fields attribute (though note that the leading underscore means that this is part of the private API/implementation detail of MongoEngine and may be subject to change in future versions)
Have each of your documents implement a as_dict method which returns a dictionary representation of the object. This would work similarly to the to_mongo method provided on documents by MongoEngine, but shouldn't return the _types or _cls fields (again, these are implementation details of MongoEngine).
I'd suggest you go with option #2: the code will be cleaner and easier to read, better encapsulated, and won't require using any private APIs.
As dcrosta suggested you can do something like this, hope that will help you.
Document definition
class MyDocument(Document):
# Your document definition
def to_dict(self):
return mongo_to_dict_helper(self)
helper.py:
from mongoengine import StringField, ListField, IntField, FloatField
def mongo_to_dict_helper(obj):
return_data = []
for field_name in obj._fields:
if field_name in ("id",):
continue
data = obj._data[field_name]
if isinstance(obj._fields[field_name], StringField):
return_data.append((field_name, str(data)))
elif isinstance(obj._fields[field_name], FloatField):
return_data.append((field_name, float(data)))
elif isinstance(obj._fields[field_name], IntField):
return_data.append((field_name, int(data)))
elif isinstance(obj._fields[field_name], ListField):
return_data.append((field_name, data))
else:
# You can define your logic for returning elements
return dict(return_data)