I have model with one field(this is synthetic example):
model Tank:
oxygen = models.PositiveSmallIntegerField(
_("Oxygen %"),
help_text="%",
default=21,
validators=[MinValueValidator(21.0), MaxValueValidator(50.0)],
null=True,
)
And I parse some files with data. I want to validate input data before write it model instance. Something like this
oxygen = get_raw_data()
Tank.oxygen.validate(oxygen) # this is wrong I know :)
# if value is valid do something
# else do something
What should I write instead of Tank.oxygen.validate(oxygen)?
I can duplicate validation logic or validate data when save model instance, but maybe somebody know better solution.
You need to actually create an instance with the data, then call full_clean() on it:
my_tank = Tank(oxygen=oxygen)
my_tank.full_clean()
If you only want to validate one field, then I suggest you use a form.Field class to do it:
from django import forms
from django.core.exceptions import ValidationError
oxygen_field = forms.IntegerField(required=True, min_value=21, max_value=50)
def is_oxygen_valid(value):
try:
oxygen_field.clean(value)
except ValidationError:
return False
else:
return True
Testing:
>>> is_oxygen_valid(None)
False
>>> is_oxygen_valid(11)
False
>>> is_oxygen_valid(55)
False
>>> is_oxygen_valid('hello')
False
>>> is_oxygen_valid(list())
False
>>> is_oxygen_valid(45)
True
I'm assuming that you are going to be validating the oxygen field first, and then deciding what to do depending on the result. You could adapt this to any other field you need. Just find the appropriate FormField class, and any Validators you might need, and use that instead.
Related
Even though I put this line in my settings.py :
LANGUAGE_CODE = 'pt-br'
TIME_ZONE = 'America/Sao_Paulo'
USE_I18N = True
USE_L10N = True
USE_TZ = True
DECIMAL_SEPARATOR = ','
DATE_INPUT_FORMATS = ['%d/%m/%Y']
DATE_FORMAT = r'd/m/Y'
As specified here : https://docs.djangoproject.com/en/1.10/ref/settings/#decimal-separator
Even with L10N set to False it won't recognize(although the language code should already set decimal separator as comma)
Django still won't recognize the comma as decimal separator
Actual error :
ValueError: could not convert string to float: '123,123'
The field is just a default FloatField, I'm not using forms.
What could be causing it to not recognize the comma?
This is the views.py code :
def new_object(request):
data = json.loads(request.body.decode("utf-8"))
model_name = data.get('model')
model = apps.get_model(app_label='cadastroimoveis', model_name=model_name)
obj = model(**data.get('fields'))
obj.save()
The sent request is just a JSON with the fields as strings
Edit: I just checked and not even the DATE_INPUT_FORMATS it working, it is still expecting the default values
The problem is you seem to be confusing model fields with form fields. The form field offers localization, and it works:
>>> from django.db.models.fields import FloatField
>>> from django.forms.fields import FloatField as FloatFormField
>>> model_field = FloatField()
>>> form_field = model_field.formfield(localize=True)
>>> isinstance(form_field, FloatFormField)
True
>>> form_field.to_python('123,123')
123.123
The model field does not:
>>> model_field.to_python('123,123')
ValidationError: [u"'123,123' value must be a float."]
The model field is nowhere documented as supporting localization, and I can see nothing in the source to suggest that it should be supported.
The line obj = model(**data.get('fields')) shows that you are not using forms at all, you're just consuming a JSON data source and ramming that directly into the model. So, I think the better option for you is to pre-process the JSON data, since Django doesn't seem to support what you're looking for.
You might like to use sanitize_separators helper function, since that's what the forms.fields.FloatField uses to clean the data. Demo:
>>> from django.utils.formats import sanitize_separators
>>> sanitize_separators('123,123')
'123.123'
This seems a Django bug and there is something similar reported here
But I'm pretty sure you can overcome this by using forms. I believe, when you try to update that FloatField value in your admin you see : '123.123' and this is because the value is kept in DB's numeric field which cannot support ',' comma. But you can use forms and define form field as follows to see comma instead:
your_field_name = forms.FloatField(localize=True)
By saying localize=True it does 2 things; one to use TextInput instead of NumberInput and also force Django to format the number based on localize settings - doc.
But if you don't use forms, the only way to overcome this is to sanitize you numbers before assigned to model field, and you can use formats.sanitize_separators() function from django.utils:
from django.utils import formats
model.your_field_name = formats.sanitize_separators(value_from_user)
model.save()
Django ChoiceField "Validates that the given value exists in the list of choices."
I want a ChoiceField (so I can input choices in the view) but I don't want Django to check if the choice is in the list of choices. It's complicated to explain why but this is what I need. How would this be achieved?
You could create a custom ChoiceField and override to skip validation:
class ChoiceFieldNoValidation(ChoiceField):
def validate(self, value):
pass
I'd like to know your use case, because I really can't think of any reason why you would need this.
Edit: to test, make a form:
class TestForm(forms.Form):
choice = ChoiceFieldNoValidation(choices=[('one', 'One'), ('two', 'Two')])
Provide "invalid" data, and see if the form is still valid:
form = TestForm({'choice': 'not-a-valid-choice'})
form.is_valid() # True
Best way to do this from the looks of it is create a forms.Charfield and use a forms.Select widget. Here is an example:
from django import forms
class PurchaserChoiceForm(forms.ModelForm):
floor = forms.CharField(required=False, widget=forms.Select(choices=[]))
class Meta:
model = PurchaserChoice
fields = ['model', ]
For some reason overwriting the validator alone did not do the trick for me.
As another option, you could write your own validator
from django.core.exceptions import ValidationError
def validate_all_choices(value):
# here have your custom logic
pass
and then in your form
class MyForm(forms.Form):
my_field = forms.ChoiceField(validators=[validate_all_choices])
Edit: another option could be defining the field as a CharField but then render it manually in the template as a select with your choices. This way, it can accept everything without needing a custom validator
I'm using django 1.5.4
Here's a minimal example of the issue I'm facing.
The model:
#models.py
from django.db import models
class SampleModel(models.Model):
spam = models.BooleanField()
The form:
#forms.py
from django.forms import ModelForm
from .models import SampleModel
class SampleModelForm(ModelForm):
class Meta:
model = SampleModel
fields = ('spam', )
From the django shell:
>>> data = {} #intentionally blank
>>> form = SampleModelForm(data)
>>> is_valid = form.is_valid() #is_valid is True
>>> form.save() # model instance is created with "spam" set to False by default.
Am I validating the form incorrectly? form.is_valid validates fields of other types correctly.
The docs indicate that all fields are required by default but is_valid returns Truewithout the boolean field key being present.
I need to ensure that the boolean field is present in the input data.
As of now, I'm manually checking if the field is present and is of type bool. Do you think I should override form.is_valid and add this check so that it can be reused for other models too?
It turns out (from code inspection; the docs don't say) that model BooleanFields have blank=True set automatically in their __init__ method, thus making the automatically created model form field not required. This makes sense upon consideration (False counts as blank, so BooleanFields need it to be true) but it's not obvious when you just read the docs.
If you want it to be required to be True, the usual form field overrides apply - declare the field yourself or set its required attribute to be True somewhere before validating (I usually use the form's __init__ method). If you want it to allow True or False but not Python None, it's harder.
Specifically, the standard widget for a BooleanField is a checkbox. Browsers do not submit anything for unchecked checkboxes, so the checkbox widget treats the absence of the field from the submit as False. There's no way to distinguish the user not selecting the checkbox from cases in which the input really is bad. You could use a different widget (say, a RadioSelect) to at least make it possible for the browser to submit something for False, but you still have the problem that the BooleanField's to_python method converts its value to a boolean before validating, so you'd have to subclass and override.
Fortunately, False is not considered empty by the validation code, so if you do that you'll be able to set required=True and don't need custom cleaning methods.
I'm still trying to understand the correct way to validate a Django model object using a custom validator at the model level. I know that validation is usually done within a form or model form. However, I want to ensure the integrity of my data at the model level if I'm interacting with it via the ORM in the Python shell. Here's my current approach:
from django.db import models
from django.core import validators
from django.core exceptions import ValidationError
def validate_gender(value):
""" Custom validator """
if not value in ('m', 'f', 'M', 'F'):
raise ValidationError(u'%s is not a valid value for gender.' % value)
class Person(models.Model):
name = models.CharField(max_length=128)
age = models.IntegerField()
gender = models.CharField(maxlength=1, validators=[validate_gender])
def save(self, *args, **kwargs):
""" Override Person's save """
self.full_clean(exclude=None)
super(Person, self).save(*args, **kwargs)
Here are my questions:
Should I create a custom validation function, designate it as a validator, and then override the Person's save() function as I've done above? (By the way, I know I could validate my gender choices using the 'choices' field option but I created 'validate_gender' for the purpose of illustration).
If I really want to ensure the integrity of my data, should I not only write Django unit tests for testing at the model layer but also equivalent database-level unit tests using Python/Psycopg? I've noticed that Django unit tests, which raise ValidationErrors, only test the model's understanding of the database schema using a copy of the database. Even if I were to use South for migrations, any database-level constraints are limited to what Django can understand and translate into a Postgres constraint. If I need a custom constraint that Django can't replicate, I could potentially enter data into my database that violates that constraint if I'm interacting with the database directly via the psql terminal.
Thanks!
I had a similar misunderstanding of the ORM when I first started with Django.
No, don't put self.full_clean() inside of save. Either
A) use a ModelForm (which will cause all the same validation to occur - note: ModelForm.is_valid() won't call Model.full_clean explicitly, but will perform the exact same checks as Model.full_clean). Example:
class PersonForm(forms.ModelForm):
class Meta:
model = Person
def add_person(request):
if request.method == 'POST':
form = PersonForm(request.POST, request.FILES)
if form.is_valid(): # Performs your validation, including ``validate_gender``
person = form.save()
return redirect('some-other-view')
else:
form = PersonForm()
# ... return response with ``form`` in the context for rendering in a template
Also note, forms aren't for use only in views that render them in templates - they're great for any sort of use, including an API, etc. After running form.is_valid() and getting errors, you'll have form.errors which is a dictionary containing all the errors in the form, including a key called '__all__' which will contain non-field errors.
B) Simply use model_instance.full_clean() in your view (or other logical application layer), instead of using a form, but forms are a nice abstraction for this.
I don't really have a solution to, but I've never run into such a problem, even in large projects (the current project I work with my company on has 146 tables) and I don't suspect it'll be a concern in your case either.
I generally check if obj.pk to knwo if the objects is saved. This wont work however, if you have primary_key = True set on some fields. Eg I set user = models.OneToOneField(User, primary_key=True) on my UserProfile.
What is the canonical way to find out if a Django model is saved to db?
Nowadays you can check for:
self._state.adding
This value is set by the QuerySet.iterator() for objects which are not added yet in the database. You can't use this value in the __init__() method yet, as it's set after the object is constructed.
Important Note (as of 6 May '19): If your models use UUID fields (or other method of internal ID generation, use self._state.adding as mentioned in the comments.
Actually,obj.pk is the most canonical way. Django itself often doesn't "know" if the object is saved or not. According to the django model instance reference, if there is a primary key set already, it checks onsave() calls by selecting for the id in the database before any insert.
Even if you set user = models.OneToOneField(..., primary_key=True) the .pk attribute will still point to the correct primary key (most likely user_id) and you can use it and set it as if it was the same property.
If you want to know after an object has been saved, you can catch the post_save signal. This signal is fired on model saves, and if you want you can add your own application-specific attribute to the model, for example obj.was_saved = True. I think django avoids this to keep their instances clean, but there's no real reason why you couldn't do this for yourself. Here is a minimal example:
from django.db.models.signals import post_save
from myapp.models import MyModel
def save_handler(sender, instance, **kwargs):
instance.was_saved = True
post_save.connect(save_handler, sender=MyModel)
You can alternately have this function work for all models in your app by simply connecting the signal without specifying the sender= argument. Beware though, you can create undefined behaviours if you override a property on someone else's model instance that you are importing.
Lets say obj is an instance of MyModel. Then we could use the following block of code to check if there already is an instance with that primary key in the database:
if obj.pk is None:
# Definitely doesn't exist, since there's no `pk`.
exists = False
else:
# The `pk` is set, but it doesn't guarantee exists in db.
try:
obj_from_db = MyModel.objects.get(pk=obj.pk)
exists = True
except MyModel.DoesNotExist:
exists = False
This is better than checking whether obj.pk is None, because you could do
obj = MyModel()
obj.pk = 123
then
obj.pk is None # False
This is even very likely when you don't use the autoincrement id field as the primary key but a natural one instead.
Or, as Matthew pointed out in the comments, you could do
obj.delete()
after which you still have
obj.pk is None # False
#Crast's answer was good, but I think incomplete. The code I use in my unit tests for determining if an object is in the database is as follows. Below it, I will explain why I think it is superior to checking if obj.pk is None.
My solution
from django.test import TestCase
class TestCase(TestCase):
def assertInDB(self, obj, msg=None):
"""Test for obj's presence in the database."""
fullmsg = "Object %r unexpectedly not found in the database" % obj
fullmsg += ": " + msg if msg else ""
try:
type(obj).objects.get(pk=obj.pk)
except obj.DoesNotExist:
self.fail(fullmsg)
def assertNotInDB(self, obj, msg=None):
"""Test for obj's absence from the database."""
fullmsg = "Object %r unexpectedly found in the database" % obj
fullmsg += ": " + msg if msg else ""
try:
type(obj).objects.get(pk=obj.pk)
except obj.DoesNotExist:
return
else:
self.fail(fullmsg)
Notes: Use the above code with care if you use custom managers on your models name something other than objects. (I'm sure there's a way to get Django to tell you what the default manager is.) Further, I know that /assert(Not)?InDB/ are not a PEP 8 method names, but I used the style the rest of the unittest package used.
Justification
The reason I think assertInDB(obj) is better than assertIsNotNone(obj.pk) is because of the following case. Suppose you have the following model.
from django.db import models
class Node(models.Model):
next = models.OneToOneField('self', null=True, related_name='prev')
Node models a doubly linked list: you can attach arbitrary data to each node using foreign keys and the tail is the Node obj such that obj.next is None. By default, Django adds the SQL constraint ON DELETE CASCADE to the primary key of Node. Now, suppose you have a list nodes of length n such that nodes[i].next == nodes[i + 1] for i in [0, n - 1). Suppose you call nodes[0].delete(). In my tests on Django 1.5.1 on Python 3.3, I found that nodes[i].pk is not None for i in [1, n) and only nodes[0].pk is None. However, my /assert(Not)?InDB/ methods above correctly detected that nodes[i] for i in [1, n) had indeed been deleted.