Django: Get list of model fields? - django

I've defined a User class which (ultimately) inherits from models.Model. I want to get a list of all the fields defined for this model. For example, phone_number = CharField(max_length=20). Basically, I want to retrieve anything that inherits from the Field class.
I thought I'd be able to retrieve these by taking advantage of inspect.getmembers(model), but the list it returns doesn't contain any of these fields. It looks like Django has already gotten a hold of the class and added all its magic attributes and stripped out what's actually been defined. So... how can I get these fields? They probably have a function for retrieving them for their own internal purposes?

Django versions 1.8 and later:
You should use get_fields():
[f.name for f in MyModel._meta.get_fields()]
The get_all_field_names() method is deprecated starting from Django
1.8 and will be removed in 1.10.
The documentation page linked above provides a fully backwards-compatible implementation of get_all_field_names(), but for most purposes the previous example should work just fine.
Django versions before 1.8:
model._meta.get_all_field_names()
That should do the trick.
That requires an actual model instance. If all you have is a subclass of django.db.models.Model, then you should call myproject.myapp.models.MyModel._meta.get_all_field_names()

As most of answers are outdated I'll try to update you on Django 2.2
Here posts- your app (posts, blog, shop, etc.)
1) From model link: https://docs.djangoproject.com/en/stable/ref/models/meta/
from posts.model import BlogPost
all_fields = BlogPost._meta.fields
#or
all_fields = BlogPost._meta.get_fields()
Note that:
all_fields=BlogPost._meta.get_fields()
Will also get some relationships, which, for ex: you can not display in a view.
As in my case:
Organisation._meta.fields
(<django.db.models.fields.AutoField: id>, <django.db.models.fields.DateField: created>...
and
Organisation._meta.get_fields()
(<ManyToOneRel: crm.activity>, <django.db.models.fields.AutoField: id>, <django.db.models.fields.DateField: created>...
2) From instance
from posts.model import BlogPost
bp = BlogPost()
all_fields = bp._meta.fields
3) From parent model
Let's suppose that we have Post as the parent model and you want to see all the fields in a list, and have the parent fields to be read-only in Edit mode.
from django.contrib import admin
from posts.model import BlogPost
#admin.register(BlogPost)
class BlogPost(admin.ModelAdmin):
all_fields = [f.name for f in Organisation._meta.fields]
parent_fields = BlogPost.get_deferred_fields(BlogPost)
list_display = all_fields
read_only = parent_fields

The get_all_related_fields() method mentioned herein has been deprecated in 1.8. From now on it's get_fields().
>> from django.contrib.auth.models import User
>> User._meta.get_fields()

I find adding this to django models quite helpful:
def __iter__(self):
for field_name in self._meta.get_all_field_names():
value = getattr(self, field_name, None)
yield (field_name, value)
This lets you do:
for field, val in object:
print field, val

This does the trick. I only test it in Django 1.7.
your_fields = YourModel._meta.local_fields
your_field_names = [f.name for f in your_fields]
Model._meta.local_fields does not contain many-to-many fields. You should get them using Model._meta.local_many_to_many.

It is not clear whether you have an instance of the class or the class itself and trying to retrieve the fields, but either way, consider the following code
Using an instance
instance = User.objects.get(username="foo")
instance.__dict__ # returns a dictionary with all fields and their values
instance.__dict__.keys() # returns a dictionary with all fields
list(instance.__dict__.keys()) # returns list with all fields
Using a class
User._meta.__dict__.get("fields") # returns the fields
# to get the field names consider looping over the fields and calling __str__()
for field in User._meta.__dict__.get("fields"):
field.__str__() # e.g. 'auth.User.id'

def __iter__(self):
field_names = [f.name for f in self._meta.fields]
for field_name in field_names:
value = getattr(self, field_name, None)
yield (field_name, value)
This worked for me in django==1.11.8

A detail not mentioned by others:
[f.name for f in MyModel._meta.get_fields()]
get, for example
['id', 'name', 'occupation']
and
[f.get_attname() for f in MyModel._meta.get_fields()]
get
['id', 'name', 'occupation_id']
If
reg = MyModel.objects.first()
then
reg.occupation
get, for example
<Occupation: Dev>
and
reg.occupation_id
get
1

MyModel._meta.get_all_field_names() was deprecated several versions back and removed in Django 1.10.
Here's the backwards-compatible suggestion from the docs:
from itertools import chain
list(set(chain.from_iterable(
(field.name, field.attname) if hasattr(field, 'attname') else (field.name,)
for field in MyModel._meta.get_fields()
# For complete backwards compatibility, you may want to exclude
# GenericForeignKey from the results.
if not (field.many_to_one and field.related_model is None)
)))

Just to add, I am using self object, this worked for me:
[f.name for f in self.model._meta.get_fields()]

At least with Django 1.9.9 -- the version I'm currently using --, note that .get_fields() actually also "considers" any foreign model as a field, which may be problematic. Say you have:
class Parent(models.Model):
id = UUIDField(primary_key=True)
class Child(models.Model):
parent = models.ForeignKey(Parent)
It follows that
>>> map(lambda field:field.name, Parent._model._meta.get_fields())
['id', 'child']
while, as shown by #Rockallite
>>> map(lambda field:field.name, Parent._model._meta.local_fields)
['id']

So before I found this post, I successfully found this to work.
Model._meta.fields
It works equally as
Model._meta.get_fields()
I'm not sure what the difference is in the results, if there is one. I ran this loop and got the same output.
for field in Model._meta.fields:
print(field.name)

In sometimes we need the db columns as well:
def get_db_field_names(instance):
your_fields = instance._meta.local_fields
db_field_names=[f.name+'_id' if f.related_model is not None else f.name for f in your_fields]
model_field_names = [f.name for f in your_fields]
return db_field_names,model_field_names
Call the method to get the fields:
db_field_names,model_field_names=get_db_field_names(Mymodel)

Combined multiple answers of the given thread (thanks!) and came up with the following generic solution:
class ReadOnlyBaseModelAdmin(ModelAdmin):
def has_add_permission(self, request):
return request.user.is_superuser
def has_delete_permission(self, request, obj=None):
return request.user.is_superuser
def get_readonly_fields(self, request, obj=None):
return [f.name for f in self.model._meta.get_fields()]

Why not just use that:
manage.py inspectdb
Example output:
class GuardianUserobjectpermission(models.Model):
id = models.IntegerField(primary_key=True) # AutoField?
object_pk = models.CharField(max_length=255)
content_type = models.ForeignKey(DjangoContentType, models.DO_NOTHING)
permission = models.ForeignKey(AuthPermission, models.DO_NOTHING)
user = models.ForeignKey(CustomUsers, models.DO_NOTHING)
class Meta:
managed = False
db_table = 'guardian_userobjectpermission'
unique_together = (('user', 'permission', 'object_pk'),)

Related

Django "ValueError: Can't bulk create a multi-table inherited model"

Problem
I am using the django-model-utils InheritanceManager. I have a super Notification(models.Model) class which I use to create many notification subclasses such as PostNotification(Notification), CommentNotification(Notification), etc., and when trying to run CommentNotification.objects.bulk_create(list_of_comment_notification_objects), i get the following traceback:
File "/home/me/.virtualenvs/project/local/lib/python2.7/site-packages/django/db/models/query.py", line 429, in bulk_create
raise ValueError("Can't bulk create a multi-table inherited model")
ValueError: Can't bulk create a multi-table inherited model
and upon inspecting the query.py file, we get this causes the error:
for parent in self.model._meta.get_parent_list():
if parent._meta.concrete_model is not self.model._meta.concrete_model:
raise ValueError("Can't bulk create a multi-table inherited model")
Environment
Django Model Utils version: 3.1.1
Django version: 1.11.7
Python version: 2.7.3
Example
PostNotification.objects.bulk_create(
[PostNotification(related_user=user, post=instance) for user in users]
)
throws the above exception
What I have tried and though was a success originally:
I though that simply running:
BaseClass.objects.bulk_create(list_of_SubClass_objects) instead of SubClass.objects.bulk_create(list_of_SubClass_objects) would work and return a list of SubClass values, but subsequently running SubClass.objects.all() would return an empty result. The bulk_create() would only create a Notification base class object for each item in the list.
Found a hacky solution. I hope it works in your case. The trick is create a model (which is not an inherited one) dynamically that has some meta (db_table) set. And use this dynamic model to create Child objects in bulk (in other words write into Child's DB table).
class Parent(models.Model):
name = models.CharField(max_length=10)
class Child(Parent):
phone = models.CharField(max_length=12)
# just an example. Should be expanded to work properly.
field_type_mapping = {
'OneToOneField': models.IntegerField,
'CharField': models.CharField,
}
def create_model(Model, app_label='children', module='', options=None):
"""
Create specified model
"""
model_name = Model.__name__
class Meta:
managed = False
db_table = Model._meta.db_table
if app_label:
# app_label must be set using the Meta inner class
setattr(Meta, 'app_label', app_label)
# Update Meta with any options that were provided
if options is not None:
for key, value in options.iteritems():
setattr(Meta, key, value)
# Set up a dictionary to simulate declarations within a class
attrs = {'__module__': module, 'Meta': Meta}
# Add in any fields that were provided
fields = dict()
for field in Model._meta.fields:
if field.attname == 'id':
continue
if field.model.__name__ == model_name:
field_class_name = type(field).__name__
print(field.attname)
fields[field.attname] = field_type_mapping[field_class_name]()
# Create the class, which automatically triggers ModelBase processing
attrs.update(fields)
model = type(f'{model_name}Shadow', (models.Model,), attrs)
return model
mod = create_model(Child)
parents = [Parent(name=i) for i in range(15)]
parents = Parent.objects.bulk_create(parents)
children = [mod(phone=parent.name, parent_ptr_id=parent.id) for parent in parents]
mod.objects.bulk_create(children)
I've done a custom implementation of bulk_create that seems to be working for my case (only one parent relationship and not autoincremented pk):
from django.db import models
class MultiTableChildQueryset(models.QuerySet):
def bulk_create(self, objs, batch_size=None):
assert batch_size is None or batch_size > 0
if not objs:
return objs
self._for_write = True
objs = list(objs)
parent_model = self.model._meta.pk.related_model
parent_objs = []
for obj in objs:
parent_values = {}
for field in [f for f in parent_model._meta.fields if hasattr(obj, f.name)]:
parent_values[field.name] = getattr(obj, field.name)
parent_objs.append(parent_model(**parent_values))
setattr(obj, self.model._meta.pk.attname, obj.id)
parent_model.objects.bulk_create(parent_objs, batch_size=batch_size)
with transaction.atomic(using=self.db, savepoint=False):
self._batched_insert(objs, self.model._meta.local_fields, batch_size)
return objs
A slightly easier to read version of Moises:
from typing import TypeVar
from django.db.models import Model
M = TypeVar('M', bound=Model)
def multi_inheritance_table_bulk_insert(data: List[M]) -> None:
"""
Bulk insert data into a multi-inheritance table.
"""
if not data:
return
model = data[0].__class__
local_fields = model._meta.local_fields
parent_model = model._meta.pk.related_model
parent_fields = parent_model._meta.local_fields
parent_objects = [
parent_model(**{field.name: getattr(obj, field.name) for field in parent_fields})
for obj in data
]
parent_model.objects.bulk_create(parent_objects)
for parent, obj in zip(parent_objects, data):
obj.pk = parent.pk
queryset = QuerySet(model)
queryset._for_write = True
with transaction.atomic(using=queryset.db, savepoint=False):
queryset._batched_insert(
data,
local_fields,
batch_size=None,
)

Django unique_together with nullable ForeignKey

I'm using Django 1.8.4 in my dev machine using Sqlite and I have these models:
class ModelA(Model):
field_a = CharField(verbose_name='a', max_length=20)
field_b = CharField(verbose_name='b', max_length=20)
class Meta:
unique_together = ('field_a', 'field_b',)
class ModelB(Model):
field_c = CharField(verbose_name='c', max_length=20)
field_d = ForeignKey(ModelA, verbose_name='d', null=True, blank=True)
class Meta:
unique_together = ('field_c', 'field_d',)
I've run proper migration and registered them in the Django Admin. So, using the Admin I've done this tests:
I'm able to create ModelA records and Django prohibits me from creating duplicate records - as expected!
I'm not able to create identical ModelB records when field_b is not empty
But, I'm able to create identical ModelB records, when using field_d as empty
My question is: How do I apply unique_together for nullable ForeignKey?
The most recent answer I found for this problem has 5 year... I do think Django have evolved and the issue may not be the same.
Django 2.2 added a new constraints API which makes addressing this case much easier within the database.
You will need two constraints:
The existing tuple constraint; and
The remaining keys minus the nullable key, with a condition
If you have multiple nullable fields, I guess you will need to handle the permutations.
Here's an example with a thruple of fields that must be all unique, where only one NULL is permitted:
from django.db import models
from django.db.models import Q
from django.db.models.constraints import UniqueConstraint
class Badger(models.Model):
required = models.ForeignKey(Required, ...)
optional = models.ForeignKey(Optional, null=True, ...)
key = models.CharField(db_index=True, ...)
class Meta:
constraints = [
UniqueConstraint(fields=['required', 'optional', 'key'],
name='unique_with_optional'),
UniqueConstraint(fields=['required', 'key'],
condition=Q(optional=None),
name='unique_without_optional'),
]
UPDATE: previous version of my answer was functional but had bad design, this one takes in account some of the comments and other answers.
In SQL NULL does not equal NULL. This means if you have two objects where field_d == None and field_c == "somestring" they are not equal, so you can create both.
You can override Model.clean to add your check:
class ModelB(Model):
#...
def validate_unique(self, exclude=None):
if ModelB.objects.exclude(id=self.id).filter(field_c=self.field_c, \
field_d__isnull=True).exists():
raise ValidationError("Duplicate ModelB")
super(ModelB, self).validate_unique(exclude)
If used outside of forms you have to call full_clean or validate_unique.
Take care to handle the race condition though.
#ivan, I don't think that there's a simple way for django to manage this situation. You need to think of all creation and update operations that don't always come from a form. Also, you should think of race conditions...
And because you don't force this logic on DB level, it's possible that there actually will be doubled records and you should check it while querying results.
And about your solution, it can be good for form, but I don't expect that save method can raise ValidationError.
If it's possible then it's better to delegate this logic to DB. In this particular case, you can use two partial indexes. There's a similar question on StackOverflow - Create unique constraint with null columns
So you can create Django migration, that adds two partial indexes to your DB
Example:
# Assume that app name is just `example`
CREATE_TWO_PARTIAL_INDEX = """
CREATE UNIQUE INDEX model_b_2col_uni_idx ON example_model_b (field_c, field_d)
WHERE field_d IS NOT NULL;
CREATE UNIQUE INDEX model_b_1col_uni_idx ON example_model_b (field_c)
WHERE field_d IS NULL;
"""
DROP_TWO_PARTIAL_INDEX = """
DROP INDEX model_b_2col_uni_idx;
DROP INDEX model_b_1col_uni_idx;
"""
class Migration(migrations.Migration):
dependencies = [
('example', 'PREVIOUS MIGRATION NAME'),
]
operations = [
migrations.RunSQL(CREATE_TWO_PARTIAL_INDEX, DROP_TWO_PARTIAL_INDEX)
]
Add a clean method to your model - see below:
def clean(self):
if Variants.objects.filter("""Your filter """).exclude(pk=self.pk).exists():
raise ValidationError("This variation is duplicated.")
I think this is more clear way to do that for Django 1.2+
In forms it will be raised as non_field_error with no 500 error, in other cases, like DRF you have to check this case manual, because it will be 500 error.
But it will always check for unique_together!
class BaseModelExt(models.Model):
is_cleaned = False
def clean(self):
for field_tuple in self._meta.unique_together[:]:
unique_filter = {}
unique_fields = []
null_found = False
for field_name in field_tuple:
field_value = getattr(self, field_name)
if getattr(self, field_name) is None:
unique_filter['%s__isnull' % field_name] = True
null_found = True
else:
unique_filter['%s' % field_name] = field_value
unique_fields.append(field_name)
if null_found:
unique_queryset = self.__class__.objects.filter(**unique_filter)
if self.pk:
unique_queryset = unique_queryset.exclude(pk=self.pk)
if unique_queryset.exists():
msg = self.unique_error_message(self.__class__, tuple(unique_fields))
raise ValidationError(msg)
self.is_cleaned = True
def save(self, *args, **kwargs):
if not self.is_cleaned:
self.clean()
super().save(*args, **kwargs)
One possible workaround not mentioned yet is to create a dummy ModelA object to serve as your NULL value. Then you can rely on the database to enforce the uniqueness constraint.

Can django autocomplete_light show something than the model's string representation?

I'm using django autocomplete_light for a model search, but would like something different to the model's default string (i.e. __unicode__()) being shown in the autocompleted results. Is this possible?
Yes, use autocompleteListBase
class your_autocomplete_class(autocomplete_light.AutocompleteListBase):
names= model_name.objects.values_list('user__email', flat=True)
choices = [v for v in names]
autocomplete_light.register(your_autocomplete_class)
Yes, by overriding choice_label and return the value to display for a choice.
Example
class BookAutocomplete(autocomplete_light.AutocompleteModelBase):
search_fields = ['title']
model = Book
def choice_label(self, choice):
return '"{0.title}" by {0.author}'.format(choice)
You can override the get_result_label method form the BaseQuerySetView.
(In the following example, Select2QuerySetView inherits from BaseQuerySetView)
class MyModelAutocompleteView(autocomplete.Select2QuerySetView):
def get_queryset(self)
return MyModel.objects.filter(name__icontains='foo')
def get_result_label(self, result):
return '{0} is a choice'.format(result)

Filtering on DateTimeField with Django Rest Framework

I have a model with a DateTimeField:
class MyShell(models):
created = models.DateTimeField(auto_now=true)
I have an api linked to it using Django Rest Framework:
class ShellMessageFilter(django_filters.FilterSet):
created = django_filters.DateTimeFilter(name="created",lookup_type="gte")
class Meta:
model = ShellMessage
fields = ['created']
class ShellListViewSet(viewsets.ModelViewSet):
"""
List all ShellMessages
"""
serializer_class = ShellMessageSerializer
queryset = ShellMessage.objects.all()
filter_class = ShellMessageFilter
When I hit my API using the following URL it works perfectly:
http://127.0.0.1:8000/api/shell/?created=2014-07-17
# It returns all shell with a date greater than the one provided in URL
But, I want to do more than that by filtering base on a date and a time. I tried the following URL without success:
http://127.0.0.1:8000/api/shell/?created=2014-07-17T10:36:34.960Z
# It returns an empty array whereas there are items with a created field greater than 2014-07-17T10:36:34.960Z
If you guys know how to proceed... I don't find any good informations or example in django-filters documentation...
Simpler solution if you don't care about fractions of seconds: replace the "T" with space (%20):
http://127.0.0.1:8000/api/shell/?created=2014-07-17%2010:36:34
Worked for me.
This may not be what you want, but you could simply convert from Unix time. E.g.:
def filter_unix_dt(queryset, value):
if not value:
return queryset
try:
unix_time = int(value)
t = datetime.fromtimestamp(unix_time)
result = queryset.filter(created__gte=t)
return result
except ValueError:
return queryset
class ShellMessageFilter(django_filters.FilterSet):
created = django_filters.DateTimeFilter(action=filter_unix_dt)
class Meta:
model = ShellMessage
fields = ['created']
The issue and solution are documented in this DRF issue page: https://github.com/tomchristie/django-rest-framework/issues/1338
TL;DR: A Django ISO conversion 'issue' is preventing DRF from working as you are expecting. A fix for this has been written in DRF, allowing you to use IsoDateTimeField instead of DateTimeField. Just replaying the T with a space in your request param value also works.

Can't Return JSON object using MongoEngine Pymongo with Django?

So I'm trying to return a JSON object for a project. I've spent a few hours trying to get Django just returning the JSON.
Heres the view that we've been working with:
def json(request, first_name):
user = User.objects.all()
#user = User.objects.all().values()
result = simplejson.dumps(user, default=json_util.default)
return HttpResponse(result)
Here's my model:
class User(Document):
gender = StringField( choices=['male', 'female', 'Unknown'])
age = IntField()
email = EmailField()
display_name = StringField(max_length=50)
first_name = StringField(max_length=50)
last_name = StringField(max_length=50)
location = StringField(max_length=50)
status = StringField(max_length=50)
hideStatus = BooleanField()
photos = ListField(EmbeddedDocumentField('Photo'))
profile =ListField(EmbeddedDocumentField('ProfileItem'))
allProfile = ListField(EmbeddedDocumentField('ProfileItem')) #only return for your own profile
This is what it's returning:
[<User: User object>, <User: User object>] is not JSON serializable
Any thoughts on how I can just return the JSON?
With MongoEngine 0.8 or greater, objects and querysets have a to_json() method.
>>> User.objects.to_json()
simplejson.dumps() doesn't know how to "reach into" your custom objects; the default function, json_util.default must just be calling str() or repr() on your documents. (Is json_util custom code you've written? If so, showing its source here could prove my claim.)
Ultimately, your default function will need to be able to make sense of the MongoEngine documents. I can think of at least two ways that this might be implemented:
Write a custom default function that works for all MongoEngine documents by introspecting their _fields attribute (though note that the leading underscore means that this is part of the private API/implementation detail of MongoEngine and may be subject to change in future versions)
Have each of your documents implement a as_dict method which returns a dictionary representation of the object. This would work similarly to the to_mongo method provided on documents by MongoEngine, but shouldn't return the _types or _cls fields (again, these are implementation details of MongoEngine).
I'd suggest you go with option #2: the code will be cleaner and easier to read, better encapsulated, and won't require using any private APIs.
As dcrosta suggested you can do something like this, hope that will help you.
Document definition
class MyDocument(Document):
# Your document definition
def to_dict(self):
return mongo_to_dict_helper(self)
helper.py:
from mongoengine import StringField, ListField, IntField, FloatField
def mongo_to_dict_helper(obj):
return_data = []
for field_name in obj._fields:
if field_name in ("id",):
continue
data = obj._data[field_name]
if isinstance(obj._fields[field_name], StringField):
return_data.append((field_name, str(data)))
elif isinstance(obj._fields[field_name], FloatField):
return_data.append((field_name, float(data)))
elif isinstance(obj._fields[field_name], IntField):
return_data.append((field_name, int(data)))
elif isinstance(obj._fields[field_name], ListField):
return_data.append((field_name, data))
else:
# You can define your logic for returning elements
return dict(return_data)