I am writing a flask application for the first time from scratch. I have written several applications using django.
I am following tutorial where few options are specified on model fields. e.g.
email = db.Column(db.String(128), nullable=False,
unique=True)
date_created = db.Column(db.DateTime, default=db.func.current_timestamp())
date_modified = db.Column(db.DateTime, default=db.func.current_timestamp(),
onupdate=db.func.current_timestamp())
I want to find a list of all available options and their default behavior, like django does
While looking for constraints, I found column options here
name
type_
*args
autoincrement
default
doc
key
index
info
nullable
onupdate
primary_key
server_default
server_onupdate
quote
unique
system
comment
Related
I am preparing tables for use in an analytics tool (Microsof Power BI).
It would be helpful to show relationships between the tables to assist in report building.
However, I don't want to use foreign keys with constraints because the method of data updating and integrity checking doesn't need constraints, and it will actually get in the way. Some of the alternate backends this code supports are for cloud databases that don't offer real foreign keys.
I mention that because it means I am trying to define a foreignkey field like this:
order_guid = models.ForeignKey("Dear_sales_header",to_field="order_guid",db_constraint=False,on_delete=models.DO_NOTHING)
The migration file has this:
operations = [
migrations.AlterField(
model_name='sales_fact',
name='order_guid',
field=models.ForeignKey(db_constraint=False, on_delete=django.db.models.deletion.DO_NOTHING, to='dear_zoho_analytics.Dear_sales_header', to_field='order_guid'),
),
]
This table is routed to a different database
python manage.py migrate --database=dear_analytics
does indicate that the migration file was applied (it is 0026)
Applying dear_zoho_analytics.0026_auto_20210217_2245... OK
But when I inspect the postgresql schema in my IDE, the column in sales_fact is renamed to
order_guid_id
so it looks like I have done something wrong because this seems to reference the id field of the "parent table" dear_sales_header yet I need it to refer to dear_sales_header.order_guid which is unique but not the primary key.
Partial extract of Models:
class AnalyticsTable:
# a dummy class used to mark the DearAnalytics tables. Creating a subclass of models.Model breaks things in the ORM such as bulk update
pass
class Sales_fact(models.Model, AnalyticsTable):
analytics_table_name = "sales_fact"
#order_guid = models.CharField(max_length=1024, null=True, blank=True, help_text="")
order_guid = models.ForeignKey("Dear_sales_header",to_field="order_guid",db_constraint=False,on_delete=models.DO_NOTHING)
source_dear = models.CharField(max_length=1024, null=True, blank=True,
help_text="Link to Dear instance which is the source of the data")
class Dear_sales_header(models.Model, AnalyticsTable):
analytics_table_name = "dear_sales_header"
source_dear = models.CharField(max_length=1024, null=True, blank=True,
help_text="Link to Dear instance which is the source of the data")
order_guid = models.CharField(max_length=1024, unique=True, help_text="")
global_sale_status = models.CharField(max_length=1024, null=True, blank=True, help_text="SO Header status")
order_status = models.CharField(max_length=1024, null=True, blank=True, help_text="")
Django adds suffix _id to ForeignKey name as documented
Behind the scenes, Django appends "_id" to the field name to create
its database column name.
If you want name not to include any additions you should set db_column
The name of the database column to use for this field. If this isn’t
given, Django will use the field’s name.
order_guid = models.ForeignKey(
"Dear_sales_header", to_field="order_guid", db_constraint=False,
db_column='order_guid',on_delete=models.DO_NOTHING
)
In MRE I am not able to reproduce any undesired behavior,
Django does properly map ForeignKey to column you set also when settingdb_constraint=True properly ads back right constraint and drops it as backward operation
constraint testbench_sales_fact_order_guid_f2dab1c5_fk_testbench
references testbench_dear_sales_header (order_guid)
deferrable initially deferred
I have an application that shares a database between a large Django application, and a set of microservers (written in Go). If a record is deleted by a microserver, such that a record in a different table has a foreign key that references it, and if I've specified on_delete=models.CASCADE in the respective ForeignKey field, which of the following is true?
The dependent record will be deleted by PostgreSQL, or
the dependent record should be deleted "manually" by the microserver (this is what I currently do),
or is there a way to specify how I want on_delete to be handled?
Here's a bit of an example for completeness:
from django.contrib.gis.db import models
class Parent(models.Model):
name = models.CharField(max_length=128)
class Child(models.Model):
name = models.CharField(max_length=128)
parent = models.ForeignKey(
Parent,
null=False,
on_delete=models.CASCADE)
and in Go:
_, err = db.Exec(`DELETE FROM module_parent WHERE ID = $1`, parentID)
or do I have to do:
_, err = db.EXEC(`DELETE FROM module_child WHERE parent_id = $1`, parentID)
...
_ err = db.EXEC(`DELETE FROM module_parent WHERE id = $1`, parentID
The 2nd is true, per the Django docs for ForeignKey:
on_delete doesn’t create a SQL constraint in the database. Support for database-level cascade options may be implemented later.
I need to append the mail_d_list only if the flag is set in the model.
I am calling this function with the airport_code already coming in from the user. Now I would like to add the user to the email list if they have the flag selected for the user or not.
Each user in the model has six boolean flags, one flag for each report possible. The text for the flag is in the middle.
I have tried .get() and .filter()
Models.py
class Aerodrome(models.Model):
''' Aerodrome model for storing three-letter airport codes iata,
airport description and the database partition informtion. '''
iata = models.CharField(max_length=3, primary_key=True)
customer = models.CharField(max_length=5)
name = models.CharField(max_length=100)
partition_code = models.CharField(max_length=6)
class DistributionList(models.Model):
''' List of all email addresses that are to receive
automated emails from the system '''
email = models.CharField(max_length=40, primary_key=True)
name = models.CharField(max_length=40)
receive_emails = models.BooleanField()
receives_report = models.ManyToManyField(Aerodrome)
script.py
for user in DistributionList.objects.filter(receives_report__iata=airport_code):
mail_d_list.append(user.email)
This is definitely the wrong approach.
You already have the aerodromes defined in a separate model. You should define a ManyToManyField as a relationship between them, rather than dynamically defining fields on your DistributionList model. Then your script can filter by that relationship.
I'm normalizing a database associated with a Django project and will be moving fields to different tables. As part of the implementation process, I'd like to throw a deprecation warning to my colleagues if they attempt to use the old attributes after adding the new tables before I actually remove the columns.
class Asset(Model):
model = models.CharField(max_length=64, blank=True, null=True)
part_number = models.CharField(max_length=32, blank=True, null=True) # this will be a redundant column to be deprecated
company = models.ForeignKey('Company', models.CASCADE, blank=True, null=True) # this will be a redundant column to be deprecated
# other database fields as attributes and class methods
My understanding is that I would need to add something along the lines of warnings.warn('<field name> is deprecated', DeprecationWarning) somewhere in the class, but where would I add it?
Perhaps you could use Django's system check framework (introduced in Django 1.7).
Some interesting examples, using the system-check-framework for deprecation of custom fields, are provided in the migrations docs.
It seems you can also use this approach to mark standard fields on your model.
Applied to the example from the original post, the following works for me (tested in Django 3.1.6).
class Asset(Model):
...
company = models.ForeignKey('Company', models.CASCADE, blank=True, null=True)
company.system_check_deprecated_details = dict(
msg='The Asset.company field has been deprecated.',
hint='Use OtherModel.other_field instead.',
id='fields.W900', # pick a unique ID for your field.
)
...
See the system check API reference for more detailed information, e.g. about the "unique ID".
The following warning will then show, whenever you call runserver, migrate, or other commands, as mentioned in the docs:
System check identified some issues:
WARNINGS:
myapp.Asset.company: (fields.W900) The Asset.company field has been deprecated.
HINT: Use OtherModel.other_field instead.
Also nice to know (from the docs):
... For performance reasons, checks are not run as part of the WSGI stack that is used in deployment. ...
You can use django_deprication.DeprecatedField
pip install django-deprecation
then use like this
class Album(models.Model):
name = DeprecatedField('title')
https://github.com/openbox/django-deprecation
I do something similar to this - turn the field into a property and handle the warning there. Note that this will still break any queries you make that filter on the field - just helps with accessing the attribute from instances.
class NewAsset(Model):
model = models.CharField(max_length=64, blank=True, null=True)
class Asset(Model):
#property
def model(self):
log.warning('Stop using this')
return NewAsset.model
I have a custom user model that takes in the following fields:
class MyUser(AbstractBaseUser, PermissionsMixin):
username = models.CharField(max_length=60, unique=True)
email = models.EmailField(max_length=120, unique=True)
full_name = models.CharField(max_length=50, blank=True)
is_active = models.BooleanField(default=True)
I know username and email automatically adds a db_index because it is set to unique=True.
Does it make sense to add db_index=True for my full_name and is_active fields?
I have a search feature that checks if the regex matches the full_name. In my model managers, I also filter against is_active=True.
I would like to optimize querys, but don't want to add tables to my database if it isn't necessary.
Thank you!
Yes,just add db_index=True into your field like this
full_name = models.CharField(max_length=50, blank=True,db_index=True)
After you migrate it, your database will create a index for this filed.
Refer this: https://docs.djangoproject.com/en/1.9/ref/models/fields/#db-index
Field.db_index
If True, a database index will be created for this field.
Whether the index is used for full_name if you search using a regex is doubtable. You want to make sure of that by looking at the output of the query planner (e.g. postgres: EXPLAIN ANALYZE select * from myuser where user=...;).
If it is not used (for example because the regex lower cases everything) then you have to create a different index. For example on the lower cased value or even a GIN index (for partial matches).
Django Admin search query not hitting Postgres index
http://blog.it-agenten.com/2015/04/tuning-django-orm-text-queries/