Why is CICharField not creating database column? - django

I am trying to use the Django CITextExtension PostgreSQL extension to create case-insensitive text fields in my database.
Concretely, here is how I am trying to use django.contrib.postgres.fields.CICharField in my Expression model to create a case-insensitive CharField in my database:
from django.contrib.postgres.fields import CICharField
class Expression(models.Model):
expression = CICharField(max_length=64, unique=True)
When I open the admin, however, and go into the Expressions area I get:
OperationalError at /admin/dictionary/expression/
no such column: dictionary_expression.expression
I could not find anything at all on the internet about this particular issue. Why is CICharField not creating a column in my database?

It could be because you are using a SQLite database. Make sure you are using a PostgreSQL database.

Related

Does Django use case sensitive filenames for FileField with MySQL backend?

I'm using Django 3.1.3 and have a model with a FileField field. When I upload a file with same name, but different case, the logic I have is reporting that it is a duplicate. For example:
Files <QuerySet ['92/565/20191222_152213.jpg', '92/565/cs_731__DSC8110.jpg', '92/565/ADs-2.MP4']>, this 92/565/ADS-2.mp4
The logic is...
other_pieces_in_room = Piece.objects.filter(room__id=self.room_id)
other_files_in_room = other_pieces_in_room.values_list('file', flat=True)
mylog.debug("Files {}, this {}".format(other_files_in_room, file.name))
if file.name in other_files_in_room:
raise forms.ValidationError("...")
Model (relevant fields) is:
class Piece(models.Model):
file = models.FileField(upload_to=media_location)
room = models.ForeignKey(Room, on_delete=models.CASCADE)
Any thoughts as to what is going on?
For my case, the database was created with the default settings. The FileField in Django is stored in the database as a varchar and when doing:
show full columns from gallery_piece;
I could see that the field is using collation utf8_general_ci (which is case-insensitive).
I altered the database and all the char/varchar table fields to use utf8mb4 character set and utf8mb4_bin collation, using commands like:
ALTER TABLE gallery_exhibit MODIFY file varchar(100) CHARACTER SET utf8mb4 COLLATE utf8mb4_bin;
Once done, the filename (and other char type fields) are treating files as case sensitive. I did read that on file upload, the validation will be case-sensitive, so will pass, but then a database integrity error will be encountered, when saving an entry with only case differences using the default collation.
For production, when I create the database, I'll need to do:
CREATE DATABASE mydb
CHARACTER SET utf8mb4
COLLATE utf8mb4_bin;

Verify database integrity using Django

There is a complex app with the back-end powered by Django. It contains many-to-one many-to-many and https://django-polymorphic.readthedocs.io/en/stable/.
Using Django migration for updating database models.
Problem
A very complex migration for the existing database fails. There Foreign Key constraint related problems similar to:
ERROR: insert or update on table "foo" violates foreign key constraint "D286496390ec910156ccc566ec44e73f"
DETAIL: Key (bar_id)=(123) is not present in table "bar".
So in other words there is a record which references a non-existing record in the database.
Question
Is it possible to somehow use Django to iterate over each object in the database and validate it contains no "broken" records (referencing non-existent records)?
Short Answer
Yes, you can use django shell ./manage.py shell to iterate your model and check.
Long Anser
$ ./manage.py shell
>>> from your_app.models import MyCoolModel
>>> for item in MyCoolModel.objects.all():
... if item.foreign_field < 0: # do the control here
... # fix the non valid.
... ...

unique_together does not replace primary key

In my Django app, I want to insert a record with a composite primary key. Apparently this should be possible by making use of "unique_together". I'm quite sure this code was working in the past, but for some reason it does not seem to be working now. This code used to run on a Linux VM, and now I'm hosting it in Google App Engine. However I don't see how this can be the cause for this error.
class TermsAndConditionsDocument(models.Model):
organization = models.ForeignKey(Organization, on_delete=models.CASCADE, verbose_name=_("Organization"))
language = models.CharField(_('Language'),choices=LANGUAGE_CHOICES, max_length=5, help_text=_("The language of the content."))
content = models.TextField()
class Meta:
unique_together = ('organization', 'language')
The error:
IntegrityError at /transactions/settings/terms_and_conditions
null value in column "id" violates not-null constraint
DETAIL: Failing row contains (null, nl-BE, <p>B</p>, 10).
According to what I've read, using "unique_together" should cause Django to not need or include an ID as primary key. I checked the database, and the ID field DOES exist. I do not understand where the database constraint and the ID field are still coming from?
Apparently, as pointed out in the comments, a primary key "id" field is always added, even if you don't need it. It's supposed to get out of your way, so you don't even notice its existence. In my case, it required me to give it a value when I created a new record, which is not how things are supposed to work.
A while back I migrated this database from one Postgres database to another Postgres database. I used an SQL dump and load method for this. Some sequences seem to have been lost during that migration.
Because there are no sequences, some fields now lacked autoincrement capabilities, explaining the IntegrityError on insertion.
In order to fix this, I did the following:
1) Export the current data:
manage.py dumpdata > data.json
2) Drop your database and create a new empty one.
3) Run database migrations:
manage.py migrate
4) Load the data again, excluding some default data already recreated by Django.
manage.py loaddata --exclude auth.permission --exclude contenttypes data.json
This procedure seems to have recreated the sequences while also keeping the data.
The unique_together only creates a DB constraint (https://docs.djangoproject.com/en/2.2/ref/models/options/#unique-together).
You could create a custom primary key with the option primary_key https://docs.djangoproject.com/en/2.2/ref/models/fields/#django.db.models.Field.primary_key but you could only do that for one field.
But I suggest to just keep the auto increment id field, this works better with Django.
For the error are you saving a model? or doing a raw import?

Getting the list of timezones supported by PostgreSQL in Django using RawSQL

I am trying to get the list of all the timezones supported by PSQL database in my Django project, so I can validate timestamps with timezones before sending them to the the database. I asked another question and got an answer regarding the PSQL query here:
How to get the list of timezones supported by PostgreSQL?
Using that, I am trying to do the following:
from django.db.models.expressions import RawSQL
RawSQL("SELECT name, abbrev, utc_offset, is_dst FROM pg_timezone_names;", [])
However, it does not seem to work. I saw the docs for RawSQL, and it usually has a model attached to it, which I can't really have. How to do solve this issue? Thanks.
Following should work for you
from django.db import connection
with connection.cursor() as cursor:
cursor.execute("SELECT name, abbrev, utc_offset, is_dst FROM pg_timezone_names")
zones = cursor.fetchall()

Django: How to insert sql statements into sqlite3 database and configure with models

I'm using Django for a project. I have a .sql file containing insert statements. How can I get them into my sqlite3 database and have them work with my models?
The project "fortune" has a models.py file that looks like the following:
class fortune(models.Model):
id = models.IntegerField(primary_key=True)
category = models.CharField(max_length=50)
length = models.IntegerField()
aphorism = models.CharField(max_length=5000)
I have a .sql file with a list of Insert statements like the follwing:
INSERT INTO "fortune_fortune" VALUES(1,'fortunes',127,'Arbitrary Text');
When I run .schema on my db.sqlite3 file which is configured with my project I see:
CREATE TABLE fortune_fortune(id integer, category varchar(50), length integer, aphorism varchar(5000));
I've tried using .read in my sqlite shell with no luck. I've tried typing "sqlite3 file.sqlite3 < file.sql" in bash as well. There is something I'm missing here, but I can't seem to ID the problem.
Thanks for your help.
ok, wait.. normally you dont use sql statements to insert data into db, if you work with django.
to insert data into db, you work with django ORM which is way much fun than these ugly sql statements.
fortune = fortune(category='new cat', length=19, aphorism='i love life')
fortune.save()
then as a result, you will have one new row in fortune table in your db. just read django docs and you feel happy!
and one more thing, class names are always in capital.
to your issue:
Django provides a hook for passing the database arbitrary SQL that’s executed just after the CREATE TABLE statements when you run migrate. You can use this hook to populate default records, or you could also create SQL functions, views, triggers, etc.
The hook is simple: Django just looks for a file called sql/<modelname>.sql, in your app directory, where <modelname> is the model’s name in lowercase.
more in docs