How to store very little Integer (4 bits) integer in my database?
Hi.🙌
In my model, I want to store a very small integer in my DB and I don't want to use SmallIntegerField. Because Django will store 16 Byte data in DB, which is too much for my need. How can I store 4 bits integers or even less in PostgreSQL?
Thanks for your help.🙏
As Adrian said, the smallest storage for PostgreSQL itself is 2 bytes. There is nothing smaller than that.
If you want to make a constraint, create a maximum check via the save method in a Django model or add a field validator in the Django field.
2 bytes is the smallest, but there is nothing else
If you really want a database with such constraints, you can install an extension and make your own DB field by following this:
https://dba.stackexchange.com/questions/159090/how-to-store-one-byte-integer-in-postgresql
Related
How can we setup a field in a model to accept more than on value?
We simply can make relation fields using foreign key or many to many but in case of simple int or float or any simple variable type , how can we achieve multi value field?
If it is only about the database field, you could simply serialize your values into a string, e.g. values 1, 2, 3 become "1,2,3" in the field. Simply overwrite the getter and setter for that field (using #property), to serialize and unserialize the values each time the field is accessed.
Another approach is to use a JSONField (doc). This has wider support (for example searchability via querysets, at least using PostGreSQL. Also several 3rd party JSON form fields. You'd need to validate that the JSON supplied was a list of integers).
i wanna known what is the maximum length of TextField in django models while using PostgreSQL.. I just know Mysql uses the longtext data type to store the content, which can hold up to 4 Gigabytes but i wanna know what is the maximum length of TextField in django models while using PostgreSQL and if can increase it ?
Both TEXT and VARCHAR have the upper limit at 1 Gb, and there is no performance difference among them (according to the PostgreSQL documentation).
I have two models, user and course who should be matched. To provide a basis for this matching, both models can rank the other one. I'm trying to figure out a database structure to efficiently store the preferences. The problem is, that the number of ranks is unknown (depending of number of users/courses).
I'm using Django with Postgres, if this helps.
I tried storing the preferences as a field with a pickled dictionary, but this causes problems if a model stored in the dict is updated and takes a long time to load.
I'm thankful for any tips or suggestions for improvement.
Is there an advantage for using UUIDField with Django and Postgresql (native datatype) over a self-made generated unique key?
Currently I use a random-generated alphanumeric ID field on my models and I am wondering if the Postgres native datatype and the UUIDField are better for this purpose and whether there's a reason to switch over.
I generate the id using random letters and digits. It's 25 chars long. I put a db_index on it for faster retrieval. I don't shard my DB. The reason being that some models cannot have consecutive ids for business purposes
Switching to UUID will have an advantage particularly if you have a large number of records. Lookups and inserts ought to be a tiny bit faster. You will be saving 9 bytes of storage per row since UUID fields are only 128 bits.
However that doesn't mean your home made primary key is a bad idea. Far from it. It's a good one and a similar approach is used by Instagram who also happen to be using Postgresq and DJango. Their solution though uses only 64 bits and also manages to squeeze in information about object creation time into the primary key.
Their primary purpose is sharding but works very well even for non shared dbs. Just use some random number for the 13 bits that represent their sharding information. They have a sql sample at the link above.
"What is the biggest integer the model field that this application instance can handle?"
We have sys.maxint, but I'm looking for the database+model instance. We have the IntegerField, the SmallIntegerField, the PositiveSmallIntegerField, and a couple of others beside. They could all vary between each other and each database type.
I found the "IntegerRangeField" custom field example here on stackoverflow. Might have to use that, and guess the lowest common denominator? Or rethink the design I suppose.
Is there an easy way to work out the biggest integer an IntegerField, or its variants, can cope with?
It depends on your database backend.
Use ./manage.py sql your_app_name to check generated types for your DB-columns, and look through your database documentation for type ranges.
For MySQL: http://dev.mysql.com/doc/refman/5.1/en/numeric-types.html
For PostgreSQL: http://www.postgresql.org/docs/8.1/static/datatype.html#DATATYPE-TABLE
Can't be easily done. Just set a constant and use that.
MAX_SMALL_INT = 32767
position = models.PositiveSmallIntegerField(default=MAX_SMALL_INT)