Context:
I'm playing around with setting up a DRF project using the postgresql sample database located here: Postresql Sample DB
Problem:
The sample database is already set up with intermediate tables film_category and film_actor. When using manage.py to inspectdb it generates these intermediate tables explicitly (FilmCategory and FilmActor) and they serve no purpose in the code as they only contain the ids for the two related fields. If I were to create them using the Django ORM I could just declare:
class Film(models.Model):
...
actors = models.ManyToManyField(Actor, related_name='films')
Django creates these tables "behind the curtain" so they take up no space in my code. I attempted to just set up a ManyToManyField like so:
actors = models.ManyToManyField(Actor, db_table='film_actor', related_name='films')
categories = models.ManyToManyField(Category, db_table='film_category', related_name='films')
When attempting to migrate, however, this fails giving me the following error:
psycopg2.errors.DuplicateTable: relation "film_actor" already exists
I don't think I want to create this ManyToManyField without explicitly telling it which db_table to use because I believe that would generate an entirely new intermediate table and I lose access to all the data already stored in those intermediate tables in the original sample database.
I was able to get it to work without errors and the expected operations function normally by doing:
actors = models.ManyToManyField(Actor, through='FilmActor', related_name='films')
But now I have an explicitly defined FilmActor and FilmCategory model sitting in my models.py that I cannot remove without causing errors:
class FilmActor(models.Model):
actor = models.ForeignKey(Actor, models.CASCADE)
film = models.ForeignKey(Film, models.CASCADE)
last_update = models.DateTimeField()
class FilmCategory(models.Model):
film = models.ForeignKey(Film, models.CASCADE)
category = models.ForeignKey(Category, models.CASCADE)
last_update = models.DateTimeField()
Has any dealt with explicitly defined intermediate tables generated from an existing DB with inspectdb? Is there a way to get rid of those models that were generated while still allowing the normal ManyToMany operations? Technically what I want to do is working, I just feel like having those two intermediate tables explicitly declared as models in my code when they have no additional data (other than "last_update") feels icky.
Related
I'm having an issue building database schema for an app which includes model with image and which is being preloaded with some default data. The idea is that when application starts it's being prepopulated with default data (static images) but after that users can upload their own images.
To simplify let's say that model has only two fields - shared_one and image:
class Some(models.Model):
shared_one = models.CharField()
image = [???]
I tried to define image as FileField but it doesn't work with static files - while creating new record Some model will try to save new file image in MEDIA_ROOT directory.
Then I found out that Django introduces field type for managing static files in models - FilePathField but obviously it doesn't comply with files uploaded by users.
Then I came up with an idea to make Some as an abstract class like that:
class Some(models.Model):
class Meta:
abstract = True
shared_one = models.CharField()
class DefaultSome(Some):
media = models.FilePathField()
class UsersSome(Some):
media = models.FileField()
The issue with that approach is that DefaultSome and UsersSome are now completely different beings - I cannot for example access media from other model using ForeignKey to Some, I'd need to create different sets of models (and database tables) for each new model that will be using media which seems super inefficient.
I tried also creating Some as a not managed model but it didn't work at all. Having models defined like that:
class Some(models.Model):
class Meta:
managed = False
shared_one = models.CharField()
class DefaultSome(Some):
media = models.FilePathField()
class UsersSome(Some):
media = models.FileField()
I tried to run script which generates default data. Script looks like that:
# it's defined in migration file and is meant to run with migrations
def populate_default_data(apps, schema_editor):
DefualtSome = apps.get_model('myamazingapp', 'DefaultSome')
DefaultSome.objects.create(
shared_one="SomeSomeSome",
media=os.path.join(settings.STATIC_ROOT, "media_file.jpg")
)
But it fails with error: django.db.utils.OperationalError: no such table: myamazingapp_some. (???)
Is my data structure badly designed? Should I use completely different approach?
Probably fake migration has been performed. Open your database open table your_project.django_migrations, then search for the migration file name that created the model. Delete that migration record from table and again perform the migration.
Right now, I have an abstract model with several models that inherits its fields. But I have discovered the power of Proxy models, and I want to implement them to my app. This is a picture from now:
class BaseModel(models.Model):
field_1 = models.CharField(max_length=10)
field_2 = models.CharField(max_length=10)
field_3 = models.CharField(max_length=10)
class Meta:
abstract = True
class Model1(BaseModel):
pass
def __unicode__(self):
return self.field_1
class Model2(BaseModel):
pass
def __unicode__(self):
return self.field_1
And this is what I want:
class BaseModel(models.Model):
field_1 = models.CharField(max_length=10)
field_2 = models.CharField(max_length=10)
field_3 = models.CharField(max_length=10)
class Model1(BaseModel):
pass
class Meta:
proxy = True
def __unicode__(self):
return self.field_1
class Model2(BaseModel):
pass
class Meta:
proxy = True
def __unicode__(self):
return self.field_1
The problem is when I remove the "abstract = True" sentence. When I try to migrate, this is the warning:
You are trying to add a non-nullable field 'basemodel_ptr' to Model1 without a default; we can't do that (the database needs something to populate existing rows).
Ok, got it. I read "ptr" is a pointer to the parent model, which is BaseModel, but I don't have any BaseModel, and I cannot get it until I migrate. How can I fix this??
The migrations for this are not trivial.
Currently, your database has two tables:
yourapp_model1
yourapp_model2
Both have the same columns but they use different sequences meaning that their primary keys clash: Both of the tables will start counting their IDs (aka PKs) from 1 onwards. There is a very good chance that there is an instance of Model1 with pk=1 as well as an instance of Model2 with pk=1 and they are not the same.
This is the point of having an abstract model and concrete implementations: share Django code (business logic) while separating the data in the DB. For example, because it is semantically different (different types).
The point of proxy models is the exact opposite: while the data is located in one table in the DB, the proxy models allow you to implement different behaviour but based on the same DB data.
If you are now migrating from abstract models to proxy models it means that what you once considered different types will now become the same types (from a database point of view). As mentioned at the beginning, you will have to move the data from several tables into one and regenerate the IDs for at least part of it. Meaning also, that part of the resources that are currently using these URLs will change/cease to exist/point to a different resource.
Unless you can start from scratch (no live data that you have to support) you should really appreciate the magnitude of this change.
Should you not have the need to support live data:
drop the database
recreate the database
remove all migration files
recreate all migration files from scratch
call migrate
Data migration of live data:
Note that there are other ways to do this. Search for "Django data migration".
create the new Model structure with new names (no collisions)
makemigrations which will now pick up the new model structure and create new and empty tables in database leaving the old tables and their data untouched
create a management command that reads in the old model instances and copies them into the new tables
once this command has been run on production you can deprecate the old models and remove them while making sure that everything that was depending on them is now using the new tables
I am used in creating orm and leaving django responsible for creating the tables.But in a project I am involved I have to create a simple CRUD application a frontend for an existing database. The database was created by creating the tables manually. So I have two tables Table1 and Table2 which have a many to many relationship through Tables12. Tables12 looks like the table that django would normaly create using a ManyToManyField thus it has two fields the id's of the two models. So after using django's inspectdb, django successfully created the models according to the SQLite database. The many to many tables like Tables12 was created like the following(as stated above):
class Tables12(models.Model):
table1 = models.ForeignKey(Table1)
table2 = models.ForeignKey(Table2)
class Meta:
managed = False
db_table = "Tables12"
unique_together = (("table1_id", "table2_id"),)
Trying the following gives me an error:
>> table2 = Table2.objects.get(pk=1)
>>tables12 = Tables12.objects.filter(table2=table2)
>>tables12
OperationalError: no such column: Tables12.id
I am guessing Django's orm is expecting an id field in every models created. How can I bypass this behavior? Is there a way to edit the tables so as they look more like django's orm but behave as the existing db's tables? Like:
class Table1(models.Model):
#pre exsiting fields
table2 = models.ManyToManyField(Table2)
or
class Table2(models.Model):
#pre existing fields
table1 = models.ManyToManyField(Table1)
but without destroying database records and without creating tables from start.
You can remove the Tables12 model, and specify the db_table argument to a ManyToManyField:
class Table1(models.Model):
tables2 = models.ManyToManyField(Table2, db_table='Tables12')
You would still not be able to query the Tables12 model directly (it still exists and has an id field), but this would allow you to use the JOINs Django generates for a ManyToManyField:
table1 = Table1.objects.get(pk=1)
tables2 = table1.tables2.all()
This would still not allow you to use Django to write to the join table, but it allows you to use the data that's in it. If at all possible, I'd strongly recommend adding a primary key.
I am gradually replacing a legacy database front end with a django based system. All models are Managed = False, to preserve the original db structure.
However I have struck a problem where a field in a table is computed. The field is defined in (pseudo) sql as full_name = fname|| ||lname.
I can define the full_name field as a charfield; and I can read it no problems, however any attempt to update the record results in an update error on that field.
I can use a #property; but that replicates the functionality in django, rather than displaying the results from the db itself. Using this approach will cause problems with more complex fields (in tables I am yet to get to) that are defined with UDFs.
What is really needed is a 'read_only' or 'computed' attribute on the model itself; what is the best way to achieve this?
Do you just want to define a method in your class? Something like:
def Person(models.Model):
fname=models.CharField(...)
lname=models.CharField(...)
def fullname(self):
return self.fname+" "+self.lname
(not exactly sure what Managed=False means...)
if you are trying to make calculation on a database models and pass the value of a model field to another model field of the same class model, using a defined function then this solution might help you. for example lets assume you have an investment company and you give 20% per month for the capital each user invested, you would want want to pass value from capital model to a function that calculates the percentage interest, and then you will pass that function into another field monthly_payment and get saved in the database.
1) pip install django-computed-property
2) add 'computed_property' to your installed apps in project settings.
3) in your models.py, import computed_property then
class Investment(models.Model):
name = models.CharField(max_length=200)
capital = models.FloatField(null=False)
percentage = models.CharField(max_length=5)
duration = models.CharField(max_length=10, default="1 months")
monthly_payment = computed_property.ComputedFloatField( compute_from='monthly_percentage', null=False, blank=False)
then your function to perform the calculation will go thus
#property
def monthly_percentage(self):
return (20 / 100) * self.capital
Note: what i discovered was if you use the inbuilt django fields be it FloatFiled or IntegerField, this function won't read the amount you are passing in to get your 20% calculations.i hope this works for you as i stated all as they worked for me,cheers.
I have a Shops model and would like each shop to be able to login to my application. Following as best I can the guide at http://scottbarnham.com/blog/2008/08/21/extending-the-django-user-model-with-inheritance/ and various other googlings, I've got part of the way there, but I've run into a problem. When I try to login as a shop, I get the following error:
OperationalError at /login/
(1054, "Unknown column 'shops.user_ptr_id' in 'field list'")
Shops model:
class Shops(User):
shop_id = models.AutoField(primary_key=True)
shop_code = models.CharField(unique=True, max_length=5)
shop_type_fk = models.ForeignKey(ShopTypes,
null=True,
db_column='shop_type_id',
blank=True)
address_fk = models.ForeignKey(Addresses, db_column='address_id')
phone_number = models.CharField(max_length=30)
#email = models.EmailField(max_length=255, blank=True)
description = models.TextField(blank=True)
does_gift_aid = models.NullBooleanField(null=True, blank=True)
objects = UserManager()
class Meta:
db_table = u'shops'
I've sync'd the database, so surely it should have made the column user_ptr_id. Does anyone know where I'm going wrong?
"I've sync'd the database, so surely it should have made the column user_ptr_id."
What makes you think that? Especially in light of this clear statement in the docs for syncdb:
Syncdb will not alter existing tables
syncdb will only create tables for
models which have not yet been
installed. It will never issue ALTER
TABLE statements to match changes made
to a model class after installation.
Changes to model classes and database
schemas often involve some form of
ambiguity and, in those cases, Django
would have to guess at the correct
changes to make. There is a risk that
critical data would be lost in the
process.
If you have made changes to a model
and wish to alter the database tables
to match, use the sql command to
display the new SQL structure and
compare that to your existing table
schema to work out the changes.
It does sound like you had an existing shops table before changing it to inherit from User (as Daniel notes), and syncdb does not update the schema for existing tables.
You need to drop the table and then run syncdb, if possible. Otherwise you need to go into your database and add the user_ptr_id field manually, if you know how to do that. The definition should look something like this:
"user_ptr_id" integer NOT NULL UNIQUE REFERENCES "auth_user" ("id")