I am new to using Django framework. I want to use the same database(PostgreSQL) tables for 2 different projects. I was able to access the same database from two different projects. But how to access the tables?
Firstly, I have models in project1
class employees(models.Model):
employeeID = models.IntegerFiled()
employeeName = models.CharField(max_length=100)
This would create a project1_employees table in the database. I want to access this table in project2. If I have the same model in the project2 and migrate it creates a new table project2_employees in the same database. These are two entirely different projects.
For the second project change the meta of your model this way:
class employees(models.Model):
employeeID = models.IntegerFiled()
employeeName = models.CharField(max_length=100)
class Meta:
managed = False
db_table = 'project1_employees'
And then make fake migrations:
python manage.py migrate --fake
Use managed = False in the second project model class Meta:. This will prevent the migrate code from creating the table. For more information see:
https://docs.djangoproject.com/en/3.0/ref/models/options/#managed
What you need to consider is whether you think the projects will diverge enough they will end up needing separate databases/tables.
Related
Let's suppose I have the following model:
class Test(models.Model):
field_one = models.CharField(max_length=80)
Now, we have created 2-3 Model objects with field_one field.
p1 = Test(field_one="Object1")
p1.save()
p2 = Test(field_one="Object2")
p2.save()
Later, I realised that I need to add another field field_two to my Test model.
class Test(models.Model):
field_one = models.CharField(max_length=80)
field_two = models.IntegerField(default=3)
Now, Doing makemigrations & migrate
and running server.
which will prompt the following error
django.db.utils.ProgrammingError: column mainapp_test.field_two does not exist
I understand that this error occurs due to my 2 existing objects in PostGresDB doesn't have field_two column.
Is there any effective way to add field_two column to my existing objects with some default value? or How to solve this problem?
Django Version: 2.0
Django ORM DB: PostGresql
When you add a field to an existing model, you must either provide a default value in the code, or set it to null/blank = True, or provide a one-off default while migrating.
Since you are providing a default in the code, the migration should run without issues. At least from experience, I've added several BooleanFields with default=False to my existing model with thousands of entries, and I never got a ProgrammingError.
Have you tried shutting down the Postgres backend before running makemigrations and migrate? I would think Django would do this but that's the only thing I can think of. Also, obviously, shut down the Django server if it's still running.
I added a field to one of my models, but in the 'models' folder I have two other python files which have only View models from which I query views in my database. When I run the makemigrations command, the new migrations file that is created includes also adding these view models to my database as tables (which I don't want to). How can I ignore these changes, and only commit the one addition of a field to an actual table on the database.
I think I maybe have to delete the migrations.CreateModel... in the new migrations file and only keep the migrations.addField... , then run the 'migrate' command. I didn't proceed with this because I'm not sure and maybe it will mess up my database in some way.
Thanks in advance to anyone who can help.
when you make a model for database view you must add meta class managed = false and db_table like this:
class MyViewModel(models.Model):
field: models.CharField(max_length=100)
class Meta:
managed = False
db_table = 'database_view_name'
when you write this and run makemigrations a migration generated contains this model but when you run migrate this migration doesnt change anything on database.
you also can create view using migrations in python. see migrations.RunPython for more details
How can I create a Django REST Framework API that connects to an already existing MySQL tables instead of creating them through modela.py. My models.py shows something like this:
class Author(models.Model):
first_name = models.CharField(max_length=20)
last_name = models.CharField(max_length=20)
def __str__(self):
return f'{self.first_name} {self.last_name}'
Instead of this, I need to take data directly from existing tables in MySQL.
For that you need to define same class name as your table name with meta char field
like for example
RandomTable(id INT(10),name varchar(10)) is your existing mysql table then the models.py for it will be
class AppnameRandomTable(models.Model)
id = models.CharField(db_column="id") #name of column of existing db
inside that you will need to write the fields of your existing table name in meta section
class Meta:
db_table = "RandomTable" #your existing mysql table name
time saving hack just create a class in models.py and on terminal run "python manage.py inspectdb" you will automatically get all the column names from there.
You can just copy and paste names from there , because for reading and writing on columns you need to define their variables in your class even if the table is existing mysql table
python manage.py inspectdb > models.py
If you run that command it will create a models.py in the project's root directory. Once you've done that you can either move it directly into the project or create a models folder and break it down into areas of concern from there. You will likely have to do the work of adding related_name = 'foo' to a lot of fields that have relationships with other models. That can be time-consuming but it works.
My web application allow users to load/create tables in the Postgres database. I know Django ORM needs a model definition in models.py for each table in the database to access it. How can I access the user's uploaded tables in the app without creating a new model definition on the fly each time a new table is uploaded? I was thinking about creating a generic model definition that decompose the table into its components like this:
models.py
class Table(models.Model):
filename = models.CharField(max_length=255)
class Attribute(models.Model):
table = models.ForeignKey(Table)
name = models.CharField(max_length=255)
type = models.IntegerField()
width = models.IntegerField()
precision = models.IntegerField()
class Row(models.Model):
table = models.ForeignKey(Table)
class AttributeValue(models.Model):
row = models.ForeignKey(Row)
attribute = models.ForeignKey(Attribute)
value = models.CharField(max_length=255, blank=True, null=True)
The problems with such a generic model is that every tables are mixed in 4 table (not useful in admin interface) and its really slow to create when you have a lot of rows. Do you have suggestion with this case?
Edit: Could it be viable to use a separate database to store those tables and use a router and manage.py inspectdb to update its models.py each time a user add or delete a table? (like in this post) I wonder what would happen if two users add a table in the same time?
I think you should look into dynamic models like here:
https://code.djangoproject.com/wiki/DynamicModels
or here:
http://dynamic-models.readthedocs.org/en/latest/
Good luck because its not an easy way my friend :)
You'll probably need to use raw SQL queries for doing this.
If the schema of the tables you are expecting are predefined you can use a database router to link some model to a specific table name for each user.
Seems like this should be "easy" or at least documented somewhere, I just cant find it.
Lets say I have a model:
class A(models.Model):
users = models.ManyToMany('auth.User', blank=True)
Now I want to migrate to have a through table to add fields to the ManyToMany relation...
class AUsers(models.Model):
user = models.ForeignKey('auth.User')
a = models.ForeignKey('A')
new_field = models.BooleanField()
class A(models.Model):
users = models.ManyToMany('auth.User', blank=True, through='AUsers')
Then I do:
% ./manage.py schemamigration app --auto
Not totally surprising, it tells me it is going to drop the original auto-created through table and create a new one for AUsers. What's the best practice at this point? Is there a decent way to migrate to the new through table? Do I use db_table in Meta? Do I just not use the through=... right away... then do a schemamigration --auto, then a datamigration to copy the current table (somehow, not sure...) and then add the through relation and let it kill the table?
What's the trick here? Is this really that hard?
You should be able to do this pretty easily.
First of all, make sure that the manual through table that you are creating has the same table name in the database as the one Django originally created automatically.
So, first, let's consider a manual through model before your change:
class AUsers(models.Model):
user = models.ForeignKey('auth.User')
a = models.ForeignKey('A')
class Meta:
db_table = 'appname_a_user'
That should be functionally (almost) identical to the ManyToManyField you used to have. Actually, you could make an empty migration and apply it, and then use --auto for your changes (but don't).
Now, add your field like you did in your sample code above, and then run ./manage.py schemamigration appname manual_through_table --empty. That will give you an empty migration named ####_manual_through_table.py.
In the migration itself, there will be a forwards and backwards method. Each one needs to be one line each:
def forwards(self, orm):
db.add_column('appname_a_user', 'new_field', self.gf('django.db.models.fields.BooleanField')(default=False))
def backwards(self, orm):
db.delete_column('appname_a_user', 'new_field')
That should get you what you are after.
If anyone comes across this question when trying to do the same thing with the moderns migration framework, here are the steps:
Create a new model class that exactly matches the built-in through table
Use the Meta class to set the table name to match the existing table
Generate a migration, which will create the new table and set it as the through for the field.
Without running that migration, edit it to wrap it in a migrations. SeparateDatabaseAndState migration, where the auto-generated steps are in the state_operations field and the database operations are empty.
Modify your through table, as required, making sure to generate new migrations as normal.
As mentioned in a comment, the first step may be simplified using db.rename_table as described here, which gives this through model:
class AUsers(models.Model):
user = models.ForeignKey('auth.User')
a = models.ForeignKey('A')
class Meta:
unique_together = (('user', 'a'),)
Then, create a migration with --auto (this way you'll have the names of the DB tables visible), and replace the content with:
class Migration(SchemaMigration):
def forwards(self, orm):
db.rename_table('appname_a_user', 'appname_auser')
def backwards(self, orm):
db.rename_table('appname_auser','appname_a_user')
I just applied it in my project without issues.