I wish to make a custom model, which has a foreign key relationship to the django_migrations table. Can this be done? if so, exactly how do I import the django migrations model?
Edit... Here is my plan
1: I have one database in which there are many tables on the 'Public' schema which are used for application-wide data. I suspect this system will get very large so for scalability I would like each of the clients to have their own schema. These schema will be dynamically generated upon creation of a client
2: The client schema are controlled by a separate python/django app which has it's own set of tables/migrations.
3: I want the system to be flexible so that in the future, when new features are created, I can deploy those features to a few specific clients without having to deploy to everyone therefore each of the client schema may be at a different migration level, hence, there is a need to track which migration level each client is at
Related
i'm looking for a "best-practice" guide/solution to the following situation.
I have a Django project with a MySql DB which i created and manage. I have to import data, every 5 minutes, from a second (external, not managed by me) db in order to do some actions. I have read rights for the external db and all the necessary information.
I have read the django docs regarding the usage of multiple database: register the db in settings.py, migrate using the --database flag, query/access data by routing to the db (short version) and multiple question on this matter on stackoverflow.
So my plan is:
Register the second database in settings.py, use inspectdb to add to the model, migrate, define a method which reads data from the external db and add it to the internal (own) db.
However I do have some questions:
Do i have to register the external db if i don't manage it?
(Most probably yes in order to use ORM or the cursors to access the data)
How can i migrate the model if I don't manage the DB and don't have write permissions? I also don't need all the tables (around 250, but only 5 needed).
(is fake migration an option worth considering? I would use inspectdb and migrate only the necessary tables.)
Because I only need to retrieve data from the external db and not to write back, would it suffice to have a method that constantly gets the latest data like the second solution suggested in this answer
Any thoughts/ideas/suggestions are welcomed!
I would not use Django's ORM for it, but rather just access the DB with psycopg2 and SQL, get the columns you care about into dicts, and work with those. Otherwise any minor change to that external DB's tables may break your Django app, because the models don't match anymore. That could create more headaches than an ORM is worth.
I am connecting to a legacy mysql database in cloud using my django project.
i need to fetch data and insert data if required in DB table.
do i need to write model for the tables which are already present in the db?
if so there are 90 plus tables. so what happens if i create model for single table?
how do i talk to database other than creating models and migrating? is there any better way in django? or model is the better way?
when i create model what happens in the backend? does it create those tables again in the same database?
There are several ways to connect to a legacy database; the two I use are either by creating a model for the data you need from the legacy database, or using raw SQL.
For example, if I'm going to be connecting to the legacy database for the foreseeable future, and performing both reads and writes, I'll create a model containing only the fields from the foreign table I need as a secondary database. That method is well documented and a bit more time consuming.
However, if I'm only reading data from a legacy database which will be retired, I'll create a read-only user on the legacy database, and use raw SQL to grab what I need like so:
from django.db import connections
cursor = connections["my_secondary_db"].cursor()
cursor.execute("SELECT * FROM my_table")
for row in cursor.fetchall():
insert_data_into_my_new_system_model(row)
I'm doing this right now with a legacy SQL Server database from our old website as we migrate our user and product data to Django/PostgreSQL. This has served me well over the years and saved a lot of time. I've used this to create sync routines all within a single app as Django management commands, and then when the legacy database is done being migrated to Django, I've completely deleted the single app containing all of the sync routines, for a clean break. Good luck!
I am doing a poc in Django and i was trying to create the admin console module for inserting,updating and deleting records through django admin console through models and it was doing fine
I have 2 questions.
1.I need to have model objects for existing tables which needs to be present in a particular schema.say schema1.table1
Here as of now i was doing poc for public schema.
So can it be done in a fixed defined schema and if yes how.Any reference would be very helpful
2.Also i wanted to update few columns in the table through console and the rest of the columns will be done automatically like currentimestamp and created date etc.Is it possible through default django console and if yes kindly share any reference
Steps for 1
What i have done as of now is created a class in model.py with attributes as author,title,body,timeofpost
Then i used sqlmigrate after makemigrations app to create the table and after migrating have been using the admin console for django to insert and update the records for the table created.But this is for POC only.
Now i need to do the same but for existing tables with whom i can interact and insert or update record for those existing tables through admin console.
Also the tables are getting created in public schema by default.But i am using postgres and the existing tables are present in different schemas and i wanted to insert,update and delete for this existing tables.
I am stuck up here as i dont know how to configure model with existing database schema tables through which we can interact through django console and also for different schemas and not in public schema
Steps for 2:
Also i wanted the user to give input for few columns like suppose in this case time of creation is not required to be given as input by user .Rather it should be taken care when the database is updating or creating
Thanks
In order for Django to "interact" with an existing database you need to create a model for it which can be done automatically as shown here. This assumes that your "external" database isn't going to be changed often because you'll have to keep your models in sync which is tricky - there are other approaches if you need that.
As for working with multiple database schemas - is there a reason you can't put your POC table in the same database as the others? Django supports multiple databases, but it will be harder to setup. See here.
Finally, it sounds like you are interested in setting the Django default field attribute. For an example of current time see here.
I was wondering this could produce any problem if I directly add rows or remove some from a model table. I thought maybe Django records the number of rows in all tables? or this could mess up the auto-generated id's?
I don't think it matters but I'm using MySql.
No, it's not a problem because Django does the same that you do "directly" to the database, it execute SQL statements, and the auto generated id is handled by the database server (MySql server in this case), no matter where that SQL queries comes from, whatever it is Mysql Client or Django.
Since you can have Django work on a pre-existing database (one that wasn't created by Django), I don't think you will have problems if you access/write the tables of your own app (you might want to avoid modifying Django's internal tables like auth, permission, content_type etc until you are familiar with them)
When you create a model through Django, Django doesn't store the count or anything (unless your app does), so it's okay if you create the model with Django on the SQL database, and then have another app write/read from that same SQL table
If you use Django signals, those will not be triggered by modifying the SQL table directly through the DB, so you might want to pay attention to side effects like that.
Your RDBMS handles it's own auto generated IDs and referential integrity, counts etc, so you don't have to worry about messing it up.
On a Django app with some self-made (but close to available plugin methods) multi-tenant implementation, I would like to run a migration (a simple add_column this time) with South that could apply on all schemas. I have a configuration very close to this one.
I would like to skip any pure SQL queries if possible. I can get a list of the schemas name from the ORM properly, but then I wonder if I have the possibility to access the tables from the various schemas in a somehow propre way.
I have a hook to be able to change the DB_HOST and DB_SCHEMA via parameters at some level, but I think not to be able to loop cleanly this way inside the forwards migration method of South.
This question is quite high-level, but I mainly wonder if somebody had to face the same kind of question and I am curious to know if there is some clever way to handle it !
Regards,
Matt
This is an outline of a solution, as posted on the South mailing list. The question as phrased is a little different from the one that was posted on the list: There, it was also mentioned that there are "common" tables, shared between all tenants, in a separate schema. Rmatt's own answer refers to this as the public schema.
The basic idea of my solution: Save the migration history for each database (schema) in the schema. To do this, we need to employ some database and Django tricks.
This implies that history records for migrations of apps on the public schema are saved in the public schema, while history for migrations of tenant apps is saved in the tenant schema -- effectively sharding the migration history table. Django does not really support this kind of sharding; it is easy enough to set up the writing by instance content, but there's no way to set up the reading.
So I suggested to create, per tenant, a "tenant-helper" schema, containing one view, named south_migrationhistory, which is a union of the south_migrationhistory tables from the public and tenant schemata. Then, set up a database router for the South MigrationHistory model, instructing it to:
syncdb to both public and tenant schemata
read always from tenant-helper schema
write to public or tenant schema, according to the app the migration belongs to
The result allows proper treatment of dependencies from the tenant app migrations to the public app migrations; and it means all you need to do, to migrate forwards, is loop on the migrate --all (or syncdb --migrate) command -- no need to fake backward migrations. The migrations for the public schema will be run with the migrations for the first tenant in the loop, and all other tenants will "see" them.
As an afterthought, it is probably possible to do this also without a helper schema - by renaming the south_migrationhistory table in the tenant schema, and installing a view with that name in the schema that returns the above-mentioned union when queried, and has an "instead-of insert" trigger to write to the renamed table.
Fine, not so many people seem to have experience or to be concerned with this quite specific problem. I have tried some things here and there and I also got some support from the South mailing-list that helped me to understand some points.
Basically, the solution I implemented is the following:
I have a quite normal migration file autogenerated via South's schemamigration. But I have changed the table name of the add_column and delete_column to schema.table_name. The schema is provided by importing the multi-tenant middleware.
The migration is then applied only if the schema is not run against the public schema. It is actually not meant to be run standalone, or only with database and schema kwargs, but rather from a migration runner that is a new django command.
The runner has unfortunately to call the migration externally, in order to go through the middleware each time again. One other trick is that we have to get the previous state of migration, in order to fake it back to the previous state for south after each tenant migration.
Here is my snippet :
from subprocess import call
import os
from django.core.management.base import BaseCommand
from south.models import MigrationHistory
from myapp.models import MyModel
class Command(BaseCommand):
def handle(self, *args, **options):
#the only allowed arg is the prefix version and it should have a length of 4 (i.e. 0002)
applied = MigrationHistory.objects.filter(app_name='myapp').latest('applied')
current_version = applied.migration[:4]
call_args = ['python', os.path.join('bin', 'manage.py'), 'migrate', 'myorderbird.app.backups']
if len(args) == 1 and len(args[0]) == 4:
call_args.append(args[0])
obje_call_args = None
for obje in MyModel.objects.all():
if obje.schema_exists:
# fake the migration of the previous venue back to the current version
if obje_call_args:
obje_call_args = obje_call_args[:4] + [current_version, '--fake'] + obje_call_args[len(obje_call_args)-3:]
call(obje_call_args)
# migrate the venue in the loop
obje_call_args = list(call_args)
obje_call_args.extend(['--database={}'.format(obje.db), '--schema={}'.format(obje.schema)])
call(venue_call_args)