django db migration concurrently - django

I have multiple django apps connect to the same mysql db backend for high availability purpose. My question is, is it a problem if app instances try to migrate db concurrently? If so, what's the best way to prevent it?

Related

Auto-updating Django postgres database when changes are made in legacy database

I'm working on a Django project with docker which use different legacy Microsoft SQL Server database. Since I didn't want to alter those legacy databases I use inspectdb and Django router to create models and then use the data there to read and write a default Postgres database with all the Django specific tables.
Everything was working fine, each instance of my model objects have the initial data from the SQL server db populating into the Postgres db. However whenever anything is updated in the SQL server db the data in the is not updating in Postgres db unless the docker container is stop and restarted. I'm wondering is there a way to autoupdate my postgres db when changes are made in the connected SQL Server db's with stopping the container?
Would I need to use something like celery or a cron job to schedule updates? I've used cron jobs before but only to pull from an api. Is it possibly to run celery or a cron job just on a database and not an api? Or is there a solution that's simplier?
I'm also running with nginx and gunicorn on an Ubuntu server. Using volumes to persist db.

How to migrate all the data of an sqlite database to cockroachdb (for a django project)

I created a Django application and deployed it for testing purposes. I now have data which i would not like to lose, but i want to use cockroachDB for storing the web-app data. The SQLite database is the one django creates automatically while applying migrations. Is there a way to move all that data to my cockroachDB cluster? I will be using cockroachDB for my database.
Thank you for your time
You can modify the settings.py in your django app to use a different database.
For example, one can use the dj_database_url module to set up a connection with cockroachDB.

Changing an AWS RDS MySQL instance to Postgres

What is the best way to go about this? I have a mobile app a project team developed whereby they setup the database as a MySQL instance. However, with this new project I have with my own developers, we believe Postgres would better suit or needs - but I want everything on one DB instance (data between the mobile app and the new project will be shared). What is the best way to accomplish this?
You will need to create a new RDS instance to switch the engine type to Postgres.
Whilst transitioning you will need to have both running, to migrate the DB across you will want to keep the data synchronised between both. Luckily AWS has the database migration service.
You should try to migrate your existing application to use this first, then remove the DMS setup and shutdown the MySQL database.

Update SQLite database on disk

My Django application (a PoC, not a final product) with a backend library uses a SQLite database - read only. The SQLite database is part of the repo and deployed to Heroku. This is working fine.
I have the requirement to allow updates to this database via the Django admin interface. This is not a Django managed database, so from Django's point of view just a binary file.
I could allow for a FileField to handle this, overwriting the database; I guess this would work in a self-managed server, but I am on Heroku and have the constraints imposed by Disk Backed Storage. My SQLite is not my webapp database, but limitations apply the same: I can not write to the webapp's filesystem and get any guarantee the new data will be visible by the running webapp.
I can think of alternatives, all with drawbacks:
Put the SQLite database in another server (a "media" server), and access it remotely: this will severely impact performance. Besides, accessing SQLite databases over the network does not seem easy.
Create a deploy script for the customer to upload the database via the usual deploy mechanisms. Since the customer is not technically fit, and I can not provide direct support, this is unfeasible.
Move out of Heroku to a self-managed server, so I can implement this quick-and-dirty upload without complications.
Do you have another suggestion?
PythonAnywhere.com
deploy your app and you can easily access all of your files and update them and your Sqlite3 database is going to be updated in real time without losing data.
herokuapp.com erase your Sqlite3 database every 24 hours that's why it's not preferred for Sqlite3 having web apps

Deploy Django Application without Service Interruption / no Downtime

We have no continuous integration setup(, yet). But want to deploy very frequently. Once a day or so.
We have a pretty standard Django application with a separate Postgres server. We use normal rented VMs (NO Amazon or Rackspace).
How can we minimize the downtime of our application? Best would be to zero downtime. We thought about a setup with two equal application and two database servers and deploy one app/db server pair after another.
The problem is keeping the data consistant. While one app/db server pair is updating the server pair with the old code can serve users. But if the users write to the db we would lose the data when switching to the updated pair. Especially when we push schema migrations.
How can we handle this? This must be a very common problem but I can't find good answers. How do you handle this problem?
In the case that you have no schema migrations, I'll give you a practical scenario:
Keep two versions of django processes ( A and B ), which you control with, let's say, supervisor. Keep an nginx process in front of your django processes, which forwards all requests to A. So, you upload version B to the server, start the django process B with supervisor, then change your nginx's conf file to point to B, then reload your nginx process..
In the case that you have schema migrations, things get complicated. Your options include:
You could consider using a NoSQL solution, like mongoDB (in this case you can keep a single DB instance).
Figure out how to manually record all write requests while uploading, as to push them later to your new database.