I'm working on a Django project with docker which use different legacy Microsoft SQL Server database. Since I didn't want to alter those legacy databases I use inspectdb and Django router to create models and then use the data there to read and write a default Postgres database with all the Django specific tables.
Everything was working fine, each instance of my model objects have the initial data from the SQL server db populating into the Postgres db. However whenever anything is updated in the SQL server db the data in the is not updating in Postgres db unless the docker container is stop and restarted. I'm wondering is there a way to autoupdate my postgres db when changes are made in the connected SQL Server db's with stopping the container?
Would I need to use something like celery or a cron job to schedule updates? I've used cron jobs before but only to pull from an api. Is it possibly to run celery or a cron job just on a database and not an api? Or is there a solution that's simplier?
I'm also running with nginx and gunicorn on an Ubuntu server. Using volumes to persist db.
Related
I created a Django application and deployed it for testing purposes. I now have data which i would not like to lose, but i want to use cockroachDB for storing the web-app data. The SQLite database is the one django creates automatically while applying migrations. Is there a way to move all that data to my cockroachDB cluster? I will be using cockroachDB for my database.
Thank you for your time
You can modify the settings.py in your django app to use a different database.
For example, one can use the dj_database_url module to set up a connection with cockroachDB.
What is the best way to go about this? I have a mobile app a project team developed whereby they setup the database as a MySQL instance. However, with this new project I have with my own developers, we believe Postgres would better suit or needs - but I want everything on one DB instance (data between the mobile app and the new project will be shared). What is the best way to accomplish this?
You will need to create a new RDS instance to switch the engine type to Postgres.
Whilst transitioning you will need to have both running, to migrate the DB across you will want to keep the data synchronised between both. Luckily AWS has the database migration service.
You should try to migrate your existing application to use this first, then remove the DMS setup and shutdown the MySQL database.
I have Celery set up with Django. I'm using RabbitMQ as my broker. I'm trying to set up sqlalchemy as my result back-end with a sqlite database separate from the Django database. I have RabbitMQ, Django, & Celery all running without any issues. I put in my settings.py CELERY_RESULT_BACKEND = 'db+sqlite:///celery_results.sqlite3' & on Celery worker startup it shows the back-end configured correctly in the log output.
The problem is that my database isn't being created. Why is this happening?
Hopefully this will save someone some head scratching or ruining their configuration. Everything was set up correctly and running fine. The issue was that I didn't have any tasks running since I wanted as little going on as possible during setup. Once I ran a task the sqlite database & tables were created successfully. I was also able to write a script to confirm that the task results were being stored in the database.
My Django application (a PoC, not a final product) with a backend library uses a SQLite database - read only. The SQLite database is part of the repo and deployed to Heroku. This is working fine.
I have the requirement to allow updates to this database via the Django admin interface. This is not a Django managed database, so from Django's point of view just a binary file.
I could allow for a FileField to handle this, overwriting the database; I guess this would work in a self-managed server, but I am on Heroku and have the constraints imposed by Disk Backed Storage. My SQLite is not my webapp database, but limitations apply the same: I can not write to the webapp's filesystem and get any guarantee the new data will be visible by the running webapp.
I can think of alternatives, all with drawbacks:
Put the SQLite database in another server (a "media" server), and access it remotely: this will severely impact performance. Besides, accessing SQLite databases over the network does not seem easy.
Create a deploy script for the customer to upload the database via the usual deploy mechanisms. Since the customer is not technically fit, and I can not provide direct support, this is unfeasible.
Move out of Heroku to a self-managed server, so I can implement this quick-and-dirty upload without complications.
Do you have another suggestion?
PythonAnywhere.com
deploy your app and you can easily access all of your files and update them and your Sqlite3 database is going to be updated in real time without losing data.
herokuapp.com erase your Sqlite3 database every 24 hours that's why it's not preferred for Sqlite3 having web apps
I have multiple django apps connect to the same mysql db backend for high availability purpose. My question is, is it a problem if app instances try to migrate db concurrently? If so, what's the best way to prevent it?