How to synchronize Local database with a remote database using Django? - django

I just want concurrency in the local and remote database.
The changes made to local database should be reflected to a remote database automatically.
In short explain me how to concurrently update multiple database servers.

If you don't want to solve this on a DB level, you may want to have a look at Django's built in support for multiple databases.

Related

django multitenant architecture options: what influence on database performance?

I m designing a website where security of data is an issue.
I ve read this book : https://books.agiliq.com/projects/django-multi-tenant/en/latest/index.html
I'm still thinking about the right database structure for users.
And im hesitating between shared database with isolated schemas, isolated databases in a shared app, or completely isolated tenants using docker.
As security of data is an issue, i would like to avoid to put all the users in the same table in the database or in different schemas in the same database. However i dont understand well if i should put each user in a separate database (create a database per user, sqlite for example but i dont know if it would communicate well with postgres). What is the best practice for this in terms of security?
Im wondering how these options affect database speed compared to a shared database with a shared schema, which was the basic configuration of the course i attended on django.
I dont have good knowledge on databases so your help on the performance issue would be very appreciated!
Also, if I want to do some stats and use tenants data, how difficult is it to query completely isolated tenants using docker or isolated databases, in particular if each user is a separate docker or database?

Multi database - each database writes to self and read from self

tried to google as much as I could but couldn't find what I was looking for..
So idea is: There is one Master Database from which one you read users authentication process. And there is other many databases which ones keep information(her users, files and other) just to itself(it writes to itself and reads) but Master can reach them all. And if master makes changes to lets say to database structure - all fields should change but information on those database should stay (but master can change any of those database information).
It's like Multi-master but I do not want that other masters could reach other databases, but only write to itself.
Any tips?
Question is not clear, but if you want use multiple databases with django. Look at THIShttps://docs.djangoproject.com/en/2.1/topics/db/multi-db/

Redis -- how does it improve performance?

i'm relatively new to the world of web-development and have only recently learned memory hierarchies in computer systems. I recently came across Redis and am itching to try it out in a small web-app. But before I do, I was wondering how is Redis going to improve performance? From what i've read so far, it seems that Redis is an "in-memory" data store, so does that mean that whenever a user requests a data from the server, instead of fetching from the database (given that the Redis data store is already populated with the needed data) the request can be fulfilled by accessing the data directly from the server's memory? To be specific, say if i have a web-app which back-end server is hosted on AWS, and the database is stored on MLAB, then whenever a user requests a data, instead of querying to the server which redirects the request to MLAB, it can now directly fetch the data from the server without going to MLAB ? Also, by in-memory, does that mean that the data is stored in the RAM on my AWS server?
Finally, how is this different from a cache?
Thank you so much!!
Well, Redis is used as a cache, the difference with most of the traditional cache is that you have other nice structures like hashes, sets, lists, TTL on keys, hyperlologs and so on, not only pair key:value.
You are right what you define about Redis, is but take into account that if you want to move your data from MLAB database to Redis you have to design some process to keep Redis update in each update that happens in your database. So every query from your application will use Redis to get data but apart from that you will need a process to keep update Redis with changes on your database, so if you use your application to update the database (and there are no other external parts which update your DB), every time you get an update from your web-app you have to update the DB and also Redis or having a command/script which detect every time an updated happened in the DB and update Redis properly.
AWS also provides Redis services, like ElasticCache https://aws.amazon.com/elasticache/?nc1=h_ls so basically the AWS ECS instance where you have your application doesn't use the RAM but this ElasticCache service which can live on another physical machine.
Finally, Redis store on memory the data though, it uses a dump file to save partial data in case of crashes and it also offers a persistence mode

Temporarily storing statistics on slave in Django

My slave servers collect statistics and performance metrics about visits, but eventually they would have to be sent to the master DB.
I don't want to have a permanent database connection open with the master DB server, so they would have to be temporarily stored locally and shipped over in chunks at specific intervals.
Any suggestions for tools to do this with Django? I've come up with the idea of storing the records in a local SQLite DB and sending them to the main DB server every hour for example. But maybe there are better ways than SQLite out there. Also, still not sure, for pushing the data back into the master DB server at regular intervals, would you use a direct DB connection from within Django, or design a simple API to send it over HTTPS?
I'll end up using redis and putting them in a list, prickling the objects. Main advantage: no SQLite migrations to maintain. It's really a pain though that Redis doesn't support atomic gets for more than 1 item, like LPOP but for more than 1, to retrieve them in batches...

Creating an Application Based Database

I am working on an application where I would like to store all my current user authentications (users that are currently logged in with their login tokens) in a database in memory. Currently I have an HSQL DB that I do a DROP and CREATE TABLE command onApplicationStart to store the authentications, but I was wondering if their is a way that I could just wipe out the database when the application restarts (currently it is stored until the server restarts).
Is there a way that I could create an in memory database that has limited access from only the application that uses it, and that destroys itself when the application restarts?
Sounds like this is exactly what you need:
Networked In-Memory Databases With ColdFusion
Using An In-Memory Database With ColdFusion (Out Of The Box)
TRUNCATE TABLE <tablename> works in MySQL, maybe there is a similar function?