Where to put caching functions in Django? - django

Whenever I update the SQL database, I also update the Redis cache. Should I put all the Redis logic in models or should I have a new module for that?

Create a caching module to keep your cache logic together.
I prefer to keep my modules in a consistent structure based on how I organise the models module.
So if models contains car.py and driver.py I'd have those same files in my cache, forms & views modules. Makes it easier to find everything :)

Related

Can Django fixtures be used for production?

I have a Django application that reads different CSV files and saves them to the same model/table in the DB.
While fixtures are certainly used for setting up a test environment quickly, I used the fixture to configure the different CSV Schemata that is subsequently parsed by the Django application.
So each of the data provider has their own distinct schema which is a different row in the CsvSchema table.
During code review it came up that this is bad style because --
It leads to duplication of data. I found it useful to pass such configurations via a fixture and treated it like a configuration file.
To further treat the fixture like a configuration file, I even put it inside the git repository, which is again somethong the reviewer agrees with.
The reviewer also claimed that fixtures should be use only once in the lifetime of the application, while setting it up initially.
To me, the fixtures are just a tool that Django provides us. I can play around with the schema details in my development machine, and then dump them into a fixture, effectively using it as configuration. Am I playing too hard and fast with the rules here?

shared DB across django projects

Our product has a restful API and a server rendered app (the CMS). Both share the database. Both are written in django
The fields and the models needed in both are not mutually exclusive, there are some only particular to the API, some particular to the CMS, and some which are common.
My question is if I run migrations on one of the repos will they try to drop the fields that aren't present in the models of that particular repo, and needed by the other. Will running the migrations individually in both repos keep the database up to date and not pose a problem.
The only other valid option IMHO (besides merging projects) is turning off automation of Django migrations on common models (Meta.managed = False) and taking table creation & versioning into your own hands. You still can write migration scripts using django.db.migrations but makemigrations command won't do anything for these tables.
This was solved by using a schema migration tool external to Django's own. We use
yoyo migrations to migrate our schema now.
Will running the migrations individually in both repos keep the database up to
date and not pose a problem.
Unfortunately, no. As you suspected, changes in one will attempt to override the other.
The easiest thing to do is merge the two projects into one so this problem goes away entirely.
If this isn't an option, can the code be organised in such a way that both projects share the same models.py files? You could do this by perhaps having the models.py files and migrations folders only exist in one project. The second project could have a symlink across to each models.py file it uses. The trick (and the difficult part) will be to make sure you never create migrations for the app which uses the symlinks.
I think the best things to do would be to have one repo that contains all the fields. This project will be responsible to apply the migrations.
In the other projects, you'll need a db_router containing a function allow_migrate which will return False on your model classes.
Also having different db user with different db permissions can prevent from altering the tables.

Django design decision for multiple databases

I have two a django site, on some of the pages data is coming from a postgresql database. Another set of pages are connected to a sqlite database. The tables are from two different sources so I cannot merge them but I need to merge them in one django site. What is the best practice for this:
should I merge the two in a django application so modifiying model.py,views.. or I should put them into different django applications with different models, view ?
You can merge two apps into one instance, but then you won't be able to use default session and auth modules from django with models of one of it.
Good solution is to merge it into one project (so two apps can share some code and maybe some settings) but run it as 2 separate instances with 2 different settings loaded.
Also: you can just merge two databases, even if they are using different engines. Django has built in dumpdata and loaddata manage commands, you can use it to move data from one database to another.

Two Django projects with common models and business logic

I have two Django Projects that have different use cases. There are reached using different domains. They are hosted in two different servers. Also each Django project has it's own database.
Now, both the projects have some models and some business logic common between them. I don't want to duplicate the code and data which shall be chaotic going forward. Also, I want the models and code (business logic) to be in sync (when models/code is altered).
Can anyone guide me towards a pattern that can help me attain the required architecture: 2 separate projects with common models and business logic.
Thanks in advance.
I've done this before. You will have to move the shared models and business into a new python package (better if you can create a django app that encapsulates these models), in a separate directory.
Add this directory to your python path (the one that contains the package, not the package itself) and you should be able to use this code from within your projects.
The only downside to this is having to configure PYTHON_PATH in your servers or having to copy manually this package into your runtimes

Django: hint for isolating data among set of users

I want to build a Django application in which each "set of users" can see different data, as they were using different DBs. Is there any best practice to achieve this?
E.g.
user A, after logging, sees and manages his own data, with no possibility to see other's data.
I could use Django multi-db feature to define several dbs and create an automatic router to point to the correct db according to the user of current request (keeping auth_user on a common db).
As last chance, I can make a different application folder for each group (the only difference being the database name), but I prefer a smarter solution.
You could consider reusing one of the many per-object permission apps for django.
Another possibility is to make different settings files. For example, settings_groupA.py:
from settings import *
DATABASES = ...
And then you could use management commands, like syncdb, or runserver, etc, etc ... with the --settings option:
--settings=SETTINGS The Python path to a settings module, e.g.
"myproject.settings.main". If this isn't provided, the
DJANGO_SETTINGS_MODULE environment variable will be
used.
Examples:
./manage.py syncdb --settings=settings_groupA
./manage.py runserver --settings=settings_groupA
The advantage of using one database per set of users is that you can keep your code easier and let them have native support for django.contrib.admin with no hack. But then if you're going to have per-object permissions within a group anyway then you might as well go straight for the first solution.