What I want to do is to failover Redis with Django, but cannot find the way to do it.
What I've setup is as follows:
I'm using Redis as a session backend.
I've setup two Redis servers in master-slave relationship that when master fails, slave automatically becomes master (using Sentinnel)
I setup settings.py like this
CACHES = {
'default': {
'BACKEND': 'redis_cache.RedisCache',
'LOCATION':[
"127.0.0.1",
"IPofSlave"
],
'OPTIONS': {
'PASSWORD': "xxxxxxxx",
'DB': 0,
}
}
}
SESSION_ENGINE = 'django.contrib.sessions.backends.cache'
SESSION_CACHE_ALIAS = "default"
I want Django to use only the master normally, and switch automatically to slave when it can't connect to the master.
How could I do this by editing settings.py or should I take another way around?
I would probably go with something like https://github.com/KabbageInc/django-redis-sentinel/blob/master/README.md which adds sentinel support to the Django Redis plugin. There may be others more suitable, this was top of the list in a Google search for Django sentinel.
Related
I have a small django site which controls an anstronomy dome and house automation. On start up the project loads 3 json files: relays, conditions and homeautomation. To avoid constant reading and writing to the Pi4's ssd I load the json files into REDIS (on start up in apps, see below). I already have REDIS running in a docker as the project uses celery.
My problem is that within a few minutes of loading the json into REDIS it clears the data out of cache.
I load the json file in the form of a dictionary (dict) in apps
cache.set("REDIS_ashtreeautomation_dict", dict, timeout=None)
and set
CACHES = {
"default": {
"BACKEND": "django_redis.cache.RedisCache",
"LOCATION": "redis://redis:6379",
"OPTIONS": {
"CLIENT_CLASS": "django_redis.client.DefaultClient",
"SERIALIZER": "django_redis.serializers.json.JSONSerializer",
"TIMEOUT": None
}
}
}
I don't need the data to persist if the dockers go down and I don't need db functions. Caching these files is ideal but I need them to 'stay alive' for the lifetime of the server.
Thank you.
Thank you Kevin.
Moving TIMEOUT solved the issue.
CACHES = {
"default": {
"BACKEND": "django_redis.cache.RedisCache",
"LOCATION": "redis://redis:6379",
"TIMEOUT": None,
"OPTIONS": {
"CLIENT_CLASS": "django_redis.client.DefaultClient",
"SERIALIZER": "django_redis.serializers.json.JSONSerializer",
}
}
}
I am going to include some code to catch the long term REDIS 'eviction' policies (i.e. reload the json data). I don't want to delve into the REDIS docker.
Thanks
Ian
I have two databases configured in my app:
DATABASES = {
'default': { ... }
'legacy': { ... }
}
The legacy database is only used in a particular part of the app (I've added it as a second database for convenience).
This works fine, except that when I try to run tests, Django tried to create a test database for the legacy database, causing an error:
Got an error creating the test database: (1044, "Access denied for user ... to to database 'test_...'")
How can I tell Django not to create a test database for the second legacy database?
I thought setting the following would work:
DATABASES['legacy']['TEST'] = {
'NAME': None,
'CREATE_DB': False
}
but that doesn't seem to help
Seems looks like a common issue with multiples databases and testing in Django. Here is the way I generally deal with it.
DATABASES = {
'default': { ... }
'legacy': { ... }
}
# You can add here any other type of control (not prod, debug == True, etc.)
if "test" in sys.argv:
del DATABASES["legacy"]
# Or
DATABASES = { "default": DATABASES["default"] }
This works great in the case you have only one setting file, you can easily adapt for other cases.
If you are handling many databases another option could be to start from the ground up inside your tests settings:
DATABASES = {
'default': { ... }
'legacy': { ... }
}
# You can add here any other type of control (not prod, debug == True, etc.)
if "test" in sys.argv:
DATABASES = {"default": {}}
DATABASES["default"]["ENGINE"] = "django.db.backends.sqlite3"
# Etc... Add want you need here.
I've got Redis set up as my cache in django, with the following setting:
CACHES = {
'default': {
'BACKEND': 'redis_cache.RedisCache',
'LOCATION': 'localhost:6379',
'OPTIONS': {
'PICKLE_VERSION': 1,
},
},
}
And I'm experimenting with it (new to Redis, want to understand it better). So, I go into my Django shell, and I do:
from django.core.cache import cache
cache.set('asdf', 2)
cache.get('asdf') # Returns 2
And then I go into redis-cli, where I expect to see the value, but none of these show any values:
KEYS *
GET *
GET 'asdf'
What's up with that?
Redis has 16 databases by default. As #Bernhard says in his comment, you can see how many keys each has with:
INFO KEYSPACE
Which in my case returned:
# Keyspace
db0:keys=1,expires=0,avg_ttl=0
db1:keys=2,expires=2,avg_ttl=504748260
And you can SELECT the database you want to inspect with:
SELECT 1
At which point, sure enough, I can see the keys I expected:
KEYS *
1) ":1:asdf"
2) ":1:django.contrib.sessions.cacheg2l0bo9z88z8bn4q2ep0andjgo8zrzzk"
I'm running into an issue that South creates the DB table for a new model as INNODB when I migrate but creates the table as MYISAM when another developer runs their own migration.
The problem with this is that all my other tables are MYISAM so using the new tables leads to many foreign key constraint errors.
How can I explicitly make sure the table is created using MYISAM?
What could be causing the table to be created using a different storage engine in different environments?
To be sure that all migrations are always done using INNODB, you should set the storage engine as INNODB in the database definition directly, like thisĀ :
DATABASES = {
'default': {
...
'OPTIONS' : { 'init_command' : 'SET storage_engine=INNODB', },
}
If you are using MySQL 5.7.x and above,
DATABASES = {
'default': {
...
'OPTIONS' : { 'init_command' : 'SET default_storage_engine=INNODB', },
}
But you should know that it can have a performance hit. So you may want to set this option only when running migrations.
If you use South, you can set the STORAGE_ENGINE.
django < 1.2
# add to your settings file
DATABASE_STORAGE_ENGINE = 'INNODB' # django < 1.2
django >= 1.2
# add to your settings file
DATABASES = {
'default': {
...
'STORAGE_ENGINE': 'INNODB'
}
}
I'm specifying the databases using a python object:
DATABASES = {
'default':{
'ENGINE':'mysql',
'NAME':'testsqldb',
'USER':'<username>',
'PASSWORD':'<password>',
},
'dynamic_data':{
'ENGINE': 'sqlite3',
'NAME':'',
'USER':'',
'PASSWORD':''
},
}
How can I specify the name of my test database? I've been trying to use TEST_NAME = 'auto_tests' in the settings.py file. However, when I run python manage.py tests <app_name> I get the following message:
Creating test database 'default'...
Got an error creating the test database: (1007, "Can't create database 'test_testsqldb'; database exists")
Type 'yes' if you would like to try deleting the test database 'test_testsqldb', or 'no' to cancel:
I'm expecting the system to create a separate database when running my tests, presumably called 'auto_tests_testsqldb'; however, it's still asking me about test_testsqldb.
Any advice is appreciated!
In Django 1.6 and below, TEST_NAME should be a key of one of your database dictionaries. But in Django 1.7 and above, you use a TEST key which is a dictionary of settings for test databases.
You probably want:
DATABASES = {
'default':{
'ENGINE':'mysql',
'NAME':'testsqldb',
'USER':'<username>',
'PASSWORD':'<password>',
'TEST': {
'NAME': 'auto_tests',
}
},
'dynamic_data':{
'ENGINE': 'sqlite3',
'NAME':'',
'USER':'',
'PASSWORD':''
},
}
Alternatively, perhaps you are wanting to use a different engine for your tests? In that case, I think you'll just have to create a separate settings file for testing. It can import from your standard settings module and override DATABASES.