Django: Why does database login every query? - django

In my settings.py, I have a database configured. I have a page that issues a sql query against the database defined in settings.py every 10 seconds.
I have been using it for 1 year now and never had any issues with it.
My database admin ran a login audit on our database. As it turns out, each individual single sql query has a unique login to the database. His audit took 5 minutes to run just today and it is because of my django application logging in.
I was pretty surprised to find out that every query that is issues has a unique login attempt to the database.
Is there anyway to create a "session" for the backend database in settings.py. I really feel that the application should have a single login and use that session to issue commands.
Did I miss a setting to do this?
Sql login audit:

Consider setting CONN_MAX_AGE in DATABASES setting.
From the docs
Persistent connections avoid the overhead of re-establishing a connection to the database in each request. They’re controlled by the CONN_MAX_AGE parameter which defines the maximum lifetime of a connection. It can be set independently for each database.
The default value is 0, preserving the historical behavior of closing the database connection at the end of each request. To enable persistent connections, set CONN_MAX_AGE to a positive integer of seconds. For unlimited persistent connections, set it to None.

Related

Allow Users to Specify SESSION_COOKIE_AGE in Django at Runtime

I'm specifying a default SESSION_COOKIE_AGE in the main settings.py file; however, I'd like admin users to be able to overwrite this value. This value should persist across server restarts.
My initial reaction was to store this in a single-row in a DB table and then write some middleware that would adjust the user's expiry date on each request to match the user-supplied SESSION_COOKIE_AGE; however, that seems like a lot of redundant db queries. Then I went down a signals rabbit hole and am considering creating two signals (post_save and connection_created) to handle: user updates to the user-supplied SESSION_COOKIE_AGE value stored in the DB as well as ensuring SESSION_COOKIE_AGE is updated upon the server starting. And finally I see all of these projects like django-constance, etc. to dynamically update settings constants at runtime.
Any suggestions on the most reliable / least error-prone architecture for this?

How to consolidate Django (different models) database and store at centralized place

I have created a Django-based webpage where different vendor company employees can logins and can change their Shift Timing. (Now we are controlling this job with Linux script but due to large user size ~8k doing it for all requests is a difficult task).
To resolve this I have created a Django webpage( 6 separate models/DB) and used default SQLite DB.
Requirement:
The user is working on some application which needs to be controlled by updated shift timing on the portal.
Question:
How to consolidate OR store DB data in a centralized place? so that if tomorrow I have to reset the Timing for all the users in the portal to default consider General Shift.
I have the below Idea to do this but not sure if this is the best way to complete this work.
by using the REST API I will get the JSON data.OR
manage.py dumpdata apple.CompanyName(Model) --indent 5
any help/Suggestion on this would be appreciated.
For the database u could use an Hosted db like heroku postgres database, If ur new
to database else u can run ur own postgres database in the server.AS u mention there 8k its not good to use SQLite DB as it is file system based.To update the shift timing u can use the default Django admin. I am not sure about ur model structure but as long as you have necessary validation in logic it can be updated from admin anytime

Changing Django session engine without destroying existing sessions

I'm currently running a Django application with SESSION_ENGINE configured as django.contrib.sessions.backends.db. I'd like to change this to django.contrib.sessions.backends.cached_db for a performance boost.
Can I make this change without destroying the existing sessions?
Yes, you can make this change without logged in users suddenly finding themselves being logged out. That's because cached_db checks memcache first for the key and if it cannot be found in it, goes to the database. Thus making this change will not cause a loss of session data. Fragment of code from cached_db
def load(self):
try:
data = self._cache.get(self.cache_key)
except Exception:
# Some backends (e.g. memcache) raise an exception on invalid
# cache keys. If this happens, reset the session. See #17810.
data = None
if data is None:
# Duplicate DBStore.load, because we need to keep track
# of the expiry date to set it properly in the cache.
However please note that cached sessions backends are a bit over rated. Depending on the middleware that you have, the session object may be updated very often, as often as every request if only to change the expire date. In that case you will find that the database is being written to all the time. Which means the cached value has to be discarded too.
You should be able to. cached_db backend is just a write-through cache to a database backed, persistent, db backend which speeds up your read queries. It will not speed up your write queries, so you should try and find out how much you are reading and writing the session data.
Your Django SECRET_KEY setting determines your session key hashing parametrs along with Session settings that determine the cache you will use for sessions and session your TTLs, so if you are not changing those variables, you should be good.

Is there a way to force Sitecore to sync MongoDB data with it's SQL database?

I am setting up Sitecore xDB and am trying to test exactly what info gets through the system for authenticated and non-authenticated users. I would like to be able to make a change and see the results quickly in Sitecore. I found the setting to lower session lifetime to 1 minute rather than 20. I have not found a way to just force Sitecore to sync with Mongo on demand or at least within 1-5 minutes rather than, what also appears to be about 20 minutes at the moment. Does it exist or is "rebuilding" the database explained here the only existing process?
See this blog post by Martina Welander for this and more good info about xDB sessions: https://mhwelander.net/2016/08/24/whats-in-a-session-what-exactly-happens-during-a-session-and-how-does-the-xdb-know-who-you-are/
You just need a utility page that calls System.Web.HttpContext.Current.Session.Abandon(). You may also want to redirect the user to a page that doesn't exist.
Update to address comment
My understanding is that once an xDB session has expired, processing should take place quickly. In the Sitecore.Analytics.Processing.Services.config file, the BackgroundService agent is set to run on an interval of 15 seconds by default.
You may just be seeing cached reporting data. Try clearing the cache using the /sitecore/admin/cache.aspx page. You could also decrease the defaultCacheExpiration setting for the reporting cacheProvider in the Sitecore.Analytics.Reporting.config file. The default is 10 minutes.

Django Multiple Databases - One not always available

I am developing a Django application which will use multiple database backends. I would like to put an sqlite database on the machine running the django application, and sync to a remote mysql database. The tricky part is that this machine running the application will not always have an internet connection, so the mysql database is not always availalble. There will be multiple machines running the application, each with it's own local sqlite DB, but all using the same remote mysql DB.
I haven't written the code yet, but here is what I have in mind. Every time I run an insert or update I would like to write it to both databases, unless the remote DB is unavailable, in which case I will save the sql statement in a table on the local database to run when the remote DB is available.
Can this be done with database routers, or do I need to manually implement this with each db statement?
Note on PK: Not directly related, but sure to be asked. The primary key will be generated locally on each machine. In the mysql DB there will be a field for this key, and a field with a unique identifier for each instance of the application, which together will provide a unique key.
Suppose you have a model called Blog then you can use the following to store it locally and remotely (assuming that you have configured access to the remote db).
blog = Blog('test')
blog.save() #Assuming that sqlite is the default db
try:
blog.save(using='mysql')
except NoInternetConnection:
pass
Make sure you have defined and configured 'mysql' in settings.py and that you handle the cases when there is no Internet connection.
Side note: I am not quite sure why you actually would want to do this. If this is for backup purposes then I would use standard backup procedures. For more information about using multiple databases see: http://docs.djangoproject.com/en/dev/topics/db/multi-db/#using-raw-cursors-with-multiple-databases
I took DrDee's code and attached it to the post_save signal (+1 for the help).
#receiver(models.signals.post_save) #new signal decorator in Django 1.3
def save_to_remote(sender,instance,using,**kwargs):
if using == 'default' and instance.__module__ == 'mymodel.myapp.models':
try:
instance.save(using='remote')
except:
pending_instance=Pending_Remote(
pk_default=instance.pk,
model_default=instance.__class__.__name__
)
pending_instance.save()
This also saves a record of what was not saved to the remote database. Note that the model Pending_Remote must not be in 'myapp'.