Django Transaction when open connection? #transaction.commit_manually #transaction.atomic - django

i want to know when is dbConnection opened
for exampel
as follow logic
Does wait db Connection to external_api_call
or not?
#transaction.commit_manually
def do_something():
# 1. now db connection open?
# api_call
response= requests.get(~~~~)
# 2. or db connection opened here?
aa= User.objects.get(id=1)

Django internally maintains DB connection. It uses persistent connection approach. But when the connection gets closed after a certain time and you hit a new database query, new connection is made by django and kept it in memory. And the same connection is re-used by multiple request until it expires.
You can read more about it here https://docs.djangoproject.com/en/4.0/ref/databases/

Related

Django: How to establish persistent connection to rabbitmq?

I am looking for a way to publish messages to a rabbitmq server from my django application. This is not for task offloading, so I don't want to use Celery. The purpose is to publish to the exchange using the django application and have a sister (non-django) application in the docker container consume from that queue.
This all seems very straightforward, however, I can't seem to publish to the exchange without establishing and closing a connection each time, even without explicitly calling for that to happen.
In an attempt to solve this, I have defined a class with a nested singleton class that maintains a connection to the rabbitmq server using Pika. The idea was that the nested singleton would be instantiated only once, declaring the connection at that time. Any time something is to be published to the queue, the singleton handles it.
import logging
import pika
import os
logger = logging.getLogger('django')
class PikaChannelSingleton:
class __Singleton:
channel = pika.adapters.blocking_connection.BlockingChannel
def __init__(self):
self.initialize_connection()
def initialize_connection(self):
logger.info('Attempting to establish RabbitMQ connection')
credentials = pika.PlainCredentials(rmq_username, rmq_password)
parameters = pika.ConnectionParameters(rmq_host, rmq_port, rmq_vhost, credentials, heartbeat=0)
connection = pika.BlockingConnection(parameters)
con_chan = connection.channel()
con_chan.exchange_declare(exchange='xchng', exchange_type='topic', durable=True)
self.channel = con_chan
def send(self, routing_key, message):
if self.channel.is_closed:
PikaChannelSingleton.instance.initialize_connection()
self.channel.basic_publish(exchange='xchng', routing_key=routing_key,
body=message)
instance = None
def __init__(self, *args, **kwargs):
if not PikaChannelSingleton.instance:
logger.info('Creating channel singleton')
PikaChannelSingleton.instance = PikaChannelSingleton.__Singleton()
#staticmethod
def send(routing_key, message):
PikaChannelSingleton.instance.send(routing_key, message)
rmq_connection = PikaChannelSingleton()
I then import rmq_connection where needed in the django application. Everything works in toy applications and in the python repl, but a new connection is being established every time the send function is being called in the django application. The connection then immediately closes with the message 'client unexpectedly closed TCP connection'. The message does get published to the exchange correctly.
So I am sure there is something going on with django and how it handles processes and such. The question still remains, how do I post numerous messages to a queue without re-establishing a connection each time?
If I understand correctly, connections cannot be kept alive like that in a single-threaded context. As your Django app continues executing, the amqp client is not sending the heartbeats on the channel and the connection will die.
You could use SelectConnection instead of BlockingConnection, probably not easy in the context of Django.
A good compromise could be to simply collect messages in your singleton but only send them all at once with a BlockingConnection at the very end of your Django request.

SQLAlchemy: Keep the session's connection when rolling back

I would like to implement a MySQL style named lock like in http://arr.gr/blog/2016/05/mysql-named-locks-in-python-context-managers/ :
Using sqlalchemy Session, I directly execute a GET_LOCK:
lock = session.execute("SELECT GET_LOCK('TEST', 5)")
Then if my lock is OK, I do what I want, especially some stuff with my database using the session. After that I release the lock:
session.execute("SELECT RELEASE_LOCK('TEST)")
My question is the following:
How can I be sure that the session connection to the database for the release is the same as the one at the beginning ?
The sqlalchemy's documentation says:
When the transactional state is completed after a rollback or commit,
the Session releases all Transaction and Connection resources [...]
http://docs.sqlalchemy.org/en/latest/orm/session_transaction.html
I'm in the context of a web application, so every time the session is committed, or if the session is rollbacked, the next queries aren't guaranteed to be on the same connection. For most of my operations, that isn't a problem, but in the case of a MySQL named lock, 'GET_LOCK' and 'RELEASE_LOCK' have to be requested on the same connection.
The only way I found is to have a specific session for the locking : it will keep its connection only for the lock. But is there a way not to use a connection just for this purpose ?
Edit 2017-06-22:
This topic is about the same subject: SQLAlchemy session and connection relationship. But if a scoped_session is used:
Regarding the extract from the documentation above, after a commit or a rollback, the connection for future queries with this session may be different, right ?
A unique session will be used in the thread ?
If that could help someone else, finally I used a scoped_session with a bind when getting the session for the first time in a request:
# Initialization:
Session = scoped_session(sessionmaker(bind=engine))
# ...
# Before processing a request:
connection = engine.connect()
Session(bind=connection)
# ...
# After the processing
Session.remove()
connection.close()
With this configuration, if during my request processing I use GET_LOCK/RELEASE_LOCK, they will apply on the same connection.

Storing and accessing request-response wide object

I need to store a created/open LDAP connection, so multiple models, views and so on can reuse a single connection rather than creating a new one each time. This connection should be open when first required during a request and closed when sending a response (done generating the page). The connection should not be shared between different requests/responses.
What is the way to do it? Where to store the connection and how to ensure it is eventually closed?
A bit more info. As an additional information source, I use LDAP connections. LDAP data contains details I cannot store in the database (redundancy/consistency reasons), e.g. MS Exchange mailing groups. I might need some LDAP data in multiple points, different objects/instances should access it during response generation.
One way to store the connection resource so that it can be shared across your components is to use thread local storage.
For example, in myldap.py:
import threading
_local = theading.local()
def get_ldap_connection():
if not hasattr(_local, 'ldap_connection') or _local.ldap_connection is None:
_local.ldap_connection = create_ldap_connection()
return _local.ldap_connection
def close_ldap_connection():
if hasattr(_local, 'ldap_connection') and _local.ldap_connection is not None:
close_ldap_connection(_local.ldap_connection)
_local.ldap_connection = None
So the first time myldap.get_ldap_connection is called from a specific thread it will open the connection. Subsequent calls from the same thread will reuse the connection.
To ensure the connection is closed when you have finished working, you could implement a Django middleware component. Amongst other things this will allow you to specify a hook that gets invoked after the view has returned it's response object.
The middleware can then invoke myldap.close_ldap_connection() like this:
import myldap
Class CloseLdapMiddleware(object):
def process_response(self, response):
myldap.close_ldap_connection()
return response
Finally you will need to add your middleware in settings.py MIDDLEWARE_CLASSES:
MIDDLEWARE_CLASSES = [
...
'path.to.CloseLdapMiddleWare',
...
]

Suggested way how to implement redis connection management in Django

I am trying to put some of the message system to redis. I have a question regarding the connection management towards redis from django. Below is taken from quora:
When talking to Redis from Django (or indeed any other web framework, I imagine) an interesting challenge is deciding when to connect and disconnect.
If you make a new connection for every query to Redis, that's a ton of unnecessary overhead considering a single page request might make hundreds of Redis requests.
If you keep one connection open in the thread / process, you end up with loads of unclosed connections which can lead to problems. I've also seen the Redis client library throw the occasional timeout error, which is obviously bad.
The best result I've had has been from opening a single Redis connection at the start of the request, then closing it at the end - which can be achieved with Django middleware. It feels a bit dirty though having to add a piece of middleware just to get this behaviour.
Does anybody had a chance to create such redis middleware , I am always in favor of not reinventing the wheel but didn't find anything on google related to this topic.
I implemented the middleware :
import redis
from redis_sessions import settings
# Avoid new redis connection on each request
if settings.SESSION_REDIS_URL is not None:
redis_server = redis.StrictRedis.from_url(settings.SESSION_REDIS_URL)
elif settings.SESSION_REDIS_UNIX_DOMAIN_SOCKET_PATH is None:
redis_server = redis.StrictRedis(
host=settings.SESSION_REDIS_HOST,
port=settings.SESSION_REDIS_PORT,
db=settings.SESSION_REDIS_DB,
password=settings.SESSION_REDIS_PASSWORD
)
else:
redis_server = redis.StrictRedis(
unix_socket_path=settings.SESSION_REDIS_UNIX_DOMAIN_SOCKET_PATH,
db=settings.SESSION_REDIS_DB,
password=settings.SESSION_REDIS_PASSWORD,
)
class ReddisMiddleWare(object):
def process_request(self,request):
request.redisserver = redis_server
Then in the view I am just using request.redisserver.get(key) .

mongodb when to close connections and when to make them persistent

I am writing a Django app using mongodb. For a simple GET request I need to get results from database for which I am making a connection in the HTTPRequestHandler. The db operation for the HTTPRequest isn't a heavy operation. Should I close the connection in that handler itself. Her is the code snippet.
def search(request):
dbConnection = Connection('hostname', int('port-no'))
... made a small query to db. (not a heavy operation)
dbConnection.close()
return HTTPResponse(result)
Is this code doing the suitable job of connecting and closing connections. What I want to know is that is it fast in terms of performance. I want this "search" HTTPRequestHandler to work fast. If this is not the way to go, can someone please explain when and how should we close connections and when to make them persistent in mongo.