Persistent MongoDB Connection With Django - django

I am currently using (or hoping to use) both PostgreSQL and MongoDB for my Django project. When it comes to MongoDB I've been looking at PyMongo to make the connection, and add/edit info using Python. Most likely I will be calling PyMongo in my views.py to read/insert/edit data from MongoDB.
For the sake of this question, let's say I have a view called worksheet, which querys some data from MongoDB relevant to the current user. However I would like to maintain a persistent connection to the database so that I do not have to make a new connection every time a user visits the worksheet view.
Is there any way I could make a persistent connection so I am not wasting time/resources making new connections to MongoDB every time? Thanks.

I wanted the exact same thing and this is how I did it.
from pymongo import MongoClient
from threading import local
from django.conf import settings
_mongo_client = local()
def mongo_client():
client = getattr(_mongo_client, 'client', None)
if client is None:
client = MongoClient(settings.MONGODB_URI)
_mongo_client.client = client
return client
And then in your views you use it the client like this.
mongo_client().db.orders.find(...)
This is what I have been doing and it works, my only concern would be if the mongodb client correctly reestablishes lost connections... But I think it does because I have not been having any issues.

Related

Flask-PyMongo query DB

I have a remote MongoDB with scraped data that I want to display via webpage in Flask, but it seems to be running into issues. I'm able to add to the DB without issue, but displaying data from the DB seems like an impossible feat. I'm at a loss after repeated research. One common error is that the 'Cursor' object is not callable
Code:
from flask import Flask, render_template
from flask_pymongo import PyMongo
app = Flask(__name__)
app.config["MONGO_URI"] = 'mongodb+srv://example:example#cluster0-zh34t.mongodb.net/test?retryWrites=true'
mongo = PyMongo(app)
#app.route("/")
def index():
doc = mongo.db.collection.find_one({"_id": 0})
return doc
Cursor was not the real issue here. Using find_one instead of find passes MongoDB into a dictionary that you can then use as expected. My issue, which is now resolved, was due to the MONGO_URI specified. Because of how flask_pymongo automatically identifies your DB based on the URI, I had "Test" as opposed to my actual DB. "Test" didn't exist, although MongoDB Atlas provided the path, so I ran into all kinds of issues. If you experience this type of issue, be sure to triple-check your URI.

How to use Records with Flask?

My Flask app should execute some rather complicated sql queries, parse them and display results. Now I'm using SQLAlchemy (as described here), but I want to switch to Records, simple wrapper over SQLAlchemy (it has all I need out of the box). I haven't found any examples of integration Records into Flask.
Can you provide an example of simple Flask app with vanilla Records integration?
Records handle the SQLAlchemy connection lifecycle, I don't think it is a good idea to really integrate them. You can share the config, passing the database URL to records.Database.
app = Flask(__name__)
app.config['DATABASE_URL'] = 'postgres://...'
records_db = records.Database(app.config['DATABASE_URL'])
If you use Flask-SQLAlchemy, the config key is SQLALCHEMY_DATABASE_URI.
Add the records_db to Flask context g to access it from any file with a Flask context, without messing with circular imports.
from flask import g
g.records = records_db

Where to initialize MongoDB connection in Django projects?

I wonder where I should initialize my MongoDB connection in my Django projects.
Currently I am initializing the client and db before my view functions in views.py:
import pymongo
from django.conf import settings
client = pymongo.MongoClient(settings.MONGO_URI)
db = client.get_default_database()
def some_view(request):
pass
However I also need to use MongoDB in my models.py in conjunction with Django signals. What do you suggest?
Maybe settings.py? Or even root __init__.py? Then you can import client and db everywhere you need it.
I've decided to use project/mongodb.py (same folder as settings.py)
import pymongo
from django.conf import settings
client = pymongo.MongoClient(settings.MONGO_URI)
mongodb = client.get_default_database()
I am using two different settings files for local and production. Therefore, this approach makes it possible to use environment dependent settings, while enabling me to access mongodb variable from anywhere in the project.

psycopg2 with flask, when to close connection

I'm trying to build a simple web application that will be querying a postgres db and inserting/deleting data. Since it is a very simple application, I'm not using an ORM layer like sqlalchemy. Instead I'd like to use psycopg directly. Now, I'm wondering, when would be the best time to close cursors and connections? I'm having trouble getting the bigger picture of when the connection is idling with respect to access to the web app.
Thanks!
maybe the official documentation can be useful
#app.before_request
def before_request():
g.db = connect_db()
#app.teardown_request
def teardown_request(exception):
g.db.close()

How do I allow clients to create dynamic databases and connections as a multitenant database setup with Django on Appengine?

Our current set up has Django hosted on Google's appengine with a MySQL database on Google's Cloud SQL.
The users (clients) are typically small businesses who we give a subdomain to for a multi-tenant database structure (1 database for each client).
As for determining which request should hit up which database, there is an existing middleware which strips the request path to get the subdomain and thus returning the correlated database alias defined in settings.py
from django.conf import settings
import threading
currentCompanyConnection = threading.local()
class DatabaseMiddleware(object):
def process_request(self, request):
url = request.build_absolute_uri()
if url.find("http://") < 0:
company = url.split("https://")[1].split(".")[0]
else:
company = url.split("http://")[1].split(".")[0]
global currentCompanyConnection
if company in settings.DATABASES:
currentCompanyConnection.company = company
else:
currentCompanyConnection.company = 'default'
request.currentCompany = str(currentCompanyConnection.company)
return None
class DBRouter(object):
def read_and_write(self, model, **hints):
return currentCompanyConnection.company
db_for_read = read_and_write
db_for_write = read_and_write
However, to allow our web application the functionality of a freemium self-service, a new database must be generated on the fly and imported into Django's settings.py dynamically for each user who sign up.
The last part is something I can't seem to figure out, since each time I change the settings.py, I must deploy it to appengine again. Aside from that, I'm not sure how to create a new database with pre-defined tables in Google's Cloud SQL from our web application.
Thanks for your help! I love resolving challenges from work, but this is something I simply haven't done before =O
You can't modify your source files once deployed. You can modify stuff in the blobstore or datastore.
I'd recommend storing the settings as structured data in the datastore, then have your settings.py read the data from the datastore and store them as python objects in settings.py that are accessible from other code. This should allow you to configure django to connect to multiple databases.
I'm not too familiar with CloudSQL, but I think you may still have a challenge of starting multiple CloudSQL instances and routing the appropriate traffic to the appropriate instances.