Flask-PyMongo query DB - flask

I have a remote MongoDB with scraped data that I want to display via webpage in Flask, but it seems to be running into issues. I'm able to add to the DB without issue, but displaying data from the DB seems like an impossible feat. I'm at a loss after repeated research. One common error is that the 'Cursor' object is not callable
Code:
from flask import Flask, render_template
from flask_pymongo import PyMongo
app = Flask(__name__)
app.config["MONGO_URI"] = 'mongodb+srv://example:example#cluster0-zh34t.mongodb.net/test?retryWrites=true'
mongo = PyMongo(app)
#app.route("/")
def index():
doc = mongo.db.collection.find_one({"_id": 0})
return doc

Cursor was not the real issue here. Using find_one instead of find passes MongoDB into a dictionary that you can then use as expected. My issue, which is now resolved, was due to the MONGO_URI specified. Because of how flask_pymongo automatically identifies your DB based on the URI, I had "Test" as opposed to my actual DB. "Test" didn't exist, although MongoDB Atlas provided the path, so I ran into all kinds of issues. If you experience this type of issue, be sure to triple-check your URI.

Related

Integrate redis with flask application using redis-py

I want to use redis-py directly to work with flask, instead of other wrapper lib (e.g. flask-redis, flask-and-redis) of flask plugins. How to initialize redis-client in factory function create_app() of flask application ? Or how to properly just initialize redis.StrictRedis (maybe not compatible with other flask plugins of redis ??) ? Because there're some operations related to token persistence in router of other modules using this redis.StrictRedis object.
Any advice, please ?
Interesting question, I think using the wrappers may make things easier but if you just want to use redis as a cache you can very easily import redis and just create/instantiate alongside the flask app.
import redis
from flask import Flask
# make redis
redis_cache = redis.StrictRedis()
# make flask app
app = Flask(__name__)
# business logic
#app.route('/<string:item>')
def index(item):
# if cache hit then get from redis
if redis_cache.exists(item):
value = redis_cache.get(item)
# cache miss
else:
value = 'Not in cache'
return value
This is the simplest way.
You can also follow this link to create the setup function and add the redis instantiation in the create_app function.

Persistent MongoDB Connection With Django

I am currently using (or hoping to use) both PostgreSQL and MongoDB for my Django project. When it comes to MongoDB I've been looking at PyMongo to make the connection, and add/edit info using Python. Most likely I will be calling PyMongo in my views.py to read/insert/edit data from MongoDB.
For the sake of this question, let's say I have a view called worksheet, which querys some data from MongoDB relevant to the current user. However I would like to maintain a persistent connection to the database so that I do not have to make a new connection every time a user visits the worksheet view.
Is there any way I could make a persistent connection so I am not wasting time/resources making new connections to MongoDB every time? Thanks.
I wanted the exact same thing and this is how I did it.
from pymongo import MongoClient
from threading import local
from django.conf import settings
_mongo_client = local()
def mongo_client():
client = getattr(_mongo_client, 'client', None)
if client is None:
client = MongoClient(settings.MONGODB_URI)
_mongo_client.client = client
return client
And then in your views you use it the client like this.
mongo_client().db.orders.find(...)
This is what I have been doing and it works, my only concern would be if the mongodb client correctly reestablishes lost connections... But I think it does because I have not been having any issues.

Services and flask_sqlalchemy inside a flask application

I'm building a large, complex Flask application. It has lots of route functions (which need to do database calls) and lots of services (which also need to do database calls).
I'm using flask_sqlalchemy to do the database calls in the route functions as normal with model syntax (User.query).
I would also like to make database calls in the services. However, I can't do this without setting an app context:
with app.app_context():
User.query
which requires importing the app, which leads to circular imports (because the app imports the routes, which import the services, which have to import the app).
QUESTION 1: Do you know a way to get round this circular import problem?
Another approach could be to use flask_sqlalchemy in the routes and sqlalchemy in the services. This means nothing needs to be shared except for the database url and.... the models.
QUESTION 2: Do you know a way to use the same model files with both flask_sqlalchemy and normal sqlalchemy?
I've tried this: https://dev.to/nestedsoftware/flask-and-sqlalchemy-without-the-flask-sqlalchemy-extension-3cf8
...however it breaks my relationships.
I am following the flask app pattern shown here: https://github.com/sloria/cookiecutter-flask/blob/master/%7B%7Bcookiecutter.app_name%7D%7D/README.rst (application factory)
Thanks,
Louise

How to use Records with Flask?

My Flask app should execute some rather complicated sql queries, parse them and display results. Now I'm using SQLAlchemy (as described here), but I want to switch to Records, simple wrapper over SQLAlchemy (it has all I need out of the box). I haven't found any examples of integration Records into Flask.
Can you provide an example of simple Flask app with vanilla Records integration?
Records handle the SQLAlchemy connection lifecycle, I don't think it is a good idea to really integrate them. You can share the config, passing the database URL to records.Database.
app = Flask(__name__)
app.config['DATABASE_URL'] = 'postgres://...'
records_db = records.Database(app.config['DATABASE_URL'])
If you use Flask-SQLAlchemy, the config key is SQLALCHEMY_DATABASE_URI.
Add the records_db to Flask context g to access it from any file with a Flask context, without messing with circular imports.
from flask import g
g.records = records_db

psycopg2 with flask, when to close connection

I'm trying to build a simple web application that will be querying a postgres db and inserting/deleting data. Since it is a very simple application, I'm not using an ORM layer like sqlalchemy. Instead I'd like to use psycopg directly. Now, I'm wondering, when would be the best time to close cursors and connections? I'm having trouble getting the bigger picture of when the connection is idling with respect to access to the web app.
Thanks!
maybe the official documentation can be useful
#app.before_request
def before_request():
g.db = connect_db()
#app.teardown_request
def teardown_request(exception):
g.db.close()