I am currently using couchdb as my backend for my flask application.
I currently have some huge master files that I am not able to put into couch (CouchDB throws an OS errorwhen I build a view function for these lookup files... thats a whole different bag that I want to get to at a later point).
These lookup files do not have any relation to the datasets in the couchdb dataset and I was wondering if I can shift only these lookup files to either a mySQL or a PostgreSQL database.
Is it possible for one flask app to handle multiple databases (I would use couchdb Manager and Flask-SQLAlchemy here for instance).
Regards,
Galeej
Related
I created a Django application and deployed it for testing purposes. I now have data which i would not like to lose, but i want to use cockroachDB for storing the web-app data. The SQLite database is the one django creates automatically while applying migrations. Is there a way to move all that data to my cockroachDB cluster? I will be using cockroachDB for my database.
Thank you for your time
You can modify the settings.py in your django app to use a different database.
For example, one can use the dj_database_url module to set up a connection with cockroachDB.
My Django application (a PoC, not a final product) with a backend library uses a SQLite database - read only. The SQLite database is part of the repo and deployed to Heroku. This is working fine.
I have the requirement to allow updates to this database via the Django admin interface. This is not a Django managed database, so from Django's point of view just a binary file.
I could allow for a FileField to handle this, overwriting the database; I guess this would work in a self-managed server, but I am on Heroku and have the constraints imposed by Disk Backed Storage. My SQLite is not my webapp database, but limitations apply the same: I can not write to the webapp's filesystem and get any guarantee the new data will be visible by the running webapp.
I can think of alternatives, all with drawbacks:
Put the SQLite database in another server (a "media" server), and access it remotely: this will severely impact performance. Besides, accessing SQLite databases over the network does not seem easy.
Create a deploy script for the customer to upload the database via the usual deploy mechanisms. Since the customer is not technically fit, and I can not provide direct support, this is unfeasible.
Move out of Heroku to a self-managed server, so I can implement this quick-and-dirty upload without complications.
Do you have another suggestion?
PythonAnywhere.com
deploy your app and you can easily access all of your files and update them and your Sqlite3 database is going to be updated in real time without losing data.
herokuapp.com erase your Sqlite3 database every 24 hours that's why it's not preferred for Sqlite3 having web apps
I am building a small financial web app with django. The app requires that the database has a complete history of prices, regardless of whether someone is currently using the app. These prices are freely available online.
The way I am currently handling this is by running simultaneously a separate python script (outside of django) which downloads the price data and records it in the django database using the sqlite3 module.
My plan for deployment is to run the app on an AWS EC2 instance, change the permissions of the folder where the db file resides, and separately run the download script.
Is this a good way to deploy this sort of app? What are the downsides?
Is there a better way to handle the asynchronous downloads and the deployment? (PythonAnywhere?)
You can write the daemon code and follow this approach to push data to DB as soon as you get it from Internet. Since your daemon would be running independently from the Django, you'd need to take care of data synchronisation related issues as well. One possible solution could be to use DateTimeField in your Django model with auto_now_add = True, which will give you idea of time when data was entered in DB. Hope this helps you or someone else looking for similar answer.
I have two a django site, on some of the pages data is coming from a postgresql database. Another set of pages are connected to a sqlite database. The tables are from two different sources so I cannot merge them but I need to merge them in one django site. What is the best practice for this:
should I merge the two in a django application so modifiying model.py,views.. or I should put them into different django applications with different models, view ?
You can merge two apps into one instance, but then you won't be able to use default session and auth modules from django with models of one of it.
Good solution is to merge it into one project (so two apps can share some code and maybe some settings) but run it as 2 separate instances with 2 different settings loaded.
Also: you can just merge two databases, even if they are using different engines. Django has built in dumpdata and loaddata manage commands, you can use it to move data from one database to another.
I'm trying to develop a Django website with Heroku. Having no previous experience with databases (except the sqlite3 one from the tutorial), it seems to me a good idea to have the following file structure:
Projects
'-MySite
|-MySite
'-MyDB
I'm finding it hard to figure out how to do it, with psql commands preferring to put the databases in some obscure directory instead. Perhaps it's not such a good idea?
Eventually I want to be able to test and develop my site (it'll be just a blog for a while, I'm still learning) locally (ie. add a post, play with the CSS) and sync with the Heroku repository, but I also want to be able to add posts via the website itself occasionally.
The underlying data files (MyDb) has nothing to do with your project files and should not be under your project.
EDIT
added two ways to sync your local database with the database ON the Heroku server
1) export-import
This is the most simple way, do the following steps every now and then:
make an export on the Heroku server by using the pg_dump utility
download the dump file
import the dump into your local database by using the psql utility
2) replication
A more sophisticated way for keeping your local db in sync all the time is Replication. It is used in professional environments and it is probably an overkill for you at the moment. You can read more about it here: http://www.postgresql.org/docs/9.1/static/high-availability.html