Handle multiple remote databases pouchdb - ionic2

I'm trying to build an ionic2 application which syncs data from cloudant, i want to be able to sync data from multiples remote databases with pouchdb.
Any solutions?

Related

What is the best way to replicatedata from Oracle Goldengate Onpremise to AWS (SQL or NOSQL)?

What is the best way to replicate data from Oracle Goldengate On premise to AWS (SQL or NOSQL)?
I was just checking this for azure,
My company is looking for solutions of moving data to the cloud
Minimal impact for on-prem legacy/3rd party systems.
No oracle db instances on the cloud side.
Minimum "hops" for the data between the source and destination.
Paas over IaaS solutions.
Out of the box features over native code and in-house development.
oracle server 12c or above
some custom filtering solution
some custom transformations
** filtering can be done in goldengate, in nifi, azure mapping, ksqldb
solutions are divided into:
If solution is alolwed to touch.read the logfile of the oracle server
you can use azure ADF, azure synapse, K2view, apache nifi, Orcle CDC adapter for BigData (check versions) to directly move data to the cloud buffered by kafka however the info inside the kafka will be in special-schema json format.
If you must use GG Trail file as input to your sync/etl paradigm you can
use a custom data provider that would translate the trailfile into a flowfile for nifi (you need to write it, see this 2 star project on github for a direction
use github project with gg for bigdata and kafka over kafkaconect to also get translated SQL dml and ddl statements which would make the solution much more readable
other solutions are corner cases, but i hope this gives you what you needed
In my company's case we have Oracle as a source db and Snowflake as a target db. We've built the following processing sequence:
On-premise OGG Extract works with on-premise Oracle DB.
Datapump sends trails to another host
On this host we have OGG for Big data Replicat that processes trails and then sends result as json to AWS S3 bucket.
Since Snowflake DB can handle JSON as a source of data and works with S3 bucket it loads jsons into staging tables where further processing takes place.
You can read more about this approach here: https://www.snowflake.com/blog/continuous-data-replication-into-snowflake-with-oracle-goldengate/

Using two different databases in Flask

I am currently using couchdb as my backend for my flask application.
I currently have some huge master files that I am not able to put into couch (CouchDB throws an OS errorwhen I build a view function for these lookup files... thats a whole different bag that I want to get to at a later point).
These lookup files do not have any relation to the datasets in the couchdb dataset and I was wondering if I can shift only these lookup files to either a mySQL or a PostgreSQL database.
Is it possible for one flask app to handle multiple databases (I would use couchdb Manager and Flask-SQLAlchemy here for instance).
Regards,
Galeej

How to point to a different database in oracle APEX v 5.x?

I have Oracle APEX configured on my laptop pointing to Oracle express DB on my laptop also.
I want to point to a different database on another server (specifically Oracle eBusiness suite database). How could this be achieved?
Does all the data live in the other database? Or does most of the data live in your local database and you just need to pull a bit of data from the other database?
If you are building applications that interact primarily with the data in the Oracle eBusiness Suite database, you'd realistically want to install APEX (if it is not already installed there) in the Oracle eBusiness Suite database and build your APEX application there. If you are building applications that interact primarily with data in your local database and you just need to pull a bit of data from the eBusiness Suite database, you can create a database link in your local database that connects to the remote database and reference objects over the database link.

Update SQLite database on disk

My Django application (a PoC, not a final product) with a backend library uses a SQLite database - read only. The SQLite database is part of the repo and deployed to Heroku. This is working fine.
I have the requirement to allow updates to this database via the Django admin interface. This is not a Django managed database, so from Django's point of view just a binary file.
I could allow for a FileField to handle this, overwriting the database; I guess this would work in a self-managed server, but I am on Heroku and have the constraints imposed by Disk Backed Storage. My SQLite is not my webapp database, but limitations apply the same: I can not write to the webapp's filesystem and get any guarantee the new data will be visible by the running webapp.
I can think of alternatives, all with drawbacks:
Put the SQLite database in another server (a "media" server), and access it remotely: this will severely impact performance. Besides, accessing SQLite databases over the network does not seem easy.
Create a deploy script for the customer to upload the database via the usual deploy mechanisms. Since the customer is not technically fit, and I can not provide direct support, this is unfeasible.
Move out of Heroku to a self-managed server, so I can implement this quick-and-dirty upload without complications.
Do you have another suggestion?
PythonAnywhere.com
deploy your app and you can easily access all of your files and update them and your Sqlite3 database is going to be updated in real time without losing data.
herokuapp.com erase your Sqlite3 database every 24 hours that's why it's not preferred for Sqlite3 having web apps

Sitecore agents on instances sharing a DB

Our production Content Delivery environment has two web servers and one DB server that the two web servers share.
I know that there are a lot of DB related background tasks/agents that run in out-of-box Sitecore which do thing to the DB, like clean up tables, etc. Is it ok to have both web servers doing these tasks? Or are there tasks that should be turn off on the second server so that both aren't trying to do the same thing on the same DB? I don't see anything about this specifically in their Scaling Guide. Thanks.
As far as I'm aware that are not any issues with this setup - I have sites running like this with no issues. As long as both CD web servers only share Web and Core databases you should be fine.
Section 3.1 (Configuring a Publishing Target) in the Scaling guide has this setup on a diagram where the Core and Pub databases are shared between the two CD boxes.
The Pub database is just another Web database that is configured with a publishing target.