Camunda User Profile Management with Docker - camunda

When camunda docker container is removed and added, I am loosing all the users and groups from the camunda application, I tried to manage user profile from outside of the docker through shared volume and it did not works, is there any solution to keep camunda users and groups outside of the docker or any option to export and import profiles systematically?

Camunda uses a relational database for persistence. Your users and groups will also be stored in that DB.
If you continuously loose all that data, I guess you are either working with an in-memory or file system (temp mount) datasource (h2?).
You should set up a postgres or mysql DB in a docker container and make sure you store its content correctly. If you then reconnect the camunda docker node to the datasource, you will have all store data again.

Related

Auto-updating Django postgres database when changes are made in legacy database

I'm working on a Django project with docker which use different legacy Microsoft SQL Server database. Since I didn't want to alter those legacy databases I use inspectdb and Django router to create models and then use the data there to read and write a default Postgres database with all the Django specific tables.
Everything was working fine, each instance of my model objects have the initial data from the SQL server db populating into the Postgres db. However whenever anything is updated in the SQL server db the data in the is not updating in Postgres db unless the docker container is stop and restarted. I'm wondering is there a way to autoupdate my postgres db when changes are made in the connected SQL Server db's with stopping the container?
Would I need to use something like celery or a cron job to schedule updates? I've used cron jobs before but only to pull from an api. Is it possibly to run celery or a cron job just on a database and not an api? Or is there a solution that's simplier?
I'm also running with nginx and gunicorn on an Ubuntu server. Using volumes to persist db.

Changing an AWS RDS MySQL instance to Postgres

What is the best way to go about this? I have a mobile app a project team developed whereby they setup the database as a MySQL instance. However, with this new project I have with my own developers, we believe Postgres would better suit or needs - but I want everything on one DB instance (data between the mobile app and the new project will be shared). What is the best way to accomplish this?
You will need to create a new RDS instance to switch the engine type to Postgres.
Whilst transitioning you will need to have both running, to migrate the DB across you will want to keep the data synchronised between both. Luckily AWS has the database migration service.
You should try to migrate your existing application to use this first, then remove the DMS setup and shutdown the MySQL database.

IBM Block Chain- Car Lease Demo state database location?

I was working on IBM block chain examples and I deployed car-lease-demo sample on a Linux system. I am not able to understand how the database is storing. I see that there is a location "/var/hyperledger/production" where the database is located but I did not find any location like that.
Can anyone explain me how the data is stored and how hyperledger fabric uses the database to store key-value pairs and where is the location of the db where all the data is stored?
Also I would like to know if we can use a different db configuration like NOSQL databases like Neo4j, MongoDB ??
The default implementation uses LevelDB as the backend store for data and is present on all peer nodes. You can enter the docker container in cli mode and see it for yourself.
Yes, you can change the default DB to any other NoSQL DB. Here's is an example of setting up CouchDB with Hyperledger fabric.
As you can see, CouchDB is hosted in a separate container linked to the peer node via an open port (Look at the Docker compose file for details of connection). You can do the same for any other NoSQL DB and use the correct PUT and GET APIs in chain code to access them. But you will have to make sure that data gets replicated in all the DBs in time to maintain the consistency of the Blockchain network

Update SQLite database on disk

My Django application (a PoC, not a final product) with a backend library uses a SQLite database - read only. The SQLite database is part of the repo and deployed to Heroku. This is working fine.
I have the requirement to allow updates to this database via the Django admin interface. This is not a Django managed database, so from Django's point of view just a binary file.
I could allow for a FileField to handle this, overwriting the database; I guess this would work in a self-managed server, but I am on Heroku and have the constraints imposed by Disk Backed Storage. My SQLite is not my webapp database, but limitations apply the same: I can not write to the webapp's filesystem and get any guarantee the new data will be visible by the running webapp.
I can think of alternatives, all with drawbacks:
Put the SQLite database in another server (a "media" server), and access it remotely: this will severely impact performance. Besides, accessing SQLite databases over the network does not seem easy.
Create a deploy script for the customer to upload the database via the usual deploy mechanisms. Since the customer is not technically fit, and I can not provide direct support, this is unfeasible.
Move out of Heroku to a self-managed server, so I can implement this quick-and-dirty upload without complications.
Do you have another suggestion?
PythonAnywhere.com
deploy your app and you can easily access all of your files and update them and your Sqlite3 database is going to be updated in real time without losing data.
herokuapp.com erase your Sqlite3 database every 24 hours that's why it's not preferred for Sqlite3 having web apps

Good way to deploy a django app with an asynchronous script running outside of the app

I am building a small financial web app with django. The app requires that the database has a complete history of prices, regardless of whether someone is currently using the app. These prices are freely available online.
The way I am currently handling this is by running simultaneously a separate python script (outside of django) which downloads the price data and records it in the django database using the sqlite3 module.
My plan for deployment is to run the app on an AWS EC2 instance, change the permissions of the folder where the db file resides, and separately run the download script.
Is this a good way to deploy this sort of app? What are the downsides?
Is there a better way to handle the asynchronous downloads and the deployment? (PythonAnywhere?)
You can write the daemon code and follow this approach to push data to DB as soon as you get it from Internet. Since your daemon would be running independently from the Django, you'd need to take care of data synchronisation related issues as well. One possible solution could be to use DateTimeField in your Django model with auto_now_add = True, which will give you idea of time when data was entered in DB. Hope this helps you or someone else looking for similar answer.