I need to use a custom db instead of one that boxfuse creates when deploying my application to aws. I created a aws rds database, and the app won't connect to it after deploying, instead it uses a database that boxfuse creates while deploying. I use flyway for migrations.
I tried to put the url, username and password into a configuration boxfuse.yml file, but nothing happens.
I read the boxfuse documentation, but couldn't find a solution to my problem.
Is there a easy way to solve my problem?
To use your own database and prevent Boxfuse from provisioning one for you, the easiest is to recreate your app with -db.type=none (see https://cloudcaptain.sh/docs/payloads/springboot#databases)
Alternatively if you want to use both the Boxfuse-provisioned database and your own, you have to manually define a second DataSource bean with the correct connection parameters which you can then pass to Flyway or any other library requiring it.
Related
When we need to change resource in AWS like cognito, for example adding lambdas for login, changeing custom attributes ect. Is there a standard way to migrate these? For example, one we are thinking of doing this is by having a spring boot service that can we would hit with a command to do the migration lined out in java code. This does not feel like the correct way to do it so I'm looking for advice.
I currently have a server build process that uses Terraform and deploys a server all from code.
I'm looking for a web UI with forms that I could either populate specific fields and or do API get commands against a VCenter or wherever the server is being built to populate the specific fields. The fields that get populated would be stored as the variables.tf file and when someone hits submit, it would run the actual Terraform command terraform apply to build the server based on the variables. My guess is the terraform binaries would have to live on there so it could run in the background.
It doesn't have to be some super fancy web page, just something that I could potentially make look cool for Director level folks.
Also, I don't want to use TF enterprise, yet. I've looked into a couple of open source projects (atlantis and terrahub) but none seem to be what I'm looking for.
I'm far from a web developer so any help would be awesome.
You can try with SLD
Stack-Lifecycle-Deployment
I think it has everything that you need
It is very intuitive, it has a web interface and a rest api to easily integrate it with the rest of the applications.
I currently am working on an Application with a MySQL back-end hosted on GCP. Code is Node.js and we use CircleCI and Sequelize. Right now after deploying my scripts, if I had any schema changes, I would go in to the database and manually add those fields with sql scripts. We create migrations in our code and want to add those new field programmatically. I need to go in the container root and manually run the npx sequelize db:migrate. Is there a way to automate that?
For background info, I come from the database side not developer side, so please be gentle if my question seems to have an obvious answer.
When i'm deploying a Django project (tried two of them) in Azure App Services (Linux), i always get the error SQLite3 database locked: OperationalError: database is locked,when trying to log in. Has someone an idea or workaround to resolve the problem without changing to another database? I changed the default timeout as mentioned by the official django documentation: https://docs.djangoproject.com/en/2.2/ref/databases/#sqlite-notes, but the problem remain. I want to keep using sqlite database!
Thanks for your help.
App Service locks the db.sqlite3 file, preventing both reads and writes. This behavior doesn't affect an external database:
https://learn.microsoft.com/en-us/azure/devops/pipelines/languages/python-webapp?view=azure-devops#considerations-for-django
https://vscode-eastus.azurewebsites.net/docs/python/tutorial-deploy-app-service-on-linux
Test App: https://github.com/itsimplified/slick-crud-app
However, you should be able to use the following way to make the WebApp work by moving the SQLite to Azure Storage.
Please follow the below steps in order to achieve this:
Mount your Azure Storage to the WebApp:
Example:
Make necessary changes to the path of the DB file in the application:
Example:
You should also be able to see the DB file in the Storage:
I'm currently running a Django app on the Azure server. I have a MySQL database to access using SSL. The SSL certificates I need to access the server are physically in the repo and I got my Django settings file to point to these using a relative path.
I have Azure set up to do continuous deployment from BitBucket. Problem is, at the end of the deployment, it will copy over all the files EXCEPT for the .pem files that I need.
I have to manually copy over the certificates everytime I push a commit. The files are in static/certs/*.pem
Is there something wrong with Azure? Or BitBucket? Or is there a better way of doing this?
I figured it out. Anything put manually inside the static folder gets cleaned out by Azure during deployment.
Just don't put anything inside the static folder