Liquibase in multiple modules on same database - database-migration

We have two modules, app & admin, these modules are coupled, they have one database.
Is there a way to have liquibase files in both modules for same database?

liquibase needs a table (DATABASECHANGELOG) to track changes made on the schema it manages.
You can also split your application on two differents schemas and have independant changeset management
OR
keep only one schema and manually manage your two db-changelog. In this case you have to take care to avoid same id for changeset with a prefix for example

Yes, this is completely possible. You don't specify how you are running liquibase or how your source code is arranged (or many details at all), but if you had the source for both modules in a single github repository and you were on Linux, you could just have the changelog in one place and a softlink so that the file also appeared to be in the other place. Then if you were building a standalone jar with Maven you could have the maven POM configured so that it would package the changelog into both jars. As long as the relative path was the same, whenever you trigger a database schema update from either jar, each application should compare the changelog they have to the schema of the database they are working with, look at the DATABASECHANGELOG table and come to the same conclusion as to which changesets would need to be deployed.

Related

How to handle migrations for 3rd-party django apps in a deployment pipelines?

I'm using a 3rd party dependency modeltranslation-lokalise which requires me to run python manage.py makemigration. This will generate a migration file that sits within the package directory once generated, and therefore are not source controlled.
For all other migration files I'm keeping them in source control and would like to do the same for these, as it will allow us to track changes to the database in the environments we use, ensuring they can be easily replicated and that operations are not duplicated on the same database (e.g. by having the same operation exist in differently-named migrations).
My question: How are these 3rd party migrations meant to be handled when building and deploying to production (e.g. via a Dockerfile in a build pipeline?
My guess would be to find a way to reliably extract the migration files from the package and add it to source control.
The alternative would be: run makemigrations within the Dockerfile that builds the deployed image, however that could result in the same migration being run twice on the database should the name of that migration change (e.g. if there's a timestamp in the name). So I'm thinking best to avoid that.
Note: this question is similar but it considers the use of the third part optional and doesn't seem to consider the above alternatives.
I solve this using the answer of this question:
Django migration file in an other app?
By using the MIGRATIONS_MODULE settings, I was able to point django to create the migrations in my source directory, and thereby commit them to source control.

Best way to update Doctrine migrations tables

Overview
I have an application that uses Doctrine migrations. The application is a software code base used to manage multiple businesses who all use the same code instance in their own unique environments. i.e. Each code base is identical other than configurations.
Mistakes we made
One of the mistakes the dev team made was to include core migrations in the customer folders. i.e. With each new application we have to drop in the migration files or run a migration:diff to get going which I feel is not efficient and can lead to a mess.
What I would prefer is to have core migrations as part of the core code since it rarely changes and custom migrations on the client app side.
What I want to do
I want to move all core structure migrations to our code files
I want to have a single custom migration to drop in customised data in the client migration folder.
The problem
The problem I face is pretty much how to reorganize the migrations without breaking databases on existing client applications.
I have thought of two solutions:
Solution 1:
Add blank migrations as a placeholder for the new migrations I want.
Commit these to the repo and deploy to our environments.
They will be run, nothing will be changed, the migraitons table will store them as having been executed.
Next, Update the blank migrations to the actual code I want, and empty all other migration files. Commit this to the environments.
Finally - remove the unwanted migration files, remove the unwanted database migration records.
Solution 2
Change the migration location in the db to a new location
Remove all migration files and add blank migrations for the new ones I want
Commit this to the repo, allow to run and record the migrations as being run in the new table.
Add migration code.
Now all new applications will have the updated migration files and the old apps will have the new migration files...
Question:
Am I re-inventing the wheel? Is there a standard on how to do this as I am certain I am not the first to bump into this problem?
So for anyone who finds themselves in a similar position where they need to tidy up a mess of doctrine migrations, this should serve as a simple pattern to follow.
In our development environment we use continuous integration/git/kubernetes etc. and the following process works well with our environment.
The steps:
Update the migrations table name, this you can do in the configs quite easily.
'table_storage' => [
'table_name' => 'migration_version',
'version_column_name' => 'version_timestamp',
],
Next, delete your old migrations (delete the files) and run migrations:diff to generate a new one which will be a combination of all your changes.
Now comment out the code in the new file so that it's essentially an empty migration file.
On local, delete the old migrations table and run your build process which will add the new migration to the new table.
Commit to develop/staging/live etc. and repeat the process.
Now that the db in all your environments has the updated migrations file in it. You can now uncomments the code which will not be executed when you commit the file since it exists in your migrations table.
Hope this helps someone!

I am using Flyway database migration tool. Can I archive artifacts for what was migrated or they have to stay under current folder?

I am using flyway database migration tool. Let's assume I have 100 sql scripts under a folder and I migrated them by applying on a server. Later I added 50 more sql scripts. Can I be able to move these 100 old sql scripts away and archive them somewhere (artifactory or remote share). That way I can only have 50 sql scripts which are new and needed for current migration.
Is this possible? Or all sql scripts has to be present under the directory?
It would be useful if you could edit your question to state the reason why you want to only include the 50 scripts for the current migration, as this would help me recommend an approach.
There are two ways that I know of that you could use.
Create subfolders for each release, rather than have one single folder with an unmanageable list of scripts. This approach means that all migrations remain in the project, which has advantages such as being able to rebuild a test database from scratch (using flyway clean migrate) and also being able to reuse the project later on to repeat earlier deployments. If you do it this way, you will need to reference each subfolder using the flyway.locations parameter.
Remove the sql scripts that have already been run, as you have suggested. If you do this, you will need to run flyway baseline on any targets. This command will ensure that the targets don't expect migrations below a specific version, so the missing migration files won't confuse flyway.
My personal preference is the first option. If you can't use this for whatever reason, we'd be interested to understand why.

Migrations Plugin for CakePHP

I have few questions about this plugin.
1- what does it do?
Is it for exchanging databases between teams or changing their schema or creating tables based on models or something else?
2- if it is not meant to create tables based on models where can I find a script that does this?
3-can it work under windows?
thanks
The Migrations plugin allows versioning of your db changes. Much like is available in other PHP frameworks and Rails.
You essentially start with your original schema and create the initial migration. Each time you make a change you generate a 'diff' that gets stored in the filesystem.
When you run a migration, the database is updated with the changes you made. Think deployment to a staging or production server where you want the structure to be the same as your code is moved from environment to environment.
We are starting to look at this plugin so we can automate our deployments, as the DB changes are done manually right now.

Designing a full database migration stack

During development, I like the idea of frameworks like Entity Framework 4.3 Migrations (although I need it to work with sql scripts instead of Migration classes) keeping all developer databases up-to-date. Someone does an update to get the latest source, they try to run the application and get an error that they need to update their database to the latest migration (or have the migration happen automatically). Migration files are timestamped so devs don't have to worry about naming two files the same or the sequence the files need to be executed in.
When I get ready to build a WebDeploy deployment package, I want the package to contain the scripts required to move the production database to the latest db version. So somehow, MSBuild or WebDeploy need to make a decision about which scripts must be packaged. For deployment I don't want the application to try to update itself like what EF offers. I want to either hand the package to the IT department or do an auto-deploy through a deployment server.
Some of my questions are:
Can EF 4.3 work with sql scripts instead of DBMigration classes for my development needs (I'm already using it for my ORM so it would be nice if it could)?
Does MSBuild or WebDeploy understand the concept of database migrations (e.g. does it recognize the EF. 4.3 migrationHistory table) or do I have to make sure to give it just the scripts it needs to run that will bring my prod db to the latest migration? Manually deciding which scripts should be pakaged is not something I want to do so is there a MS WebDeploy extension that understands migrations?
Are my concerns and ideas valid? I'm just researching this stuff so I don't really know.
I think that your concerns are valid. During development anything that keeps the developer machines in sync will do. When deploying, more control is needed.
In my project we've decided to only use code-based migrations, to ensure that all databases are migrated in the same order with the same steps. Automatic migration and db creation is disabled by a custom initialization strategy that only checks that the db exists and is valid.
I've written more about the details in Prevent EF Migrations from Creating or Changing the Database. I've also looked a bit into merge conflicts which will eventually happen with multiple developers.
You can run SQL in the classes by calling the Sql method, or generate SQL from a migration by using the -script parameter with the update-database command.
No. They were looking at adding support for webdeploy but apparently decided against it before rtm. They did however release a command line app (and powershell script obviously) which you could call from either.
What I do in my projects is create an startup module in my applications and run any migrations that haven't been deployed automatically - Triggering EF migration at application startup by code. It's not perfect, developers have to run the app after getting latest before changing the db, but it works for both get latest and deployment.