Is there a way to apply all flyway schema versions to a new database to bring it up to date? - database-migration

I use flyway for DB schema migrations.
But now I also want to make it possible to dynamically create a new database (for testing), update it to the latest schema, and fill it with test data.
Is it possible to have flyway baseline a new DB and apply ALL schema version scripts sequentially so the DB is updated to the latest state?
I could not find any examples of this. I don't want to have a separate process or scripts for creating a new DB with the right schema.

Related

I want to run migration from PostgreSQL to sqlite3

currently I am using PostgreSQL database in my project but I also want to use SQLite for localhost, so I want to run migrate command but there are errors because in SQLite array field is not used so I want to convert array field to JSONfield and makemigrations but in migrations old migrations also present. S I want to write custom logic in migrations. So, it use old migrations when database is PostgreSQL and new migrations when it is sqlite3.
I don't want create new migrations and migration table every time I switch databases.
SQLite is more of a flat file system. I think the original idea is that you can store a small amount of data on a device and update the main database, or fetch info from a database, when the device is 'idle' as a background process. I know there may be some people putting this comment down but essentially SQLite is 'Light' and a flat file. Those considerations should be taken into account. btw I see that there is MYSQL for Andriod but I have not tried it out.

Database migration in loopback4

I wanted to create database tables from model definition in loopback4. How to do that using loopback4's Auto update functionality?
You can use the npm run migrate script in the LoopBack4 application. See https://loopback.io/doc/en/lb4/Database-migrations.html for details. Please note that there is an option to drop existing schemas before creating a new one.

Is there a way to change schema with AWS Amplify?

I have an existing AWS Amplify schema with data deployed to DynamoDB tables.
I want to change the AWS Amplify schema.
When I change the schema, how do I include the data in my old tables and migrate them to the new tables created by AWS Amplify?
The answer to this depends on how much you are changing your schema. If you are just adding new attributes to your models or taking away attributes then you won't need to do anything. If you are renaming or creating new models this will get trickier. My advice would be to add all new schema models you want without removing the old ones. Then write a few migration scripts using the dynamodb directly to migrate your data. and then once all of the old data is migrated you can delete your old models.

Updating old changesets - Liquibase

We use liquibase for source controlling the database. Initially, we started with Postgres and created the changesets with datatype of columns which are specific to Postgres.
For example, we have a changeset which creates the table with fields of type 'JSON'. Now that, we wanted to move to other database. So, when we run the changeset against the other database, it fails to create the table. I tried adding 'failOnError=false'. But, the later changesets failed because the table doesnot exist.
Could you please suggest how to refactor the old changeset to make compatible with other database as well?
You can make your changesets database specific. You could try re-creating the changeset to work in the new DB, and adding the "dbms" attribute equal to the new DB to the new changeset. While adding the same attribute but with the old DB to the old changeset.

AWS Data Migration Service (DMS) not moving identity, foreign keys, default values, indexes

I was able to clone one of my SQL Server database using the DMS. It copied clustered indexes, primary key definition etc along with the data.
However, it didn't not move/copy other constraints (identity, foreign key definition, default values) or any indexes.
I have generated / scripted out the indexes, default constraints and foreign keys, executed successfully. But is there a way to turn on the IDENTITY on respective columns ?
Figured out there is no way i can do this with AWS DMS as it do not import secondary/foreign keys, Indexes and Identity columns as well. You need to do it manually yourself by generating a script from SSMS or writing your own script.
Check this FAQ from Amazon:
Q. Does AWS Database Migration Service migrate the database schema for me?
To quickly migrate a database schema to your target instance you can rely on the Basic Schema Copy feature of AWS Database Migration Service. Basic Schema Copy will automatically create tables and primary keys in the target instance if the target does not already contain tables with the same names. Basic Schema Copy is great for doing a test migration, or when you are migrating databases heterogeneously e.g. Oracle to MySQL or SQL Server to Oracle. Basic Schema Copy will not migrate secondary indexes, foreign keys or stored procedures. When you need to use a more customizable schema migration process (e.g. when you are migrating your production database and need to move your stored procedures and secondary database objects), you can use the AWS Schema Conversion Tool for heterogeneous migrations, or use the schema export tools native to the source engine, if you are doing homogenous migrations like (1) SQL Server Management Studio's Import and Export Wizard, (2) Oracle's SQL Developer Database Export tool or script the export using the dbms_metadata package, (3) MySQL's Workbench Migration Wizard.