Is there a way to change schema with AWS Amplify? - amazon-web-services

I have an existing AWS Amplify schema with data deployed to DynamoDB tables.
I want to change the AWS Amplify schema.
When I change the schema, how do I include the data in my old tables and migrate them to the new tables created by AWS Amplify?

The answer to this depends on how much you are changing your schema. If you are just adding new attributes to your models or taking away attributes then you won't need to do anything. If you are renaming or creating new models this will get trickier. My advice would be to add all new schema models you want without removing the old ones. Then write a few migration scripts using the dynamodb directly to migrate your data. and then once all of the old data is migrated you can delete your old models.

Related

Is there a way to apply all flyway schema versions to a new database to bring it up to date?

I use flyway for DB schema migrations.
But now I also want to make it possible to dynamically create a new database (for testing), update it to the latest schema, and fill it with test data.
Is it possible to have flyway baseline a new DB and apply ALL schema version scripts sequentially so the DB is updated to the latest state?
I could not find any examples of this. I don't want to have a separate process or scripts for creating a new DB with the right schema.

AWS DMS - Migrate - only schema

We have noticed that if a table is empty in SQL Server, the empty table does not come via DMS. Only after inserting a record it starts to show up.
Just checking, is there a way to get the schema only from DMS?
Thanks
You can use Schema conversion tool for moving DB objects and Schema. Its a free tool by AWS and can be installed on On-Prem server or on EC2. It gives a good report before you can actually migrate the DB schema and other DB objects. It shows how many Tables, SP's Funcs etc can be directly migrated and shows possible solutions too.

How to manage schema changes to a BigQuery table via Terraform

We currently use the following mechanism to create a BigQuery table with a pre-defined schema and we created the infrastructure.
https://www.terraform.io/docs/providers/google/r/bigquery_table.html
The dev team decided to modify the schema by adding another column, so we are planning to modify the schema changes in the above terraform script to enable this.
What would be the best way to manage such schema migrations in production environments?
Since in a production environment, we would be expected to retain the table data while the schema migration is performed
It seems you cannot modify the schema of the table and retain data using Terraform. Instead you can use bq command-line for the same. https://cloud.google.com/bigquery/docs/managing-table-schemas#bq.
Looks like there was a fix for it -
https://github.com/hashicorp/terraform-provider-google/issues/8503

DynamoDB schema updates with AWS Amplify

According to the AWS Amplify documentation:
"objects annotated with #model are stored in Amazon DynamoDB";
"a single #model directive configures ... an Amazon DynamoDB table"; and
one can "push updated changes with amplify push".
It seems clear that amplify push creates a DynamoDB table for each #model.
My questions relate to schema updates:
I imagine that adding/removing a model or adding/removing a field in a model works by updating the schema document and then running amplify push. Is that right?
How does one rename a model or a field? How would amplify push know to rename vs. drop the old and add the new?
How does one implement a migration that requires some business logic, e.g., to update the contents of existing rows? Doing this without Amplify has already been addressed but it is unclear whether that would conflict with something that amplify push might try to do.
DynamoDB is schema-less, and doesn't care about your application schema as long as you don't try to change its hash key or range key
Therefore, nothing really happens on the datastore side. If you drop a key and add a new one in your schema, then your application will start to search and write data to the new key. Old key will simply be ignored from now on, but existing data will be kept in the datastore.
If you want to rename a key, then you would have to migrate the data by yourself through mass update on the table. There are many ways to do it, the simpliest one being scaning the table and performing updates on found items.
Have you tried compiling the schema with this:
amplify api gql-compile
Try running
amplify codegen models
before doing the
amplify push

Syncing db with existing tables through django for an existing schema table and also updating few columns for the tables and the rest automatically

I am doing a poc in Django and i was trying to create the admin console module for inserting,updating and deleting records through django admin console through models and it was doing fine
I have 2 questions.
1.I need to have model objects for existing tables which needs to be present in a particular schema.say schema1.table1
Here as of now i was doing poc for public schema.
So can it be done in a fixed defined schema and if yes how.Any reference would be very helpful
2.Also i wanted to update few columns in the table through console and the rest of the columns will be done automatically like currentimestamp and created date etc.Is it possible through default django console and if yes kindly share any reference
Steps for 1
What i have done as of now is created a class in model.py with attributes as author,title,body,timeofpost
Then i used sqlmigrate after makemigrations app to create the table and after migrating have been using the admin console for django to insert and update the records for the table created.But this is for POC only.
Now i need to do the same but for existing tables with whom i can interact and insert or update record for those existing tables through admin console.
Also the tables are getting created in public schema by default.But i am using postgres and the existing tables are present in different schemas and i wanted to insert,update and delete for this existing tables.
I am stuck up here as i dont know how to configure model with existing database schema tables through which we can interact through django console and also for different schemas and not in public schema
Steps for 2:
Also i wanted the user to give input for few columns like suppose in this case time of creation is not required to be given as input by user .Rather it should be taken care when the database is updating or creating
Thanks
In order for Django to "interact" with an existing database you need to create a model for it which can be done automatically as shown here. This assumes that your "external" database isn't going to be changed often because you'll have to keep your models in sync which is tricky - there are other approaches if you need that.
As for working with multiple database schemas - is there a reason you can't put your POC table in the same database as the others? Django supports multiple databases, but it will be harder to setup. See here.
Finally, it sounds like you are interested in setting the Django default field attribute. For an example of current time see here.