Make Evolve (DB Migration) Erase include all schemas - visual-studio-2017

I'm using Evolve which is based on Flyway for database migrations. When I create the database from scratch my migration scripts create schemas named rls and 'legacy', function and a procedure in rls, and a table in legacy.
When I set the Evolve command to erase to erase my database so that I can create it from scratch, it first fails because I have a couple security policies that depend on the tables that I guess Evolve can't detect. So I drop those myself and build my project again to run the Erase command and the output window shows:
4> Executing Erase...
4> Evolve initialized.
4> Successfully erased schema dbo.
4> Erase schema(s) completed: 1 erased, 0 skipped.
Notice it only mentions the schema dbo and doesn't even list the others as skipped. All of the objects in my database in the dbo schema are gone, but those in rls and legacy are still there including the schemas themselves. So if I now switch my command back to migrate to create the database from scratch, it's going to blow up when it tries to create these objects because they already exist. According to the documentation it's supposed to
[erase] all the database objects (tables, views...) of the schemas
created by Evolve or found empty.
Why isn't it erasing them?

You need to include the schema names in the call arguments like this:
.\evolve.exe -c "Host=10.10.10.10;Port=5432;Database=a;Username=a;Password=v;Enlist=true" -l .\db -s dbo -s rls -s legacy --metadata-table-schema migrations erase postgresql
Please note, that evolve is only able to erase schemas which were created by Evolve. Otherwise, you got this error:
This schema was not empty when Evolve first started migrations.
So you probably need to delete the "Create schema" commands from your commands and instead pass the names of the schemas to evolve, evolve will then create them for you.

If Evolve marks it as empty then it can erase the database:
$ dotnet evolve migrate '#args.txt'
Executing Migrate...
Evolve initialized.
Mark schema public as empty. <----------------------------- EMPTY
No metadata found.
Successfully applied migration V1_1_1__CreateUser.sql in 244 ms.
Successfully applied migration V1_1_2__CreateProduct.sql in 67 ms.
$ dotnet evolve erase '#args.txt'
Evolve initialized.
Executing Erase...
Successfully erased schema public.
Erase schema(s) completed: 1 erased, 0 skipped.
Note: Delete manually the database if evolve did not mark as empty its public scheme.

Related

Avoiding InconsistentMigrationHistory with explicit inserts in migrations

I have a little problem with the order of the migrations. The fact is that in my database there is a "Products" model whose migration is one of the first in the history list, call it 001_products. After this migration, others are executed that make inserts in this same table (some inserts necessary for the basic operation of the application), called migration 002_inserts_products.
The problem appeared when modifying the "Products" model, call it 003_modify_products migration. The migration was applied after the inserts and made the tests fail (tests that generate a test database doing all the migrations), which followed this order:
001_products
002_inserts_products
003_modify_products
The solution then was to add a dependency on migrations that made inserts in "Products" with respect to the subsequent migration that modified that table. That is, make 002_inserts_products dependent on 003_modify_products.
However, this, which worked in the tests and locally (where the modification in "Products" had already been applied), does not work in production, since there the migration that has not been applied is the one that modifies the "Products" model ".
That is, the panorama in production is:
[X] 001_products
[X] 002_inserts_products
[ ] 003_modify_products
When trying to do the new migration, the error that appears is:
django.db.migrations.exceptions.InconsistentMigrationHistory: Migration 002_inserts_products is applied before its dependency 003_modify_products on database 'default'.
The question is how to set the migrations to work both in test and in production (that is, in the context that the previous migrations have already been done)?
Unfortunately, you're trying to find a solution for a problem you created yourself by modifying an older migration to make it dependent on a newer migration to get around your tests failing.
The proper solution would be to do the following:
Remove 002_inserts_products's dependency on 003_modify_products, and return it to its original state.
Add 004_update_products to update any products inserted via 002_insert_products so that they work with the table modifications in 003_modify_products.
Update your tests to accommodate the changes made in 003_modify_products.
It's never a good idea to change the expected ordering of migrations that have already run, because while it might work in your local environment, it's very likely to blow up when you're deploying to a server on which none of those migrations have run.
Also remember that tests failing is not always indicative of something you've done wrong -- tests, especially database tests, are not necessarily future-proof. It's totally normal to have to update tests due to schema changes.

How can I remove a Django migration file?

I made a migration, and realised that I made an error (by adding a default value), I then made a new migration which allows for null.
I don't want my colleagues to run the first migration which adds a default value to thousands of records. How can I delete that migration without breaking the current migrations (usually if you just delete a migration you get a heap of errors which are a pain to fix).
I'd assume you could use a command? I'd assume it'd be something like this ~>
e.g django manage.py deletemigration <migration_id>
Squash
You can do a ./manage.py squashmigrations since one of your migrations are effectively cancelling out another the end result will be the field being nullable. Your colleagues will not have to go through the step of adding a default value.
Squashing is the act of reducing an existing set of many migrations
down to one (or sometimes a few) migrations which still represent the
same changes.
Edit the migration file
You can edit the migration file by hand to remove the changes to the column. A migration can actually have an empty migration
class Migration(migrations.Migration):
dependencies = [
(some stuff here),
]
operations = []

How to change Sitecore Template field Item Id without data loss?

I recently noticed there is a difference in Item Id for a Sitecore template field between 2 environments (Source and Target). Due to this, any data changes to the field value for the dataitem using the template is not reflecting to target Sitecore database.
Hence, we manually copy the value from source to target and which takes lot of time to sync the 2 environments. Any idea how to change the template field Item Id in Sitecore without data loss in target instance?
Thanks
The template fields have most likely been created manually on the different servers, as #AdrianIorgu has suggested. I am going to suggest that you don't worry about merging fields and tools.
What you really care about is the content on the PRODUCTION instance of your site (assuming that this is Target). In any other environment, content should be regarded throwaway.
With that in mind, create a package of the template from your PRODUCTION instance and the install that in the other environments, deleting the duplicate field from the Source instance. The GUIDs of the field should now match across all environments. Check this into your source control (using TDS or Unicorn or whatever). You can then correctly update any standard values and that will be reflect through the server when you deploy again.
If your other environments (dev/qa/pre-prod) result in data loss for that field then don't worry about it, restore a backup from PROD.
Most likely that happened because the field or the template was added manually on the second environment, without migrating the items using packages, serialization or a third-party tool like TDS or Unicorn.
As #SitecoreClimber mentioned above, you can use Razl to sync the two environments and see the differences, but I don't think you will be able to change the field's GUID, to have the two environments consistent, without any data loss. Depending on the volume of your data, fixing this can be tricky.
What I would do:
make sure the target instance has the right template by installing a package with the correct template from source (with a MERGE-MERGE operation), which will end up having a duplicate field name
write a SQL query to get a list of all the items that have value for that field and update the value to the new field
Warning: this SQL query below is just a sample to get you started, make sure you extend and test this properly before running on a CD instance
use YOUR_DATABASE
begin tran
Declare #oldFieldId nvarchar(100), #newFieldId nvarchar(100), #previousValue nvarchar(100), #newValue nvarchar(100)
set #oldFieldID = '75577384-3C97-45DA-A847-81B00500E250' //old field ID
set #newFieldID = 'A2F96461-DE33-4CC6-B758-D5183676509B' //new field ID
/* versionedFields */
Select itemId, fieldid, value
from [dbo].[versionedFields] f with (nolock)
where f.FieldId like #oldFieldID
For this kind of stuff I sugest you to use Sitecore Razl.
It's a tool for comparing and merging sitecore databases.
Razl allows developers to have a complete side by side comparison between two Sitecore databases; highlighting features that are missing or not up to date. Razl also gives developers the ability to simply move the item from one database to another.
Whether it's finding that one missing template, moving your entire database or just one item, Razl allows you to do it seamlessly and worry free.
It's not a free tool, you can check here how you can buy it:
https://www.razl.net/purchase.aspx

Upgrading to Django 1.7: table prefix using `class_prepared`

I'm at a new job. One of my predecessors used the class_prepared signal to apply a prefix to all the table names, i.e. django_content_type is oursite_django_content_type. I think this was unnecessary and ill advised (I looked at doing this at a previous job and did not), as the documentation says that class_prepared is an implementation detail. Well -- now it's a problem.
The site seems to work okay (I didn't thoroughly test it yet) but I can't run our unit tests. This is because the migrations are a core feature now and the contenttypes migration defines the db_table_name.
I dropped some debugging statements and this is the sequence of events.
class_prepared is run for contenttypes -- the db_table_name for the class is altered to oursite_django_content_type.
The migration is run and it creates the django_content_type table.
Tests are run and they can't find oursite_django_content_type.
I think I'm going to have to migrate the tables to their default values, but -- Does anyone have a suggestion so that I might put this off so that my successor will have to deal with it?

EF 4.3 Database Migrations - Is there a way to ignore errors?

Is there a way to ignore errors when running a manual migration?
We have client databases in different states and I need to get them all updated to the latest version.
The reason I ask about ignoring errors is because I just want to code my migration like this
public override void Up()
{
AddColumn("ClientUser", "LastSyncTime", c => c.Guid());
AddColumn("ClientUser", "FileTransferToken", c => c.Guid());
AddColumn("ClientUser", "DateFileTransferTokenIssued", c => c.DateTime());
}
but naturally and expectedly it will throw an exception where the column already exists.
No. It is not supposed use case for EF Migrations. Migration drives database from one defined state to another defined state. If you have database in different states you need multiple migrations each covering just part of the transition.
If you want to start to use migrations in existing project with multiple databases you should first move all your databases to the same state without migrations and start to use it as initial state after which all changes will be handled only through migrations. Otherwise you will have a lot of problems.
This doesn't answer your specific question, but it may be an answer for your problem.
Use the Database Project in VS 2010 to create a schema of your target database.
You can use this "Gold Standard" schema to compare your other databases that are in different states and produce a delta script to take it from its current schema to the target schema.
Once you are at a known state across your databases, then switch to the Database Migrations for the schema moving forward.
Keith