Cross database in memory unit testing - unit-testing

Objective:Unit Testing DAO layer using any in memory database(HSQL/Derby?)
My end database is Sybase, so the query uses cross database joins.
Eg. of the query :
select table1.col1, table2.col2
from db1..table1, db2..table2
where table1.col1=table2.col2
Obviously, table1 is a table in Database db1 and table2 is a table in Database db2.
Is there any way to test this out?

if you use db specific feature (cross db joins) then you have 2 options to prevent failure on other db:
teach other db to understand the query. for example you can register new functions in h2. this probably won't work in your case. but this way you don't test anything, you just prevent failure and/or allow your application to work on other db
don't test it on other db. face it, even if your queries run on h2, you still have to run them on sysbase. otherwise you simply don't know if it works
so this way or another you need to run your tests on sysbase. if you don't plan to support other database then what's the point in fighting for making this specific test works on h2? it will have completely irrelevant to your production code.
and answering your question: Is there any way to test this out?
on sysbase? yes, just run your query and clear your database after. you can use vagrant or dedicated db
on h2/hsqlbd? probably not

Related

Django didn't save data to postgre during test case executing

I have webapp, which has a lot of big calculation logic. To stable it and spend less time to regress tests I decided to create some unit test cases on Django.
My purpose - don't create new test DB (I use Postgre SQL, is important) for every testcases running and save DB with all test data after test cases finished. I need to have test data for analysis and search bugs.
I created custom TestRunner, which inherited on DiscoverRunner with change some parameters:
keepdb = True debug_mode = True verbosity = 2
This thing is work, I reuse one DB for all test-cases.
My problem is I don't see test data which created during test cases in my test DB. I had some failed testcases or all OK - nothing change.
Interesting facts for me:
During test cases I have some queries to DB about new test data and these queries returns right result.
During test cases I created new objects and sequences in DB are changed as if test data exists in DB.
What I do wrong?
You might be confusing database and data in the database.
I would guess that the keepdb flag only keeps the database (schema etc.).
Some testcase remove the data at the end of the run, eg. TransactionTestCase
TestCase running on a database that does not support rollback (e.g. MySQL with the MyISAM storage engine), and all instances of TransactionTestCase, will roll back at the end of the test by deleting all data from the test database.

How to query data from real Database created by django-pytest db fixture

I write db-related tests for my django project using pytest. I use db fixture for creation of records in the database. Test works fine but how to select these records from real Postgres db. When I try to select these records via PgAdmin I get empty list. Can you please clarify how db fixture works and where does it store records?
pytest.mark.django_db wraps the whole test in a transaction (which is rolled back a the end of the test) which is not visible from pgadmin.
You can try to use #pytest.mark.django_db(transaction=True) to enable testing transactions in your tests. However, it is slower and flushes the database after the test
When running tests, Django is creating a second, temporary database and run tests inside them, so it won't affect your production/development database and you'll have always the same, predictive outcome.
On top of that, depending on how each test is configured, Django can wrap the test inside a transaction that will be rolled back after a specific test has finished, so tests won't affect each other.
If you want to see this data in your real database, after the test has finished, you have to configure your tests, for which you want it to persist, to not be wrapped in a transaction. As you're using pytest-django, there are 2 possible methods to handle that, depending on which "style" of test suites are you using:
If you're using TestCase or TransactionTestCase from django.test, then switch to SimpleTestCase. Just be aware that together with this transaction rollback, you'll lose some more database-related features.
If you're using pytest.mark.django_db, simply remove it.
As for the separate database, you can either just connect to it during the tests or add a --reuse-db to the command line when running the tests, so pytest-django won't destroy it after testing has been completed. This separate database will just have test_ prefix added to your regular database name.

Am I required to use django data migrations?

I migrated a database that was manipulated via SQL to use Django's migration module: https://docs.djangoproject.com/en/3.0/topics/migrations/.
Initially I thought of using migrations only for changes in the model such as changes in columns or new tables, performing deletes, inserts and updates through SQL, as they are constant and writing a migration for each case would be a little impractical.
Could using the migration module without using data migrations cause me some kind of problem or inconsistency in the future?
You can think about data migration if you made a change that required also a manual fix on the data.
Maybe you decided to normalize something in the database, e.g. to split a name column to the first name and the last name. If you have only one instance of the application with one database and you are the only developer then you will also not write a data migration, but if you want to change it on a live production site with 24 x 7 hours traffic or you cooperate with other developers then you probably prepare a data migration for their databases or you will thoroughly test the migration on a copy of live data that the update will work on the production site correctly without issues and with minimal shutdown. If you don't write a data migration and you had no problem immediately then it is OK and will be not worse then a possible ill-conceived data migration.

Push from one sql server to another autonomously

I have an application that requires me to pull certain information from DB#1 and push it to DB#2 every time a certain entry in a table from DB#1 is updated. The polling rate doesn't need to be extremely fast, but it probably shouldn't be any slower than 1 second.
I was planning on writing a small service using the C++ Connector library, but I am worried about putting too much load on DB#1. Is there a more efficient way of doing this? Such as built in functionality within an SQL script?
There are many methods to accomplish this, so it may be other factors you prefer that drive the approach.
If the SQL Server databases are on the same server instance:
Trigger on the DB1 tables that push to the DB2 tables
Stored procedure (in DB1 or DB2) that uses MERGE to identify changes and sync them to DB2, then use SQL job to call the procedure on your schedule
Enable Change Tracking on database and desired tables, then use stored proc + SQL job to send changes without any queries on source tables
If on different instances or servers (can also work if on same instance though):
SSIS Package to identify changes and push to DB2 (bonus can work with change data capture)
Merge Replication to synchronize changes
AlwaysOn Availability Groups to synchronize entire dbs
Microsoft Sync Framework
Knowing nothing about your preferences or comfort levels, I would probably start with Merge Replication - can be a bit tricky and tedious to setup, but performs very well.
You can create a trigger in DB1 and dblinks in between DB1 and DB2. So you can natively invoke trigger within DB1 and transfer data directly to DB2.

Coldfusion: Move data from one datasource to another

I need to move a series of tables from one datasource to another. Our hosting company doesn't give shared passwords amongst the databases so I can't write a SQL script to handle it.
The best option is just writing a little coldfusion scripty that takes care of it.
Ordinarily I would do something like:
SELECT * INTO database.table FROM database.table
The only problem with this is that cfquery's don't allow you to use two datasources in the same query.
I don't think I could use a QoQ's either because you can't tell it to use the second datasource, but to have a dbType of 'Query'.
Can anyone think of any intelligent ways of getting this done? Or is the only option to just loop over each line in the first query adding them individually to the second?
My problem with that is that it will take much longer. We have a lot of tables to move.
Ok, so you don't have a shared password between the databases, but you do seem to have the passwords for each individual database (since you have datasources set up). So, can you create a linked server definition from database 1 to database 2? User credentials can be saved against the linked server, so they don't have to be the same as the source DB. Once that's set up, you can definitely move data between the two DBs.
We use this all the time to sync data from our live database into our test environment. I can provide more specific SQL if this would work for you.
You CAN access two databases, but not two datasources in the same query.
I wrote something a few years ago called "DataSynch" for just this sort of thing.
http://www.bryantwebconsulting.com/blog/index.cfm/2006/9/20/database_synchronization
Everything you need for this to work is included in my free "com.sebtools" package:
http://sebtools.riaforge.org/
I haven't actually used this in a few years, but I can't think of any reason why it wouldn't still work.
Henry - why do any of this? Why not just use SQL manager to move over the selected tables usign the "import data" function? (right click on your dB and choose "import" - then use the native client and permissions for the "other" database to specify the tables. Your SQL manager will need to have access to both DBs, but the db servers themselves do not need access to each other. Your manager studio will serve as a conduit.