Microsoft Fakes (Shims and / or Stubs) on a c# method with SQL code - unit-testing

I am trying to learn a bit more about Unit Testing, using out-of-the-box functionality (i believe it is MSTest.exe), and Microsoft Fakes (stubs and Shims).
I am using Visual Studio 2012 Ultimate and .Net 4.5 Framework.
Given the following code that calls a stored procedure (SQL Server) which returns a single output value (for simplicity):
public string GetSomeDatabaseValue()
{
string someValue = String.Empty;
SqlParameter paramater = new SqlParameter();
paramater.ParameterName = "#SomeParameter";
paramater.Direction = ParameterDirection.Output;
paramater.SqlDbType = SqlDbType.NVarChar;
paramater.Size = 50;
try
{
using(SqlConnection connection = new SqlConnection(ConfigurationManager.ConnectionStrings["connection"].ConnectionString))
{
using (SqlCommand command = new SqlCommand())
{
command.Connection = connection;
command.CommandType = System.Data.CommandType.StoredProcedure;
command.CommandText = "SomeStoredProcedure";
command.Parameters.Add(paramater);
connection.Open();
command.ExecuteNonQuery();
if (command.Parameters["#SomeParameter"] != null)
{
someValue= Convert.ToString(command.Parameters["#SomeParameter"].Value);
}
}
}
}
catch(SqlException)
{
throw;
}
return someValue;
}
Can it be tested using shims and/or stubs so that the output value can be set to a specific value?
If so how?
Should I even use unit testing for this?
I have followed this tutorial and managed to understand and adapt it to the day of the week.
I'm waiting on the VS2012 database unit tests functionality to become available by end of 2012 (or reinstated) as a MS employee has commented so that the database can be tested in isolation.

Microsoft Fakes is not an appropriate tool to test this code. Create an integration test instead. In this test, use a local instance of the SQL server, explicitly create data that the stored procedure expects to find in the database, call the stored procedure and verify its result. Rollback transaction or manually delete data from the database to ensure that it does not affect other tests.

Related

NHibernate InMemory tests

I'm trying to use in memory test with NHibernate, and i succeeded to do that in this little project :
https://github.com/demojag/NHibernateInMemoryTest
As you can see from the map of the object i had to comment this line :
//SchemaAction.None(); Test will fail. this option hide the schema exportation.
this comment is just i guess I've made because so far i haven't found serious documentation about Schema Actions.
I'm doing those tests because i have an existing situation i would like to test in memory but all the entity maps have the option SchemaActions.None(), and when i try to execute the in memory test i get a lot of "no such tables".
I would like to know if exist a way to keep the Schema action option set to none and export the schema anyway ? (i know that can be an encapsulation violation so it would not really have a lot of sense).
I would like to leave this option set to none because is a "DatabaseFirst" application , and i can't take the risk to drop the database and re create it every time the configuration is build, but i guess, if in the configuration i don't specify the instruction "exposeConfiguration" and SchemaExport, i can be pretty safe.
Thank you in advice
Giuseppe.
You should be able to override any and all settings in the HBM or Fluent NHibernate mappings via the NHibernate.Cfg.Configuration.BeforeBindMapping event which gives you programmatic runtime access to NHibernate's internal model for a mapping. See the example below which sets up BeforeBindMapping event handler which overrides the SchemaAction specified in the mapping to whatever you want.
public NHibernate.Cfg.Configuration BuildConfiguration()
{
var configuration = new NHibernate.Cfg.Configuration();
configuration.BeforeBindMapping += OnBeforeBindMapping;
// A bunch of other stuff...
return configuration;
}
private void OnBeforeBindMapping( object sender, NHibernate.Cfg.BindMappingEventArgs bindMappingEventArgs )
{
// Set all mappings to use the fully qualified namespace to avoid class name collision
bindMappingEventArgs.Mapping.autoimport = false;
// Override the schema action to all
foreach ( var item in bindMappingEventArgs.Mapping.Items )
{
var hbmClass = item as NHibernate.Cfg.MappingSchema.HbmClass;
if ( hbmClass != null )
{
hbmClass.schemaaction = "all";
}
}
}

Sharepoint 2013 Query very slow

we set up a new SharePoint 2013 Server to test how it would work as Document-Storage.
The Problem is, that it is very slow and I dont know why..
I adapted from msdn:
ClientContext _ctx;
private void btnConnect_Click(object sender, RoutedEventArgs e)
{
try
{
_ctx = new ClientContext("http://testSP1");
Web web = _ctx.Web;
Stopwatch w = new Stopwatch();
w.Start();
List list = _ctx.Web.Lists.GetByTitle("Test");
Debug.WriteLine(w.ElapsedMilliseconds); //24 first time, 0 second time
w.Restart();
CamlQuery q = CamlQuery.CreateAllItemsQuery(10);
ListItemCollection items = list.GetItems(q);
_ctx.Load(items);
_ctx.ExecuteQuery();
Debug.WriteLine(w.ElapsedMilliseconds); //1800 first time, 900 second Time
}
catch (Exception)
{
throw;
}
}
There arent very much Documents in the Test list.
Just 3 Folders and 1 Word-File.
Any suggestions/ideas why it is this slow?
Storing unstructured content (Word docs, PDFs, anything except metadata) in SharePoint's SQL content database is going to result in slower upload and retrieval than if the files are stored on the file system. That's why Microsoft created the Remote BLOB (Binary Large Object) Storage interface to enable files to be managed in SharePoint but live on the file system or in the cloud. The bigger the files, the greater the performance hit.
There are several third-party solutions that leverage this interface, including my company's offering, Metalogix StoragePoint. You can reach out to me at trossi#metalogix.com if you would like to learn more or visit http://www.metalogix.com/Products/StoragePoint/StoragePoint-BLOB-Offloading.aspx

How to integration test an object with database queries

How can i write unitintegration tests that talk to a database. e.g.:
public int GetAppLockCount(DbConnection connection)
{
string query :=
"SELECT"+CRLF+
" tl.resource_type AS ResourceType,"+CRLF+
" tl.resource_description AS ResourceName,"+CRLF+
" tl.request_session_id AS spid"+CRLF+
"FROM sys.dm_tran_locks tl"+CRLF+
"WHERE tl.resource_type = 'APPLICATION'"+CRLF+
"AND tl.resource_database_id = ("+CRLF+
" SELECT dbid"+CRLF+
" FROM master.dbo.sysprocesses"+CRLF+
" WHERE spid = ##spid)";
IRecordset rdr = Connection.Execute(query);
int nCount = 0;
while not rdr.EOF do
{
nCount := nCount+1;
rdr.Next;
}
return nCount;
}
In this case i am trying to exorcise this code of bugs (the IRecordset returns empty recordset).
[UnitTest]
void TestGetLockCountShouldAlwaysSucceed();
{
DbConnection conn = GetConnectionForUnit_IMean_IntegrationTest();
GetAppLockCount(conn);
CheckTrue(True, "We should reach here, whether there are app locks or not");
}
Now all i need is a way to connect to some database when running a unit integration testing.
Do people store connection strings somewhere for the test-runner to find? A .ini or .xml or .config file?
Note: Language/framework agnostic. The code intentionally contains elements from:
C#
Delphi
ADO.net
ADO
NUnit
DUnit
in order to drive that point home.
Now all i need is a way to connect to some database when running a unit integration testing.
Either use an existing database or an in-memory database. I've tried both an currently use an existing database that is splatted and rebuilt using Liquibase scripts in an ant file.
Advantages to in-memory - no dependencies on other applications.
Disadvantages - Not quite as real, can take time to start up.
Advantages to real database - Can be identical to the real world
Disadvantages - Requires access to a 3rd party machine. More work setting up a new user (i.e. create new database)
Do people store connection strings somewhere for the test-runner to find? A .ini or .xml or .config file?
Yeap. In C# I used a .config file, in java a .props file. With in-memory you can throw this into the version control as it will be the same for each person, with a real database running somewhere it will need to be different for each user.
You will also need to consider seed data. In Java I've used dbUnit in the past. Not the most readable, but works. Now I use a Ruby ActiveRecord task.
How do you start this? First can you rebuild your database? You need to be able to automate this before going to far down this road.
Next you should build up a blank local database for your tests. I go with one-per-developer, some other teams share but don't commit. In a .NET/MS SQL world I think in memory would be quite easy to do.

How to retrieve data from CRM 2011 by using webservice and SSIS

Goal:
Retrieve data from Dynamics CRM 2011 to my database from SQL server R2 by using webservice through integration services (SSIS). Webservice needed to be located inside of SSIS. Gonna use the data for data warehouse.
Problem:
How do I do it?
We only write to Dynamics so I can't address the specific method name but the general idea below should get you started.
Assumptions
Two variables have been defined in your package and they are passed to the script component as ReadOnlyVariables: CrmOrganizationName, CrmWebServiceUrl.
A script component has been added to the dataflow as a Source component. On the Inputs and Outputs tab, an appropriate number of columns have been added to Output 0 (or whatever you define your output collection as) with appropriate data types.
Inside the script, add a web reference to your CRM instance. This code assumes it's called CrmSdk.
using System;
using System.Data;
using System.Data.SqlClient;
using System.Windows.Forms;
using Microsoft.SqlServer.Dts.Pipeline.Wrapper;
using Microsoft.SqlServer.Dts.Runtime.Wrapper;
// web reference
using CrmSdk;
[Microsoft.SqlServer.Dts.Pipeline.SSISScriptComponentEntryPointAttribute]
public class ScriptMain : UserComponent
{
public override void CreateNewOutputRows()
{
// Get a reference to the CRM SDK
CrmSdk.CrmService CrmService = new CrmSdk.CrmService();
// An Authentication Token is required because CRM requires an OrganizationName
// to identify the Organization to be used
CrmSdk.CrmAuthenticationToken token = new CrmSdk.CrmAuthenticationToken();
token.AuthenticationType = 0;
token.OrganizationName = this.Variables.CrmOrganizationName;
CrmService.CrmAuthenticationTokenValue = token;
// Use default credentials
CrmService.Credentials = System.Net.CredentialCache.DefaultCredentials;
// Get the web service url from the config file
CrmService.Url = this.Variables.CrmWebServiceUrl;
//////////////////////////////////////////////////
// This code is approximate
// Use the appropriate service call to get retrieve
// data and then enumerate through it. For each
// row encountered, call the AddRow() method for
// your buffer and then populate fields. Be wary
// of NULLs
//////////////////////////////////////////////////
foreach (CrmSdk.entity person in CrmService.Get())
{
Output0Buffer.AddRow();
Output0Buffer.FirstName = person.FirstName;
Output0Buffer.LastName = person.LastName;
}
}
}
Caveats
There is no error handling, checks for nulls or anything elegant. The service should probably have been defined with the using statement, etc, etc, etc. It should provide an appropriate starting point for understanding how to consume a web service and load data into the pipeline.
The easiest solution for your requirement is to use a third-party library for SSIS. The commercial COZYROC SSIS+ library includes Dynamics CRM adapters, which support all deployment models: Premise, Live, Hosted, Federation, Office 365.

Using MbUnit3's [Rollback] for unit testing NHibernate interactions with SQLite

Background:
My team is dedicated to ensuring that straight from checkout, our code compiles and unit tests run successfully. To facilitate this and test some of our NHibernate mappings, we've added a SQLite DB to our repository which is a mirror of our production SQL Server 2005 database. We're using the latest versions of: MbUnit3 (part of Gallio), System.Data.SQLite and NHibernate.
Problem:
I've discovered that the following unit test does not work with SQLite, despite executing without trouble against SQL Server 2005.
[Test]
[Rollback]
public void CompleteCanPersistNode()
{
// returns a Configuration for either SQLite or SQL Server 2005 depending on how the project is configured.
Configuration config = GetDbConfig();
ISessionFactory sessionFactory = config.BuildSessionFactory();
ISession session = sessionFactory.OpenSession();
Node node = new Node();
node.Name = "Test Node";
node.PhysicalNodeType = session.Get<NodeType>(1);
// SQLite fails with the exception below after the next line called.
node.NodeLocation = session.Get<NodeLocation>(2);
session.Save(node);
session.Flush();
Assert.AreNotEqual(-1, node.NodeID);
Assert.IsNotNull(session.Get<Node>(node.NodeID));
}
The exception I'm getting (ONLY when working with SQLite) follows:
NHibernate.ADOException: cannot open connection --->
System.Data.SQLite.SQLiteException:
The database file is locked database is locked
at System.Data.SQLite.SQLite3.Step(SQLiteStatement stmt)
at System.Data.SQLite.SQLiteDataReader.NextResult()
at System.Data.SQLite.SQLiteDataReader..ctor(SQLiteCommand cmd, CommandBehavior behave)
at System.Data.SQLite.SQLiteCommand.ExecuteReader(CommandBehavior behavior)
at System.Data.SQLite.SQLiteCommand.ExecuteNonQuery()
at System.Data.SQLite.SQLiteTransaction..ctor(SQLiteConnection connection, Boolean deferredLock)
at System.Data.SQLite.SQLiteConnection.BeginDbTransaction(IsolationLevel isolationLevel)
at System.Data.SQLite.SQLiteConnection.BeginTransaction()
at System.Data.SQLite.SQLiteEnlistment..ctor(SQLiteConnection cnn, Transaction scope)
at System.Data.SQLite.SQLiteConnection.EnlistTransaction(Transaction transaction)
at System.Data.SQLite.SQLiteConnection.Open()
at NHibernate.Connection.DriverConnectionProvider.GetConnection()
at NHibernate.Impl.SessionFactoryImpl.OpenConnection()
--- End of inner exception stack trace ---
at NHibernate.Impl.SessionFactoryImpl.OpenConnection()
at NHibernate.AdoNet.ConnectionManager.GetConnection()
at NHibernate.AdoNet.AbstractBatcher.Prepare(IDbCommand cmd)
at NHibernate.AdoNet.AbstractBatcher.ExecuteReader(IDbCommand cmd)
at NHibernate.Loader.Loader.GetResultSet(IDbCommand st, Boolean autoDiscoverTypes, Boolean callable, RowSelection selection, ISessionImplementor session)
at NHibernate.Loader.Loader.DoQuery(ISessionImplementor session, QueryParameters queryParameters, Boolean returnProxies)
at NHibernate.Loader.Loader.DoQueryAndInitializeNonLazyCollections(ISessionImplementor session, QueryParameters queryParameters, Boolean returnProxies)
at NHibernate.Loader.Loader.LoadEntity(ISessionImplementor session, Object id, IType identifierType, Object optionalObject, String optionalEntityName, Object optionalIdentifier, IEntityPersister persister)
at NHibernate.Loader.Entity.AbstractEntityLoader.Load(ISessionImplementor session, Object id, Object optionalObject, Object optionalId)
at NHibernate.Loader.Entity.AbstractEntityLoader.Load(Object id, Object optionalObject, ISessionImplementor session)
at NHibernate.Persister.Entity.AbstractEntityPersister.Load(Object id, Object optionalObject, LockMode lockMode, ISessionImplementor session)
at NHibernate.Event.Default.DefaultLoadEventListener.LoadFromDatasource(LoadEvent event, IEntityPersister persister, EntityKey keyToLoad, LoadType options)
at NHibernate.Event.Default.DefaultLoadEventListener.DoLoad(LoadEvent event, IEntityPersister persister, EntityKey keyToLoad, LoadType options)
at NHibernate.Event.Default.DefaultLoadEventListener.Load(LoadEvent event, IEntityPersister persister, EntityKey keyToLoad, LoadType options)
at NHibernate.Event.Default.DefaultLoadEventListener.ProxyOrLoad(LoadEvent event, IEntityPersister persister, EntityKey keyToLoad, LoadType options)
at NHibernate.Event.Default.DefaultLoadEventListener.OnLoad(LoadEvent event, LoadType loadType)
at NHibernate.Impl.SessionImpl.FireLoad(LoadEvent event, LoadType loadType)
at NHibernate.Impl.SessionImpl.Get(String entityName, Object id)
at NHibernate.Impl.SessionImpl.Get(Type entityClass, Object id)
at NHibernate.Impl.SessionImpl.Get[T](Object id)
D:\dev\598\Code\test\unit\DataAccess.Test\NHibernatePersistenceTests.cs
When SQLite is used and the [Rollback] attribute is NOT specified, the test also completes successfully.
Question:
Is this an issue with System.Data.SQLite's implementation of TransactionScope which MbUnit3 uses for [Rollback] or a limitation of the SQLite engine?
Is there some way to write this unit test, working against SQLite, that will rollback so as to avoid affecting the database each time the test is run?
This is not a real answer to you question, but probably a solution to solve the problem.
I use an in-memory implementation of sql lite for my integration tests. I build up the schema and fill the database before each test. The schema creation and initial data filling happens really fast (less then 0.01 seconds per test) because it's an in-memory database.
Why do you use a physical database?
Edit: response to answer about question above:
1.) Because I migrated my schema and data directly from SQL Server 2005 and I want it to persist in source control.
I recommend to store a file with the database schema in and a file or script that creates the sample data in source control. You can generate the file using sql server studion management express, you can generate it from your NHibernate mappings or you can use a tool like sql compare and you can probably find other solutions for this when you need it. Plain text files are stored easier in version control systems then complete binary database files.
2.) Does something about the in-memory SQLite engine differ such that it would resolve this difficulty?
It might solve your problems because you can recreate your database before each test. Your database under test will be in a the state you expect it to be before each test is executed. A benefit of that is there is no need to roll back your transactions, but I have run similar test with in memory sqllite and it worked as aspected.
Check if you're not missing connection.release_mode=on_close in your SQLite NHibernate configuration. (reference docs)
BTW: always dispose your ISession and ISessionFactory.
Ditch [Rollback] and use NDbUnit. I use this myself for this exact scenario and it has been working great.