TDD with Sitecore 7 and TFS 2012 CI - unit-testing

So, testing with Sitecore. It's a special topic and I've found allot of reading material regarding it already. (Sitecore Development Chapter 8, Alistair Deneys blog, NextDigital blog, iStern blog,...) but in most of these cases they're going with NUnit and custom test runners etc... The most useful (to me in my context) this far was the iStern blog for mocking out Sitecore using Microsoft Fakes. But is this really the way to go?
I'm surprised that with Hedgehog TDS system to integrate so deeply with TFS and be able to do CI in Sitecore development that there isn't more on how to utilize this system for setting up solid testing executed by TFS (yet).
We're gearing up for a large new project now that uses Sitecore to handle front-end user interaction where the data being used is 95% behind a WCF service. So this part can be easily tested and TDD developed. It's the last 5% (which sadly includes like the highest business value, being online payments) that resides within Sitecore that needs to be tested. Can we ever have enough intimate knowledge of sitecore to mock it out? I'd be inclined to think not... if so, how then do we run conclusive tests on our TFS CI build against sitecore?
Last but not least I get the feeling that the information currently to be found is getting a bit out of date perhaps (maily seeing the remarks on the NextDigital blog), does Sitecore 7 open new ways to tackle this issue?
For those who'd see this more as a philosophical rather then a technical question: There can only be one answer to this and that is a technical accurate definition of a method of using the Microsoft test framework that is capable of running in the TFS CI environment to test code written for Sitecore.

Is Microsoft Fakes the way to go? In my opinion, no. Microsoft fakes allows you to test code that is not designed to be testable. If you design you solution properly, a standard mocking framework should be sufficient.
Can we ever have enough intimate knowledge of sitecore to mock it out? This is kind of a trick question. Unless a third-party library was specifically designed for it and is something that you would consider a "stable dependency", you shouldn't try to mock it. Instead, wrap it with your own classes and abstractions and mock those.
Take a look at Synthesis and Glass Mapper. They are object-mapping frameworks that allow you to map Sitecore items to interfaces while maintaining page editor support. Glass, in particular has a wrapper around Sitecore.Context that can be mocked. Synthesis is supposed to be pretty testable as well, but I haven't tried it yet.
Using one of those mapping frameworks and a good SOLID design, you should be able to make most of your code testable. Just remember that the classes on the edges of your solution should be simple enough to not require testing.

I was in the exact same situation as you, IvanL, a few weeks ago. I wanted to test some of my business logic running against Sitecore 7 without a mocking framework. I managed to do it, but only in a very specific scenario. Unfortunately, I haven't published my prototype solution or the slides explaining it yet, but I'll explain the basics of what I did.
In Sitecore 7, the move towards querying against the index with the Sitecore.ContentSearch namespace and using LINQ opened up a way for me to very easily unit test with fake index data.
There are some unit test examples out there, as you've seen, that use mocking frameworks. However, the classes they mock are actually quite simple to fake out yourself. If you implement ISearchIndex, you really only need to implement the CreateSearchContext method in order to start returning an IQueryable to work with in your tests.
To implement CreateSearchContext, you will likely need to create a fake provider search context implementation that will do the GetQueryable implementation.
Once you have those two classes set up, you've essentially got your 'index' covered. Add a property onto it where you can set the data collection from the unit test and then make sure the context returns that data collection.
That will let you build up a fake index with whatever data POCOs you want, and then pass that through to your standard provider implementations that are running your business data.
The big thing to remember is that this only works for any code you may be writing that will use the new Sitecore 7 way of using LINQ and the IQueryable implementation. Older style code that is running using the Sitecore.Data.Item API still works the way it used to, and has the same limitations as before.
Update: The prototype I mentioned is now available for download: http://blog.nonlinearcreations.com/2014/02/sitecore-7-developers-quest-successful-unit-testing/

Related

What are some good books and online resources for test-driven SharePoint development?

I am seeking articles, tutorials, examples, links, videos, books, or anything to get me started with the Test Driven SharePoint Development.
I am also interested to find out pros and cons of this practice and things to look for.
I don't think you need to understand it specific to Sharepoint. The concept of test-driven development is general enough to be applicable to any programming language or framework.
The wiki can get you started, and there are plenty of links in google to go from there.
But, if you really want some Sharepoint specific articles, there is The Sharepoint Cowboy's article and an article on 21 apps.
I agree with Larry, "normal" test-driven applies to SharePoint just fine.
You should look at TypeMock for your unit / integration tests. It's a unit-test / faking framework that allows you to fake SharePoint objects. Not sure if it's still the only one that does it, but it certainly is a good product.
I find that doing test-driven development in SharePoint is a bit more time-consuming than regular .NET TDD, mostly because SharePoint objects are more complex and harder to isolate. For example, a SharePoint list often depends on a content type, which depends on site columns, etc. It means that you have to put it a bit more work to get something working.
A big pro (in my eyes anyway) is that once your setup is done, you can work fast. Setting up your environment for a test (if you're doing it through the UI) is a pain. It takes a long time. For example, if you need to create your columns, then your content type though the UI every time you make a change to code that uses it, you'll lose a lot of time. Even more so if your test fails and you need to delete your content type because you broke something. Going with unit tests / fake SharePoint objects, you'll be able to do that in no time, plus you'll quickly get a better understanding of how the SharePoint objects work.

Devising a test strategy

As part of a new job, I have to devise and implement a complete test strategy for the company's new product. So far, all I really know about it is that it is written in C++, uses an SQL database and has a web API which is used by a browser client written using GWT.
As far as I know, there isn't much of an existing strategy, except for using Python scripts to test the web API.
I need to develop and implement a suitable strategy for unit, system, regression and release testing, preferably a fully automated one.
I'm looking for good references for:
Devising the complete test strategy.
Testing the web API.
Testing the GWT based application.
Unit testing C++ code.
In addition, any suitable tools would be appreciated.
Testing Computer Software is a great soup-to-nuts book on the entire testing process. In addition to the items you mentioned, you'll need to think about other types of testing (performance, security, localization, stress testing, to name a few) and how to manage the test process; test plans, issue tracking, test data, test cases, in addition to the tools.
There's a lot there, and you can't do everything at once. I think a phased approach would be best, where you identify the gaps, weaknesses, and risks in the current process, prioritize them, then set up a high level plan to address them one by one.
Software QA Testing and Test Tool Resources is a good starting place for finding some tools to fit your process. StickyMinds is a nice web site dedicated to software testing, and the folks here at StackOverflow certainly know their stuff, so don't be afraid to ask.
Good luck :)
You can find a ton of excellent information on testing and developing a test strategy in general over on James Bach's blog. Specifically by searching through it for tips on testing strategies.
James is an excellent resource for information about how to do great software testing.
Best of luck.
There's a good conversation here on The Purpose of a Test Strategy.
As testing tools you could use Selenium for web testing and CppUnit for c++ unit testing.

GWT Unit Testing TDD and Tooling

I m just starting using gwt and so far so good, however after reading some sample code I wonder is it necesary to have a high level of test coverage? (I can see that most code is declarative and then add some attributes I can see the sense in checking so me particular attributes are there but not all)
Also i would be interested to know anything about what are the gotchas in TDDing with GWT
I m using eclipse so also if you are really happy with some particualrs add ins for GWT I would be happy to hear about that
Thanks for the input
edit: maybe I m asking a very wide question, but even little pieces of information will help
I come from having nvelocity views with jquery/extJs/prototype/scriptaculous and this is a bit different
When designing GWT applications to be easily testable, it's best to move as much logic out of the view as possible. Use a design pattern which makes GUI testing easier such as Model-View-Presenter (MVP), which is used widely in building desktop applications (The C#/.NET folks have written a lot about this pattern.)
You can use GWTTestCases to test remote communication and code that ultimately executes raw JavaScript (most of the GWT core classes require this, especially widgets). However, these tests are slow to execute, so you should prefer designs which put all that logic in objects that can be tested in plain ol' JUnit TestCases.
For more information about writing GWT applications test-first, I've written an article for Better Software magazine, which is available as a PDF online at my blog.
I think the best reference at the moment would be this Testing Methodologies Using Google Web Toolkit
I think you asked a pretty broad question, which is part of the reason why you didn't get a reply for a while.
Compared to traditional AJAX web development, one could argue a GWT application requires less testing. Because the GWT team has worked so hard to make sure that its widgets work consistently across all web browsers, you don't have to worry about cross-browser compatibility nearly as much for your own application.
That frees you up to focus on your own application. Create a separate test case for each of your own custom widgets and test that they behave as you expect, and then write higher-level tests for each module. Take the extra step to make your tests fully automatable - that way every time you make a change or are about to release, it's easy to run all of your tests.
http://code.google.com/docreader/#p=google-web-toolkit-doc-1-5&s=google-web-toolkit-doc-1-5&t=DevGuideJUnitIntegration

TDD and ADO.NET Entity Framework

I've been playing with ADO.NET Entity Framework lately, and I find that it suits my needs for a project I'm developing. I also find cool its non-invasive nature.
After generating a data model from an existing database you are faced with the task of integrating the generated model and your business logic. More specifically, I'm used to integration-test my classes that interact with the data store via mocks/stubs of the DAL interfaces. The problem is that you cannot do this using the ADO.NET Entity Framework because the entities it generates are simple classes with no interface.
The question is: how do I apply a TDD approach to the development of an application that uses ADO.NET Entity Framework? Is this even possible or should I migrate to another DAL-generation toolset?
One of the big critiques against the Entity Framework has been that it is inherently hard to test, for example in the ALT.Net Vote of No Confidence that gef quoted.
Here is a blog post discussing how to get around this, and be able to test your code without hitting the database, when using Entity Framework.
If testability is a big concern, you might want to look at another ORM framework, such as NHibernate, at least until Entity Framework 2.0 is released.
Although, the original question has been answered, I feel like I might add something:
I am currently using the Entity Framework 4.0 on an intranet site I'm building. I am able to test everything in my business logic and controllers without a database connection using the POCO support that has been added.
Although, the POCO's can be generated from the new t4 template included in VS 2010, something that I haven't been able to find in VS 2010 is a t4 template for generating your object context (the object context basically works as a built in unit of work for EF and is essential for mapping your EF objects to POCOs). Luckily Joachim Lykke Andersen in his blog post Entity Framework 4.0 Beta 1 – POCO, ObjectSet, Repository and UnitOfWork wrote a t4 template for generating it and it has been very helpful. If you pursue a solution using the EF4 that is testable without a database connection I highly recommend implementing something similar to his solution which includes a generic repository, unit of work wrapper, and a unit of work factory. It has been very helpful.
Best of luck.
I agree that version 1 of the Entity Framework is a crime against design and it definitely got my vote of no confidence. I credit the EF product team though for acknowledging the failure and responding by opening up their design process to the community. The next release isn't going to be perfect, it might not even be ready for use in a production level application, but I think they're finally starting to understand what's important to those of use who know that bad design is bad business. That being said... I'm still suspicious. Continuous design-time feedback is new to these guys and I've read quite a few statements on the ADO.NET blog that raise bright, red flags. We'll see how it goes with the release of .NET 4.0.
They appear to be trying though:
Test-Driven Development Walkthrough with the Entity Framework 4.0
"The tight coupling of the persistence
infrastructure to the entity classes
largely eliminates the ability to
efficiently use very tight feedback
cycles on the business logic with
automated testing. In its current
state, EF entity classes cannot be
effectively unit tested independently
of the database.
The efficiency of automated unit
testing of behavioral objects is
largely a matter of how easy the
mechanics of test data setup are and
how quickly the tests can be executed.
Using the actual database will make
test data setup more laborious,
introduce data to satisfy relational
constraints that are not germane to
the test, and make test execution an
order of magnitude slower.
A team’s ability to do evolutionary
design and incremental delivery is
damaged by the Entity Framework’s
inattention to fundamental software
design principles like Separation of
Concerns."
Blatantly stolen from here:
http://efvote.wufoo.com/forms/ado-net-entity-framework-vote-of-no-confidence/
If you're looking specifically at DAL-generation tools you'll have a hard time integrating this with TDD. Most dal generation tools I know also generate your business objects and tightly couple them to the DAL making testing difficult.
You can look at OR-mapping tools like nHibernate and maybe Linq to sql that enable "persistance ignorance", you can define your business objects yourself and they have no links to the DAL or any other infrastructure code. This makes testing your business logic seperately from your database much easier. I found it also enables other scenario's like occasionally connected clients far better.
This answer has changed to "Yes, you can".
You can generate POCO and interfaces using customized T4 templates such as https://entityinterfacegenerator.codeplex.com/, then create mocking objects to test EF in and out without hitting the database.

Best practice for integrating TDD with web application development?

Unit testing and ASP.NET web applications are an ambiguous point in my group. More often than not, good testing practices fall through the cracks and web applications end up going live for several years with no tests.
The cause of this pain point generally revolves around the hassle of writing UI automation mid-development.
How do you or your organization integrate best TDD practices with web application development?
Unit testing will be achievable if you separate your layers appropriately. As Rob Cooper implied, don't put any logic in your WebForm other than logic to manage your presentation. All other stuff logic and persistence layers should be kept in separate classes and then you can test those individually.
To test the GUI some people like selenium. Others complain that is a pain to set up.
I layer out the application and at least unit test from the presenter/controller (whichever is your preference, mvc/mvp) to the data layer. That way I have good test coverage over most of the code that is written.
I have looked at FitNesse, Watin and Selenium as options to automate the UI testing but I haven't got around to using these on any projects yet, so we stick with human testing. FitNesse was the one I was leaning toward but I couldn't introduce this as well as introducing TDD (does that make me bad? I hope not!).
This is a good question, one that I will be subscribing too :)
I am still relatively new to web dev, and I too am looking at a lot of code that is largely untested.
For me, I keep the UI as light as possible (normally only a few lines of code) and test the crap out of everything else. At least I can then have some confidence that everything that makes it to the UI is as correct as it can be.
Is it perfect? Perhaps not, but at least it as still quite highly automated and the core code (where most of the "magic" happens) still has pretty good coverage..
I would generally avoid testing that involves relying on UI elements. I favor integration testing, which tests everything from your database layer up to the view layer (but not the actual layout).
Try to start a test suite before writing a line of actual code in a new project, since it's harder to write tests later.
Choose carefully what you test - don't mindlessly write tests for everything. Sometimes it's a boring task, so don't make it harder. If you write too many tests, you risk abandoning that task under the weight of time-consuming maintenance.
Try to bundle as much functionality as possible into a single test. That way, if something goes wrong, the errors will propagate anyway. For example, if you have a digest-generating class - test the actual output, not every single helper function.
Don't trust yourself. Assume that you will always make mistakes, and so you write tests to make your life easier, not harder.
If you are not feeling good about writing tests, you are probably doing it wrong ;)
A common practice is to move all the code you can out of the codebehind and into an object you can test in isolation. Such code will usually follow the MVP or MVC design patterns. If you search on "Rhino Igloo" you will probably find the link to its Subversion repository. That code is worth a study, as it demonstrate one of the best MVP implementations on Web Forms that I have seen.
Your codebehind will, when following this pattern, do two things:
Transit all user actions to the presenter.
Render data provided by the presenter.
Unit testing the presenter should be trivial.
Update: Rhino Igloo can be found here: https://svn.sourceforge.net/svnroot/rhino-tools/trunk/rhino-igloo/
There have been tries on getting Microsoft's free UI Automation (included in .NET Framework 3.0) to work with web applications (ASP.NET). A german company called Artiso happens to have written a blog entry that explains how to achieve that (link).
However, their blogpost also links an MSDN Webcasts that explains the UI Automation Framework with winforms and after I had a look at this, I noticed you need the AutomationId to get a reference to the respecting controls. However, in web applications, the controls do not have an AutomationId.
I asked Thomas Schissler (Artiso) about this and he explained that this was a major drawback on InternetExplorer. He referenced an older technology of Microsoft (MSAA) and was hoping himself that IE8 will do this better.
However, I was also giving Watin a try and it seems to work pretty well. I even liked Wax, which allows to implement simple testcases via Microsoft Excel worksheets.
Ivonna can unit test your views. I'd still recommend moving most of the code to other parts. However, some code just belongs there, like references to controls or control event handlers.