Unit testing DTS packages - unit-testing

does anybody have any experience writing unit tests for sql server 2000 DTS packages?
I about to start working with DTS and jobs, so I want to be able to unit test as much as possible. I guess i could invoke dtsrun.exe via command line , but perhaps someone else has better ideas.
Thanks
Fede

I came here looking for insight but since no one else has given you ideas, I did come up with one that I will share.
In my case (I know not all cases), we use a lot of ActiveX (VBScript) scripts to accomplish things. I'm theorizing (I've not tried this) that, if I move my ActiveX functionality to VBScript classes that I can unit test those classes and etc. and then do very, very basic class instantiation and usage in the main function of the ActiveX script.
I've not tried to implement this; my project does not have the budget to do so. But, in theory it seems sound. I also am unaware of any challenges this may cause.
Please see another question I posted here. The question is loosely related to yours. The answer I selected for this question doesn't fit your scenario. You'd be more interested in Michal's answer: Creating unit tests for your asp application

Saw this question had been here for a while so I'm just throwing some ideas out there...
I'm wondering if you could write some code to use the DTS API to call your packages, then write assertions about those packages in the unit test tool for whatever language you used. For example, you could write your code in C# and use NUnit.
Additionally, if your DTS packages are all just calling SQL stored procedures, you can unit test the procedures (which would be true unit testing) using T-SQL Unit.
If it's higher level (integration) tests you want, then you might also consider hooking FitNesse up to the DTS API.
Not sure if any of this helps, but I hope I've at least given you some ideas.

Related

Big project, huge lack of test coverage, how would you approach this?

So i have this huge SF2 project, which is luckily pretty 'OK' written. Services are there, background jobs are there, no god classes, it's testable--but, i never gotten any further than just unit-testing stuff, so the question is basically, where do i start taking this further.
The project consists of SF2 and all the yada yada, Doctrine2, Beanstalkd, Gaufrette, some other abstractions--its fine.
The one problem it has is some gluecode in controllers here and there, but i don't see it as a big problem since functional tests are going to me the main focus.
The infrastructure is setup pretty ok as well, its covered by docker so CI is going to work out well also.
But it has basically gotten too large to manually test any longer, so i want full functional coverage on short notice, and let the unit-testing grow over time. (Gonna dive into the isolated objects as they need future adjustments and build test for them in due course)
So i got the unit-testing covered, thats going to need to grow over time, but i want to make some steps towards the functional testing to get some quick gains on the testing dep. YESTERDAY.
My plan as of now is use Behat and Mink for this, the tests are going to be huge, so i might as well want to have it set as stories instead of code. Behat also seem to have a extension for Symfony' BrowserKit.
There are plenty of services and external things happening, but they are all isolated by services, so i can mock them through the test environment service config i guess.
Please some advice here if there is as better way
I'm also going to need fixtures, i'm using Alice for generating some fixtures so far, seems nice together with the doctrine extension, don't think there are "better" options on this one.
How should i test external services? Im mocking things as a Facebook service, but i also want to really test it to some test account, is this advisable? I know that this goes beyond its scope, the service has to be mocked and tested in every way possible to "ensure its working" according to the purist. But in the end of the day it still breaks because of some API key or other problem in the connection, which i cant afford really. So please advice here also
All your suggestions to use other tools are welcome ofcourse, and especially if there is a good book that covers my story.
I'm glad you brought up behat, I was going to suggest the same thing.
I would consider starting with your most business critical pieces; unit test the extremely important business logic and use behat on the rest.
For the most part, I would create stubs for your services that have expected output for expected input. That way you can create failures based on specific input. You can override your services in your test config.
Another approach would be to do very thin functional testing where you make GET requests to all of your endpoints and look for 200's. This is a very quick way to make sure that your pages are at least loading. From there, you can start writing tests for your POST endpoints and expanding your suite further with more detailed test cases.

TDD with Sitecore 7 and TFS 2012 CI

So, testing with Sitecore. It's a special topic and I've found allot of reading material regarding it already. (Sitecore Development Chapter 8, Alistair Deneys blog, NextDigital blog, iStern blog,...) but in most of these cases they're going with NUnit and custom test runners etc... The most useful (to me in my context) this far was the iStern blog for mocking out Sitecore using Microsoft Fakes. But is this really the way to go?
I'm surprised that with Hedgehog TDS system to integrate so deeply with TFS and be able to do CI in Sitecore development that there isn't more on how to utilize this system for setting up solid testing executed by TFS (yet).
We're gearing up for a large new project now that uses Sitecore to handle front-end user interaction where the data being used is 95% behind a WCF service. So this part can be easily tested and TDD developed. It's the last 5% (which sadly includes like the highest business value, being online payments) that resides within Sitecore that needs to be tested. Can we ever have enough intimate knowledge of sitecore to mock it out? I'd be inclined to think not... if so, how then do we run conclusive tests on our TFS CI build against sitecore?
Last but not least I get the feeling that the information currently to be found is getting a bit out of date perhaps (maily seeing the remarks on the NextDigital blog), does Sitecore 7 open new ways to tackle this issue?
For those who'd see this more as a philosophical rather then a technical question: There can only be one answer to this and that is a technical accurate definition of a method of using the Microsoft test framework that is capable of running in the TFS CI environment to test code written for Sitecore.
Is Microsoft Fakes the way to go? In my opinion, no. Microsoft fakes allows you to test code that is not designed to be testable. If you design you solution properly, a standard mocking framework should be sufficient.
Can we ever have enough intimate knowledge of sitecore to mock it out? This is kind of a trick question. Unless a third-party library was specifically designed for it and is something that you would consider a "stable dependency", you shouldn't try to mock it. Instead, wrap it with your own classes and abstractions and mock those.
Take a look at Synthesis and Glass Mapper. They are object-mapping frameworks that allow you to map Sitecore items to interfaces while maintaining page editor support. Glass, in particular has a wrapper around Sitecore.Context that can be mocked. Synthesis is supposed to be pretty testable as well, but I haven't tried it yet.
Using one of those mapping frameworks and a good SOLID design, you should be able to make most of your code testable. Just remember that the classes on the edges of your solution should be simple enough to not require testing.
I was in the exact same situation as you, IvanL, a few weeks ago. I wanted to test some of my business logic running against Sitecore 7 without a mocking framework. I managed to do it, but only in a very specific scenario. Unfortunately, I haven't published my prototype solution or the slides explaining it yet, but I'll explain the basics of what I did.
In Sitecore 7, the move towards querying against the index with the Sitecore.ContentSearch namespace and using LINQ opened up a way for me to very easily unit test with fake index data.
There are some unit test examples out there, as you've seen, that use mocking frameworks. However, the classes they mock are actually quite simple to fake out yourself. If you implement ISearchIndex, you really only need to implement the CreateSearchContext method in order to start returning an IQueryable to work with in your tests.
To implement CreateSearchContext, you will likely need to create a fake provider search context implementation that will do the GetQueryable implementation.
Once you have those two classes set up, you've essentially got your 'index' covered. Add a property onto it where you can set the data collection from the unit test and then make sure the context returns that data collection.
That will let you build up a fake index with whatever data POCOs you want, and then pass that through to your standard provider implementations that are running your business data.
The big thing to remember is that this only works for any code you may be writing that will use the new Sitecore 7 way of using LINQ and the IQueryable implementation. Older style code that is running using the Sitecore.Data.Item API still works the way it used to, and has the same limitations as before.
Update: The prototype I mentioned is now available for download: http://blog.nonlinearcreations.com/2014/02/sitecore-7-developers-quest-successful-unit-testing/

Unit testing installation of services

Our installer program is going to be installing a number of system services, under both Windows and UNIX, using JavaServiceWrapper. There will be a class responsible for creating JavaServiceWrapper config files, installing the services, etc.
Can I have some suggestions on how to unit-test this class?
I would not struggle too much with unit testing such a class, rather I would go for integration / smoke tests. You need these anyway to verify that your installation works properly - preferably not only on your own machine, but also in the target environment, in real life, before you are about to demonstrate it to your boss and most important client :-)
Update: I assume that the class in question would not contain much complicated logic, rather just gluing together different pieces supplied by other APIs. However, if this is not the case, and you feel you can't easily test a significant part of its functionality via integration tests, you can still try unit testing with good ol' mocks and/or dependency injection.
Lol! Found this last night. Environmentally Friendly Deployment. I really think as more complex your deployment, the more you need to validate your environment.

GWT Unit Testing TDD and Tooling

I m just starting using gwt and so far so good, however after reading some sample code I wonder is it necesary to have a high level of test coverage? (I can see that most code is declarative and then add some attributes I can see the sense in checking so me particular attributes are there but not all)
Also i would be interested to know anything about what are the gotchas in TDDing with GWT
I m using eclipse so also if you are really happy with some particualrs add ins for GWT I would be happy to hear about that
Thanks for the input
edit: maybe I m asking a very wide question, but even little pieces of information will help
I come from having nvelocity views with jquery/extJs/prototype/scriptaculous and this is a bit different
When designing GWT applications to be easily testable, it's best to move as much logic out of the view as possible. Use a design pattern which makes GUI testing easier such as Model-View-Presenter (MVP), which is used widely in building desktop applications (The C#/.NET folks have written a lot about this pattern.)
You can use GWTTestCases to test remote communication and code that ultimately executes raw JavaScript (most of the GWT core classes require this, especially widgets). However, these tests are slow to execute, so you should prefer designs which put all that logic in objects that can be tested in plain ol' JUnit TestCases.
For more information about writing GWT applications test-first, I've written an article for Better Software magazine, which is available as a PDF online at my blog.
I think the best reference at the moment would be this Testing Methodologies Using Google Web Toolkit
I think you asked a pretty broad question, which is part of the reason why you didn't get a reply for a while.
Compared to traditional AJAX web development, one could argue a GWT application requires less testing. Because the GWT team has worked so hard to make sure that its widgets work consistently across all web browsers, you don't have to worry about cross-browser compatibility nearly as much for your own application.
That frees you up to focus on your own application. Create a separate test case for each of your own custom widgets and test that they behave as you expect, and then write higher-level tests for each module. Take the extra step to make your tests fully automatable - that way every time you make a change or are about to release, it's easy to run all of your tests.
http://code.google.com/docreader/#p=google-web-toolkit-doc-1-5&s=google-web-toolkit-doc-1-5&t=DevGuideJUnitIntegration

Best practice for integrating TDD with web application development?

Unit testing and ASP.NET web applications are an ambiguous point in my group. More often than not, good testing practices fall through the cracks and web applications end up going live for several years with no tests.
The cause of this pain point generally revolves around the hassle of writing UI automation mid-development.
How do you or your organization integrate best TDD practices with web application development?
Unit testing will be achievable if you separate your layers appropriately. As Rob Cooper implied, don't put any logic in your WebForm other than logic to manage your presentation. All other stuff logic and persistence layers should be kept in separate classes and then you can test those individually.
To test the GUI some people like selenium. Others complain that is a pain to set up.
I layer out the application and at least unit test from the presenter/controller (whichever is your preference, mvc/mvp) to the data layer. That way I have good test coverage over most of the code that is written.
I have looked at FitNesse, Watin and Selenium as options to automate the UI testing but I haven't got around to using these on any projects yet, so we stick with human testing. FitNesse was the one I was leaning toward but I couldn't introduce this as well as introducing TDD (does that make me bad? I hope not!).
This is a good question, one that I will be subscribing too :)
I am still relatively new to web dev, and I too am looking at a lot of code that is largely untested.
For me, I keep the UI as light as possible (normally only a few lines of code) and test the crap out of everything else. At least I can then have some confidence that everything that makes it to the UI is as correct as it can be.
Is it perfect? Perhaps not, but at least it as still quite highly automated and the core code (where most of the "magic" happens) still has pretty good coverage..
I would generally avoid testing that involves relying on UI elements. I favor integration testing, which tests everything from your database layer up to the view layer (but not the actual layout).
Try to start a test suite before writing a line of actual code in a new project, since it's harder to write tests later.
Choose carefully what you test - don't mindlessly write tests for everything. Sometimes it's a boring task, so don't make it harder. If you write too many tests, you risk abandoning that task under the weight of time-consuming maintenance.
Try to bundle as much functionality as possible into a single test. That way, if something goes wrong, the errors will propagate anyway. For example, if you have a digest-generating class - test the actual output, not every single helper function.
Don't trust yourself. Assume that you will always make mistakes, and so you write tests to make your life easier, not harder.
If you are not feeling good about writing tests, you are probably doing it wrong ;)
A common practice is to move all the code you can out of the codebehind and into an object you can test in isolation. Such code will usually follow the MVP or MVC design patterns. If you search on "Rhino Igloo" you will probably find the link to its Subversion repository. That code is worth a study, as it demonstrate one of the best MVP implementations on Web Forms that I have seen.
Your codebehind will, when following this pattern, do two things:
Transit all user actions to the presenter.
Render data provided by the presenter.
Unit testing the presenter should be trivial.
Update: Rhino Igloo can be found here: https://svn.sourceforge.net/svnroot/rhino-tools/trunk/rhino-igloo/
There have been tries on getting Microsoft's free UI Automation (included in .NET Framework 3.0) to work with web applications (ASP.NET). A german company called Artiso happens to have written a blog entry that explains how to achieve that (link).
However, their blogpost also links an MSDN Webcasts that explains the UI Automation Framework with winforms and after I had a look at this, I noticed you need the AutomationId to get a reference to the respecting controls. However, in web applications, the controls do not have an AutomationId.
I asked Thomas Schissler (Artiso) about this and he explained that this was a major drawback on InternetExplorer. He referenced an older technology of Microsoft (MSAA) and was hoping himself that IE8 will do this better.
However, I was also giving Watin a try and it seems to work pretty well. I even liked Wax, which allows to implement simple testcases via Microsoft Excel worksheets.
Ivonna can unit test your views. I'd still recommend moving most of the code to other parts. However, some code just belongs there, like references to controls or control event handlers.