Contributing R test scripts - unit-testing

Although tools like RUnit, svUnit, and testthat are good for package developers, I think it would be useful to have some means of uploading test scripts or even just usage examples for particular packages. Users who do continuous integration (e.g. Jenkins) or even basic unit testing may already have such tests and could find it beneficial to contribute scripts for package maintainers' use.
Does such functionality exist, either through CRAN or RForge, or via other sites, such as github? If so, is there a major example of using a repository, e.g. github, to allow users to contribute tests?
(Thanks to #mariotomo for reminding me of svUnit.)

I would suggest to refrain from overcomplicating things. Why not just something like this:
Look at a given package's sources (ie on r-forge, rforge, github, ... or straight CRAN sources),
understand its testings scheme (ie tests/ directory, examples in manual pages, or one of the three unit testing frameworks from CRAN), and
contribute new tests.
That's really all there is too it. Same for contributing documentation, demo scripts, new code, .... We can and should focus on the open in open source.

Related

ColdFusion, and documenting code as well as tools for multiple developer teams

I am beginning a fairly large new project using ColdFusion. This new project will include several developers and as such documentation of code will be key.
Another issue I am hoping to avoid (either with adequate code documentation or some other tool) is the duplication of code. A tool that would be able to "index" the code for searching or diagramming would likely help here.
What are others out there using either specifically for ColdFusion or language agnostic. We will likely be using ColdBox for the underlying framework if that makes a difference.
Thanks for any any all suggestions.
-c
Well, it's impossible to tell you which framework to use without knowing more about your project, but I can list out some tools that will be useful no matter which framework you use.
Language-agnostic tools:
GitHub.com organization+teams
Jenkins continuous integration
Apache ANT build scripts
Apache Maven for project management
Coldfusion-specific tools:
MXunit unit testing framework
MockBox for unit testing (if you use ColdBox: ColdBox-specific tutorial)
ColdDoc documentation generator
Javascript-specific tools:
JSLint or JSHint for JS code cleanup
Jasmine unit testing
Ideally, your Jenkins build server should:
Do a fresh checkout from source control
Run all unit tests and stop the build if they fail
Generate documentation
Generate a production-ready package of your project
At a minimum, I highly recommend using source control, setting up Jenkins with MXunit tests, and scheduling daily automated builds.
We used the Atlassian suite at my last job. Mostly Jira for tracking and Greenhopper for agile but the other tools may help, fisheye, bamboo, and crucible. If you host it yourself I believe they have a one time $10/product price tag that. Depending on your team's size may or may not work. If money is no subject the suite worked really nicely. It also has built in support for svn and maybe more by now.
http://www.atlassian.com/software
Sounds to me what you need is a methodology, not a tool. If you have a clearly defined set of objects/responsibilities. There should be no crossover in scripting, and if you determine a common API for the objects being coded, I would think you'd be fine.

buildbot vs hudson/jenkins for C++ continuous integration

I'm currently using jenkins/hudson for continuous integration a large mostly C++ project. We have separate projects for trunk and every branch. Also, there are some related projects for the Java code, but the setup for those are fairly basic right now (we may do more later though). The C++ projects do the following:
Builds everything with options for whether to reconfigure, do a clean build, or use a fresh checkout
Optionally builds and runs all tests
Optionally runs all tests using Valgrind's memcheck
Runs cppcheck
Generates doxygen documentation
Publishes reports: unit tests, valgrind, cppcheck, compiler warnings, SLOC, open tasks, and code coverage (using gcov, gcovr, and the cobertura plugin)
Deploys code nightly or on demand to a test environment and a package repository
Everything is configurable for automatic builds and optional for on demand builds. Underneath, there's a bash script that controls much of this, which farther depends on our build system, which uses automake and autoconf along with custom bash scripts.
We started using Hudson (at the time) because that's what the Java guys were using and we just wanted nightly builds. Since then, we've added a lot more and continue to add more. In some ways Hudson is great, but certainly isn't ideal.
I've looked at other solutions and the only one that looks like it could be a replacement is buildbot. Would buildbot be better for this situation? Is the investment worth it since we're already using Hudson? Why?
EDIT: Someone asked why I haven't found Hudson/Jenkins to be ideal. The short answer is that everything can be improved. I'm simply wondering if Jenkins is the best current solution for my use case or whether there is something better (buildbot?) that would be easier to maintain in the long run even as new requirements come up.
Both are open source projects, but you do not need to change buildbot code to "extend" it, it is actually quite easy to import your own packages in its configuration in which you can sub-class most of the features with your own additions. Examples: your own compilation or test code, some parsing of outputs/errors to be given to the next steps, your own formating of alert emails etc. there are lots of possibilities.
Generally I would say that buildbot is the most "general purpose" automatic builds tools. Jenkins however might be the best related to running tests, especially for parsing and presenting results in nice ways (results, details, charts.. some clicks away), things that buildbot does not do "out-of-the-box". I'm actually thinking of using both to have sexier test result pages.. :-)
Also as a rule of thumb it should not be difficult to create a new tool's config: if the specification of what to do (configs, builds, tests) is too hard to switch from one tool to another, it is a (bad) sign that not enough configuration scripts are moved to the sources. Buildbot (or Jenkins) should only call simple commands. If it is simple to run tests, then developers will do it as well and this will improve the success rate, whereas if only the continuous integration system runs the tests, you will be running after it to fix the new code failures, and will loose its non-regression value, just my 0.02€ :-)
Hope it'll help.
The 'result integration' is also in jenkins/hudson, and you can relatively easily capture build products without having to 'copy them elsewhere'.
For our instance, the coverage reports and unit test metrics and javadoc for the java code is all integrated. For our C++ code, the plugins are a little lacking, but you can still get most of it.
we ran buildbot since pre 0.7, and are now running 0.8 and are only now seeing any real reason to switch, as buildbot 0.8 forgot about windows slaves for an extended period of time and the support was pretty poor.
There are many other solutions out there, besides Jenkins/Hudson/BuildBot:
TeamCity by Jetbrains
Bamboo by Atlassian
Go by Thoughtworks
Cruise Control
OpenMake Meister
The specifics about what you are doing are not so important, in fact, as long as the agents (aka nodes) that you are doing them on support those tasks.
The beauty of a CI server is noticing when the build changes to trigger a new build (and test), publish the artifacts, and publish test results.
When you compare CI tools like those we mentioned, consider features like the usability of its interface, how easy is branching (and features it might offer like automatic merging), notifications (like XMPP/jabber), or an information-radiator (like hooking up a monitor to always show status). Product support is another thing to consider - Jenkins' support is only as good as who is responding to community questions at the time you have questions.
My personal favorite is Bamboo, but it comes with a license fee.
I'm a long-time Jenkins user in the middle of evaluating Buildbot and would like to offer a few items for folks considering using Buildbot for multi-module solutions:
*) Buildbot doesn't have any out-of-the-box concept of file artifacts related to each build. It's not in the UI and it's not in any of the builtin "steps" modules as far as I can see:
http://docs.buildbot.net/current/manual/configuration/buildsteps.html
...and I see no third party plugin:
https://github.com/buildbot/buildbot/wiki/PluginList#steps
Buildbot does collect all the console output from a given build, but critically, you can't collect files related to it.
*) Given that artifacts are not supported, it's not easy to create "collector" projects that bring multiple modules into say, a single installer. Jenkins has a great feature that lets you parameterize a build with builds from other modules (the parameter type is a run).
*) Establishing dependencies between modules is trickier in Buildbot. Say you have a library that three binaries depend on, and you want those binaries to rebuild each time the library changes. Jenkins has triggers built into the UI. If you want to do triggers in Buildbot you have to script them using schedulers.Dependent, and it causes a lot of item congestion in the Schedulers UI.
*) When you're working in Buildbot, it seems that pretty much all of the configuration is done in master.cfg in code. This is awesome and frustrating.
*) Buildbot forces you to create a worker in addition to a master server. This is annoying for beginners and systems for which a single build server is sufficient.
My impression after two days of Buildbot evaluation is that we'll stick with Jenkins, primarily due to it having artifacts. Buildbot is a tool we'd only use if we had more extensive customization needs, and the time to do it.
On the subject of buildbot and artifacts -- I don't have enough user score to make a comment -- you can get artifacts from buildbot 2.x series pretty easy with built-in file/directory upload actions. However you rarely want to just move files. Typically you make a triggered buildstep that does deployment directly off the worker for best results. eg push to cloud storage, containers, thirdparty (steam uploads), etc.
This way you can get metrics on the uploads and conditionally control them better (or even mix and match artifacts across worker machines).

Which acceptance testing frameworks are better for backend development?

By backend I mean software systems that consume data, process files or communicate using machine interfaces (REST, SOAP, CORBA, etc...). No fancy web or UI testing is necessary. I have in mind Cucumber and Robotframework but I don't know how well suited they are in the task at hand.
There isn't an easy answer to this question.
Sounds like you got your domain right... because if you were trying to test UI/Web components acceptance testing frameworks make not be the right tool for the job.
You have a few options
SpecFlow
Cucumber
StoryTeller
FitNesse
mSpec
a few others.
I'm partial to StoryTeller & mSpec.. but each of these tools have their pros/cons.
Questions you should be asking yourself (and your team are)
Who is going to be writing/maintaining the tests?
Are self documenting acceptance tests provide value for your organization?
Which technology would integrate most easily with your current build process?
I have used Cucumber to test a batch application written in perl and plsql, an informatica transformation, and am currently using it to test a telephony ivr/queueing system. Ruby provided the gems I needed to drive the system and it was very easy for the testers to learn the language/syntax.
If it comes to Robot Framework I have an experience with bad support in Intellij for this. I'm using IntelliBot plugin. In many cases you will know your mistake like wrong syntax, cannot find keyword etc. late in test runtime.
Problems with finding imported libraries, no debug, no simple refactor like rename keyword across a project.

Subversioning System / Changelog and Bugsubmission

I am looking for a small software versioning (changelog) and bug submission system with a web-frontend.
The features I only need is a change-log where users can see what they can expect and a tiny bug-submission system. I don't need the many features SVN offers as software versiong as the project is quite small and I do all development locally.
Any ideas?
It sounds like you'll do just fine coding raw html with your requirements. If you can code in any language, the html you'll need is minimal it'll be easy to pick up in the case that you're not familiar with it.
Although, I do still recommend rethinking your decision not to use SVN. If your project is open source, have a look at Google Code, which offers free source code hosting including bug tracker, SVN repository, release management and wiki. It'll also make your project more discoverable. If it's not open source, you can purchase private hosting on github, but that uses git which is more complicated.
Mantis is probably what you want - web based bug system that's quite nice, fast and simple. It integrates with SVN using a php page you can access using curl from a SVN hook.
We used it as a bug tracker, when the code was committed (with a special regex in the log comment) the files that were changed were added to the mantis bug as a bugnote. I'm not 100% sure of your situation, but it appears at first glance that this kind of arrangement is similar to what you want.

What are some of the best resources to learn MSBuild with?

I am looking for any and all suggestions of the best and effective resources that the StackOverflow community has used to better learn MSBuild with an emphasis on integrating unit tests and later static code analysis tools such as FxCop and StyleCop into the build process.
I have tried to find good clear documentation on adding unit tests into my build but I still am searching - even Google searches have come up empty or with just bits and pieces. Ideally I want to add unit tests, report results, and eventually add code coverage statistics, etc into the build results.
I know it has to be in MSDN somewhere but I seem unable to find anything which explains and teaches well. I am using Visual Studio Team System 2008.
Continuous Integration From Theory to Practice by Carel Lotz. It covers the entire scope of your problem, and then some. Well written, complete, and a full sample are all there.
Hands down best resource. Use it as a tutorial first, then use it as a guide, then use it as a reference.
MSDN and others are good for clarifying (or confusing) the details.
Edit: The guide by Carel Lotz uses MBUnit for unit tests (see his earlier document version for NUnit, though you can replace the MBUnit with NUnit pretty easily if you follow the NUnit help files).
Also, it is written to use Cruise Control.NET to run the MSBuild script in various configurations.
Personally, I run unit tests in a secondary MSBuild script, but have found that wrapping the NUnit calls in MSBuild gives more flexibility than running from CCNet directly.
Here's a book that might help: http://blogs.msdn.com/microsoft_press/archive/2009/01/31/sayed.aspx
I guess I need to ask if you are sure you want to use MSBuild directly? Might want to check out WIX as the MSI producing tool - there is an extensive manual and it is built on top of MSBuild.
As for the automating of your tests with reporting and integrating with NUnit, FxCop, NCover, FitNesse, etc - I think the best (free) tool out there is CruiseControl .Net. It works with all of these tools and more. Can do versioning, automated builds with automated testing, creates the reports for each...
Here is a sample of one of my builds...
http://img84.imageshack.us/img84/3664/cruisecontrolnetsamplezn0.jpg http://img84.imageshack.us/img84/3664/cruisecontrolnetsamplezn0.jpg