Where should I place my lib tests - unit-testing

Note that I have already read the layout convention.
In my lib directory I usually have a few libraries I could extract into their own package. Very often the code is not complete enough or / and I want to wait for a new package until I really want to reuse the code in another project.
I would really like to place the unit-test code, examples and doc in the same directory.
Example:
let's say I have a string-helper library in lib → lib/string-helper.
I would like to place my tests, examples and doc in lib/string-helper/tests, lib/string-helper/examplesand lib/string-helper/doc.
However the layout convention says that I should put them outside the lib directory.
This makes it unnecessarily hard to extract it into its own package. (pub serve even went into an endless loop when I ignored this and made my own package symbolic link)
How do you handle this?

The only valid place for tests is the my_package/test directory or any subdirectory of test.

Related

How to determine the tree of files which are imported during a test case?

When I run a test in Go, is there any way for me to get the list of files that the code imports, directly or indirectly? For example, this could help me rule out changes from certain parts of the codebase when debugging a failing test.
Alternatively, with Git, can we find out what the lowest common ancestor git tree node is for the files exercised in a given test?
Context: I'm looking into automated flakiness detection for my test suite, and I want to be able to know the dependency tree for every test so that I can detect flaky tests better.
For example, if TestX fails for version x of the code, and later on some files in the same codebase which are not used at all by TestX are changed, and then TestX passes, I want to be able to detect that this is a flaky test, even though the overall codebase that the test suite ran on has changed.
You are probably looking for go list -test -deps [packages].
For an explanation of what the flags do, you can check Go command List packages or modules:
-deps:
The -deps flag causes list to iterate over not just the named packages but also all their dependencies. It visits them in a depth-first post-order traversal, so that a package is listed only after all its dependencies. [...]
-test:
The -test flag causes list to report not only the named packages but also their test binaries (for packages with tests), to convey to source code analysis tools exactly how test binaries are constructed. The reported import path for a test binary is the import path of the package followed by a ".test" suffix, as in "math/rand.test". [...]
Maybe I'll state the obvious, but remember that list works on packages, not single files, so the command above will include dependencies of the non-test sources (which should be what you want anyway).

Using AsConfigured and still be able to get UnitTest results in TFS

So I am running into an issue when I go to build my projects using tfs build controller using the Output location "AsConfigred" it will not detect my unit tests. Let me give a little info on my setup.
TFS 2013 Update 2, Default Process Template
Here is a few screenshots that can hopefully help fill in what I can't in typing. I am copying my build out to a file share on our network so that we can use other utilities use the output. I don't want to use "PerProject" or "SingleFolder" because they mess up the file structure we have configured (These both will run the tests). So i have the files copy to folder names "SingleOutputFolder" which is a child of the DropLocation. I would like to be able to run from the drop folder or run from the bin folder for each of my tests (I don't care which). However it doesn't seem to detect/run ANY of the tests. Any help would be greatly appreciated. Please let me know if you need any additional information.
I have tried using ***test*.dll, Install\SingleFolderOutput**.test.dll, and $(TF_BUILD_DROPLOCATION)\Install\SingleFolderOutput*test*.dll
But I am not sure what variables are available and understand where the scope of its execution is.
Given that you're using Build Output location set to AsConfigured you have to change the default values of the Test sources spec setting to allow build to find the test libraries in the bin folders. Here's an example.
If the full path to the unit test libraries is:
E:\Builds\7\<TFS Team Project>\<Build Definition>\src\<Unit Test Project>\bin\Release\*test*.dll
use
..\src\*UnitTest*\bin\*\*test*.dll;
This question was asked on MSDN forums here.
MSDN Forums Suggested Workaround
The suggested workaround in the accepted answer (as of 8 a.m. on June 20) is to specify the full path to the test projects' binary folders: For example:
C:\Builds\{agentId}\{teamProjectName}\{buildDefinitionName}\src\{solutionName}\{testProjectName}\bin*\Debug\*test*.dll*
which really should have been shown as
{agentWorkingFolder}\src\{relativePathToTestProjectBinariesFolder}\*test*.dll
However this approach is very brittle, for the following reasons:
Any new test projects you add to the solution will not be executed until you add them to the build definition's list of test sources:
It will break under any of the following circumstances:
the build definition is renamed
the working folder in build agent properties is modified
you have multiple build agents, and a different agent than the one you specified in {id} runs the build
Improved Workaround
My workaround mitigates the issues listed in #2 (can't do anything about #1).
In the path specified above, replace the initial part:
{agentWorkingFolder}
with
..
so you have
..\src\{relativePathToTestProjectBinariesFolder}\*test*.dll
This works because the internal working directory is apparently the \binaries\ folder that is a sibling of the \src\ folder. Navigating up to the parent folder (whatever it is named, we don't care) and back in to \src\ before specifying the path to the test projects binaries does the trick.
Note: If you have multiple test projects, you add additional entries, separated with semicolons:
..\src\{relativePathToTestProjectONEBinariesFolder}\*test*.dll;..\src\{relativePathToTestProjectTWOBinariesFolder}\*test*.dll;..\src\{relativePathToTestProjectTHREEBinariesFolder}\*test*.dll;
What I ended up doing was adding a post build event to copy all of the test.dll into the staging location folder in the specific build that is basically equivalent to where it would go on a SingleFolder build and do that on each test project.
if "$(TeamBuildOutDir)" == "" (
echo "Building Interactively not in TFS"
) else (
echo "Building in TFS"
xcopy "$(TargetDir)*.*" "$(TeamBuildBinaries)\" /Y /E /S
)
MSBUILD parameter in the build def that told it to basically drop in the folder that TFS looks for them.
/p:TeamBuildBinaries="$(TF_BUILD_BINARIESDIRECTORY)"
Kept the default Test assembly file specification:
**\*test*.dll
View this link for the information on the variable that I used and what relative path it exists at.
Another solution is to do the reverse.
Leave all of the files in the root so that all of the built in functionality works. There is more than just test execution in there. What about static code analysis, impact analysis..among others. You would have to do something custom for them all.
Instead use a pre-drop powershell script to create your Install arrangement from the root files.
If it is an application then you can use the _ApplicationFolder Nuget package to create an _PublishApplications folder same as you get for web applications.

TFS Build and absolute path in app.config

first, I'd like to apologize for my english, it's not my 1st language !
I'm a total n00b in the wonderful world of TFS Build (2010), and I've got a problem.
I'll try to explain it to you using a simple example (but my actual situation is much more complicated).
I have a project with a console application "MyApp1", its location on my computer is "D:\MyProjets\MyApp1".
I have another project "Res" which contains only resources, including a file named emailTemplate.html.
My project "MyApp1" uses this file. Therefore, in the "App.config" file there's a key that stores the path of this resource : "D:\MyProjets\Res\emailTemplate.html"
Finally, I have a test for that application "MyApp1". This test checks if an e-mail has been sent. To send the e-mail "MyApp1" will need the file "emailTemplate.html", and will use the key in the configuration file to find it.
When I run the test on my computer : everything's ok.
But if I build the solution with TFS Build, when the tests are run I have a problem with this resource. During the build, the source files are copied in a directory (for example "D:\Build\1\My build projet\Sources\MyProjets\Res", and therefore "MyApp1" will look for "emailTemplate.html" in "D:\MyProjets\Res\emailTemplate.html" (configuration file) and of course won't be able to find it.
How should I do ?
I already know that my project shouldn't work with resources this way, but it's almost impossible for us to change that now, since it's the way we've been working in my company for a loooong time...
I thought about modifying the BuildProcessTemplate to force the Build server to run a getLatest on the Res projects exactly where I want. But I don't know if it's a good idea, or if it's even possible...
Thanks a lot for your help ! :)
Edit your build definition to include the "Res" project directory in the workspace as well. It should be automatically download/updated at each build (if you use any of the default process templates), and as long as you use relative paths in your tests you should be fine.

Using closure library with jsTestDriver

I'm learning about google closure tools by writing a simple JavaScript game. I'm having trouble figuring out how to set up jsTestDriver so that it works well with closure library.
Specifically: I'd like to use the goog.require mechanism to include any additional JavaScript files rather than have to manually add them all to the config file.
Following meyertee's suggestion I made a simple script to automatically write the dependencies to a config file
#!/bin/bash
cp tests/jsTestDriver.conf.proto tests/jsTestDriver.conf
libs/closure-library/closure/bin/build/closurebuilder.py --root="./libs/closure-library" --root="./js" --namespace="lds" | sed "s#^# - \.\./#" >> tests/jsTestDriver.conf
The tests/jsTestDriver.conf.proto file is a simple template:
test:
- "*.js"
load:
- ../libs/knockout-2.1.0.js
# Crucial, the load key needs to be last, and this comment must be followed by a newline.
It is a very fragile script, but hopefully someone (other than me) will find it useful.
You can do it semi-automatically by letting Closure Compile generate a manifest file, which will output all files in the correct order of dependency. You can then transform that file to relative paths and paste them into the JsTestDriver config file. That's how I do it.
You could even write a script that does this transformation automatically.
This is the relevant compiler argument:
--output_manifest manifest.MF
There are some details on the Closure Compiler's Google Code Wiki
Edit:
There are also some Python scripts to help you calculate dependencies. You can use calcdeps.py or closurebuilder.py to generate a manifest file, which even includes files that haven't been 'required' by your code.
Since JsTestDriver does not following the Closure Library convention of declaring dependencies with goog.provide() and goog.require(), your best option may be meyertee's solution.
However, the Closure Library includes its own testing framework. See:
Test Driven Development with the Closure Framework
Asserts API

Using Non-Local Data/Media Files with a C++ Application (gtkmm)

I'm beginning development on an acoustic spectrum analysis tool (inspired by spek) written in C++ with gtkmm (C++ bindings for the GTK+ GUI toolkit). I would imagine that I should know how to do this by now, however...
My directory structure is a-la-GNOME, e.g src/, data/, po/, man/. The specific situation that presented the need for my inquiry is the use of a GTK UI Manager that will be located in data/ui. For this specific situation, I want to be able to load the user-interface from this file in an install-independent manner (e.g. loading of the file does not depend on a make install; the executable may be run [and load the UI file] either from src/ after running make [thus compiling the sources into the selfsame exectuable] or from its install prefix). How would I refer to the UI file in my source code (keeping in mind that the loading of the file is not performed by creating a file object (fopen(...)) but rather by passing a file location as a string argument to (UIManager).add_ui_from_file(...))?
In addition to this particular situation of a UI file, how would I do similar references to files (i.e. databases, INI files, XML schemas) by using the autotools build process? Is there a piece of relevant Automake code to quickly set up a project to use this type of directory structure?
simply try to use both files (with the un-installed taking precedence):
if(!(UIManager).add_ui_from_file(../data/ui/mygui))
(UIManager).add_ui_from_file(/incalled/location/mygui)
In Glom, I created a helper function that tries both locations, with both locations being defined in the Makefile.am (this is simpler if you have only one Makefile.am, by using non-recursive automake, which is simpler anyway):
http://git.gnome.org/browse/glom/tree/glom/glade_utils.h#n38