How do I manage deployment item references that may either be on an x86 install, or x64 install for an MSTest-based project? - unit-testing

Much related to this question, we have a scenario on my team where we need to copy the contents of a folder for a suite of libraries and configuration files for said libraries to our folder where our test code is running from, as part of the test's deployment step.
Due to the installation size, and other factors, checking in this install folder into source control for sharing between team members just isn't viable.
The install path for the folder is either /Program Files/InternalTool/ or /Program Files (x86)/InternalTool/ depending on the installed environment. I want to setup my .testrunconfig file such that when a person gets the latest version of the solution, they don't have to worry about fixups for the path to the shared internal library suite.
Is there a way to make this seamless for all members involved, and if so, how could one accomplish this?
Restrictions are as follows:
can't check in shared suite
shared suite has no override for installation path
Is this possible, or am I asking for too much?

We handle this sort of issue (our issues are not the same but are similar) by having different config files with different names and copying the correct one over when it is needed.
In some cases we automate this within the batch job that gets the latest version.

This was actually way, way easier than I expected.
While the UI doesn't support many things with the local test run config file, I was able to set the path using the standard %ProgramFiles%.
On x86 systems, this resolves, on most systems, to C:\Program Files\.
On x64 systems, this resolves, on most systems, to C:\Program Files\.
But! If the caller is 32-bit, and not 64-bit or set to MSIL, %ProgramFiles% will resolve to C:\Program Files(x86)\. Since there is no 64-bit mstest, the resolution should happen seamlessly. For an example, this is ripped from my LocalTestRun.testrunconfig file, and then properly sanitized:
<Deployment>
<DeploymentItem filename="%ProgramFiles%\InternalSuite\" />
</Deployment>
While I haven't had the chance to fully test this yet, this should resolve our issue just fine. I have tested this on my 32-bit system, and have found that it resolves right as rain.
Hope this helps someone else!

Related

Is it possible to edit the hardcoded path in a Windows (custom built) installation of Qt 5?

We are building Qt 5.10 internally, and installing it to a given prefix on the build environments.
We would like to be able to relocate the installation (notably, but not only for, distribution). We are aware of qt.conf, as pointed out by this answer.
Yet, is there a maintained way to directly edit the values of those hardcoded paths in the installed files?
EDIT:
More rationale behind why we thing qt.conf is inferior to directly patching the binaries.
On development machines, it means that instead of simply patching the installed binaries once, we have to provide a configuration file in each folder containing an application depending on Qt.
Even worse than that, we discovered through failures (and the help of this post) that qtwebengineprocess.exe, in qtprefix/bin, expects its own qt.conf file, otherwise it will use the paths hardcoded in the libraries. This means that we have to touch the the library folder anyway, in otder to edit the configuration file to make it match the folder location on each development machine.

TFS Build 2015 - Using Globally Referred Files in Every Build

So, we are in this process of migrating XAML Builds to vNext (2015) Builds on TFS, and we are trying to "do things as clean as possible", since we had many, many customizations on the XAML builds that could be avoided and actually gave us problems along the way.
One major issue we are facing is with paths and "global files". Let me explain:
There are some files that, for commodity reasons, we have on a single place and every SLN file on that Collection refers them. Those files are such ones as Code Analysis RuleSets, Signing Files (SNK), etc. So the change is made in one place only and it affects every build.
Well, in XAML Builds we have a Build that runs with CI that downloads (Gets) those files, and since we hammered-in the same exact pathing for TFS and Machine (with a environment variable for the beginning of the path), the path is the same on the Developers and Build machines. However, this creates dependencies between builds and workspace issues.
My question here is, is there a configuration that I am missing that allows referring to files in other branches other than the build one? Since I’m trying to keep the build machines as “disposable” as possible, it’s running with an Agent Config Out of the Box: No custom paths, no hardwiring.
I already tried referring the files directly with their source control path, for example. The only options I’m seeing are either creating a PowerShell/CMD Script that downloads those files right into the same folder as the SLN or keeping it “as it is” and use relative paths putting a “Build” Build Step before the actual Build Step so it downloads the files to the server.
Isn’t there an “Elegant” way of doing this? Or is our methodology wrong from the get go?
You can add a Copy Files step to copy the files that the build needs:

Is It Ok to Move Boost Library Installation To New Computer Without Reinstalling

I originally installed boost per the instructions at http://www.boost.org/doc/libs/1_55_0/doc/html/bbv2/installation.html
I transferred most of my Windows user profile to a new computer, which contained a folder called CodeLibs. This folder is where I originally installed boost (in place of PREFIX in above documentation).
I compiled a project that uses the serialization library, and I didn't receive any errors.
My question is, is there any reason to go through the documented installation process again or is the above directory transfer sufficient?
Thanks in advance.
Copying should be fine, so long as the target architecture is the same.
Boost doesn't need to be "installed" in the typical way. There are no registry settigs to set, no COM servers to install, no daemons to set up. Nothing like that.
The install process you went through originally mostly consisted of compiling code. That code, once compiled, was then copied to some destination folder and some environment variables might have been updated.
None of this is truly necessary, but once you get the code on your target machine you might have to tweak a few paths etc so that the compiler can find the headers and libs (if any libs are needed), and executables can find the shared libraries.
Assuming you have a high level of proficiency with such things -- as is suggested by the fact that you were able to install it the first time at all -- I'm sure none of this will be a major challenge for you.

Where to install SDK DLLs on a system so that they can be found by apps that need them

I've got an SDK I'm working on and the previous developer just dropped the DLLs in System32 (Apparently a serious offense: see here)
So assuming I move them out into \Program Files\\SDK (or whatever), how do I make sure that all the apps that needs those DLLs can access them? And to clarify, all apps that access these are doing early (static) binding to the DLLs at compile time so I can't pass the full path to them or anything. They need to be able to find it just given the DLL filename only.
Along the same lines, what about including a particular version of MSVCR80.dll? They all depend on this but I need to make sure they get a specific version (the one I include).
Any ideas?
An SDK is by definition a development kit. It's not a deployment patch...
What this means is that the applications that depend on those assemblies should ship with them and install them into their local \program files.. directories.
The reason for this is let's say you decide to do a breaking change by eliminating an entry point for example. By installing your "SDK", it has the potential to stop older programs from functioning.
You could take a play from the Java handbook and update the PATH environment variable. Whenever a program makes a call to an external assembly it searches along that environment variable until it finds it.
Of course, this could still result in the problem showing up. So your best bet is to just install the SDK into Program Files and let the developers of the products that depend on your toolkit decide whether they want to update their versions or not.
UPDATE
As I'm thinking about this, one last possibility is to GAC your assemblies. In the event you do so, bear in mind that they should be strongly named and properly versioned so as not to step on each other. I don't recommend this route because it hides the actual locations of the assemblies and makes uninstalling a little more difficult then simply hitting delete on your directory.
I can't tell you about your own DLLs, but you should never redistribute Microsoft DLLs alone.
You always have to use Microsoft Redistributable Package.
For example, if your application depends on dll from Dev Studio 2005 SP1, you should redistribute your application with Microsoft Visual Studio 2005 SP1 redistributable. The same applies to 2008. MS provide MSI based installer and Merge Module to include in your own product installer.
You are asking about "DLL Hell", something I had thought every Windows developer was familiar with. The order of search for DLLs is:
the directory the exex that calls them was loaded from
the current directory
various Windows directories (as discussed in your previous question)
directories in the PATH variable
As the Windows directories should be ruled out, that leaves you with three options.
You can put your install path in the search path, which will allow the applications to find them.
Alternatively, you can deploy the DLL's into the same directory as the application that depends on them.
I believe the first is better from an SDK perspective - it'll make development easier. But I think the second is better for when the application gets deployed to end-users, unless you expect there may be many consumers on a single system so the disk and memory footprint of having copies of the DLL are prohibitive.
If you can't install the dlls into the same directory as the exe using them you could append your directory to the PATH environment variable.
You don't say which version of Windows you're using, as the details are slightly different from what I remember.
You could also put your version of MSVCR80.dll in the same folder. However, you'd have to ensure that your folder was before the system one on the path otherwise the linker would pick up the "standard" one first. However, if you adopted the "local" dlls approach then you wouldn't have this problem as Windows searches the local directory first and so will pick up your version of MSVCR80.dll.
Is your version the latest or a previous version? You might be better off getting your app to work with that version or later and then allow the users to update their machines as required. This also illustrates why you should never mess with the dlls in \Windows or \Windows\system32 as, as others have pointed out, you could break other applications by changing the version of this dll.

How to compensate for differences in environment between development machines and Build servers?

I have NUnit installed on my machine in "C:\Program Files\NUnit 2.4.8\" but on my integration server(running CruiseControl.Net) I have it installed in "D:\Program Files\NUnit 2.4.8\". The problem is that on my development machine my NAnt build file works correctly because in the task I'm using the path "C:\Program Files\NUnit 2.4.8\bin\NUnit.Framework.dll" to add reference to the 'NUnit.Framework.dll' assembly but this same build file cannot build the file on my integration server(because the reference path is different). Do I have to have my NUnit installed at the same location as it is in my integration server? This solution seems too restrictive to me. Are there any better ones? What is the general solution to this kind of problem?
Typically I distribute NUnit and any other dependencies with my project, in some common location (for me that's a libs directory in the top level).
/MyApp
/libs
/NUnit
/NAnt
/etc...
/src
/etc...
I then just reference those libs from my application, and they're always in the same location relative to the project solution.
In general, dependencies on absolute paths should be avoided. As far as CI goes, you should be able to build and run your solution on a clean machine completely from scatch using only resources found in your source code control via automated scripts.
The "ultimate" solution can be to have the entire tool-chain stored in your source-control, and to store any libraries/binaries you build in source-control as well. Set up correctly, this can ensure you have the ability to rebuild any release, from any point in time, exactly as it was shipped, but that, furthermore, you don't need to do that as every binary you#ve ever generated is source-controlled.
However, getting to that point is some serious work.
I'd use two approaches:
1) use two different staging scripts (dev build/integration build) with different paths.
2) put all needed executables in you path folder and call them directly.
I'd agree that absolute paths are evil. If you can't get around them, you can at least set an NUNIT_HOME property within your script that defaults to C:... and in your CI server call your script passing in the NUNIT_HOME property at the command line.
Or you can set your script to require an NUNIT_HOME environment variable to be set in order for NUNIT to work. Now, instead of requiring that the machine it runs on has nUnit in some exact location, your script requires that nunit be present and available in the environment variable.
Either approach would allow you to change the version of nunit you are using without modifying the build script, is that what you want?
The idea of having all the tools in the tool chain under version control is a good one. But while on your path there you can use a couple of different techniques to specify different paths per machine.
NAnt let's you define a <property> that you can override with -Dname=value. You could use this to have a default location for your development machines that you override in your CI system.
You can also get values of environment variables using environment::get-variable to change the location per machine.