I have a solution that has projects both with C++ and C++/CLI code, and a set of projects which unit test all of these, using the Microsoft Unit Test Framework. For the C++/CLI projects, the unit test projects are C# unit tests. What I currently have is a platform for 32- and 64-bit. Also, for each platform I have unit test projects set to 32- and 64-bit platforms to match.
The issue I have is that when I switch to 32-bit vs. 64-bit I need to go to
TEST > TEST SETTINGS > DEFAULT PROCESSOR ARCHITECTURE and flip from 32- and 64 as needed. If I don't, I get a warning from Visual Studio that a 64-bit image cannot run in a 32-bit process. This makes sense, but surely there is some way to automate this?
Otherwise if I do a batch build on a build machine I will not have control of this and the unit tests will fail.
Also I have tried to set the unit test projects to be AnyCPU but this fails with an error saying "An attempt was made to load a program with an incorrect format"
Is there a better way perhaps?
If you're looking to automate test run i a build machine you can set the project to AnyCPU and run corflags /32bit+ (or /32bit-) to set the .NET assembly to the right platform before running the unit tests.
I'm not aware of an automatic solution for VS other then have two projects (one x64 and on x86) that link to the same files.
Related
I'm trying to add unit tests for my x86 .NET Core project.
I've created new project inside my solution and Visual Studio made it x64 by default. I added reference to the new project and tried to run an example test but it failed because there was a mismatch between projects' platform targets - x86 vs x64.
Then I tried to change platform target to x86 in the project with tests.
It gives me these warnings:
Test run will use DLL(s) built for framework .NETCoreApp,Version=v1.0 and platform X86. Following DLL(s) do not match framework/platform settings.
Project.UnitTests.dll is built for Framework 2.1 and Platform X86.
Exception discovering tests from Project.UnitTests: System.BadImageFormatException: Could not load file or assembly 'Project.UnitTests, Version=0.0.0.0, Culture=neutral, PublicKeyToken=null'. An attempt was made to load a program with an incorrect format.
No test matches the given testcase filter `FullyQualifiedName=Project.UnitTests.UnitTest1.Test1
Any ideas how to make it work?
Thanks
Is it possible to compile c++ project for Windows, Mac, Linux in Visual Studio 2017?
If not please give me a best way to compile for cross platforms.
No. It is not generally possible to do that with Visual Studio.
In my opinion, the best approach is to configure your CI system to spin up virtual machines running those other operating systems and then perform the build natively in the VM using whatever compiler those systems provide (like GCC & Clang). With the help of a build system like SCons or CMake you can abstract away most of the platform specific compiler bits.
A bonus is that building your code (and running your tests) with multiple compilers is a good way to find bugs.
Visual Studio 2017 added support for building and debugging for Linux, either on a remote machine or using some new built in local subsystem.
A fully cross-platform solution for you, porting your existing projects from VS could be as follows:
Start by converting your entire solution tree to a CMake project (VS 2017 fully supports loading such a project instead of the MS project format of the .sln and .vcxproj files). You can try a conversion tool like this one.
Now that you have a CMake project you can use the CMake build system directly from any other platform. For example on a Virtual Machine running your target OS. There, all you need is configuring your CMake project to build with clang/gcc instead of msvc.
If you prefer staying closer to home for now, in terms of editor/IDE that can be configured to build from the GUI -- kind of like the VS you're used to -- look for a cross-platform editor that supports CMake projects like VS Code.
We would like to measure code coverage for our own automated regression test system run over a fairly large native app. This is a sophisticated, scripted test system using the inbuilt scripting of our app. It has thousands of tests and is not going to be replaced by MSTest unit tests.
Whilst we're using VS2012 (Premium) as the IDE currently it is still compiled with the VS2010 compilers & libraries. That could change sooner if it was a prerequisite to getting code coverage going.
We can do separate builds for this - instrumenting is not a problem.
I'm just confused reading the MS documentation which seems to all start from an assumption you're running unit tests using their inbuilt test framework. That's when I'm not struggling to find stuff which actually talks about native support for ALM in the first place!
thanks
Visual Studio 2012's code coverage tool is entirely separate from the test execution system (full disclosure: I wrote it, but the team that inherited it after I left Microsoft removed some fairly useful functionality). It was rewritten from the ground up in VS 2012 to dynamically instrument native (x86 and x86-64) and managed code (.NET and Silverlight) when it loads into the process instead of modifying executables on disk.
You can find CodeCoverage.exe in "%ProgramFiles%\Microsoft Visual Studio 11.0\Team Tools\Dynamic Code Coverage Tools".
To collect data:
CodeCoverage.exe collect /output:foo.coverage foo.exe foos_args
A configuration file (there's a default one in that directory called CodeCoverage.config) can be specified to control collection.
To analyze the coverage data, you can open foo.coverage in Visual Studio 2012 or use the coverage tool itself to do the analysis:
CodeCoverage.exe analyze /output:results.xml foo.coverage
Note: for instrumentation to take place, .pdb files must be discovered for your modules. Since you are building with 2010, they may not work with 2012's DIA so you may have to rebuild with 2012's toolset. If you are not seeing the modules you expect in the coverage analysis, pass /include_skipped_modules to the analyze command; there will be a "reason" attribute telling you why the module was skipped (excluded, no debug information, etc.).
Edit: Also, unlike previous versions of Visual Studio, 2012's coverage file format is completely self-contained. The modules and .pdbs don't need to be present at analysis time.
I realize this is an old post, but I believe the answer still is relevant.
With all the things that I used to have at my disposal in C#, I didn't really like what I saw when I moved to Visual C++. Also, like you the MSTests only partially worked for me; I'm used to have my own test applications as well.
Basically what I wanted was the following:
Run MS tests or an EXE file
Get code coverage right in Visual Studio.
After doing some research, I noticed that VS Enterprise supports this feature today with test adapters.
If you're not on VSE, I noticed there are some other tools, each providing users with an independent UI. Personally I don't like that; I want my coverage right in Visual Studio, preferably in Visual Studio Community edition.
So I decided to build this addin myself and - while it's not as sophisticated as VSE - it does the trick for me.
I wrote a VSIX code coverage tool on https://github.com/atlaste/CPPCoverage . Basically it manages the highlighting in Visual Studio, generates a clickable report, and integrates in solution explorer.
For the coverage measurements themselves, I used to use https://opencppcoverage.codeplex.com/ . Basically that allows you to perform code coverage tests on any debuggable (native) executable. Nowadays, I'm using my own code coverage measuring tools (which are open sourced above as well).
I have a Visual Studio 2012 Solution with a number of native c++ test projects.
I can run all of these correctly and successfully from within Visual Studio 2012 using the Test Explorer tab.
However, I cannot get the tests to run when running from the command line.
Following the documentation I have been running the following command line
mstest /testcontainer:PathToTestProject\Win32\Release\testproject.dll
I also need to run
mstest /testcontainer:PathToTestProject\x64\Release\testproject.dll
for the testing of the 64bit version of the code.
When I run these command lines I get the following error message.
Microsoft (R) Test Execution Command Line Tool Version 11.0.50727.1
Copyright (c) Microsoft Corporation. All rights reserved.
Loading PathToTestProject\Win32\Release\testproject.dll...
PathToTestProject\Win32\Release\testproject.dll
Unable to load the test container PathToTestProject\Win32\Release\testproject.dll' or one of its dependencies. If you build your test project assembly as a 64 bit assembly, it cannot be loaded. When you build your test project assembly, select "Any CPU" for the platform. To run your tests in 64 bit mode on a 64 bit processor, you must change your test settings in the Hosts tab to run your tests in a 32 bit process. Error details: Could not load file or assembly 'file:///c:\PathToTestProject\Win32\Release\testproject.dll' or one of its dependencies. The module was expected to contain an assembly manifest.
The code is native c++ and has two build configurations one on Win32 platform, and the other on x64 platform. I cannot have an AnyCPU platform configuration.
What am I missing here to be able to run the tests from the command line?
After a lot of searching, I finally discovered a very hidden msdn documentation page
here which states the compatibility of mstest with different test project types.
And it turns out the mstest is not compatible with native unit tests (nice of msdn to document this in an easy to find location).
Instead you need to use the visual studio test running (vstest.console.exe) instead of msbuild for native unit test projects.
for example
vstest.console.exe /Platform:x64 PathToTestProject\x64\Release\testproject.dll
I'm considering running unit tests for my Visual Studio 2010 projects on our build server at build time. The problem is that when I'm working locally, I want to test against DEV, when building for QA, I want the tests to run against QA, when building/promoting for UAT/PROD... you get the picture.
I think VS 2010 might have support for per-environment configs. If so, does this apply for test projects also? If not, what are some alternatives?
thanks,
Mark
I don't really have much experience with VS 2010, but there was no such functionality in VS 2008.
I usually have a msbuild script to build and run the tests of the solution. In your case I would have a step in the this script to set the correct configuration after the code was built and before the tests are run. I used XmlUpdate task from http://msbuildtasks.tigris.org/ last time.