Prevent adding new csproj from adding AnyCPU back to solution file - visual-studio-2017

We have a solution that we only want to have the x86 platform but every time we add a new project to the solution it adds AnyCPU back for every single project in the solution. This is tedious to remove all the AnyCPU lines in the solution file because we have 70+ projects in the solution. Is their any way to configure Visual Studio to prevent this from being added?
Not sure if this is relevant but we are on the legacy project system and only use csproj in our solution.
EDIT 1:
The reason I would like to keep AnyCPU from being added back to the solution is because of warnings and issues with building with certain nuget packages.
Some of our third party dependencies are built against x86 and it produces warnings with no codes when we reference them so I am unable to suppress them.
The nuget package I am specifically aware causes issues is CefSharp. It will fail to build our desktop application that references it if the developer selects AnyCPU. It uses the platform to determine if it should copy its unmanaged x86 or x64 dll.
EDIT 2:
Here is the section of the solution that causes issues when we go to build. From what I have read Visual Studio looks through this list alphabetically for a platform if one is not provided. This example is from an unrelated solution.
GlobalSection(SolutionConfigurationPlatforms) = preSolution
Debug|Any CPU = Debug|Any CPU
QA|Any CPU = QA|Any CPU
Release|Any CPU = Release|Any CPU
EndGlobalSection
EDIT 3:
As far as I can tell Hans' answer is the correct way to handle this. I have looked for other ways to handle this but after looking on uservoice was able to find where this was suggested in 2011.

This is a very common mistake. VS2010 is for a large part responsible for it, its project templates chose x86 instead of AnyCPU. Fixed again in VS2012 but not otherwise repairing any damage done by solutions that were once exposed to VS2010. Or helping programmers to get it right.
The platform selection is meaningless for C# projects. You use the exact same build tools for any platform, the generated code is truly compatible with "any cpu". It is the just-in-time compiler that locks in the target processor, it does so at runtime. The only setting that matters at all to affect what the jitter does is present in the Project > Properties > Build tab. Only the settings for the EXE project matter, libraries don't have a choice but to be compatible with the bitness of the process.
It does matter for C++ projects. A lot, they use a completely different compiler and linker for each platform. Necessarily so, C++ projects generate machine code up front and that code must be compatible with the user's machine. Also the reason this got fumbled at VS2010, that's when the C++ build system moved to MSBuild.
The typical reason AnyCPU pops back into the solution is adding a new project. Since they default to AnyCPU again, it needs to be added back to the solution platforms.
By far the best solution is to stop fighting the machine. AnyCPU should be your preference. Use Build > Configuration Manager > Active Solution combobox > Edit. Remove x86 so only AnyCPU remains. And do focus on what you want to accomplish, it is the EXE project settings that matter. Beware of yet another trap, even though the default platform is AnyCPU, a project template turns on the "Prefer 32-bit" checkbox by default. Not any cpu anymore. High time Microsoft fixes this btw, the 64-bit debugger and jitter have been stable and capable long enough to no longer need this.

Related

How to find out exactly why Visual Studio 2019 rebuilds unmodified .NET Core projects

I'm building a solution containing .NET Standard 2.0 and .NET Core 2.0 projects (C# and F#) in VS2019 (16.1.1). If I build multiple times without changes the second and subsequent builds should say "Build: 0 succeeded, 0 failed, X up-to-date", but it sometimes rebuilds some projects every time. How do I find out exactly why?
There are many SO questions and blog posts about this, most of them suggesting setting the build log verbosity to "Diagnostic" and looking for "not up to date". I've done that and the string is not found, nor is "not up-to-date" (but "up-to-date" occurs many times). So this appears to have changed in VS2019. I also know about the U2DCheckVerbosity registry setting, but that's only for .NET Framework. Reading through the build log output is unrealistic, as it's over 360 thousands lines, so I need to know what to search for.
Please note, I'm not looking for guesses as to what the problem might be - I'm looking for a way to get VS/the compiler to tell me.
I'm looking for a way to get VS/the compiler to tell me. (For VS2019)
It's hard to reproduce same issue so I'm not sure about the cause of your issue. But as for what you're asking is the way to find up-to-date related info in Output window, maybe you can check the Up To Data Checks option for .net core.
Go Tools=>Options=>Projects and Solutions=>.net core=>Up To Data Checks. Make sure you've checked the Don't call MSBuild if a project appears to be up-to-date. Then change the Logging Level to Info or Verbose.(Choose the suitable level according to your needs)
For the normal .net framework projects or C++ projects, the build output verbosity in build and run would be of great help. But when trying to find the reason why VS consider one .net core or .net standard project is out-of-date, I think we can try this option since its output is more clear.
E.g: (One .net Core project which depends on the Standard project with Info level .net core Up-To-Date Check):
And if you have too many projects in one solution, I suggest you build one project one time instead of build the whole solution so you can locate the cause of the rebuild more easily.
VS writes a file called NETCoreApp,Version=v2.0.AssemblyAttributes.cs into temp folder. If you build several .net core projects, the file gets changed by the other project and your VS thinks the old project is modified and builds it.
Move the generated files into the project to reduce the builds:
<PropertyGroup>
<TargetFrameworkMonikerAssemblyAttributesFileClean>False</TargetFrameworkMonikerAssemblyAttributesFileClean>
<TargetFrameworkMonikerAssemblyAttributesPath>$(MSBuildThisFileDirectory)SharedAssemblyAttributes.cs</TargetFrameworkMonikerAssemblyAttributesPath>
</PropertyGroup>
In VS2019 the option to control logging is:
The default is "None", but honestly "Minimal" is a good setting in general. When set to that level, only a single line is output per project, and only if that project is not up-to-date. That line will explain exactly why the project is considered out-of-date.
It's worth remembering that this is Visual Studio's up-to-date check, which it uses to quickly assess project state and avoid the comparatively expensive call to MSBuild. It is possible, with exotic project configurations, that VS determines your project needs building, but MSBuild doesn't actually build. This is rare, but can be worth understanding if you're debugging issues here.

How to compile c++ without needing a [duplicate]

My current preferred C++ environment is the free and largely excellent Microsoft Visual Studio 2005 Express edition. From time to time I have sent release .exe files to other people with pleasing results. However recently I made the disturbing discovery that the pleasing results were based on more luck that I would like. Attempting to run one of these programs on an old (2001 vintage, not scrupulously updated) XP box gave me nothing but a nasty "System cannot run x.exe" (or similar) message.
Some googling revealed that with this toolset, even specifying static linking results in a simple hello-world.exe actually relying on extra .dll files (msvcm80.dll etc.). An incredibly elaborate version scheming system (manifest files anyone?) then will not let the .exe run without exactly the right .dll versions. I don't want or need this stuff, I just want an old fashioned self contained .exe that does nothing but lowest common denominator Win32 operations and runs on any old win32 OS.
Does anyone know if its possible to do what I want to do with my existing toolset ?
Thank you.
For the C-runtime go to the project settings, choose C/C++ then 'Code Generation'. Change the 'runtime library' setting to 'multithreaded' instead of 'multithreaded dll'.
If you are using any other libraries you may need to tell the linker to ignore the dynamically linked CRT explicitly.
My experience in Visual Studio 2010 is that there are two changes needed so as to not need DLL's. From the project property page (right click on the project name in the Solution Explorer window):
Under Configuration Properties --> General, change the "Use of MFC" field to "Use MFC in a Static Library".
Under Configuration Properties --> C/C++ --> Code Generation, change the "Runtime Library" field to "Multi-Threaded (/MT)"
Not sure why both were needed. I used this to remove a dependency on glut32.dll.
Added later: When making these changes to the configurations, you should make them to "All Configurations" --- you can select this at the top of the Properties window. If you make the change to just the Debug configuration, it won't apply to the Release configuration, and vice-versa.
I've had this same dependency problem and I also know that you can include the VS 8.0 DLLs (release only! not debug!---and your program has to be release, too) in a folder of the appropriate name, in the parent folder with your .exe:
How to: Deploy using XCopy (MSDN)
Also note that things are guaranteed to go awry if you need to have C++ and C code in the same statically linked .exe because you will get linker conflicts that can only be resolved by ignoring the correct libXXX.lib and then linking dynamically (DLLs).
Lastly, with a different toolset (VC++ 6.0) things "just work", since Windows 2000 and above have the correct DLLs installed.
In regards Jared's response, having Windows 2000 or better will not necessarily fix the issue at hand. Rob's response does work, however it is possible that this fix introduces security issues, as Windows updates will not be able to patch applications built as such.
In another post, Nick Guerrera suggests packaging the Visual C++ Runtime Redistributable with your applications, which installs quickly, and is independent of Visual Studio.

How to profile building?

I am working on a large (~1 mloc) C++-application which takes too long to build from source (on windows using Visual Studio, on the mac using a Makefile or XCode). I would like to know where to start optimizing (e.g. precompiled headers, forward declarations, ...).
As with performance of the application itself, I would like to profile the build process before I start optimizing.
What tools are available to support this?
Firstly, please state exactly which version of Visual Studio you're using. If possible, upgrade to VS2010 as this has much better support for parallel building. Here's several things to consider:
Put the source tree on a different disk to the system disk. If you can extend to 2 SSDs (1 for system, 1 for source) then this makes a huge difference
Enable parallel builds. In VS2010 this halved our build time for a project about the same size as yours. Enable the 'Multiprocessor compilation' switch (/MP). You may find that one or two of your projects may need this turned off if they have strange dependencies, but as long as it's on for most projects then you'll get a massive boost.
VS2010 has verbose build timing logging options which can help you isolate the time spent in different projects. VS2005/2008 have a build timing option
If you have VS2005 or VS2008 then try out the MPCL plugin (it's not free but very cheap) which will do better parallel building than VS itself. If you have the budget there are tools like Incredibuild
If you're using Makefiles then use the -j flag to parallelise. If you're using Xcode then you can use distributed builds if you have other macs available (I've never had any luck with this myself though)
You could look into using ccache with gcc
Enable Precompiled headers for all or most projects. It may take a bit of experimenting to work out how much benefit you get -- you do hit diminishing returns quite quickly the more you put in them (and the more you have in, the more rebuilds you'll need to do)
Read John Lakos's book on Large Scale C++ Design which is a fantastic source of advice for how to split up large projects to isolate dependencies
Consider a two-stage build process. If you have lots of third party libraries that need to be built, or other libraries that don't change all that often then set up a separate project for them. Try building that in parallel with your main project or save the binaries. Consider checking the binaries into your source control system (yes, I know checking binaries into SCM is generally considered evil, but I believe you have to be pragmatic)
There are many ways of improving build-times. One of them is of course more hardware, i.e. faster disks and more RAM. Another is features of the compiler like precompiled headers. There are also external tools that can help, like distcc or ccache. For GNU make, there is also the -j option to run several make processes in parallel.

How do I make a fully statically linked .exe with Visual Studio Express 2005?

My current preferred C++ environment is the free and largely excellent Microsoft Visual Studio 2005 Express edition. From time to time I have sent release .exe files to other people with pleasing results. However recently I made the disturbing discovery that the pleasing results were based on more luck that I would like. Attempting to run one of these programs on an old (2001 vintage, not scrupulously updated) XP box gave me nothing but a nasty "System cannot run x.exe" (or similar) message.
Some googling revealed that with this toolset, even specifying static linking results in a simple hello-world.exe actually relying on extra .dll files (msvcm80.dll etc.). An incredibly elaborate version scheming system (manifest files anyone?) then will not let the .exe run without exactly the right .dll versions. I don't want or need this stuff, I just want an old fashioned self contained .exe that does nothing but lowest common denominator Win32 operations and runs on any old win32 OS.
Does anyone know if its possible to do what I want to do with my existing toolset ?
Thank you.
For the C-runtime go to the project settings, choose C/C++ then 'Code Generation'. Change the 'runtime library' setting to 'multithreaded' instead of 'multithreaded dll'.
If you are using any other libraries you may need to tell the linker to ignore the dynamically linked CRT explicitly.
My experience in Visual Studio 2010 is that there are two changes needed so as to not need DLL's. From the project property page (right click on the project name in the Solution Explorer window):
Under Configuration Properties --> General, change the "Use of MFC" field to "Use MFC in a Static Library".
Under Configuration Properties --> C/C++ --> Code Generation, change the "Runtime Library" field to "Multi-Threaded (/MT)"
Not sure why both were needed. I used this to remove a dependency on glut32.dll.
Added later: When making these changes to the configurations, you should make them to "All Configurations" --- you can select this at the top of the Properties window. If you make the change to just the Debug configuration, it won't apply to the Release configuration, and vice-versa.
I've had this same dependency problem and I also know that you can include the VS 8.0 DLLs (release only! not debug!---and your program has to be release, too) in a folder of the appropriate name, in the parent folder with your .exe:
How to: Deploy using XCopy (MSDN)
Also note that things are guaranteed to go awry if you need to have C++ and C code in the same statically linked .exe because you will get linker conflicts that can only be resolved by ignoring the correct libXXX.lib and then linking dynamically (DLLs).
Lastly, with a different toolset (VC++ 6.0) things "just work", since Windows 2000 and above have the correct DLLs installed.
In regards Jared's response, having Windows 2000 or better will not necessarily fix the issue at hand. Rob's response does work, however it is possible that this fix introduces security issues, as Windows updates will not be able to patch applications built as such.
In another post, Nick Guerrera suggests packaging the Visual C++ Runtime Redistributable with your applications, which installs quickly, and is independent of Visual Studio.

Using Makefile instead of Solution/Project files under Visual Studio (2005)

Does anyone have experience using makefiles for Visual Studio C++ builds (under VS 2005) as opposed to using the project/solution setup. For us, the way that the project/solutions work is not intuitive and leads to configuruation explosion when you are trying to tweak builds with specific compile time flags.
Under Unix, it's pretty easy to set up a makefile that has its default options overridden by user settings (or other configuration setting). But doing these types of things seems difficult in Visual Studio.
By way of example, we have a project that needs to get build for 3 different platforms. Each platform might have several configurations (for example debug, release, and several others). One of my goals on a newly formed project is to have a solution that can have all platform build living together, which makes building and testing code changes easier since you aren't having to open 3 different solutions just to test your code. But visual studio will require 3 * (number of base configurations) configurations. i.e. PC Debug, X360 Debug, PS3 Debug, etc.
It seems like a makefile solution is much better here. Wrapped with some basic batchfiles or scripts, it would be easy to keep the configuration explotion to a minimum and only maintain a small set of files for all of the different builds that we have to do.
However, I have no experience with makefiles under visual studio and would like to know if others have experiences or issues that they can share.
Thanks.
(post edited to mention that these are C++ builds)
I've found some benefits to makefiles with large projects, mainly related to unifying the location of the project settings. It's somewhat easier to manage the list of source files, include paths, preprocessor defines and so on, if they're all in a makefile or other build config file. With multiple configurations, adding an include path means you need to make sure you update every config manually through Visual Studio's fiddly project properties, which can get pretty tedious as a project grows in size.
Projects which use a lot of custom build tools can be easier to manage too, such as if you need to compile pixel / vertex shaders, or code in other languages without native VS support.
You'll still need to have various different project configurations however, since you'll need to differentiate the invocation of the build tool for each config (e.g. passing in different command line options to make).
Immediate downsides that spring to mind:
Slower builds: VS isn't particularly quick at invoking external tools, or even working out whether it needs to build a project in the first place.
Awkward inter-project dependencies: It's fiddly to set up so that a dependee causes the base project to build, and fiddlier to make sure that they get built in the right order. I've had some success getting SCons to do this, but it's always a challenge to get working well.
Loss of some useful IDE features: Edit & Continue being the main one!
In short, you'll spend less time managing your project configurations, but more time coaxing Visual Studio to work properly with it.
Visual studio is being built on top of the MSBuild configurations files. You can consider *proj and *sln files as makefiles. They allow you to fully customize build process.
While it's technically possible, it's not a very friendly solution within Visual Studio. It will be fighting you the entire time.
I recommend you take a look at NAnt. It's a very robust build system where you can do basically anything you need to.
Our NAnt script does this on every build:
Migrate the database to the latest version
Generate C# entities off of the database
Compile every project in our "master" solution
Run all unit tests
Run all integration tests
Additionally, our build server leverages this and adds 1 more task, which is generating Sandcastle documentation.
If you don't like XML, you might also take a look at Rake (ruby), Bake/BooBuildSystem (Boo), or Psake (PowerShell)
You can use nant to build the projects individually thus replacing the solution and have 1 coding solution and no build solutions.
1 thing to keep in mind, is that the solution and csproj files from vs 2005 and up are msbuild scripts. So if you get acquainted with msbuild you might be able to wield the existing files, to make vs easier, and to make your deployment easier.
We have a similar set up as the one you are describing. We support at least 3 different platforms, so the we found that using CMake to mange the different Visual Studio solutions. Set up can be a bit painful, but it pretty much boils down to reading the docs and a couple of tutorials. You should be able to do virtually everything you can do by going to the properties of the projects and the solution.
Not sure if you can have all three platforms builds living together in the same solution, but you can use CruiseControl to take care of your builds, and running your testing scripts as often as needed.