I'm on a BizTalk 2013 solution and I'm trying to grow into automated testing. However, when I try to run my tests after changing only the test project, or even just run the tests after changing nothing anywhere, I'm stuck building the same amount of projects that I build when I invoke a full rebuild on the project being tested. This eats up an enormous amount of time, and it's a death sentence for my ability to sell future investments into this type of thing.
Is this is a known deficiency with BizTalk, or with its interaction with MSBuild? Is it a known pitfall that I can repair on my end?
EDIT: After reviewing the "possible duplicate" thread, I believe this question to be similar, but distinct. The explanation from the thread highlights the mechanics by which MSBuild determines that a rebuild is necessary, but MSBuild is widely-used technology across all projects in Visual Studio and can differ significantly by project type based on that project type's specific targets import. I've edited the question title to reflect that I want to learn how to prevent this for BizTalk solutions rather than simply asking why it's happening (although knowing why is always helpful).
So, what you're seeing is not a problem with BizTalk (because BizTalk is perfect and wonderful and never has any problems ever...:).
It's actually a behavior of Visual Studio. To note, BizTalk Projects are just specialized c# Projects.
The best workaround, which I do all the time, is to uncheck the Build and Deploy options for Projects I'm not actively working with in the Solution Configuration. If the Project is not checked for Build, it will not build even when you choose Rebuild Solution.
One possible solution would be to reference not the projects, but the DLL files which are the result of the same - already compiled and built - projects.
This way, when building your test project, it would be built against these existing assemblies and hence would not take the time to rebuild those.
You have to make sure however that these DLLs are updated whenever the project behind them also updates. You could do this by rebuilding them, whenever necessary, in a separate Visual Studio instance.
It takes some practice and thinking to make sure you are building against the latest version, but it WILL save you a lot of time.
I've noticed this as well. Turning on diagnostic output on MSBuild, it turned out that the project settings .user files were being modified after the .pdb files. I've tried several ways of resolving this, including changing the modify date on the pdb file, setting the .user file to readonly, removing (renaming) the .user file, etc.
Unfortunately, the build task for BizTalk will overwrite/recreate/create new .user file after every build, and I haven't come up with a way to convince MSBuild that that it can just ignore the .user file being created as new. Due to that, I'd go with one of the other suggestions here.
Even creating an exclusive lock on the file so that MSBuild can't update it causes a rebuild, since then MSBuild thinks the build is dirty ("Project 'Schemas' is not up to date. Project dirty in MSBuild.")
Related
I'm building a solution containing .NET Standard 2.0 and .NET Core 2.0 projects (C# and F#) in VS2019 (16.1.1). If I build multiple times without changes the second and subsequent builds should say "Build: 0 succeeded, 0 failed, X up-to-date", but it sometimes rebuilds some projects every time. How do I find out exactly why?
There are many SO questions and blog posts about this, most of them suggesting setting the build log verbosity to "Diagnostic" and looking for "not up to date". I've done that and the string is not found, nor is "not up-to-date" (but "up-to-date" occurs many times). So this appears to have changed in VS2019. I also know about the U2DCheckVerbosity registry setting, but that's only for .NET Framework. Reading through the build log output is unrealistic, as it's over 360 thousands lines, so I need to know what to search for.
Please note, I'm not looking for guesses as to what the problem might be - I'm looking for a way to get VS/the compiler to tell me.
I'm looking for a way to get VS/the compiler to tell me. (For VS2019)
It's hard to reproduce same issue so I'm not sure about the cause of your issue. But as for what you're asking is the way to find up-to-date related info in Output window, maybe you can check the Up To Data Checks option for .net core.
Go Tools=>Options=>Projects and Solutions=>.net core=>Up To Data Checks. Make sure you've checked the Don't call MSBuild if a project appears to be up-to-date. Then change the Logging Level to Info or Verbose.(Choose the suitable level according to your needs)
For the normal .net framework projects or C++ projects, the build output verbosity in build and run would be of great help. But when trying to find the reason why VS consider one .net core or .net standard project is out-of-date, I think we can try this option since its output is more clear.
E.g: (One .net Core project which depends on the Standard project with Info level .net core Up-To-Date Check):
And if you have too many projects in one solution, I suggest you build one project one time instead of build the whole solution so you can locate the cause of the rebuild more easily.
VS writes a file called NETCoreApp,Version=v2.0.AssemblyAttributes.cs into temp folder. If you build several .net core projects, the file gets changed by the other project and your VS thinks the old project is modified and builds it.
Move the generated files into the project to reduce the builds:
<PropertyGroup>
<TargetFrameworkMonikerAssemblyAttributesFileClean>False</TargetFrameworkMonikerAssemblyAttributesFileClean>
<TargetFrameworkMonikerAssemblyAttributesPath>$(MSBuildThisFileDirectory)SharedAssemblyAttributes.cs</TargetFrameworkMonikerAssemblyAttributesPath>
</PropertyGroup>
In VS2019 the option to control logging is:
The default is "None", but honestly "Minimal" is a good setting in general. When set to that level, only a single line is output per project, and only if that project is not up-to-date. That line will explain exactly why the project is considered out-of-date.
It's worth remembering that this is Visual Studio's up-to-date check, which it uses to quickly assess project state and avoid the comparatively expensive call to MSBuild. It is possible, with exotic project configurations, that VS determines your project needs building, but MSBuild doesn't actually build. This is rare, but can be worth understanding if you're debugging issues here.
We have two distinct team projects, both running in TFS 2013 / VS 2013. One of them always builds the whole solution when asked to run all tests in the Test Explorer window, while the other one does not build anything and just tries to run the tests again.
Sometimes we would like to prevent VS from building the whole solution, since we know it did not change or we just don't want to test against the modified changes, for instance.
What setting controls this behavior? I can't really see any differences between both projects to warrant different test behavior. I'm not using any settings file in neither of them, and this was tested with both projects completely cleaned (like deleting the .suo file for example).
I'm aware of two things that may help.
Set files in your solution to Do not copy or Copy if newer
In Tools -> Options -> Projects and Solutions -> Build and Run check the checkbox: Only build startup projects and dependencies on Run
I'm working with these settings and I don't have the issue thought I cannot guarantee it will help in your case.
To build only few projects, you can create a sub solution - meaning the solution will have only few projects and their dependencies.
To stop building every time, use the tools-options-Projects and solutions->Build and run
check, build only the Startup projects and dependencies
and select "never build" in the drop down "On run, when projects are out of date"
Is it possible to combine the following properties, and if so, how?
Store in our version control system some Visual Studio 2008 native C++ (VCPROJ) project files for the developers in our team that use this IDE.
Allow some of those developers to tweak their projects (e.g. using debug version of third-party libraries instead of the usual ones).
Make sure these modifications are done in files that are not versioned.
In other words, I would like to allow developers to tweak some settings in their projects without risking that these changes are committed.
An 'optional VSPROP' file approach seems doomed to fail, as VS2008 refuses to load projects that refer to non-existent VSPROP files...
Any other suggestion? Is this possible with VS2010?
You may not be able to do this but using a solution that generates the vcproj like CMake for example would let you do this. Scripts all your project with CMake and literally conditionally include a config file(if present for example) that developers can change on their setup.
Branches could solve this problem: you create a branch, play with different versions of third-party, merge changes to trunk if results are good.
Well, as a preliminary solution you could put the project file into something like .hgignore or .gitignore after its initial commit.
This way changes to it can't be done accidentally.
At least that's how I handle .hgignore itself.
We use a versionned "common_configuration" folder, and a script which copies project files from this "common_configuration" folder towards the "project" folder.
We have another script to copy the configuration backwards, so the developpers need to make a conscious action to commit their local changes to the global version control system.
It answers partly your needs :
The upside : we have a way to keep a common configuration for everyone, and no accidental committing of local configuration
The downside : blindly copying the files actually crushes local changes. We live with it. We could write some more clever merger tool (using diff, or xml specific manipulations), but don't want to spend to much time on supporting the deployment tools.
I haven't done much "front-end" development in about 15 years since moving to database development. I'm planning to start work on a personal project using C++ and since I already have MSDN I'll probably end up doing it in Visual Studio 2010. I'm thinking about using Subversion as a version control system eventually. Of course, I'd like to get up and running as quickly as I can, but I'd also like to avoid any pitfalls from a poorly organized project environment.
So, my question is, are there any good resources with common best practices for setting up a development environment? I'm thinking along the lines of where to break down a solution into multiple projects if necessary, how to set up a unit testing process, organizing resources, directories, etc.
Are there any great add-ons that I should make sure I have set up from the start?
Most tutorials just have one simple project, type in your code and click on build to see that your new application says, "Hello World!".
This will be a Windows application with several DLLs as well (no web development), so there doesn't need to be a deploy to a web server kind of process.
Mostly I just want to make sure that I don't miss anything big and then have to extensively refactor because of it.
Thanks!
I would also like a good answer to this question. What I've done is set it up so that each solution makes reference to a $(SolutionDir)\build directory for includes and libraries. That way each project that has dependencies on other projects can access them and versions won't compete. Then there are post-build commands to package up headers and .lib files into a "distribution" folder. I use CC.net to build each package on checkin. When we decide to update a dependency project we "release" it to ourselves, which requires manual tagging, manual copying current.zip into a releases area and giving it a version number, and copying that into the /build of the projects that depend on the upgrade.
Everything works pretty great except this manual process at the end. I'd really love to get rid of it but can't seem to. Read an article from ACM about "Continuous Release" that would be really nice to have an implementation of but there isn't any. I keep telling myself I'll make one.
If I use "junctions" in the windows filesystem I can link "distribute" to "build" and then build a secondary solution that includes all the projects that are dependent on each other to build a product. When I did that though it encouraged developers to use it for active development, which discouraged TDD and proper releasing.
Does anyone have experience using makefiles for Visual Studio C++ builds (under VS 2005) as opposed to using the project/solution setup. For us, the way that the project/solutions work is not intuitive and leads to configuruation explosion when you are trying to tweak builds with specific compile time flags.
Under Unix, it's pretty easy to set up a makefile that has its default options overridden by user settings (or other configuration setting). But doing these types of things seems difficult in Visual Studio.
By way of example, we have a project that needs to get build for 3 different platforms. Each platform might have several configurations (for example debug, release, and several others). One of my goals on a newly formed project is to have a solution that can have all platform build living together, which makes building and testing code changes easier since you aren't having to open 3 different solutions just to test your code. But visual studio will require 3 * (number of base configurations) configurations. i.e. PC Debug, X360 Debug, PS3 Debug, etc.
It seems like a makefile solution is much better here. Wrapped with some basic batchfiles or scripts, it would be easy to keep the configuration explotion to a minimum and only maintain a small set of files for all of the different builds that we have to do.
However, I have no experience with makefiles under visual studio and would like to know if others have experiences or issues that they can share.
Thanks.
(post edited to mention that these are C++ builds)
I've found some benefits to makefiles with large projects, mainly related to unifying the location of the project settings. It's somewhat easier to manage the list of source files, include paths, preprocessor defines and so on, if they're all in a makefile or other build config file. With multiple configurations, adding an include path means you need to make sure you update every config manually through Visual Studio's fiddly project properties, which can get pretty tedious as a project grows in size.
Projects which use a lot of custom build tools can be easier to manage too, such as if you need to compile pixel / vertex shaders, or code in other languages without native VS support.
You'll still need to have various different project configurations however, since you'll need to differentiate the invocation of the build tool for each config (e.g. passing in different command line options to make).
Immediate downsides that spring to mind:
Slower builds: VS isn't particularly quick at invoking external tools, or even working out whether it needs to build a project in the first place.
Awkward inter-project dependencies: It's fiddly to set up so that a dependee causes the base project to build, and fiddlier to make sure that they get built in the right order. I've had some success getting SCons to do this, but it's always a challenge to get working well.
Loss of some useful IDE features: Edit & Continue being the main one!
In short, you'll spend less time managing your project configurations, but more time coaxing Visual Studio to work properly with it.
Visual studio is being built on top of the MSBuild configurations files. You can consider *proj and *sln files as makefiles. They allow you to fully customize build process.
While it's technically possible, it's not a very friendly solution within Visual Studio. It will be fighting you the entire time.
I recommend you take a look at NAnt. It's a very robust build system where you can do basically anything you need to.
Our NAnt script does this on every build:
Migrate the database to the latest version
Generate C# entities off of the database
Compile every project in our "master" solution
Run all unit tests
Run all integration tests
Additionally, our build server leverages this and adds 1 more task, which is generating Sandcastle documentation.
If you don't like XML, you might also take a look at Rake (ruby), Bake/BooBuildSystem (Boo), or Psake (PowerShell)
You can use nant to build the projects individually thus replacing the solution and have 1 coding solution and no build solutions.
1 thing to keep in mind, is that the solution and csproj files from vs 2005 and up are msbuild scripts. So if you get acquainted with msbuild you might be able to wield the existing files, to make vs easier, and to make your deployment easier.
We have a similar set up as the one you are describing. We support at least 3 different platforms, so the we found that using CMake to mange the different Visual Studio solutions. Set up can be a bit painful, but it pretty much boils down to reading the docs and a couple of tutorials. You should be able to do virtually everything you can do by going to the properties of the projects and the solution.
Not sure if you can have all three platforms builds living together in the same solution, but you can use CruiseControl to take care of your builds, and running your testing scripts as often as needed.