I have a project with multiframework target- <TargetFrameworks>netstandard2.0;net471</TargetFrameworks>.
I want to build the solution for netframework and netstandard separately.
Currently I use this MSBuild command:
MSBuild MySln.sln /t:Build /p:Configuration=Release /p:Platform="Any CPU" /m /nr:False
I tried tu run this command:
MSBuild CxAudit.sln /t:Build /p:Configuration=Release /p:Platform="Any CPU" /p:TargetFramework=netstandard2.0 /m /nr:False
(with /p:TargetFramework=netstandard2.0)
But it failed, even the first command pass and build the netstandard output.
I suggest setting a custom property and conditioning on these properties. This way, you won't affect other projects or references:
<TargetFrameworks Condition="'$(BuildNetStdOnly)' == 'true'">netstandard2.0</TargetFrameworks>
<TargetFrameworks Condition="'$(BuildNetFxOnly)' == 'true'">net471</TargetFrameworks>
<TargetFrameworks Condition="'$(TargetFrameworks)' == ''">netstandard2.0;net471</TargetFrameworks>
This way you can build using
msbuild -p:BuildNetStdOnly=true -p:Configuration=Release -m -nr:false
msbuild -p:BuildNetFxOnly=true -p:Configuration=Release -m -nr:false
Note that this is setting the plural version only because TargetFramework needs to be set as global properties for the inner builds to work if the project was restored for both target frameworks. If you want to set the singular TargetFramework, you also need to restore again for each invocation, by passing the -restore argument to msbuild as well.
Related
Posting this after reading https://www.hanselman.com/blog/FasterBuildsWithMSBuildUsingParallelBuildsAndMulticoreCPUs.aspx. Either this is clear as mud or I am just plain stupid.
I have always run msbuild.exe on the command line with /m without any /p:BuildInParallel=true and it always spawned the expected number of msbuild nodes (12 on my desktop, 4 on my laptop) and built the solution projects what the respective degree of concurrency (12 or 4 at a time most of the time).
On the other hand, when I called the MSBuild task from my targets filed and passed it more than one project (or a solution file), I have always set the BuildInParallel property of the MSBuild task. Because this is how you build projects in parallel using the MSBuild task, right?
Note, that this is a property of the MSBuild task, not a build property (like Configuration) passed in MSBuild.Properties.
The article suggests that there is actually a build property under the same name - BuildInParallel and that it compliments the /m switch, which is complete news to me. I searched all the *.targets files under *c:\Program Files (x86)\Microsoft Visual Studio\2019\Enterprise\MSBuild* and the only mentioning of it is in the following context:
If the property value is nil, set it to true
Pass its value to the MSBuild.BuildInParallel property.
Hard coded pass false when building Silverlight projects. (We still have these and I can concur - they are always built sequentially :-()
My conclusion - the article is misleading with respect to passing BuildInParallel to msbuild.exe, because it is of no significance, unless we pass false there. Which is also not needed, just do not pass /m.
But there is always a chance I am missing something here, so my question - what do I loose when running msbuild.exe /m without /p:BuildInParallel=true?
(I think I know the answer - absolutely nothing, but just in case)
what do I loose when running msbuild.exe /m without /p:BuildInParallel=true ?
Normally, you lose nothing. Running with with /p:BuildInParallel=true only changes the behavior if you manually adjusted your MSBuild project file accordingly as proposed by the article.
Basically, the article suggests defining a new global property called BuildInParallel and passing it to selected MSBuild tasks to be able to control their built-in BuildInParallel property. But the global property could be called any other name as there is no predefined global property, i.e. no built-in support to automatically pick up the value of a certain global property. The only thing built-in is the property BuildInParallel of the MSBuild task (most/all other tasks don't have this property).
BuildInParallel allows the MSBuild task to process the list of projects which were passed to it in a parallel fashion, while /m tells MSBuild how many processes it is allowed to start.
More Granularity (if needed)
If you are custom-crafting your MSBuild files, you could turn off parallelism on just certain MSBuild tasks by adding:
BuildInParallel=$(BuildInParallel)
to specific MSBuild Tasks and then just those sub-projects wouldn't build in parallel if you passed in a property from the command line:
MSBuild /m:4 /p:BuildInParallel=false
But this an edge case as far as I'm concerned.
Here is an example MSBuild project file to illustrate the use case to build some projects in parallel (if /m is passed on the command line), to build some projects in parallel (if /m and /p:BuildInParallel=true is passed on the command line) and to always build some projects sequentially.
<!-- built in parallel if both /m and /p:BuildInParallel=true are specified -->
<MSBuild
Projects="Solution_with_projects_to_build_in_parallel_1.sln"
Targets="Build"
BuildInParallel="$(BuildInParallel)">
</MSBuild>
<!-- built in parallel if /m is specified -->
<MSBuild
Projects="Solution_with_projects_to_build_in_parallel_2.sln"
Targets="Build">
</MSBuild>
<MSBuild
Projects="Solution_with_projects_always_built_sequentially.sln"
Targets="Build"
BuildInParallel="false">
</MSBuild>
I have a VS 2015 C++ project with both PreBuild and PostBuild steps.
In addition I have a Custom Target added to the project by "Build Dependencies -> Build Customization". The Custom Target runs a Perl script which runs nmake building files with Intel Compiler. The custom target always runs. Specifically the Perl script always runs while nmake checks for changes and prevents building if input files have not changed.
Invoking the custom target causes the PreBuild and PostBuild to run even if the custom target did not produce and new output (it ran but did nothing but checks).
I want to prevent PreBuild and PostBuild to run if my Custom Target didn't produce any new output. So far I didn't find a way to do this.
Another option is to prevent the custom target from running if sources have not changed. Unfortunately the files built by the Intel compiler are marked as "Exclude From Build" and thus do not trigger the custom target. I tried to define Input & Output for the task run by the custom target with no luck.
Any help will be highly appreciated!
Make sure your custom targets have an Inputs and Outputs attribute which properly describes which files will be used as input and which one were the resulting output. MsBuild will use the timestamp on these files to decide whether you actually changed anything. The timestamp on these files must be older than the file that would be generated as output from the target, that's how MsBuild decides.
Example:
<Target Name="Custom"
Inputs="#(CSFile)"
Outputs="hello.exe">
<Csc
Sources="#(CSFile)"
OutputAssembly="hello.exe"/>
</Target>
See also:
https://msdn.microsoft.com/en-us/library/ms171483.aspx
You can use transforms to map input to output if there is a logical relationship between the two:
<Target Name="Convert"
Inputs="#(TXTFile)"
Outputs="#(TXTFile->'%(Filename).content')">
<GenerateContentFiles
Sources = "#(TXTFile)">
<Output TaskParameter = "OutputContentFiles"
ItemName = "ContentFiles"/>
</GenerateContentFiles>
</Target>
https://msdn.microsoft.com/en-us/library/ms171483.aspx
Do not rely on BeforeTargets and AfterTargets and never rely on PreBuildEvent, as that target itself doesn't have any inputs or outputs and thus always triggers, they're quite old constructs, stemming from the 2003 era, instead override BuildDependsOn and inject your target in the chain.
Example:
<PropertyGroup>
<BuildDependsOn>
Convert;
$(BuildDependsOn);
</BuildDependsOn>
</PropertyGroup>
See:
https://blogs.msdn.microsoft.com/msbuild/2006/02/10/how-to-add-custom-process-at-specific-points-during-build-method-2/
I have setup a custom NuGet server for my company. It all works great - I can publish, view packages, etc.
My only concern is that I can publish a package with the same name and version number, thereby overwriting the existing package. This is not ideal and I would like the NuGet server to return an error if a package with the same name and version already exists.
Any clues on how I can accomplish this?
I would also greatly appreciate disallowing to overwrite existing packages. However, it does not seem to be possible using the NuGet server out of the box. A similar feature request has been closed about two years ago.
But looking at the source code opens some options. Have a look at the CreatePackage()-method. It uses an IPackageAuthenticationService to check if the specified package is allowed to be added (only checks the API Key) and a IServerPackageRepository to actually add the package:
// Make sure they can access this package
if (Authenticate(context, apiKey, package.Id))
{
_serverRepository.AddPackage(package);
WriteStatus(context, HttpStatusCode.Created, "");
}
Both are passed in using constructor injection so it is easy to extend the behaviour by passing custom implementations (Modify the Ninject bindings for that).
At first sight i would go for a custom IServerPackageRepository. The current implementation uses IFileSystem.AddFile(...) to add the package. You can use IFileSystem.FileExists(...) to check whether the package already exists.
From a continuous integration perspective it makes totally sense to disallow overwriting an existing package since NuGet follows Semantic Versioning. Thus, a new build should incorporate a bugfix, a new feature or a breaking change.
I would choose to allow overwriting snapshots/pre-releases, however.
Update: It seems v2.8 will have an option allowOverrideExistingPackageOnPush which defaults to true for backwards compatibility. It has been comitted with 1e7345624d. I realized that after forking. Seems i was too late again ;-)
I ran into the same problem. I run my own SymbolSource server. I decided to maintain a log of published packages. Before I publish a package, I can check the log to see if it has already been published and then not publish it. This is all done in an MS-DOS batch file. See below.
#echo off
rem Requires that the Visual Studio directory is in your
rem PATH environment variable. It will be something like:
rem C:\Program Files (x86)\Microsoft Visual Studio 12.0\Common7\IDE
rem API key for publishing to SymbolSource server
set apiKey=<<<GUID>>>
rem URL of the SymbolSource web app
set lib=http://<<<address>>>
rem Path to a simple text file on a file share - which happens to be the
rem same place that the SymbolSource server web app is published.
set log=\\<<<path>>>\publish_log.txt
rem Path to the Visual Studio solution that contains the projects to be published.
set sln=..\<<<solution name>>>.sln
rem Build all projects in the solution.
devenv %sln% /rebuild Debug
rem Delete packages produced during last run.
del *.nupkg
rem Each line in projects.txt is a path to a .csproj file that we want to
rem create a nuget package for. Each .csproj file has a corresponding .nuspec
rem file that lives in the same directory.
for /F %%i in (projects.txt) do nuget.exe pack %%i -IncludeReferencedProjects -Prop Configuration=Debug -Symbols
rem Delete any local packages that have already been published.
for /F %%i in (%log%) do if exist %%i del %%i
for %%F in (".\*.symbols.nupkg") do nuget push %%~nxF %apiKey% -source %lib%
rem Log information about published packages so, in the next run,
rem we can tell what has been published and what has not.
for %%F in (".\*.symbols.nupkg") do echo %%~nxF >> %log%
I wrote a PowerShell script that deletes the existing package version but only if it matches the version which I wish to push:
param (
[string]$buildconfiguration = "Debug"
)
function Update-Package ([string]$package,[string]$version,[string]$path)
{
dotnet nuget delete $package $version -s https://<mynugetserver>/nuget -k <my access code if used> --non-interactive
dotnet nuget push "$path\bin\$buildconfiguration\$package.$version.nupkg" -s https://<mynugetserver>/nuget -k <my access code if used>
}
Update-Package -package "My.Package" -version "2.2.0" -path "MyPackage"
The major drawback to this is the possibility of a change to the package while forgetting to update the package version in the NuSpec or vsproj package section as well as forgetting to change the version number in the script file.
I would never use this technique on a public NuGet server.
I also use a version of this file that doesn't push, just deletes which is used as a PowerShell task in my Azure DevOps (VSTS) build.
I know that NuGet says it will list versions available, but I didn't really feel like writing a script which could read back the result of a list to determine if the version number that I'm building already exists.
The one good thing is that if the new version number doesn't exist, the CLI call to delete doesn't complain too much and no package version is affected.
I've got an NMake project in Visual Studio 2008 that has the Build command set to a custom tool that performs a long build task.
Build = "#call MyTool -config $(ConfigurationName)"
I want a way to to pass a special flag ("-quickbuild") to my tool to tell it to do a quick subset of the overall build.
Build = "#call MyTool -config $(ConfigurationName) -quickbuild"
However I want it to be easy to switch between them so I don't actually want to change the build command.
My thought was to change the build command to this:
Build = "#call MyTool -config $(ConfigurationName) $(ShouldQuickbuild)"
and create a visual studio macro that will set the "ShouldQuickbuild" environment variable to "-quickbuild" then call DTE.Solution.SolutionBuild.BuildProject(...) on the project. The problem is it doesn't see the "ShouldQuickbuild" environment variable.
Any ideas on how I can get this working. Is there a better approach for doing what I want?
Use a batch file and check, If the environment is passed on to the batch file, then you can get that in the batch file and call the actual tool that you want.
The batch file would look like this :
#echo off
MyTool -config %1 %ShouldQuickbuild%
IF the environment is not passed to the batch file, you have to somehow get the info across, globally. Is it possible to create a file from a VS macro? Or call an EXE? Then it's quite simple..
Try putting your variable inside of % delimiters, as in %ShouldQuickBuild%.
You can you control this with the "Solution Configureation". Create two new configureations "Debug Quick" and "Release Quick". These would be copies of the originals. Then change the build command for each configuration.
Does anyone know of any MSBuild or NAnt tasks for controlling Wise Installation Studio?
I know, I should probably just use WiX but my current project is already in Wise and all I need to automate is updating of a product code, the upgrade section and a few bits of text.
I'm using CruiseControl.NET - this could be adapted for use in a Nant-only solution if desired. I call wfwi.exe which is included in the Wise installation and is meant for command line access (here's the Wise installer manual which contains instructions for wfwi.exe). Here's a snippet from my ccnet.config:
<!-- build installers -->
<exec>
<executable>C:\Path\To\WiseWrapper.bat</executable>
<buildArgs>"C:\Path\To\wfwi.exe" "C:\Path\To\Output.wsi" /c /p /s</buildArgs>
</exec>
<!-- build installer exes -->
<exec>
<executable>C:\Path\To\Wise32.exe</executable>
<buildArgs>/c /s C:\Path\To\Your.wse</buildArgs>
</exec>
And the WiseWrapper.bat allows the installer ProductVersion to be updated using the CC.NET build label. The entire contents of WiseWrapper.bat is:
%1 %2 %3 %4 ProductVersion=%CCNetLabel% %5