I'm creating my first Azure build pipeline for a .Net Core 2.1 solution. I've had success with DotNetCoreCLI#2 for all of my steps, that is, except for the pack step.
This works, and is currently what I have resorted to:
- script: |
dotnet pack src/MyProject/MyProject.csproj --version-suffix $(VersionSuffix) --configuration $(BuildConfiguration) --no-restore --no-build --output $(Build.ArtifactStagingDirectory)
displayName: 'dotnet pack [$(BuildConfiguration)]'
This does not work, in that it ignores the --version-suffix directive:
- task: DotNetCoreCLI#2
inputs:
command: 'pack'
# packagesToPack: '**/*.csproj; **/!*Test*.csproj' - TODO pack all projects, except test projects
packagesToPack: 'src/MyProject/MyProject.csproj'
arguments: '--version-suffix $(VersionSuffix) --configuration $(BuildConfiguration) --no-restore --no-build --output $(Build.ArtifactStagingDirectory)'
displayName: 'dotnet pack [$(BuildConfiguration)]'
(I've left one of my TODOs in there as a side quest)
Also, the version prefix resides in the csproj file:
<Project Sdk="Microsoft.NET.Sdk">
<PropertyGroup>
<TargetFramework>netstandard2.0</TargetFramework>
<PackageId>MyProject</PackageId>
<Authors>Me</Authors>
<Description>A description</Description>
<VersionPrefix>0.1.0</VersionPrefix>
<IsPackable>true</IsPackable>
</PropertyGroup>
<Project/>
When I use the dotnet pack I see a NuGet package with the full version (i.e. <prefix>-<suffix>) as I expect; e.g. 0.1.0-190813.02.abcdef.
When I use the DotNetCoreCLI#2 task the version is limited to the version prefix; e.g. 0.1.0.
What have I missed? Ideally I would like the pipeline yaml file to be consistent.
What have I missed? Ideally I would like the pipeline yaml file to be consistent.
No, you did not miss anything. This behavior is by designed for DotNetCoreCLI#2.
When you check that task in the classic editor without YAML, you can see there is no such Arguments option, instead of Pack options:
So, we could use this option to define the package version.
Besides, according to the document .NET Core CLI task, the description for the arguments when you use it in YAML:
The argument -version for Pack is not accept, we need use the custom command, which is the method you are using now.
So, you are on the right way now, do not need to worry about it.
Hope this helps.
I also stumbled onto this problem and found out that DotNetCoreCLI is not going to help me with version suffixes per #Leo answer.
Way to overcome this problem is to pack your project with powershell task. A simple example:
- powershell: 'dotnet pack -o $(build.artifactstagingdirectory) --no-build --no-restore -c ${{ parameters.configuration }} ${{ parameters.projectPath }}'
```
The way I fixed was to use Nuget Command
- task: NuGetCommand#2
displayName: 'Pack Test'
inputs:
command: 'pack'
packagesToPack: '**/test.nuspec'
versioningScheme: 'byBuildNumber'
Related
I haven't changed anything recently in my project, but when I tried to deploy it last, I received this error in the logs: ERROR: Could not build wheels for pyarrow, which is required to install pyproject.toml-based projects
See the full log here: log-d20114fe-3eeb-4a8d-8926-3a971882894c.txt
This is my requirements.txt:
requirements.txt
It seems like it is an issue with the dependencies for the snowflake-connector-python package, but I am not really sure what would have caused this. I see in the logs:
-- Running cmake for pyarrow
Step #0 - "Buildpack": cmake -DPYTHON_EXECUTABLE=/layers/google.python.runtime/python/bin/python3 -DPython3_EXECUTABLE=/layers/google.python.runtime/python/bin/python3 "" -DPYARROW_BUILD_CUDA=off -DPYARROW_BUILD_FLIGHT=off -DPYARROW_BUILD_GANDIVA=off -DPYARROW_BUILD_DATASET=off -DPYARROW_BUILD_ORC=off -DPYARROW_BUILD_PARQUET=off -DPYARROW_BUILD_PARQUET_ENCRYPTION=off -DPYARROW_BUILD_PLASMA=off -DPYARROW_BUILD_S3=off -DPYARROW_BUILD_HDFS=off -DPYARROW_USE_TENSORFLOW=off -DPYARROW_BUNDLE_ARROW_CPP=off -DPYARROW_BUNDLE_BOOST=off -DPYARROW_GENERATE_COVERAGE=off -DPYARROW_BOOST_USE_SHARED=on -DPYARROW_PARQUET_USE_SHARED=on -DCMAKE_BUILD_TYPE=release /tmp/pip-install-w1g_50oc/pyarrow_4a54282bee5f4c3c8399d3428e4134e6
Step #0 - "Buildpack": error: command 'cmake' failed: No such file or directory
This makes me think CMake is the problem, but I tried explicitly adding CMake to my requirements file and had the same result.
I also looked at the last successful build, and it looks like I was running python version 3.10.8, and the one that failed first was running 3.11. How can I change what python version cloud build uses? I am using the cloudbuild.yaml file instead of docker.
Figured it out! The issue was with not specifying a version in Cloud Build for Python, so it was defaulting to 3.11, which does not yet have support for pyarrow. I ended up setting the version in the cloud build yaml file to 3.10.8 like so:
steps:
- name: gcr.io/k8s-skaffold/pack
env:
- GOOGLE_ENTRYPOINT=$_ENTRYPOINT
- GOOGLE_RUNTIME_VERSION=$_RUNTIME_VERSION
args:
- build
- '$_GCR_HOSTNAME/$PROJECT_ID/$REPO_NAME/$_SERVICE_NAME:$COMMIT_SHA'
- '--builder=gcr.io/buildpacks/builder:v1'
- '--network=cloudbuild'
- '--path=.'
- '--env=GOOGLE_ENTRYPOINT'
- '--env=GOOGLE_RUNTIME_VERSION'
For those of you who are using a Procfile
I was able to fix by creating a .python-version file that contains just the version number needed,
i.e. 3.10.4
I'm trying to set up a github actions script for a project of mine. The project is private at the moment since I want it in an RC state before the first release. As it is close to ready, I intended to set up an automated build now, but I'm seeing some strange behavior. The project is a simple c# library, and so the .yml file is quite simple:
name: .NET Core Desktop
on: [push, pull_request]
jobs:
build:
strategy:
matrix:
configuration: [Debug, Release]
runs-on: windows-latest
env:
Solution_Name: Replacement.sln # Replace with your solution name, i.e. MyWpfApp.sln.
Test_Project_Path: UnitTest.csproj # Replace with the path to your test project, i.e. MyWpfApp.Tests\MyWpfApp.Tests.csproj.
steps:
- name: Checkout
uses: actions/checkout#v3
with:
fetch-depth: 0
# Run build
- name: Run build
run: ./build.cmd --target pack --configuration ${{ matrix.configuration }}
Sometimes, one of the builds (Debug or Release, or both) fail with an entry such as
Terminate batch job (Y/N)?
Error: The operation was canceled.
in the log. I certainly did not cancel the build in any way. This may happen after like 3 minutes.
Am I running into the github build time limit (how would I know)? Or is there something else wrong I'm missing?
YAMAL config file:
# .NET Desktop
# Build and run tests for .NET Desktop or Windows classic desktop solutions.
# Add steps that publish symbols, save build artifacts, and more:
# https://learn.microsoft.com/azure/devops/pipelines/apps/windows/dot-net
trigger:
- master
pool:
vmImage: 'windows-latest'
variables:
solution: '**/*.sln'
buildPlatform: 'Any CPU'
buildConfiguration: 'Release'
steps:
- task: NuGetToolInstaller#1
- task: NuGetCommand#2
inputs:
restoreSolution: '$(solution)'
- task: VSBuild#1
inputs:
solution: '$(solution)'
platform: '$(buildPlatform)'
configuration: '$(buildConfiguration)'
- task: VSTest#2
inputs:
platform: '$(buildPlatform)'
configuration: '$(buildConfiguration)'
Error:
##[error]Solution not found using search pattern 'D:\a\1\s***.sln'.
RAW log:
2020-06-26T13:39:42.7115236Z ##[section]Starting: VSBuild
2020-06-26T13:39:42.7374773Z ==============================================================================
2020-06-26T13:39:42.7375115Z Task : Visual Studio build
2020-06-26T13:39:42.7375427Z Description : Build with MSBuild and set the Visual Studio version property
2020-06-26T13:39:42.7375688Z Version : 1.166.2
2020-06-26T13:39:42.7375904Z Author : Microsoft Corporation
2020-06-26T13:39:42.7376240Z Help : https://learn.microsoft.com/azure/devops/pipelines/tasks/build/visual-studio-build
2020-06-26T13:39:42.7376608Z ==============================================================================
2020-06-26T13:39:46.1317137Z ##[error]Solution not found using search pattern 'D:\a\1\s***.sln'.
2020-06-26T13:39:46.1880482Z ##[section]Finishing: VSBuild
Agree with Daniel, the error message Solution not found using search pattern 'D:\a\1\s***.sln' clearly states that the cause of this error is that the .sln file is not found in the s folder.
You need to check whether your .sln file is pushed to the version control. You can search this in source repo of azure devops.
I had the same error in my yaml pipelines,
##[error]Solution not found using search pattern 'D:\a\1\s\**\*.sln'
So I made it as my code & the yaml file are in same branch. Now it will work, I hope it will help you.
I have a .NET Framework project that is being built on a self hosted Windows build agent.
There is a step to run tests, and that step needs to provide code coverage reports and stats.
When i try using "dotnet test" the step runs and the tests complete, the .coverage files are also generated. When i check the build summary after it's complete i see the standard test results and report, and also the code coverage tab. The code coverage tab has a download link to get the file. There is no code coverage report. There is also a link to "Setup Code Coverage" on the initial build summary screen.
Why is there no code coverage report? and why is the "Setup Code Coverage" link still visible?
This is incredibly frustrating! I must be missing something incredibly obvious, but the docs suggest what i have done is correct.
Using VSTest task rather that dotnet tests results in the same outcome, but runs far slower.
displayName: dotnet test
inputs:
command: test
arguments: '--configuration $(BuildConfiguration) --collect:"Code Coverage"'
workingDirectory: '$(Build.SourcesDirectory)\src'```
I eventually achieved this by using Hugh Lin's answer for help and modifying for my own purposes.
We have Coverlet as a reference in the project, and ReportGenerator installed into Azure DevOps, so that made this a little easier.
I found that we had an issue with a SOAP API reference that was causing the huge performance issues with generating a report. Once I filtered that out with a "classfilter"the process became more manageable. I also found that without the "disable.coverage.autogenerate" variable the "PublishCodeCoverageResults" task will take forever and likely fail as it tries to do the "ReportGenerator" step itself by without the "classfilters". It does this because the "ReportGenerator" is built into the "PublishCodeCoverageResults" step now, but due to having no filters it doesn't work for this scenario.
This is running against a .NET Framework project so there were a few adjustments to the projects needed to ensure "dotnet test" works successfully.
variables:
disable.coverage.autogenerate: 'true'
- task: DotNetCoreCLI#2
displayName: dotnet test
inputs:
command: test
publishTestResults: true
arguments: '/p:CollectCoverage=true /p:CoverletOutputFormat=cobertura --no-restore'
workingDirectory: '$(Build.SourcesDirectory)\src'
configuration: "$(buildConfiguration)"
- task: reportgenerator#4
inputs:
reports: '$(Build.SourcesDirectory)\src\*.UnitTests\coverage.cobertura.xml'
targetdir: '$(Common.TestResultsDirectory)/CoverageReport/'
classfilters: '-NAMESPACE*'
- task: PublishCodeCoverageResults#1
inputs:
codeCoverageTool: 'Cobertura'
summaryFileLocation: '$(Build.SourcesDirectory)\src\*.UnitTests\coverage.cobertura.xml'
reportDirectory: '$(Common.TestResultsDirectory)/CoverageReport/'
By default, the code coverage for the dotnet test task is output to a .codecoverage file, which Azure DevOps does not know how to interpret and only provides as a downloadable file. Code coverage Tab only supports code coverage data in Jacoco or Cobertura formats. So the result of the *.coverage file can not be shown by tables and graphs.
If you want more detailed code coverage report, you need to use coverlet in .Net framework by install the tool during the pipeline and then generate the report. For example in PowerShell script:
dotnet tool install dotnet-reportgenerator --tool-path . --version 4.0.12
dotnet tool install coverlet.console --tool-path . --version 1.4.1
mkdir .\reports
$unitTestFile = gci -Recurse | ?{ $_.FullName -like "*bin\*test*.dll" }
$coverlet = "$pwd\coverlet.exe"
& $coverlet $unitTestFile.FullName --target "dotnet" --targetargs "vstest $($unitTestFile.FullName) --logger:trx" --format "cobertura"
gci -Recurse |
?{ $_.Name -eq "coverage.cobertura.xml"} |
%{ &"$pwd\reportgenerator.exe" "-reports:$($_.FullName)" "-targetdir:reports" "-reportstypes:HTMLInline;HTMLChart" }
Then add Publish code coverage task:
For details, you can refer to the case mentioned in the comment and this ticket.
I am building the project and running test cases in appveyor. After successfully executing test cases the coverage data must be uploaded to coveralls.
But in my case no error is thrown but coverage is being recorded.
The details of my project are
.Net Core 1.1.0
Visual Studio 2017
xunit - 2.2.0
OpenCover - 4.6.519
coveralls.net - 0.7.0
My appveyor.yml is as below :
version: 1.0.{build}
os: Visual Studio 2017
skip_tags: true
configuration: Release
environment:
nodejs_version: "0.12"
COVERALLS_REPO_TOKEN:
secure: rstgrtert
cache:
- "%LOCALAPPDATA%\\Yarn"
install:
- npm i -g yarn#0.16.1
- npm i -g typescript typings
- yarn global add typescript typings
- cd ".\Promact.Oauth.Server\src\Promact.Oauth.Server\"
- yarn
- cd..
- cd..
build_script:
- ps: dotnet restore
build:
project: .\Promact.Oauth.Server\Promact.Oauth.Server.sln
verbosity: minimal
test_script:
- cd ".\src\"
- ps: >-
C:\Users\appveyor\.nuget\packages\OpenCover\4.6.519\tools\OpenCover.Console.exe-target:"C:\Program Files\dotnet\dotnet.exe" -targetargs:"test -f netcoreapp1.1 -c Release .\Promact.Oauth.Server.Tests\Promact.Oauth.Server.Tests.csproj" -mergeoutput -hideskipped:File -output:opencover.xml -oldStyle -filter:"+[Promact.Oauth.Server]*Repository -[Promact.Oauth.Server.Tests*]*" -register:user
if(![string]::IsNullOrEmpty($env:COVERALLS_REPO_TOKEN)){
$coveralls = (Resolve-Path "C:\Users\appveyor\.nuget\packages\coveralls.net\0.7.0\tools\csmacnz.coveralls. exe").ToString()
& $coveralls --opencover -i opencover.xml --repoToken
$env:COVERALLS_REPO_TOKEN --commitId $env:APPVEYOR_REPO_COMMIT --commitBranch $env:APPVEYOR_REPO_BRANCH --commitAuthor $env:APPVEYOR_REPO_COMMIT_AUTHOR --commitEmail $env:APPVEYOR_REPO_COMMIT_AUTHOR_EMAIL --commitMessage $env:APPVEYOR_REPO_COMMIT_MESSAGE --jobId $env:APPVEYOR_JOB_ID
}
In appveyor it runs the test and simply shows
Committing...
No results, this could be for a number of reasons. The most common reasons are:
1) missing PDBs for the assemblies that match the filter please review the
output file and refer to the Usage guide (Usage.rtf) about filters.
2) the profiler may not be registered correctly, please refer to the Usage
guide and the -register switch.
Coverage data uploaded to coveralls.
link to the appveyor build