I am building the project and running test cases in appveyor. After successfully executing test cases the coverage data must be uploaded to coveralls.
But in my case no error is thrown but coverage is being recorded.
The details of my project are
.Net Core 1.1.0
Visual Studio 2017
xunit - 2.2.0
OpenCover - 4.6.519
coveralls.net - 0.7.0
My appveyor.yml is as below :
version: 1.0.{build}
os: Visual Studio 2017
skip_tags: true
configuration: Release
environment:
nodejs_version: "0.12"
COVERALLS_REPO_TOKEN:
secure: rstgrtert
cache:
- "%LOCALAPPDATA%\\Yarn"
install:
- npm i -g yarn#0.16.1
- npm i -g typescript typings
- yarn global add typescript typings
- cd ".\Promact.Oauth.Server\src\Promact.Oauth.Server\"
- yarn
- cd..
- cd..
build_script:
- ps: dotnet restore
build:
project: .\Promact.Oauth.Server\Promact.Oauth.Server.sln
verbosity: minimal
test_script:
- cd ".\src\"
- ps: >-
C:\Users\appveyor\.nuget\packages\OpenCover\4.6.519\tools\OpenCover.Console.exe-target:"C:\Program Files\dotnet\dotnet.exe" -targetargs:"test -f netcoreapp1.1 -c Release .\Promact.Oauth.Server.Tests\Promact.Oauth.Server.Tests.csproj" -mergeoutput -hideskipped:File -output:opencover.xml -oldStyle -filter:"+[Promact.Oauth.Server]*Repository -[Promact.Oauth.Server.Tests*]*" -register:user
if(![string]::IsNullOrEmpty($env:COVERALLS_REPO_TOKEN)){
$coveralls = (Resolve-Path "C:\Users\appveyor\.nuget\packages\coveralls.net\0.7.0\tools\csmacnz.coveralls. exe").ToString()
& $coveralls --opencover -i opencover.xml --repoToken
$env:COVERALLS_REPO_TOKEN --commitId $env:APPVEYOR_REPO_COMMIT --commitBranch $env:APPVEYOR_REPO_BRANCH --commitAuthor $env:APPVEYOR_REPO_COMMIT_AUTHOR --commitEmail $env:APPVEYOR_REPO_COMMIT_AUTHOR_EMAIL --commitMessage $env:APPVEYOR_REPO_COMMIT_MESSAGE --jobId $env:APPVEYOR_JOB_ID
}
In appveyor it runs the test and simply shows
Committing...
No results, this could be for a number of reasons. The most common reasons are:
1) missing PDBs for the assemblies that match the filter please review the
output file and refer to the Usage guide (Usage.rtf) about filters.
2) the profiler may not be registered correctly, please refer to the Usage
guide and the -register switch.
Coverage data uploaded to coveralls.
link to the appveyor build
Related
YAMAL config file:
# .NET Desktop
# Build and run tests for .NET Desktop or Windows classic desktop solutions.
# Add steps that publish symbols, save build artifacts, and more:
# https://learn.microsoft.com/azure/devops/pipelines/apps/windows/dot-net
trigger:
- master
pool:
vmImage: 'windows-latest'
variables:
solution: '**/*.sln'
buildPlatform: 'Any CPU'
buildConfiguration: 'Release'
steps:
- task: NuGetToolInstaller#1
- task: NuGetCommand#2
inputs:
restoreSolution: '$(solution)'
- task: VSBuild#1
inputs:
solution: '$(solution)'
platform: '$(buildPlatform)'
configuration: '$(buildConfiguration)'
- task: VSTest#2
inputs:
platform: '$(buildPlatform)'
configuration: '$(buildConfiguration)'
Error:
##[error]Solution not found using search pattern 'D:\a\1\s***.sln'.
RAW log:
2020-06-26T13:39:42.7115236Z ##[section]Starting: VSBuild
2020-06-26T13:39:42.7374773Z ==============================================================================
2020-06-26T13:39:42.7375115Z Task : Visual Studio build
2020-06-26T13:39:42.7375427Z Description : Build with MSBuild and set the Visual Studio version property
2020-06-26T13:39:42.7375688Z Version : 1.166.2
2020-06-26T13:39:42.7375904Z Author : Microsoft Corporation
2020-06-26T13:39:42.7376240Z Help : https://learn.microsoft.com/azure/devops/pipelines/tasks/build/visual-studio-build
2020-06-26T13:39:42.7376608Z ==============================================================================
2020-06-26T13:39:46.1317137Z ##[error]Solution not found using search pattern 'D:\a\1\s***.sln'.
2020-06-26T13:39:46.1880482Z ##[section]Finishing: VSBuild
Agree with Daniel, the error message Solution not found using search pattern 'D:\a\1\s***.sln' clearly states that the cause of this error is that the .sln file is not found in the s folder.
You need to check whether your .sln file is pushed to the version control. You can search this in source repo of azure devops.
I had the same error in my yaml pipelines,
##[error]Solution not found using search pattern 'D:\a\1\s\**\*.sln'
So I made it as my code & the yaml file are in same branch. Now it will work, I hope it will help you.
I have a .NET Framework project that is being built on a self hosted Windows build agent.
There is a step to run tests, and that step needs to provide code coverage reports and stats.
When i try using "dotnet test" the step runs and the tests complete, the .coverage files are also generated. When i check the build summary after it's complete i see the standard test results and report, and also the code coverage tab. The code coverage tab has a download link to get the file. There is no code coverage report. There is also a link to "Setup Code Coverage" on the initial build summary screen.
Why is there no code coverage report? and why is the "Setup Code Coverage" link still visible?
This is incredibly frustrating! I must be missing something incredibly obvious, but the docs suggest what i have done is correct.
Using VSTest task rather that dotnet tests results in the same outcome, but runs far slower.
displayName: dotnet test
inputs:
command: test
arguments: '--configuration $(BuildConfiguration) --collect:"Code Coverage"'
workingDirectory: '$(Build.SourcesDirectory)\src'```
I eventually achieved this by using Hugh Lin's answer for help and modifying for my own purposes.
We have Coverlet as a reference in the project, and ReportGenerator installed into Azure DevOps, so that made this a little easier.
I found that we had an issue with a SOAP API reference that was causing the huge performance issues with generating a report. Once I filtered that out with a "classfilter"the process became more manageable. I also found that without the "disable.coverage.autogenerate" variable the "PublishCodeCoverageResults" task will take forever and likely fail as it tries to do the "ReportGenerator" step itself by without the "classfilters". It does this because the "ReportGenerator" is built into the "PublishCodeCoverageResults" step now, but due to having no filters it doesn't work for this scenario.
This is running against a .NET Framework project so there were a few adjustments to the projects needed to ensure "dotnet test" works successfully.
variables:
disable.coverage.autogenerate: 'true'
- task: DotNetCoreCLI#2
displayName: dotnet test
inputs:
command: test
publishTestResults: true
arguments: '/p:CollectCoverage=true /p:CoverletOutputFormat=cobertura --no-restore'
workingDirectory: '$(Build.SourcesDirectory)\src'
configuration: "$(buildConfiguration)"
- task: reportgenerator#4
inputs:
reports: '$(Build.SourcesDirectory)\src\*.UnitTests\coverage.cobertura.xml'
targetdir: '$(Common.TestResultsDirectory)/CoverageReport/'
classfilters: '-NAMESPACE*'
- task: PublishCodeCoverageResults#1
inputs:
codeCoverageTool: 'Cobertura'
summaryFileLocation: '$(Build.SourcesDirectory)\src\*.UnitTests\coverage.cobertura.xml'
reportDirectory: '$(Common.TestResultsDirectory)/CoverageReport/'
By default, the code coverage for the dotnet test task is output to a .codecoverage file, which Azure DevOps does not know how to interpret and only provides as a downloadable file. Code coverage Tab only supports code coverage data in Jacoco or Cobertura formats. So the result of the *.coverage file can not be shown by tables and graphs.
If you want more detailed code coverage report, you need to use coverlet in .Net framework by install the tool during the pipeline and then generate the report. For example in PowerShell script:
dotnet tool install dotnet-reportgenerator --tool-path . --version 4.0.12
dotnet tool install coverlet.console --tool-path . --version 1.4.1
mkdir .\reports
$unitTestFile = gci -Recurse | ?{ $_.FullName -like "*bin\*test*.dll" }
$coverlet = "$pwd\coverlet.exe"
& $coverlet $unitTestFile.FullName --target "dotnet" --targetargs "vstest $($unitTestFile.FullName) --logger:trx" --format "cobertura"
gci -Recurse |
?{ $_.Name -eq "coverage.cobertura.xml"} |
%{ &"$pwd\reportgenerator.exe" "-reports:$($_.FullName)" "-targetdir:reports" "-reportstypes:HTMLInline;HTMLChart" }
Then add Publish code coverage task:
For details, you can refer to the case mentioned in the comment and this ticket.
My code is written in C++
GitLab CI Compiler: MSVC2017_x64
My project is being compiled with a GitLab pipeline on a Windows server. I want to be able to compile parts of this project two times and somehow change something in the code, before it is compiled so that I will have two versions of the same application but with different predefined settings.
Something simple like the compiler setting a #define in a header or cpp file would be great.
Using this same technique, I'd also like to hard wire the build number (pipeline ID) into the application.
I've already tried the /D or -D parameter but it doesn't set a #define.
The solution needs to work no matter what the user's setup is so something like environment variables won't work.
This is my .gitlab-ci.yml file (the application is ultimately built with the qmake and nmake calls):
stages:
- build
- test
variables:
GIT_SUBMODULE_STRATEGY: recursive
build:
stage: build
only:
- master
tags:
- windows
- qt
script:
- call "C:\Program Files (x86)\Microsoft Visual Studio\2017\BuildTools\VC\Auxiliary\Build\vcvars64.bat"
- call "C:\Qt\5.9.1\msvc2017_64\bin\qtenv2.bat"
- set CL=/MP16
- set PATH="C:\Program Files (x86)\Inno Setup 5";%PATH%
- cd %CI_PROJECT_DIR%\(...)\Application
- qmake CONFIG+=release
- nmake
- cd %CI_PROJECT_DIR%\(...)\3rdParty\Windows
- copy (...)\Application.exe .
- copy (...)\Application.ico .
- windeployqt .
- iscc InnoScript.iss
artifacts:
name: "%CI_COMMIT_REF_NAME%"
expire_in: 1 month
paths:
- (...)\3rdParty\Windows\Application.exe
And this is how I would want it to be (with the -D mode=xy's):
stages:
- build
- test
variables:
GIT_SUBMODULE_STRATEGY: recursive
build:
stage: build
only:
- master
tags:
- windows
- qt
script:
- call "C:\Program Files (x86)\Microsoft Visual Studio\2017\BuildTools\VC\Auxiliary\Build\vcvars64.bat"
- call "C:\Qt\5.9.1\msvc2017_64\bin\qtenv2.bat"
- set CL=/MP16
- set PATH="C:\Program Files (x86)\Inno Setup 5";%PATH%
- cd %CI_PROJECT_DIR%\(...)\Application
- qmake CONFIG+=release
- nmake -D mode=prod
- qmake CONFIG+=release
- nmake -D mode=stage
- cd %CI_PROJECT_DIR%\(...)\3rdParty\Windows
- copy (...)\Application.exe .
- copy (...)\Application.ico .
- windeployqt .
- iscc InnoScript.iss
artifacts:
name: "%CI_COMMIT_REF_NAME%"
expire_in: 1 month
paths:
- (...)\3rdParty\Windows\Application.exe
How do I modify my code by only using the GitLab CI or is there another, better way?
I've just found out how to do it.
Before, I was trying to add the -D argument to the nmake call when instead, I needed to add a DEFINES+=MY_VAR='foo bar' to the qmake call.
After doing this and deleting the build- folder to get rid of the outdated Makefile.Debug and Makefile.Release files I got it to work correctly.
I have a .net core 2.2 project, with some unit tests that test a class library project using Visual Studio Testing
Visual Studio Test - YAML
steps:
- task: VSTest#2
displayName: 'VsTest - testAssemblies'
inputs:
testAssemblyVer2: |
**\*test*.dll
!**\*TestAdapter.dll
!**\obj\**
**\$(BuildConfiguration)\*\*unittests.dll
**\$(BuildConfiguration)\*\*test*.dll
!**\*Microsoft.VisualStudio.TestPlatform*
vstestLocationMethod: location
vstestLocation: 'C:\Program Files (x86)\Microsoft Visual Studio\2017\Enterprise\Common7\IDE\Extensions\TestPlatform\'
codeCoverageEnabled: true
otherConsoleOptions: '/Framework:.NETCoreApp,Version=v2.2 /logger:console;verbosity="normal"'
platform: '$(buildPlatform)'
configuration: '$(buildConfiguration)'
timeoutInMinutes: 1
And the error message
[warning]No test assemblies found matching the pattern:
*test*.dll,!*TestAdapter.dll,!\obj**,\Release**unittests.dll,\Release**test*.dll,!*Microsoft.VisualStudio.TestPlatform*.
Can anyone identify what needs to be done to make it work?
I have tried various different options for the above
I believe you need to be pointing to the projects themselves; not the output DLL's.
So your pattern needs to look more like this: **/*[Tt]ests/*.csproj
Which will cause DEV Ops to load all projects that have Tests in its Project Name
I have server with Jenkins and QT project. The server runs on CentOS 7. I installed "Cppcheck" into server also I installed "Cppcheck Plug-in" plugin into Jenkins.
The script for build project:
cd FlashClipboard;
/usr/lib64/qt5/bin/qmake FlashClipboard.pro;
make clean;
make;
cppcheck --enable=all --suppress=missingIncludeSystem . --xml --xml-version=2 . 2> ./tmp/cppcheck.xml;
Post-Build Actions:
But I have error:
[Cppcheck] Starting the cppcheck analysis.
[Cppcheck] Processing 1 files with the pattern 'tmp/cppcheck.xml'.
[Cppcheck] Parsing throws exceptions. javax.xml.bind.UnmarshalException
- with linked exception:
[org.xml.sax.SAXParseException; systemId: file:/var/lib/jenkins/workspace/Flash%20Clipboard/tmp/cppcheck.xml; lineNumber: 1; columnNumber: 1; Premature end of file.]
Build step 'Publish Cppcheck results' changed build result to FAILURE
Build step 'Publish Cppcheck results' marked build as failure
What is my mistake?
sorry but are you sure the path is correct?
Shouldn't it be:
Cppcheck report XML: FlashClipboard/tmp/cppcheck.xml