if statement on gitlab.ci yaml - if-statement

On my gitlab.ci.yaml response status must be 200 otherwise job should be fail and must write transfer already started.
But I don't know how to write if statement for that.
variables:
NUGET_PATH: 'C:\Tools\Nuget\nuget.exe'
MSBUILD_PATH: 'C:\Program Files (x86)\Microsoft Visual Studio\2019\Community\MSBuild\Current\Bin\amd64\msbuild.exe'
SOLUTION_PATH: 'Textbox_ComboBox.sln'
stages:
- build
- job1
- job2
before_script:
- "cd Source"
build_job:
stage: build
except:
- schedules
script:
- '& "$env:NUGET_PATH" restore'
- '& "$env:MSBUILD_PATH" "$env:SOLUTION_PATH" /nologo /t:Rebuild /p:Configuration=Debug'
job1:
stage: job1
script:
- 'curl adress1'
job2:
stage: trigger_SAP_service
when: delayed
start_in: 5 minutes
only:
- schedules
script:
- 'curl adress2'
How to add if condition for status=200 and transfer started in response ?

Related

Cypress AWS codebuild error: spec must be a string or comma-separated list

I am trying to implement parallel testing in AWS code build. I created a buildspec.yml file like this sample project:
https://github.com/cypress-io/cypress-realworld-app/blob/develop/buildspec.yml
My problem is the environments that I use during the cypress command are getting as empty.
- echo $CY_GROUP_SPEC
- CY_GROUP=$(echo $CY_GROUP_SPEC | cut -d'|' -f1)
- CY_BROWSER=$(echo $CY_GROUP_SPEC | cut -d'|' -f2)
- CY_SPEC=$(echo $CY_GROUP_SPEC | cut -d'|' -f3)
- CY_CONFIG=$(echo $CY_GROUP_SPEC | cut -d'|' -f4)
And then the cypress code build fails with this error:
Opening Cypress...
Cypress encountered an error while parsing the argument: --spec
You passed: true
The error was: spec must be a string or comma-separated list
I use this command to run cypress:
- NO_COLOR=1 ./node_modules/.bin/cypress run --browser $CY_BROWSER --spec "$CY_SPEC" --config "$CY_CONFIG" --headless. --record --key $CYPRESS_KEY --parallel --ci-build-id $CODEBUILD_INITIATOR --group "$CY_GROUP"
I defined these env variables like this on the top of the file:
batch:
build-matrix:
dynamic:
env:
image:
- ${AWS_ACCOUNT_ID}.dkr.ecr.${AWS_REGION}.amazonaws.com/cypress:latest
variables:
CY_GROUP_SPEC:
- "UI - Chrome|chrome|cypress/e2e/account/*"
- "UI - Chrome|chrome|cypress/e2e/auth/*"
- "UI - Chrome|chrome|cypress/e2e/mastering/*"
- "UI - Chrome|chrome|cypress/e2e/pages/**/*"
- "UI - Chrome|chrome|cypress/e2e/user-flows/**/*"
WORKERS:
- 1
- 2
- 3
- 4
- 5
How can I fix this problem?
Thanks
The errors definitely tell you that the command is wrong. Check that carefully.

Gitlab CI pipeline to run jobs parallel in same stage and invoke/trigger other jobs of same stage

I am trying to create a automation pipeline for data load. I have a scenario as explained below:
stages
- stage1
- stage2
job1:
stage: stage1
script:
- echo "stage 1 job 1"
job2:
stage: stage1
script:
- echo "stage 1 job 2"
job3:
stage: stage1
script:
- echo "stage 1 job 3"
job4:
stage: stage1
script:
- echo "stage 1 job 4"
I want to run the job1 and job2 parallel in the same stage. So, after Job1 and job2 success
job1 will invoke/trigger the job3. that means job3 will start automatically when job1 successes
job2 will invoke/trigger the job4 that means job4 will start automatically when job2 successes
I am writing the pipeline in the .gitlab-ci.yml.
Can anyone help me to implement this?
Strict implementation of your requirements is not possible (according to my knowledge), the jobs 3 and 4 would need to be in a separate stage (although support for putting them in the same stage is planned). To be clear: the other functional requirements can be fulfilled, i.e:
job1 and job2 start in parallel
job1 will trigger the job3 (immediately, without waiting for the job2 to finish)
job2 will trigger the job4 (immediately, without waiting for the job1 to finish)
The key is using needs keyword to convert the pipieline to a directed acyclic graph:
stages:
- stage-1
- stage-2
job-1:
stage: stage-1
needs: []
script:
- echo "job-1 started"
- sleep 5
- echo "job-1 done"
job-2:
stage: stage-1
needs: []
script:
- echo "job-2 started"
- sleep 60
- echo "job-2 done"
job-3:
stage: stage-2
needs: [job-1]
script:
- echo "job-3 started"
- sleep 5
- echo "job-3 done"
job-4:
stage: stage-2
needs: [job-2]
script:
- echo "job-4 started"
- sleep 5
- echo "job-4 done"
As you can see on the screenshot, the job 3 is started, even though the job 2 is still running.

Publish code coverage result failed in azuredevops pipeline

I have a .netcode test command and a publish code coverage results task in my pipeline.
config as below:
steps:
- task: DotNetCoreCLI#2
displayName: 'Test Public API Project '
inputs:
command: test
projects: '**/DWP.CDA.API.Test.csproj'
arguments: '--output publish_output --configuration $(BuildConfiguration) /p:CollectCoverage=true /p:CoverletOutputFormat=cobertura /p:Threshold=99 /p:ThresholdStat=total /p:CoverletOutput=$(Build.SourcesDirectory)\TestResults\Coverage\ --collect "Code coverage"'
steps:
- task: PublishCodeCoverageResults#1
displayName: 'Publish code coverage'
inputs:
codeCoverageTool: Cobertura
summaryFileLocation: '$(Build.SourcesDirectory)/TestResults/Coverage/*cobertura.xml'
reportDirectory: '($(Build.SourcesDirectory)/Src/TestResults/Coverage'
But it seems that the publish results doed not take effect,such messages will show
[warning]No code coverage results were found to publish.
Did you install and run the ReportGenerator tool as well to get code coverage report in the proper format? Your warning looks like the build task isn't finding the xml file to publish in the folder that you're looking in.
I've used the following yaml in the past to run and publish code coverage results. You will need to change it to find your projects, but otherwise it should work.
- task: DotNetCoreCLI#2
displayName: 'Install ReportGenerator'
inputs:
command: custom
custom: tool
arguments: 'install --global dotnet-reportgenerator-globaltool'
- task: DotNetCoreCLI#2
displayName: 'Run unit tests - $(buildConfiguration)'
inputs:
command: 'test'
arguments: '--no-build --configuration $(buildConfiguration) /p:CollectCoverage=true /p:CoverletOutputFormat=cobertura /p:CoverletOutput=$(Build.SourcesDirectory)/TestResults/Coverage/'
publishTestResults: true
projects: '**/*.Tests.csproj'
- script: |
reportgenerator -reports:$(Build.SourcesDirectory)/**/coverage.cobertura.xml -targetdir:$(Build.SourcesDirectory)/CodeCoverage -reporttypes:HtmlInline_AzurePipelines
displayName: 'Create code coverage report'
- task: PublishCodeCoverageResults#1
displayName: 'Publish code coverage report'
inputs:
codeCoverageTool: 'cobertura'
summaryFileLocation: '$(Build.SourcesDirectory)/**/coverage.cobertura.xml'
Yes, as SapuSevn commented, it should be Agent.TempDirectory.
- task: DotNetCoreCLI#2
displayName: 'dotnet test'
inputs:
command: 'test'
arguments: '--configuration $(buildConfiguration) --collect:"XPlat Code Coverage" -- DataCollectionRunSettings.DataCollectors.DataCollector.Configuration.Format=cobertura'
publishTestResults: true
projects: 'tests/Frontends/GloboTicket.Web.Tests' # update with your test project directory
- task: PublishCodeCoverageResults#1
displayName: 'Publish code coverage report'
inputs:
codeCoverageTool: 'Cobertura'
#summaryFileLocation: '$(System.DefaultWorkingDirectory)/**/coverage.cobertura.xml' ## This is not working. What you need is the following.
summaryFileLocation: '$(Agent.TempDirectory)/**/coverage.cobertura.xml'
I had a similar issue, take a look at the output
Attachments:
My pipeline was putting the reports into a temp folder
Also my test project was missing a Nuget coverlet.msbuild

How to read json file in gitlab ci yaml and use if else command?

I have a yaml file and I have a respond for yaml file.Respond file have a message part.and I need if message is true return Job Succeed for yaml file if message writes wrong yaml file is not succeed
variables:
NUGET_PATH: 'C:\Tools\Nuget\nuget.exe'
MSBUILD_PATH: 'C:\Program Files (x86)\Microsoft Visual Studio\2019\Community\MSBuild\Current\Bin\amd64\msbuild.exe'
SOLUTION_PATH: 'Textbox_ComboBox.sln'
stages:
- build
- job1
- job2
before_script:
- "cd Source"
build_job:
stage: build
except:
- schedules
script:
- '& "$env:NUGET_PATH" restore'
- '& "$env:MSBUILD_PATH" "$env:SOLUTION_PATH" /nologo /t:Rebuild /p:Configuration=Debug'
job1:
stage: job1
script:
- 'curl adress1'
- - if [ "$message" == "SAP transfer started. Please check in db" ]; then exit 0; else exit 1; fi
job2:
stage: trigger_SAP_service
when: delayed
start_in: 5 minutes
only:
- schedules
script:
- 'curl adress2'
It is yaml file respond.It should be job succeed.Because respond message and if command message is same.
Skipping Git submodules setup
Authenticating with credentials from job payload (GitLab Registry)
$ cd Source
$ curl adress1
% Total % Received % Xferd Average Speed Time Time Time Current
Dload Upload Total Spent Left Speed
0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0
100 146 0 146 0 0 877 0 --:--:-- --:--:-- --:--:-- 879
{"status":200,"message":"SAP transfer started. Please check in db","errorCode":0,"timestamp":"2019-10-04T07:59:58.436+0300","responseObject":null}$ if ( [ '$message' == 'SAP transfer started. Please check in db' ] ); then exit 0; else exit 1; fi
ERROR: Job failed: exit code 1
The message variable you use in your condition is empty.
You need to assign the curl response to your message variable :
message=$(curl -Ss adress1)
and then test the content of $message

Copy Files to: $(Build.ArtifactStagingDirectory) -Input required: TargetFolder

I am new to YAML and build pipelines. I am receiving the following error, can anyone advice what's wrong with the target folder.
Unhandled: Input required: TargetFolder
[warning]Directory 'D:\a\1\a' is empty. Nothing will be added to build
artifact 'drop'.
Below is my YAML file:
# Build app using Azure Pipelines
pool:
vmImage: 'vs2017-win2016'
steps:
- script: echo hello world
- task: NodeTool#0
inputs:
versionSpec: '8.x'
- task: CopyFiles#1
displayName: 'Copy Files to: $(Build.ArtifactStagingDirectory)'
inputs:
SourceFolder: '$(build.sourcesdirectory)'
Contents:
\C:\VSCodeGit\CollMod.Web\Web.config\
TartgetFolder: '$(Build.ArtifactStagingDirectory)'
condition: succeededOrFailed()
- task: PublishBuildArtifacts#1
displayName: 'Publish Artifact: drop'
inputs:
PathtoPublish: '$(Build.ArtifactStagingDirectory)'
condition: succeededOrFailed()
I think it's the contents field that looks to be invalid here.
The docs at https://learn.microsoft.com/en-us/azure/devops/pipelines/tasks/utility/copy-files?view=vsts&tabs=yaml and further documentation on https://learn.microsoft.com/en-us/azure/devops/pipelines/tasks/file-matching-patterns?view=vsts which both give some great examples.
If you're unsure set the contents to **/* which will copy absolutely everything in the $(build.sourcesdirectory), but it will give you a feel for the shape of the directory structure so that you can change **/* into something more selective and scoped for the file(s) you want to copy.
The Source folder should be : Build.SourcesDirectory instead of '$(build.sourcesdirectory)'
This is from : https://learn.microsoft.com/en-us/azure/devops/pipelines/build/variables?view=azure-devops&tabs=yaml#build-variables