Cloudbuild.yaml command with nested quotes - google-cloud-platform

I am trying to run the following command found at http://blog.wrouesnel.com/articles/Totally%20static%20Go%20builds/:
CGO_ENABLED=0 GOOS=linux go build -a -ldflags '-extldflags "-static"' .
The two inner layers of quotes are tripping me up. How to deal with this in a cloudbuild.yaml file?
Escaping quotes don't seem to work:
steps:
- name: 'gcr.io/cloud-builders/go'
args: ['build', '-o', 'main', '-ldflags', "'-extldflags \"-static\"'", '.']
env:
- 'GOOS=linux'

Update:
There is no need for such quotes. See comment in Github here: https://github.com/GoogleCloudPlatform/cloud-builders/issues/146#issuecomment-337890587
===
Original Answer
Well, to quote ' within '-quoted strings, use '' as per YAML specification:
http://yaml.org/spec/current.html#id2534365
e.g. 'here''s to a toast!'
For the above args, it would be:
['build', '-o', 'main', '-ldflags', '''-extldflags "-static"''', '.']
Whether or not the command works within Cloud Builder is beyond the scope of this question.

Related

How to delete environment variables using Ansible

I want to delete below environment variables from etc/environment using Ansible.
export http_proxy="http://194.138.0.25:9400/"
export https_proxy="http://194.138.0.25:9400/"
export ftp_proxy="http://194.138.0.25:9400/"
Below code deletes only one env variable.
name: Delete variables from etc/environment
replace:
path: /etc/environment
regexp: 'export http_proxy="http://194.138.0.25:9400/"'
replace: ''
How to delete all the 3 environment variables?
After deleting/replacing any one env variable, empty line is being added. how to avoid this?
Use lineinfile. For example, the task below will remove all lines starting export and including the address 194.138.0.25:9400/
- lineinfile:
path: /tmp/environment
regex: '^export.*194\.138\.0\.25:9400.*$'
state: absent
Given the file
shell> cat /tmp/environment
first line
export http_proxy="http://194.138.0.25:9400/"
export https_proxy="http://194.138.0.25:9400/"
export ftp_proxy="http://194.138.0.25:9400/"
last line
Running the playbook with options --check --diff gives (abridged)
TASK [lineinfile] ***************************************************
--- before: /tmp/environment (content)
+++ after: /tmp/environment (content)
## -1,5 +1,2 ##
first line
-export http_proxy="http://194.138.0.25:9400/"
-export https_proxy="http://194.138.0.25:9400/"
-export ftp_proxy="http://194.138.0.25:9400/"
last line

How to set variable inline in gitlab-ci.yaml based on regex matching?

I am trying to create a variable in gitlab-ci.yaml based on the name of the branch.
Suppose I am pushing to a branch named 3.2.7
Here is the situation:
include:
- template: "Workflows/Branch-Pipelines.gitlab-ci.yml"
variables:
PRODUCTION_BRANCH: "master"
STAGING_BRANCH: (\d)\.(\d)\.(\d)
.deploy_rules:
rules:
- if: '$CI_COMMIT_BRANCH =~ /$STAGING_BRANCH/'
variables:
SERVER_PORT: 3007 # TODO: should be 300d ; d is the second digit
I want to generate 3002 inline using regex matching.
How can I do this?
I have done some research and seems I have to use sed but I am not sure if it is the best way to do it and how to do it.
TO MAKE THE PROBLEM SIMPLER
include:
- template: "Workflows/Branch-Pipelines.gitlab-ci.yml"
variables:
TEST_VAR: sed -E 's/(\d)\.(\d)\.(\d)/300\2/gm;t;d' <<< $CI_COMMIT_BRANCH
stages:
- temp
temp:
stage: temp
script:
- echo $TEST_VAR
Should be echoing 3002 but it is echoing sed -E 's/(\d)\.(\d)\.(\d)/300\2/gm;t;d' <<< 3.2.7
You can't use variables in the regex pattern. You just have to write the regex verbatim, it cannot be directly parameterized. You also cannot use sed or other Linux utilities in variables: or other parts of your yaml. You're bound to the limitations of YAML specification and features provided by GitLab.
However, there is an option available to you that will fit your stated use case.
Dynamic variables
TEST_VAR: sed -E 's/(\d).(\d).(\d)/300\2/gm;t;d' <<< $CI_COMMIT_BRANCH
While you can't use sed or other utilities directly in variables: declarations, you can use dotenv artifacts via artifacts:reports:dotenv to set variables dynamically.
For example, a job can use sed or whatever other utilities you like to create variables which will be used by the rest of the pipeline.
stages:
- temp
create_variables:
stage: .pre
script:
- TEST_VAR="$(sed -E 's/(\d)\.(\d)\.(\d)/300\2/gm;t;d' <<< ${CI_COMMIT_BRANCH})"
- echo "TEST_VAR=${TEST_VAR}" >> dotenv.txt
artifacts:
reports:
dotenv: dotenv.txt
temp:
stage: temp
script:
- echo $TEST_VAR
Here, the .pre stage is used, which is a special stage that is always ordered before every other stage. The dotenv artifact from the create_variables job will dynamically create variables for the jobs in subsequent stages that receive the artifact.

Azure DevOps pipeline template - how to concatenate a parameter

All afternoon I have been trying to get my head around concatenating a parameter in an ADO template. The parameter is a source path and in the template a next folder level needs to be added. I would like to achieve this with a "simple" concatenation.
The simplified template takes the parameter and uses it to form the inputPath for a PowerShell script, like this:
parameters:
sourcePath: ''
steps:
- task: PowerShell#2
inputs:
filePath: 'PSRepo/Scripts/MyPsScript.ps1'
arguments: '-inputPath ''$(sourcePath)/NextFolder''
I have tried various ways to achieve this concatenation:
'$(sourcePath)/NextFolder'
see above
'$(variables.sourcePath)/NextFolder'
I know sourcePath is not a variable, but tried based on the fact that using a parameter in a task condition it apparently only works when referencing through variables
'${{ parameters.sourcePath }}/NextFolder'
And some other variations, all to no avail.
I also tried to introduce a variables section in the template, but that is not possible.
I have searched the internet for examples/documentation, but no direct answers and other issues seemed to hint to some solution, but were not working.
I will surely be very pleased if someone could help me out.
Thanx in advance.
We can add the variables in our temp yaml file and pass the sourcePath to the variable, then we can use it. Here is my demo script:
Main.yaml
resources:
repositories:
- repository: templates
type: git
name: Tech-Talk/template
trigger: none
variables:
- name: Test
value: TestGroup
pool:
# vmImage: windows-latest
vmImage: ubuntu-20.04
extends:
template: temp.yaml#templates
parameters:
agent_pool_name: ''
db_resource_path: $(System.DefaultWorkingDirectory)
# variable_group: ${{variables.Test}}
temp.yaml
parameters:
- name: db_resource_path
default: ""
# - name: 'variable_group'
# type: string
# default: 'default_variable_group'
- name: agent_pool_name
default: ""
stages:
- stage:
jobs:
- job: READ
displayName: Reading Parameters
variables:
- name: sourcePath
value: ${{parameters.db_resource_path}}
# - group: ${{parameters.variable_group}}
steps:
- script: |
echo sourcePath: ${{variables.sourcePath}}
- powershell: echo "$(sourcePath)"
Here, I just use the workingDirectory to as the test path. You can use the variables also.
Attach my build result:
Thanx, Yujun. In meantime did get it working. Apparently there must have been some typo that did block the script from executing right as the se solution looks like one of the options mentioned above.
parameters:
sourcePath: ''
steps:
- task: PowerShell#2
inputs:
filePath: 'PSRepo/Scripts/MyPsScript.ps1'
arguments: '-inputPath ''$(sourcePath)/NextFolder''

Is it possible to set a top-level only/except in a .gitlab-ci.yml file?

I have three stages in my CI file, they all have only/except like this:
test:
only:
- tags
except:
- branches
script:
- npm run test
Seems redundant to have the only/except in three places. Is there a way to set this at the top level of the script config? Don't see anything like that in the docs.
You can use the map merging feature: https://docs.gitlab.com/ee/ci/yaml/#special-yaml-features
.job_template: &job_definition
only:
- tags
except:
- branches
test1:
<<: *job_definition
script:
- npm run test
test2:
<<: *job_definition
script:
- # ...

Ansible: ansible assemble regexp to Assembles specific files

I am using assemble module to make one file from multiple files. I have a list of files and from that list I want only .pub files to be assembled, but I am not sure how to use this.
list of files
root_rsa_comiskey-v01
root_rsa_comiskey-v01.pub
root_rsa_comiskey-v02
root_rsa_comiskey-v02.pub
root_rsa_comiskey-v03
root_rsa_comiskey-v03.pub
root_rsa_comiskey-v05
root_rsa_comiskey-v05.pub
Playbook file:
---
- hosts: 10.1.31.81
become_user: yes
tasks:
- name: list files
shell: ls -1 /tmp/root_ssh_key*
register: dumpfiles
- name: fetch files
assemble: src=/tmp/root_ssh_key/ dest=/tmp/root_ssh_key/id_rsa regexp='(*.pub)'
register: test
- debug: var=test
Regexp parameter is a Python regular expression – not shell glob.
If you want to join all files ending with pub use:
assemble: src=/tmp/root_ssh_key/ dest=/tmp/root_ssh_key/id_rsa regexp='pub$'