I am using assemble module to make one file from multiple files. I have a list of files and from that list I want only .pub files to be assembled, but I am not sure how to use this.
list of files
root_rsa_comiskey-v01
root_rsa_comiskey-v01.pub
root_rsa_comiskey-v02
root_rsa_comiskey-v02.pub
root_rsa_comiskey-v03
root_rsa_comiskey-v03.pub
root_rsa_comiskey-v05
root_rsa_comiskey-v05.pub
Playbook file:
---
- hosts: 10.1.31.81
become_user: yes
tasks:
- name: list files
shell: ls -1 /tmp/root_ssh_key*
register: dumpfiles
- name: fetch files
assemble: src=/tmp/root_ssh_key/ dest=/tmp/root_ssh_key/id_rsa regexp='(*.pub)'
register: test
- debug: var=test
Regexp parameter is a Python regular expression – not shell glob.
If you want to join all files ending with pub use:
assemble: src=/tmp/root_ssh_key/ dest=/tmp/root_ssh_key/id_rsa regexp='pub$'
Related
I have roughly formatted yml files with key/value pairs in them. I then imported the values of both of these files successfully into a running playbook using the include_vars module.
Now, I want to be able to compare the value of the key/value pair from file/list 1, to all of the keys of file/list 2. Then finally when there is a match, print and preferably save/register the value of the matching key from file/list 2.
Essentially I am comparing a machine name to an IP list to try to grab the IP the machine needs out of that list. The name is "dynamic" and is different each time the playbook is run, as file/list 1 is always dynamically populated on each run.
Examples:
file/list 1 contents
machine_serial: m60
s_iteration: a
site_name: dud
t_number: '001'
file/list 2 contents
m51: 10.2.5.201
m52: 10.2.5.202
m53: 10.2.5.203
m54: 10.2.5.204
m55: 10.2.5.205
m56: 10.2.5.206
m57: 10.2.5.207
m58: 10.2.5.208
m59: 10.2.5.209
m60: 10.2.5.210
m61: 10.2.5.211
In a nutshell, I want to be able to get the file/list 1 ct_machine_serial key who's value is currently: m60 to be able to find it's key match in file/list 2, and then print and/or preferably register it's value of 10.2.5.210.
What I've tried so far:
Playbook:
- name: IP gleaning comparison.
hosts: localhost
remote_user: ansible
become: yes
become_method: sudo
vars:
ansible_ssh_pipelining: yes
tasks:
- name: Try to do a variable import of the file1 file.
include_vars:
file: ~/active_ct-scanner-vars.yml
name: ctfile1_vars
become: no
- name: Try to do an import of file2 file for lookup comparison to get an IP match.
include_vars:
file: ~/machine-ip-conversion.yml
name: ip_vars
become: no
- name: Best, but failing attempt to get the value of the match-up IP.
debug:
msg: "{{ item }}"
when: ctfile1_vars.machine_serial == ip_vars
with_items:
- "{{ ip_vars }}"
Every task except the final one works perfectly.
My failed output final task:
TASK [Best, but failing attempt to get the value of the match-up IP.] ***********************************************************************************
skipping: [localhost] => (item={'m51': '10.200.5.201', 'm52': '10.200.5.202', 'm53': '10.200.5.203', 'm54': '10.200.5.204', 'm55': '10.200.5.205', 'm56': '10.200.5.206', 'm57': '10.200.5.207', 'm58': '10.200.5.208', 'm59': '10.200.5.209', 'm60': '10.200.5.210', 'm61': '10.200.5.211'})
skipping: [localhost]
What I hoped for hasn't happened, it simply skips the task, and doesn't iterate over the list like I was hoping, so there must be a problem somewhere. Hopefully there is an easy solution to this I just missed. What could be the correct answer?
Given the files
shell> cat active_ct-scanner-vars.yml
machine_serial: m60
s_iteration: a
site_name: dud
t_number: '001'
shell> cat machine-ip-conversion.yml
m58: 10.2.5.208
m59: 10.2.5.209
m60: 10.2.5.210
m61: 10.2.5.211
Read the files
- include_vars:
file: active_ct-scanner-vars.yml
name: ctfile1_vars
- include_vars:
file: machine-ip-conversion.yml
name: ip_vars
Q: "Compare the machine name to an IP list and grab the IP."
A: Both variables ip_vars and ctfile1_vars are dictionaries. Use ctfile1_vars.machine_serial as index in ip_vars
match_up_IP: "{{ ip_vars[ctfile1_vars.machine_serial] }}"
gives
match_up_IP: 10.2.5.210
Example of a complete playbook for testing
- hosts: localhost
gather_facts: false
vars:
match_up_IP: "{{ ip_vars[ctfile1_vars.machine_serial] }}"
tasks:
- include_vars:
file: active_ct-scanner-vars.yml
name: ctfile1_vars
- include_vars:
file: machine-ip-conversion.yml
name: ip_vars
- debug:
var: match_up_IP
All afternoon I have been trying to get my head around concatenating a parameter in an ADO template. The parameter is a source path and in the template a next folder level needs to be added. I would like to achieve this with a "simple" concatenation.
The simplified template takes the parameter and uses it to form the inputPath for a PowerShell script, like this:
parameters:
sourcePath: ''
steps:
- task: PowerShell#2
inputs:
filePath: 'PSRepo/Scripts/MyPsScript.ps1'
arguments: '-inputPath ''$(sourcePath)/NextFolder''
I have tried various ways to achieve this concatenation:
'$(sourcePath)/NextFolder'
see above
'$(variables.sourcePath)/NextFolder'
I know sourcePath is not a variable, but tried based on the fact that using a parameter in a task condition it apparently only works when referencing through variables
'${{ parameters.sourcePath }}/NextFolder'
And some other variations, all to no avail.
I also tried to introduce a variables section in the template, but that is not possible.
I have searched the internet for examples/documentation, but no direct answers and other issues seemed to hint to some solution, but were not working.
I will surely be very pleased if someone could help me out.
Thanx in advance.
We can add the variables in our temp yaml file and pass the sourcePath to the variable, then we can use it. Here is my demo script:
Main.yaml
resources:
repositories:
- repository: templates
type: git
name: Tech-Talk/template
trigger: none
variables:
- name: Test
value: TestGroup
pool:
# vmImage: windows-latest
vmImage: ubuntu-20.04
extends:
template: temp.yaml#templates
parameters:
agent_pool_name: ''
db_resource_path: $(System.DefaultWorkingDirectory)
# variable_group: ${{variables.Test}}
temp.yaml
parameters:
- name: db_resource_path
default: ""
# - name: 'variable_group'
# type: string
# default: 'default_variable_group'
- name: agent_pool_name
default: ""
stages:
- stage:
jobs:
- job: READ
displayName: Reading Parameters
variables:
- name: sourcePath
value: ${{parameters.db_resource_path}}
# - group: ${{parameters.variable_group}}
steps:
- script: |
echo sourcePath: ${{variables.sourcePath}}
- powershell: echo "$(sourcePath)"
Here, I just use the workingDirectory to as the test path. You can use the variables also.
Attach my build result:
Thanx, Yujun. In meantime did get it working. Apparently there must have been some typo that did block the script from executing right as the se solution looks like one of the options mentioned above.
parameters:
sourcePath: ''
steps:
- task: PowerShell#2
inputs:
filePath: 'PSRepo/Scripts/MyPsScript.ps1'
arguments: '-inputPath ''$(sourcePath)/NextFolder''
I wrote a task that is responsible for changing supervisor config file. The case is that on some servers we have more than one app running workers, so sometimes more than one path needs to be added to include section of supervisor.conf.
Currently I wrote this task in /roles/supervisor/tasks/main.yml/:
- name: Add apps paths in include section
lineinfile:
dest: /etc/supervisor/supervisord.conf
regex: '^files ='
line: 'files = /etc/supervisor/conf.d/*.conf /home/app/{{ app_name }}/releases/app/shared/supervisor/*.conf /home/dev/{{ app_name2 }}/releases/dev/shared/supervisor/*.conf'
when: ansible_hostname = 'ser-db-10'
notify: restart supervisor
tags: multi_workers
... and added in /roles/supervisor/defaults/main.yml/ this:
app_name: bla
app_name2: blabla
It works, but I don't like the thing that there are two application paths hardcoded in line and maybe I should also add variable in place of ser-db-10.
I am wondering how to rebuild this task to make it more independent.
What I mean is, if there are 4 apps, add 4 paths, if there are 2 apps, add 2 paths.
What is the most efficient way to do this?
As an example of how to put together the parameter line, the play below
- hosts: test_01
vars:
app_name1: A
app_name2: B
my_conf:
test_01:
lines:
- '/etc/*.conf'
- '/etc/{{ app_name1 }}/*.conf'
- '/etc/{{ app_name2 }}/*.conf'
tasks:
- debug:
msg: "files = {{ my_conf[inventory_hostname].lines|join(' ') }}"
gives
"msg": "files = /etc/*.conf /etc/A/*.conf /etc/B/*.conf"
With appropriate dictionary my_conf the task below should do the job
- name: Add apps paths in include section
lineinfile:
dest: /etc/supervisor/supervisord.conf
regex: '^files ='
line: "files = {{ my_conf[inventory_hostname].lines|join(' ') }}"
notify: restart supervisor
tags: multi_workers
(not tested)
I am trying to run the following command found at http://blog.wrouesnel.com/articles/Totally%20static%20Go%20builds/:
CGO_ENABLED=0 GOOS=linux go build -a -ldflags '-extldflags "-static"' .
The two inner layers of quotes are tripping me up. How to deal with this in a cloudbuild.yaml file?
Escaping quotes don't seem to work:
steps:
- name: 'gcr.io/cloud-builders/go'
args: ['build', '-o', 'main', '-ldflags', "'-extldflags \"-static\"'", '.']
env:
- 'GOOS=linux'
Update:
There is no need for such quotes. See comment in Github here: https://github.com/GoogleCloudPlatform/cloud-builders/issues/146#issuecomment-337890587
===
Original Answer
Well, to quote ' within '-quoted strings, use '' as per YAML specification:
http://yaml.org/spec/current.html#id2534365
e.g. 'here''s to a toast!'
For the above args, it would be:
['build', '-o', 'main', '-ldflags', '''-extldflags "-static"''', '.']
Whether or not the command works within Cloud Builder is beyond the scope of this question.
---
- hosts: localhost
user: root
tasks:
- command: "ls /root/Tmp/Deployment/script_files/Hotfix"
register: dir_out
- debug: msg="The hotfix ids are: {{dir_out.stdout_lines}}"
The output I got was:
but I want it as
The hotfix ids are :["1001","1002"]
How do I do this?
I needed to change: {{dir_out.stdout_lines}} to {{dir_out.stdout_lines|join(',')}}