I'm trying to extract a specific number from my stdout_lines in Ansible and use that as a variable. I'm running a show command in my playbook and all I want to get from the output is the highest sequence number from my crypto map. For example this is my playbook:
-
asa_command:
commands:
- show run crypto map
provider: "{{ base_provider }}"
register: result
-
debug: var=result.stdout_lines
This produces the output fine but I'm not sure how to go about extracting the sequence number from the following (I have omitted most of the crypto map just to make it easier to explain).
"crypto map map1 60 set ikev1 transform-set test",
"crypto map map1 60 set security-association lifetime seconds 3600",
"crypto map map1 61 set peer 1.1.1.1 ",
"crypto map map1 61 set ikev1 transform-set test1",
"crypto map map1 61 set security-association lifetime seconds 3600",
"crypto map map1 interface outside"
So basically, I would like to extract the highest sequence number (in this case "61") so I can input it as a variable in the same playbook. Any thoughts would be appreciated :-)
I tried looking at some jinja2 filters but I couldn't figure out what would be most appropriate for my usage.
http://ansible-docs.readthedocs.io/zh/stable-2.0/rst/playbooks_filters.html
I also tried the suggestions on this page but I didn't get far with that either.
ansible parse text string from stdout
Note that I'm pants-ing this in a notepad without full access to the tools, so please check my syntax, especially on those double-backslash escapes. That said, let's take a stab at a chain of filters that gets what you need. How about:
- debug: msg="{{ result.stdout |
regex_findall ('^"crypto map map1 \\d\\d set ') |
regex_replace ('^"crypto map map1 (\\d\\d) set .*',
'\\1') |
max
}}"
Related
I am trying to process a dictionary of tags to produce a command line that will be passed to a python program that will apply tags to an object. Tags can be either single cardinality (i.e. the "key" can appear only once on the tagged element) or multi-cardinality (i.e. the "key" can appear multiple times on the element). Single Cardinality tagging is fine, my problem is with multi-cardinality.
The dictionary in the ansible host_vars file would be:
multi_tags:
multi_tag1: value1
multi_tag2: mvalue1
multi_tag2: mvalue2
multi_tag3: value3
But ansible will replace "multi_tag2" with "mvalue2" only because you can't have 2 variables with the same name. Therefore the dictionary actually looks like:
multi_tags:
multi_tag1: value1
multi_tag2: ["mvalue1", "mvalue2"]
multi_tag3: value3
At the end of this I need to produce the following list:
multi_tag1:value1, multi_tag2:mvalue1, multi_tag2:mvalue2, multi_tag3:value3
So far i've been able to get multi_tag1:value1, multi_tag2:[mvalue1, mvalue2], multi_tag3:value3, and I can detect that mutli_tag2 is a list, but I cannot figure out how to pull "multi_tag2", separate from the multi_tags dict and then expand that list into it's individual key:value components?
NOTE: "multi_tag2" may be any name and there may be any number of lists like this in the dictionary. The challenge is in reading the dictionary, finding all of the list keys and spinning each list key into its own expanded key:value set.
I can detect and capture list name keys with:
- name: Caputre Lists for Processing
vars:
multilist: []
set_fact:
multilist: "{{ multilist + [item] }}"
with_items: '{{ multi_tags.keys() }}'
when: ( multi_tags[item] is defined ) and ( multi_tags[item] | type_debug == "list" )
But once I have the "list of lists" I don't know how to then spin that to generate/expand each key:value pair in each list to create the final consolidated group of key:value pairs.
This does not need to be done in one step, i'm happy to add the expanded key:value pairs to additional set-facts and then combine the single facts with the "list" facts to create the final key:value list if that's needed.
I've been racking my brain over this for hours now. Any advice would be helpful!
Create a list of products first. The challenge is converting a string to a list where this string is a single item. Simply item|list won't work because a string is iterable. Hence, it must be closed in brackets [item]. This would create a nested list, if the item is already a list, which must be flattened, e.g.
- set_fact:
mtl: "{{ mtl|d([]) + [item.key]|product([item.value]|flatten) }}"
loop: "{{ multi_tags|dict2items }}"
gives
mtl:
- - multi_tag1
- value1
- - multi_tag2
- mvalue1
- - multi_tag2
- mvalue2
- - multi_tag3
- value3
Then, join the items, e.g.
- debug:
msg: "{{ mtl|map('join', ':')|join(', ') }}"
gives
msg: multi_tag1:value1, multi_tag2:mvalue1, multi_tag2:mvalue2, multi_tag3:value3
Is there a way to combine multiple lists in a list in Jinja2 ?
For instance, if I have :
[['foo', 'moo'],['py','jinga','template'],['example'],['stack','overflow']]
I expect to get :
['foo', 'moo','py','jinga','template','example','stack','overflow']
(I don't know the number of lists in the list in advance.)
I already tried to use join() but it doesn't work because I get a string and not a list of strings.
If you don't know upfront the number of sub list, since Jinja won't allow assignment in loops, what you could do is to join back the sub lists via map, then, split it back.
It is not the perfect way, but it does the job.
{{ ([['foo', 'moo'],['py','jinga','template'],['example'],['stack','overflow']] | map('join','|||') | join('|||')).split('|||') }}
You can do a concatenation of lists using the + operator:
['foo','moo'] + ['py','jinga','template'] + ['example'] + ['stack','overflow']
Gives
['foo', 'moo', 'py', 'jinga', 'template', 'example', 'stack', 'overflow']
+ Adds two objects together. Usually the objects are numbers, but if both are strings or lists, you can concatenate them this way.
Source: https://jinja.palletsprojects.com/en/2.11.x/templates/#math
I came here looking for a solution for this problem in ansible, and used β.εηοιτ.βε's solution first. Then I found another solution using json_query and it's flatten projection ([]):
- name: stackoverflow test
hosts: localhost
gather_facts: no
tasks:
- flatten list of lists using json_query's flatten projection
vars:
nested: [["foo", "moo"],["py","jinga","template"],["example"],["stack","overflow"]]
debug:
msg: '{{ nested | json_query("[]") }}'
Not sure if the OP had been using ansible, but if someone else like me ends up here, they might find it useful.
Basically my goal is to catch the cisco interfaces (ie. Gi1/0) from the string that I used to store in variable intf. I am puzzled how I am going to construct my regex with set_fact that will catch interface from intf variable.
Based from regex101.com this regex will match Gi1/0 interface:
^\w+(-\w+)?\d+(([\/:]\d+)+(\.\d+)?)?$
I tried below code to catch the interface Gi1/0 for example, and store that in variable storehere, but only encountered errors.
- name: Catch interface only ie. Gi1/0 and store in storehere variable
set_fact:
storehere: "{{ intf | regex_findall(^\w+(-\w+)?\d+(([\/:]\d+)+(\.\d+)?)?$) }}"
This is my full code:
Full Script
Execution W/o Regex
Error W Regex
Your full code seems to imply that there might be multiple lines in the output. Presumably they all follow the same pattern, and the data that you need from each, is just the first column - the interface name? If so, does this fulfill your needs:
- set_fact:
interfaces: "{{ interfaces | default([]) }} + {{ [ item | regex_search('^([^ ])+') ] }}"
loop: "{{ rl00.stdout_lines | first }}"
- debug:
var: interfaces
(Note: in your example, rl00.stdout_lines contains a list, with a list of output lines as its only element. That looks a little odd and I am not sure if there are cases where other elements would be present. This answer should work for the example data you provided, but may run into problems if other elements are returned.)
This:
Loops through the lines contained in the first list contained in your rl00 registered variable
Filters the first non-space chars from the beginning of the string
Adds them as an item in the 'interfaces' list
That should leave you with a list containing interface names. You can then treat them how you like, so for example, if you need them on a single, space separated line, you could then:
- debug:
var: interfaces | join(' ')
Consider these commands:
First i create a sample trip as hash:
HMSET trip:1734 id 1734 start 08:35 end 09:30
Then I need to store some route points for the trip, I do it with lists like this:
RPUSH trip:1734:routes 22444.345566,32.875553 77.3,44.3
Now I need to retrieve only trips (list of trips) without route part, how I can do that?
I tried:
SCAN 0 MATCH trip:*[^:routes]
and it is working well.
But why does this not work?
SCAN 0 MATCH trip:*[^:*]
In a hash, I have a bunch of keys-values pairs
my keys are in the following format: name:city
john:newyork
kate:chicago
lisa:atlanta
Im using python to access redis and in https://redis-py.readthedocs.org/en/latest/, i dont see any hash operations that does the partial matching
i would like to be able to get all keys in the hash with a city name
is that possible?
It is possible, but not with HASH objects, but with sorted sets. As long as all elements in a sorted set have the same score, you can do lexicographical prefix matching.
let's say you do the following (raw redis commands, but the same applies with the python client):
ZADD foo 0 john:newyork:<somevalue>
ZADD foo 0 john:chicago:<somevalue>
ZADD foo 0 kate:chicago:<somevalue>
....
You can then query by using ZRANGEBYLEX:
ZRANGEBYLEX foo [john: (john:\xff
will give you all entries that start with john, and you can extract the value with regular expressions or splitting.
Note that this is a prefix search and not suffix search. if you want "all entries in new york" you need to reverse the order in the sorted set.
I was able to achieve matching hash keys partially by:
pool = redis.ConnectionPool(host='localhost', port=6379, db=0)
r = redis.StrictRedis(connection_pool=pool)
cmd = "hscan <hashname> 0 match *:atlanta"
print r.execute_command(cmd)