I'm trying to solve an issue with Postman Collections.
Test scripts added to collection generates additional field "id".
Id field change after each export of the Collection to file.
Due to this fact PRs with changes in Postman Collections are very hard to read.
I want to solve that issue with git pre commit hook and bash script which will remove all id's from script object of collection.
There are three possible locations of the id in scripts object:
First element of object
"script":{
"id": "83d9076e-64c7-47fa-9b50-b7635718c925",
"exec": [
"console.log(\"foo\");"
],
"type": "text/javascript"
}
Middle of object
"script":{
"exec": [
"console.log(\"foo\");"
],
"id": "83d9076e-64c7-47fa-9b50-b7635718c925",
"type": "text/javascript"
}
End of object
"script":{
"exec": [
"console.log(\"foo\");"
],
"type": "text/javascript",
"id": "83d9076e-64c7-47fa-9b50-b7635718c925"
}
From regex point of view case 1 and 2 are the same:
.*"id": "[a-f0-9-]*",
Case 3 is different and regex which handles this option is:
,\n.*"id": "[a-f0-9-]*",
As I mentioned before, I want to use this regexp in bash script:
postmanClean.sh
#!/bin/bash
COLLECTION_FILES=$(find . -type f -name "*postman_collection.json")
for POSTMAN_COLLECTION in ${COLLECTION_FILES}
do
echo "Harmonizing Postman $POSTMAN_COLLECTION"
sed -i -e 's/.*"id": "[a-f0-9-]*"\,//' ${POSTMAN_COLLECTION} # Remove test/script ID
sed -i -e 's/\,\n.*"id": "[a-f0-9-]*"//' ${POSTMAN_COLLECTION} # Remove test/script ID
done
Above solution is incorrect. I tried different options, but this regexp are not working.
How properly build this request to make them work with sed command?
Collection file:
demo.postman_collection.json
{
"info": {
"_postman_id": "258b2fe2-5768-47f8-9e82-70971bab6bbd",
"name": "demo",
"schema": "https://schema.getpostman.com/json/collection/v2.1.0/collection.json"
},
"item": [
{
"name": "One",
"item": [
{
"name": "Demo 1",
"event": [
{
"listen": "test",
"script": {
"id": "83d9076e-64c7-47fa-9b50-b7635718c925",
"exec": [
"console.log(\"foo\");"
],
"type": "text/javascript"
}
}
],
"protocolProfileBehavior": {
"disableBodyPruning": true
},
"request": {
"method": "GET",
"header": [],
"body": {
"mode": "raw",
"raw": "foo"
},
"url": {
"raw": "https://postman-echo.com/delay/1",
"protocol": "https",
"host": [
"postman-echo",
"com"
],
"path": [
"delay",
"1"
]
}
},
"response": []
}
],
"protocolProfileBehavior": {}
},
{
"name": "Two",
"item": [
{
"name": "Demo 2",
"event": [
{
"listen": "test",
"script": {
"exec": [
"console.log(\"bar\");"
],
"type": "text/javascript",
"id": "facb28f7-c54d-46e2-adb2-4c929fd1edd3"
}
}
],
"protocolProfileBehavior": {
"disableBodyPruning": true
},
"request": {
"method": "GET",
"header": [],
"body": {
"mode": "raw",
"raw": "bar"
},
"url": {
"raw": "https://postman-echo.com/delay/2",
"protocol": "https",
"host": [
"postman-echo",
"com"
],
"path": [
"delay",
"2"
]
}
},
"response": []
},
{
"name": "Demo 3",
"event": [
{
"listen": "test",
"script": {
"exec": [
"console.log(\"foobar\");"
],
"id": "facb28f7-c54d-46e2-adb2-4c929fd1edd3",
"type": "text/javascript"
}
}
],
"protocolProfileBehavior": {
"disableBodyPruning": true
},
"request": {
"method": "GET",
"header": [],
"body": {
"mode": "raw",
"raw": "bar"
},
"url": {
"raw": "https://postman-echo.com/delay/3",
"protocol": "https",
"host": [
"postman-echo",
"com"
],
"path": [
"delay",
"3"
]
}
},
"response": []
}
],
"protocolProfileBehavior": {}
}
],
"protocolProfileBehavior": {}
}
I think jq is the right tool for this job and the solution will be as simple as walk(del(.id?)). here a rewrite of your script using jq:
#!/bin/bash
COLLECTION_FILES=$(find . -type f -name "*postman_collection.json")
for f in ${COLLECTION_FILES}
do
echo "Harmonizing Postman $f"
jq --indent 4 'walk(del(.id?))' "$f" > "$f.tmp" && mv "$f.tmp" "$f"
done
and a demo (please note how jq takes care of removing the extra , from "type": "text/javascript", which will otherwise invalidate the json):
$ cp demo.postman_collection.json demo.postman_collection.json.bak
$ ./postmanClean.sh
Harmonizing Postman ./demo.postman_collection.json
$ diff demo.postman_collection.json.bak demo.postman_collection.json
17d16
< "id": "83d9076e-64c7-47fa-9b50-b7635718c925",
65,66c64
< "type": "text/javascript",
< "id": "facb28f7-c54d-46e2-adb2-4c929fd1edd3"
---
> "type": "text/javascript"
104d101
< "id": "facb28f7-c54d-46e2-adb2-4c929fd1edd3",
$
You don't need to distinguish the two patterns, as you can use sed to just match any line that contains the "id": "..." pattern and then use it to delete the entire line where it matched using the d command. So you do not need to care about the newlines, whitespace or whether the trailing comma is there or not.
Executed on your example
sed -i '/"id": "[a-f0-9-]*"/d' demo.postman_collection.json
removes all the id lines (except the "_postman_id" of course).
Related
My ListInputSecurityGroup task returns this json:
{
"output": [
{
"Arn": "arn:aws:medialive:eu-north-1:xxx:inputSecurityGroup:1977625",
"Id": "1977625",
"Inputs": [],
"State": "IDLE",
"Tags": {},
"WhitelistRules": [
{
"Cidr": "5.5.5.5/32"
}
]
},
{
"Arn": "arn:aws:medialive:eu-north-1:xxx:inputSecurityGroup:5411101",
"Id": "5411101",
"Inputs": [],
"State": "IDLE",
"Tags": {
"use": "some_other_use"
},
"WhitelistRules": [
{
"Cidr": "1.1.1.1/0"
}
]
},
{
"Arn": "arn:aws:medialive:eu-north-1:xxx:inputSecurityGroup:825926",
"Id": "825926",
"Inputs": [
"4011716"
],
"State": "IN_USE",
"Tags": {
"use": "for_rtmp_pipeline"
},
"WhitelistRules": [
{
"Cidr": "0.0.0.0/0"
}
]
}
]
}
I want to use OutputPath to extract the InputSecurityGroup with the tag {use:for_rtmp_pipeline}. According to this JSONPath tester this expression works $.output[?(#.Tags.use == for_rtmp_pipeline)] and it returns the 3rd element in this array. But when used in the StepFunction itself, or in the Data Flow Simulator, it doesn't return anything. Is this a limitation of the JSONPath engine in AWS, or is there a different syntaxis? How can I extract the one element I want?
Note that in the tester the searched string should be in quotes, while in AWS there's no need for quotes.
I have the flow where i want to edit the column in the csv table and replace the "," by a "."
How do I do that? Because the replace function expression in logicApp does not return the column:
It asks me to take the complete body when I use the replace function.
Where as details column is available which I want to edit:
How should I replace the "," from the details column?
I did this then, Then i don't see the variable I initialize.
For instance I've taken this as my sample .csv file which I'm retrieving from my storage account.
Firstly I have used Parse CSV file like you did the same, then initialised and used the Append the string variable connector taking the Productsname column. Lastly, have used the replace function expression to replace ' , ' with a ' . '.
NOTE: I have used '|' following productsname variable for future purpose.
Here is my Logic App workflow
THE COMPOSE CONNECTOR EXPRESSION :-
split(replace(variables('Productname'),',','.'),'|')
OUTPUT:
Here is my workflow that you can refer to:
{
"definition": {
"$schema": "https://schema.management.azure.com/providers/Microsoft.Logic/schemas/2016-06-01/workflowdefinition.json#",
"actions": {
"Compose": {
"inputs": "#split(replace(variables('Productname'),',','.'),'|')",
"runAfter": {
"For_each_2": [
"Succeeded"
]
},
"type": "Compose"
},
"For_each_2": {
"actions": {
"Append_to_string_variable": {
"inputs": {
"name": "Productname",
"value": "#{items('For_each_2')?['Productname']}|"
},
"runAfter": {},
"type": "AppendToStringVariable"
}
},
"foreach": "#body('Parse_CSV')",
"runAfter": {
"Initialize_variable": [
"Succeeded"
]
},
"type": "Foreach"
},
"Get_blob_content_(V2)": {
"inputs": {
"host": {
"connection": {
"name": "#parameters('$connections')['azureblob']['connectionId']"
}
},
"method": "get",
"path": "/v2/datasets/#{encodeURIComponent(encodeURIComponent('AccountNameFromSettings'))}/files/#{encodeURIComponent(encodeURIComponent('JTJmY29udGFpbmVyMjQwOCUyZlByb2R1Y3RzLmNzdg=='))}/content"
},
"metadata": {
"JTJmY29udGFpbmVyMjQwOCUyZlByb2R1Y3RzLmNzdg==": "/container2408/Products.csv"
},
"runAfter": {},
"type": "ApiConnection"
},
"Initialize_variable": {
"inputs": {
"variables": [
{
"name": "Productname",
"type": "string"
}
]
},
"runAfter": {
"Parse_CSV": [
"Succeeded"
]
},
"type": "InitializeVariable"
},
"Parse_CSV": {
"inputs": {
"body": {
"content": "#{base64(body('Get_blob_content_(V2)'))}",
"headers": "Productid,Productname"
},
"host": {
"connection": {
"name": "#parameters('$connections')['plumsail']['connectionId']"
}
},
"method": "post",
"path": "/flow/v1/Documents/jobs/ParseCsv"
},
"runAfter": {
"Get_blob_content_(V2)": [
"Succeeded"
]
},
"type": "ApiConnection"
}
},
"contentVersion": "1.0.0.0",
"outputs": {},
"parameters": {
"$connections": {
"defaultValue": {},
"type": "Object"
}
},
"triggers": {
"manual": {
"inputs": {
"schema": {}
},
"kind": "Http",
"type": "Request"
}
}
},
"parameters": {
"$connections": {
"value": {
"azureblob": {
"connectionId": "/subscriptions/<subscription id>/resourceGroups/<Your resource group name>/providers/Microsoft.Web/connections/azureblob",
"connectionName": "azureblob",
"id": "/subscriptions/<subscription id>/providers/Microsoft.Web/locations/northcentralus/managedApis/azureblob"
},
"plumsail": {
"connectionId": "/subscriptions/<subscription id >/resourceGroups/<Your resource group name>/providers/Microsoft.Web/connections/plumsail",
"connectionName": "plumsail",
"id": "/subscriptions/<subscription id>/providers/Microsoft.Web/locations/northcentralus/managedApis/plumsail"
}
}
}
}
}
I used items function express and did it directly.
#replace(item()?['details'],',','')
This was a bit strange it didn't work at first but now it is working.
My file has this following pattern.
[
{
"id": 8050879,
"coord": { "lon": -1.65825, "lat": 42.808472 },
"country": "ES",
"geoname": { "cl": "P", "code": "PPLL", "parent": 6359749 },
"name": "Iturrama",
"stat": { "level": 1.0, "population": 24846 },
"stations": [
{ "id": 5493, "dist": 4, "kf": 1 },
{ "id": 28697, "dist": 32, "kf": 1 }
],
"zoom": 14
},
{
"id": 5406990,
"coord": { "lon": -122.064957, "lat": 37.906311 },
"country": "US",
"geoname": { "cl": "P", "code": "PPL", "parent": 5339268 },
"langs": [
{ "bg": "Уолнът Крийк" },
{ "de": "Walnut Creek" },
{ "en": "Walnut Creek" },
{ "eo": "Walnut Creek" },
{ "link": "http://en.wikipedia.org/wiki/Walnut_Creek%2C_California" },
{ "post": "94595" }
],
"name": "Walnut Creek",
"stat": { "level": 1.0, "population": 64173 },
"stations": [
{ "id": 374, "dist": 9, "kf": 1 },
{ "id": 10103, "dist": 9, "kf": 1 },
],
"zoom": 11
},
...
]
I would like to get
[
{
"country": "ES",
"name": "Iturrama"
},
{
"country": "US",
"name": "Walnut Creek"
},
...
]
I have been using
grep -v id filename > result
then
grep -v coord result > result
grep -v geoname result > result
...
until I get my pattern, but I noticed I am deleting anything that has id on it,
So If I have a name: "cIDadel" it will delete too.
Can any one help me with that?
Don't use non-syntax aware tools like grep to parse structured data like JSON. It can't possibly differentiate the underlying types i.e. object/array or any other. Use a proper parser like jq using which you can simply do
jq 'map({country, name})' json_file
See it work in jq-playground. Downloading instructions and setting up is pretty easy - Download jq
If you need to use shell tools for some reason instead of JSON parsing, use AWK.
file.awk
/^\[$/ {print($0)}
/^\{$/ {print($0)}
/"country"/ {print($0)}
/"name"/ {print($0)}
/^ *\},$/ {print($1)}
/^\]$/ {print($0)}
Call:
awk -f file.awk yourdata.txt
I am trying to extract credentials client secret from the cloud foundry env json string
cf env myapp
gives the exact following,(not a proper json so thats why i cant use jq)
Getting env variables for app icm in org myorg / space myspace as
xxyy...
OK
{
"myenv_env_json": {
"http_proxy": "http://mycompany-svr-proxy-qa.mycompany.com:7070",
"https_proxy": "http://mycompany-svr-proxy-qa.mycompany.com:7070",
"no_proxy": "*.mycompany.com"
},
"running_env_json": {},
"system_env_json": {
"VCAP_SERVICES": {
"user-provided": [
{
"name": "myapp-parameters",
"instance_name": "myapp-parameters",
"binding_name": null,
"credentials": {
"auth-domain": "https://sso.login.run-np.mycompany.com",
"backend-url-other": "https://myservice-other.apps-np.mycompany.com",
"client-secret": "121322332-32322-23232-232-32-23232",
"stage": "mystg",
"backend-url": "https://myservice-other.apps-np.mycompany.com",
"client-secret-other": "121322332-32322-23232-232-32-23232"
},
"syslog_drain_url": "",
"volume_mounts": [],
"label": "user-provided",
"tags": []
},
{
"name": "appdynamics",
"instance_name": "appdynamics",
"binding_name": null,
"credentials": {
"account-access-key": "1213232-232-322-2322323-2323232-311",
"account-name": "customer1",
"application-name": "myenv-dev",
"host-name": "appdx-qa.mycompany.com",
"node-name": "$(ruby -e \"require 'json'; a = JSON.parse(ENV['VCAP_APPLICATION']); puts \\\"#{a['application_name']}-#{a['cf_api'].split(/\\.|-/)[2]}:#{a['instance_index']}\\\"\")",
"port": "9401",
"ssl-enabled": "true",
"tier-name": "$(ruby -e \"require 'json'; a = JSON.parse(ENV['VCAP_APPLICATION']); puts \\\"#{a['application_name']}-#{a['cf_api'].split(/\\.|-/)[2]}\\\"\")",
"version": "4.2.7_1"
},
"syslog_drain_url": "",
"volume_mounts": [],
"label": "user-provided",
"tags": []
}
],
"p-identity": [
{
"name": "sso",
"instance_name": "sso",
"binding_name": null,
"credentials": {
"auth_domain": "https://sso.login.run-np.mycompany.com",
"client_secret": "123232-23232-2323243-242323r3r",
"client_id": "afdvdf-dvdfdd-fgdgdf-d23232"
},
"syslog_drain_url": null,
"volume_mounts": [],
"label": "p-identity",
"provider": null,
"plan": "sso",
"tags": []
}
]
}
},
"application_env_json": {
"VCAP_APPLICATION": {
"cf_api": "https://api.run-np.mycompany.com",
"limits": {
"fds": 16384
},
"application_name": "myapp",
"application_uris": [
"myapp-dev.apps-np.mycompany.com"
],
"name": "myapp",
"space_name": "myapp-dev",
"space_id": "392929-23223-2323-2322-2322",
"uris": [
"myapp-dev.apps-np.mycompany.com"
],
"users": null,
"application_id": "fwew78cc-wewc5c-dfd8a7-89d5-fdfefwewb"
}
}
}
User-Provided:
APP_ENV: development
GRANT_TYPE: authorization_code
SSO_AUTO_APPROVED_SCOPES: openid
SSO_IDENTITY_PROVIDERS: mycompany-single-signon
SSO_LAUNCH_URL: https://myapp-dev.apps-np.mycompany.com/
SSO_REDIRECT_URIS: https://myapp-dev.apps-np.mycompany.com/callback,http://myapp-dev.apps-np.mycompany.com/callback
SSO_SCOPES: openid,user_attributes
callback_url: https://myapp-dev.apps-np.mycompany.com/callback
client_secret: secret
client_secret_other: secretother
No running env variables have been set
Staging Environment Variable Groups:
http_proxy: http://myapp-svr-proxy-qa.mycompany.com:7070
https_proxy: http://myapp-svr-proxy-qa.mycompany.com:7070
no_proxy: *.mycompany.com
Here is what i am trying to use, and so far no luck extracting p-identity sub json, what is wrong in my sed
cf env myapp|sed 's/.*\(p-identity[^}].*}\).*/\1/p'
my expected output should be as follows
"p-identity": [
{
"name": "sso",
"instance_name": "sso",
"binding_name": null,
"credentials": {
"auth_domain": "https://sso.login.run-np.mycompany.com",
"client_secret": "123232-23232-2323243-242323r3r",
"client_id": "afdvdf-dvdfdd-fgdgdf-d23232"
}
I found a dirty workaround, may not be efficient but works for now
cf env myapp|sed 1,4d|sed -n '/User-Provided:/q;p'|jq -c -r '.VCAP_SERVICES."p-identity"[0].credentials.client_secret'| head -n1
For your case it may be easier to pipe the output to grep to extract the JSON, then use jq to extract the field that you want, for example:
cf env myapp | grep -oz '{.*}' | jq 'your filter here'
How would I access the following values using the regex function in Powershell, and assign each one to an individual variable?:
id (i.e. get the value: TOKEN_ID) - under token
id (i.e. get the value: TENANT_ID) - under token, tenant
adminURL (i.e. get the value: http://10.100.0.222:35357/v2.0) - the first value under serviceCatalog,endpoints
As I am using Powershell v2, I can't use the ConvertFrom-Json cmdlet. So far I've tried converting the document to an xml file using the a third-party PS script, but it doesn't always get it right. I'd like to use regex, but I am not very comfortable with it.
$json =
"{
"access": {
"metadata": {
"is_admin": 0,
"roles": [
"9fe2ff9ee4384b1894a90878d3e92bab"
]
},
"serviceCatalog": [
{
"endpoints": [
{
"adminURL": "http://10.100.0.222:8774/v2/TENANT_ID",
"id": "0eb78b6d3f644438aea327d9c57b7b5a",
"internalURL": "http://10.100.0.222:8774/v2/TENANT_ID",
"publicURL": "http://8.21.28.222:8774/v2/TENANT_ID",
"region": "RegionOne"
}
],
"endpoints_links": [],
"name": "nova",
"type": "compute"
},
{
"endpoints": [
{
"adminURL": "http://10.100.0.222:9696/",
"id": "3f4b6015a2f9481481ca03dace8acf32",
"internalURL": "http://10.100.0.222:9696/",
"publicURL": "http://8.21.28.222:9696/",
"region": "RegionOne"
}
],
"endpoints_links": [],
"name": "neutron",
"type": "network"
},
{
"endpoints": [
{
"adminURL": "http://10.100.0.222:8776/v2/TENANT_ID",
"id": "16f6416588f64946bdcdf4a431a8f252",
"internalURL": "http://10.100.0.222:8776/v2/TENANT_ID",
"publicURL": "http://8.21.28.222:8776/v2/TENANT_ID",
"region": "RegionOne"
}
],
"endpoints_links": [],
"name": "cinder_v2",
"type": "volumev2"
},
{
"endpoints": [
{
"adminURL": "http://10.100.0.222:8779/v1.0/TENANT_ID",
"id": "be48765ae31e425cb06036b1ebab694a",
"internalURL": "http://10.100.0.222:8779/v1.0/TENANT_ID",
"publicURL": "http://8.21.28.222:8779/v1.0/TENANT_ID",
"region": "RegionOne"
}
],
"endpoints_links": [],
"name": "trove",
"type": "database"
},
{
"endpoints": [
{
"adminURL": "http://10.100.0.222:9292",
"id": "1adfcb5414304f3596fb81edb2dfb514",
"internalURL": "http://10.100.0.222:9292",
"publicURL": "http://8.21.28.222:9292",
"region": "RegionOne"
}
],
"endpoints_links": [],
"name": "glance",
"type": "image"
},
{
"endpoints": [
{
"adminURL": "http://10.100.0.222:8777",
"id": "350f3b91d73f4b3ab8a061c94ac31fbb",
"internalURL": "http://10.100.0.222:8777",
"publicURL": "http://8.21.28.222:8777",
"region": "RegionOne"
}
],
"endpoints_links": [],
"name": "ceilometer",
"type": "metering"
},
{
"endpoints": [
{
"adminURL": "http://10.100.0.222:8000/v1/",
"id": "2198b0d32a604e75a5cc1e13276a813d",
"internalURL": "http://10.100.0.222:8000/v1/",
"publicURL": "http://8.21.28.222:8000/v1/",
"region": "RegionOne"
}
],
"endpoints_links": [],
"name": "heat-cfn",
"type": "cloudformation"
},
{
"endpoints": [
{
"adminURL": "http://10.100.0.222:8776/v1/TENANT_ID",
"id": "7c193c4683d849ca8e8db493722a4d8c",
"internalURL": "http://10.100.0.222:8776/v1/TENANT_ID",
"publicURL": "http://8.21.28.222:8776/v1/TENANT_ID",
"region": "RegionOne"
}
],
"endpoints_links": [],
"name": "cinder",
"type": "volume"
},
{
"endpoints": [
{
"adminURL": "http://10.100.0.222:8773/services/Admin",
"id": "11fac8254be74d7d906110f0069e5748",
"internalURL": "http://10.100.0.222:8773/services/Cloud",
"publicURL": "http://8.21.28.222:8773/services/Cloud",
"region": "RegionOne"
}
],
"endpoints_links": [],
"name": "nova_ec2",
"type": "ec2"
},
{
"endpoints": [
{
"adminURL": "http://10.100.0.222:8004/v1/TENANT_ID",
"id": "38fa4f9afce34d4ca0f5e0f90fd758dd",
"internalURL": "http://10.100.0.222:8004/v1/TENANT_ID",
"publicURL": "http://8.21.28.222:8004/v1/TENANT_ID",
"region": "RegionOne"
}
],
"endpoints_links": [],
"name": "heat",
"type": "orchestration"
},
{
"endpoints": [
{
"adminURL": "http://10.100.0.222:35357/v2.0",
"id": "256cdf78ecb04051bf0f57ec11070222",
"internalURL": "http://10.100.0.222:5000/v2.0",
"publicURL": "http://8.21.28.222:5000/v2.0",
"region": "RegionOne"
}
],
"endpoints_links": [],
"name": "keystone",
"type": "identity"
}
],
"token": {
"audit_ids": [
"gsjrNoqFSQeuLUo0QeJprQ"
],
"expires": "2014-12-15T15:09:29Z",
"id": "TOKEN_ID",
"issued_at": "2014-12-15T14:09:29.794527",
"tenant": {
"description": "Auto created account",
"enabled": true,
"id": "TENANT_ID",
"name": "USERNAME"
}
},
"user": {
"id": "USER_ID",
"name": "USERNAME",
"roles": [
{
"name": "_member_"
}
],
"roles_links": [],
"username": "USERNAME"
}
}
}"
If you are using .NET 3.5 or higher on your machines with PowerShell 2.0, you can use a JSON serializer (from the linked answer):
[System.Reflection.Assembly]::LoadWithPartialName("System.Web.Extensions")
$json = "{a:1,b:2,c:{nested:true}}"
$ser = New-Object System.Web.Script.Serialization.JavaScriptSerializer
$obj = $ser.DeserializeObject($json)
This would be preferable to using regex.
For admin URL for example, you'd refer to:
$obj.access.serviceCatalog[0].endpoints[0].adminURL
Using RegEx Anyway
if ($json -match '(?s)"serviceCatalog".+?"endpoints".+?"adminURL"[^"]+"(?<adminUrl>[^"]+)".+?"token".+?"id"[^"]+"(?<tokenID>[^"]+)".+?"tenant".+?"id"[^"]+"(?<tenantID>[^"]+)') {
$Matches['adminURL']
$Matches['tokenID']
$Matches['tenantID']
}
RegEx Breakdown:
(?s) tells the regex engine that . matches anything, including newlines (by default it wouldn't).
Of course all of the "whatever" parts just match literally.
.+? matches 1 or more of any character (including newlines since we're using s), and the ? makes it non-greedy.
[^"]+ this matches 1 or more characters that are not a double quote.
() is a capturing group. By using (?<name>) we can refer back to the group later by name rather than number, just a nicety.
So the basic idea is to look for the literals, then get to a point where we can capture the values needed. After a -regex operator match in PowerShell, the $Matches variable is populated with the matches, groups, etc.
Note that this relies on the values being in the order they are in the posted JSON. If they were in a different order it would fail.
To work around that you could split this into 3 different regex matches.