Use regex in Powershell v2 to get values from a json file - regex

How would I access the following values using the regex function in Powershell, and assign each one to an individual variable?:
id (i.e. get the value: TOKEN_ID) - under token
id (i.e. get the value: TENANT_ID) - under token, tenant
adminURL (i.e. get the value: http://10.100.0.222:35357/v2.0) - the first value under serviceCatalog,endpoints
As I am using Powershell v2, I can't use the ConvertFrom-Json cmdlet. So far I've tried converting the document to an xml file using the a third-party PS script, but it doesn't always get it right. I'd like to use regex, but I am not very comfortable with it.
$json =
"{
"access": {
"metadata": {
"is_admin": 0,
"roles": [
"9fe2ff9ee4384b1894a90878d3e92bab"
]
},
"serviceCatalog": [
{
"endpoints": [
{
"adminURL": "http://10.100.0.222:8774/v2/TENANT_ID",
"id": "0eb78b6d3f644438aea327d9c57b7b5a",
"internalURL": "http://10.100.0.222:8774/v2/TENANT_ID",
"publicURL": "http://8.21.28.222:8774/v2/TENANT_ID",
"region": "RegionOne"
}
],
"endpoints_links": [],
"name": "nova",
"type": "compute"
},
{
"endpoints": [
{
"adminURL": "http://10.100.0.222:9696/",
"id": "3f4b6015a2f9481481ca03dace8acf32",
"internalURL": "http://10.100.0.222:9696/",
"publicURL": "http://8.21.28.222:9696/",
"region": "RegionOne"
}
],
"endpoints_links": [],
"name": "neutron",
"type": "network"
},
{
"endpoints": [
{
"adminURL": "http://10.100.0.222:8776/v2/TENANT_ID",
"id": "16f6416588f64946bdcdf4a431a8f252",
"internalURL": "http://10.100.0.222:8776/v2/TENANT_ID",
"publicURL": "http://8.21.28.222:8776/v2/TENANT_ID",
"region": "RegionOne"
}
],
"endpoints_links": [],
"name": "cinder_v2",
"type": "volumev2"
},
{
"endpoints": [
{
"adminURL": "http://10.100.0.222:8779/v1.0/TENANT_ID",
"id": "be48765ae31e425cb06036b1ebab694a",
"internalURL": "http://10.100.0.222:8779/v1.0/TENANT_ID",
"publicURL": "http://8.21.28.222:8779/v1.0/TENANT_ID",
"region": "RegionOne"
}
],
"endpoints_links": [],
"name": "trove",
"type": "database"
},
{
"endpoints": [
{
"adminURL": "http://10.100.0.222:9292",
"id": "1adfcb5414304f3596fb81edb2dfb514",
"internalURL": "http://10.100.0.222:9292",
"publicURL": "http://8.21.28.222:9292",
"region": "RegionOne"
}
],
"endpoints_links": [],
"name": "glance",
"type": "image"
},
{
"endpoints": [
{
"adminURL": "http://10.100.0.222:8777",
"id": "350f3b91d73f4b3ab8a061c94ac31fbb",
"internalURL": "http://10.100.0.222:8777",
"publicURL": "http://8.21.28.222:8777",
"region": "RegionOne"
}
],
"endpoints_links": [],
"name": "ceilometer",
"type": "metering"
},
{
"endpoints": [
{
"adminURL": "http://10.100.0.222:8000/v1/",
"id": "2198b0d32a604e75a5cc1e13276a813d",
"internalURL": "http://10.100.0.222:8000/v1/",
"publicURL": "http://8.21.28.222:8000/v1/",
"region": "RegionOne"
}
],
"endpoints_links": [],
"name": "heat-cfn",
"type": "cloudformation"
},
{
"endpoints": [
{
"adminURL": "http://10.100.0.222:8776/v1/TENANT_ID",
"id": "7c193c4683d849ca8e8db493722a4d8c",
"internalURL": "http://10.100.0.222:8776/v1/TENANT_ID",
"publicURL": "http://8.21.28.222:8776/v1/TENANT_ID",
"region": "RegionOne"
}
],
"endpoints_links": [],
"name": "cinder",
"type": "volume"
},
{
"endpoints": [
{
"adminURL": "http://10.100.0.222:8773/services/Admin",
"id": "11fac8254be74d7d906110f0069e5748",
"internalURL": "http://10.100.0.222:8773/services/Cloud",
"publicURL": "http://8.21.28.222:8773/services/Cloud",
"region": "RegionOne"
}
],
"endpoints_links": [],
"name": "nova_ec2",
"type": "ec2"
},
{
"endpoints": [
{
"adminURL": "http://10.100.0.222:8004/v1/TENANT_ID",
"id": "38fa4f9afce34d4ca0f5e0f90fd758dd",
"internalURL": "http://10.100.0.222:8004/v1/TENANT_ID",
"publicURL": "http://8.21.28.222:8004/v1/TENANT_ID",
"region": "RegionOne"
}
],
"endpoints_links": [],
"name": "heat",
"type": "orchestration"
},
{
"endpoints": [
{
"adminURL": "http://10.100.0.222:35357/v2.0",
"id": "256cdf78ecb04051bf0f57ec11070222",
"internalURL": "http://10.100.0.222:5000/v2.0",
"publicURL": "http://8.21.28.222:5000/v2.0",
"region": "RegionOne"
}
],
"endpoints_links": [],
"name": "keystone",
"type": "identity"
}
],
"token": {
"audit_ids": [
"gsjrNoqFSQeuLUo0QeJprQ"
],
"expires": "2014-12-15T15:09:29Z",
"id": "TOKEN_ID",
"issued_at": "2014-12-15T14:09:29.794527",
"tenant": {
"description": "Auto created account",
"enabled": true,
"id": "TENANT_ID",
"name": "USERNAME"
}
},
"user": {
"id": "USER_ID",
"name": "USERNAME",
"roles": [
{
"name": "_member_"
}
],
"roles_links": [],
"username": "USERNAME"
}
}
}"

If you are using .NET 3.5 or higher on your machines with PowerShell 2.0, you can use a JSON serializer (from the linked answer):
[System.Reflection.Assembly]::LoadWithPartialName("System.Web.Extensions")
$json = "{a:1,b:2,c:{nested:true}}"
$ser = New-Object System.Web.Script.Serialization.JavaScriptSerializer
$obj = $ser.DeserializeObject($json)
This would be preferable to using regex.
For admin URL for example, you'd refer to:
$obj.access.serviceCatalog[0].endpoints[0].adminURL
Using RegEx Anyway
if ($json -match '(?s)"serviceCatalog".+?"endpoints".+?"adminURL"[^"]+"(?<adminUrl>[^"]+)".+?"token".+?"id"[^"]+"(?<tokenID>[^"]+)".+?"tenant".+?"id"[^"]+"(?<tenantID>[^"]+)') {
$Matches['adminURL']
$Matches['tokenID']
$Matches['tenantID']
}
RegEx Breakdown:
(?s) tells the regex engine that . matches anything, including newlines (by default it wouldn't).
Of course all of the "whatever" parts just match literally.
.+? matches 1 or more of any character (including newlines since we're using s), and the ? makes it non-greedy.
[^"]+ this matches 1 or more characters that are not a double quote.
() is a capturing group. By using (?<name>) we can refer back to the group later by name rather than number, just a nicety.
So the basic idea is to look for the literals, then get to a point where we can capture the values needed. After a -regex operator match in PowerShell, the $Matches variable is populated with the matches, groups, etc.
Note that this relies on the values being in the order they are in the posted JSON. If they were in a different order it would fail.
To work around that you could split this into 3 different regex matches.

Related

How to extract an element in an array if the filter element is 2 levels down

My ListInputSecurityGroup task returns this json:
{
"output": [
{
"Arn": "arn:aws:medialive:eu-north-1:xxx:inputSecurityGroup:1977625",
"Id": "1977625",
"Inputs": [],
"State": "IDLE",
"Tags": {},
"WhitelistRules": [
{
"Cidr": "5.5.5.5/32"
}
]
},
{
"Arn": "arn:aws:medialive:eu-north-1:xxx:inputSecurityGroup:5411101",
"Id": "5411101",
"Inputs": [],
"State": "IDLE",
"Tags": {
"use": "some_other_use"
},
"WhitelistRules": [
{
"Cidr": "1.1.1.1/0"
}
]
},
{
"Arn": "arn:aws:medialive:eu-north-1:xxx:inputSecurityGroup:825926",
"Id": "825926",
"Inputs": [
"4011716"
],
"State": "IN_USE",
"Tags": {
"use": "for_rtmp_pipeline"
},
"WhitelistRules": [
{
"Cidr": "0.0.0.0/0"
}
]
}
]
}
I want to use OutputPath to extract the InputSecurityGroup with the tag {use:for_rtmp_pipeline}. According to this JSONPath tester this expression works $.output[?(#.Tags.use == for_rtmp_pipeline)] and it returns the 3rd element in this array. But when used in the StepFunction itself, or in the Data Flow Simulator, it doesn't return anything. Is this a limitation of the JSONPath engine in AWS, or is there a different syntaxis? How can I extract the one element I want?
Note that in the tester the searched string should be in quotes, while in AWS there's no need for quotes.

LogicApp:replace the message in the csv table with a "." for ","

I have the flow where i want to edit the column in the csv table and replace the "," by a "."
How do I do that? Because the replace function expression in logicApp does not return the column:
It asks me to take the complete body when I use the replace function.
Where as details column is available which I want to edit:
How should I replace the "," from the details column?
I did this then, Then i don't see the variable I initialize.
For instance I've taken this as my sample .csv file which I'm retrieving from my storage account.
Firstly I have used Parse CSV file like you did the same, then initialised and used the Append the string variable connector taking the Productsname column. Lastly, have used the replace function expression to replace ' , ' with a ' . '.
NOTE: I have used '|' following productsname variable for future purpose.
Here is my Logic App workflow
THE COMPOSE CONNECTOR EXPRESSION :-
split(replace(variables('Productname'),',','.'),'|')
OUTPUT:
Here is my workflow that you can refer to:
{
"definition": {
"$schema": "https://schema.management.azure.com/providers/Microsoft.Logic/schemas/2016-06-01/workflowdefinition.json#",
"actions": {
"Compose": {
"inputs": "#split(replace(variables('Productname'),',','.'),'|')",
"runAfter": {
"For_each_2": [
"Succeeded"
]
},
"type": "Compose"
},
"For_each_2": {
"actions": {
"Append_to_string_variable": {
"inputs": {
"name": "Productname",
"value": "#{items('For_each_2')?['Productname']}|"
},
"runAfter": {},
"type": "AppendToStringVariable"
}
},
"foreach": "#body('Parse_CSV')",
"runAfter": {
"Initialize_variable": [
"Succeeded"
]
},
"type": "Foreach"
},
"Get_blob_content_(V2)": {
"inputs": {
"host": {
"connection": {
"name": "#parameters('$connections')['azureblob']['connectionId']"
}
},
"method": "get",
"path": "/v2/datasets/#{encodeURIComponent(encodeURIComponent('AccountNameFromSettings'))}/files/#{encodeURIComponent(encodeURIComponent('JTJmY29udGFpbmVyMjQwOCUyZlByb2R1Y3RzLmNzdg=='))}/content"
},
"metadata": {
"JTJmY29udGFpbmVyMjQwOCUyZlByb2R1Y3RzLmNzdg==": "/container2408/Products.csv"
},
"runAfter": {},
"type": "ApiConnection"
},
"Initialize_variable": {
"inputs": {
"variables": [
{
"name": "Productname",
"type": "string"
}
]
},
"runAfter": {
"Parse_CSV": [
"Succeeded"
]
},
"type": "InitializeVariable"
},
"Parse_CSV": {
"inputs": {
"body": {
"content": "#{base64(body('Get_blob_content_(V2)'))}",
"headers": "Productid,Productname"
},
"host": {
"connection": {
"name": "#parameters('$connections')['plumsail']['connectionId']"
}
},
"method": "post",
"path": "/flow/v1/Documents/jobs/ParseCsv"
},
"runAfter": {
"Get_blob_content_(V2)": [
"Succeeded"
]
},
"type": "ApiConnection"
}
},
"contentVersion": "1.0.0.0",
"outputs": {},
"parameters": {
"$connections": {
"defaultValue": {},
"type": "Object"
}
},
"triggers": {
"manual": {
"inputs": {
"schema": {}
},
"kind": "Http",
"type": "Request"
}
}
},
"parameters": {
"$connections": {
"value": {
"azureblob": {
"connectionId": "/subscriptions/<subscription id>/resourceGroups/<Your resource group name>/providers/Microsoft.Web/connections/azureblob",
"connectionName": "azureblob",
"id": "/subscriptions/<subscription id>/providers/Microsoft.Web/locations/northcentralus/managedApis/azureblob"
},
"plumsail": {
"connectionId": "/subscriptions/<subscription id >/resourceGroups/<Your resource group name>/providers/Microsoft.Web/connections/plumsail",
"connectionName": "plumsail",
"id": "/subscriptions/<subscription id>/providers/Microsoft.Web/locations/northcentralus/managedApis/plumsail"
}
}
}
}
}
I used items function express and did it directly.
#replace(item()?['details'],',','')
This was a bit strange it didn't work at first but now it is working.

How to build a multi-dimentional json native query for Druid?

I have data with multiple dimensions, stored in the Druid cluster. for example, Data of movies and the revenue they earned from each country where they were screened.
I'm trying to build a query that the answer to be returned will be a table of all the movies, the total revenue of each of them, and the revenue for each country.
I succeeded to do it in Turnilo - it generated for me the following Druid query -
[
[
{
"queryType": "timeseries",
"dataSource": "movies_source",
"intervals": "2021-11-18T00:01Z/2021-11-21T00:01Z",
"granularity": "all",
"aggregations": [
{
"name": "__VALUE__",
"type": "doubleSum",
"fieldName": "revenue"
}
]
},
{
"queryType": "topN",
"dataSource": "movies_source",
"intervals": "2021-11-18T00:01Z/2021-11-21T00:01Z",
"granularity": "all",
"dimension": {
"type": "default",
"dimension": "movie_id",
"outputName": "movie_id"
},
"aggregations": [
{
"name": "revenue",
"type": "doubleSum",
"fieldName": "revenue"
}
],
"metric": "revenue",
"threshold": 50
}
],
[
{
"queryType": "topN",
"dataSource": "movies_source",
"intervals": "2021-11-18T00:01Z/2021-11-21T00:01Z",
"granularity": "all",
"filter": {
"type": "selector",
"dimension": "movie_id",
"value": "some_movie_id"
},
"dimension": {
"type": "default",
"dimension": "country",
"outputName": "country"
},
"aggregations": [
{
"name": "revenue",
"type": "doubleSum",
"fieldName": "revenue"
}
],
"metric": "revenue",
"threshold": 5
}
]
]
But it doesn't work when I'm trying to use it as a body for a Postman query - I got
{
"error": "Unknown exception",
"errorMessage": "Unexpected token (START_ARRAY), expected VALUE_STRING: need JSON String that contains type id (for subtype of org.apache.druid.query.Query)\n at [Source: (org.eclipse.jetty.server.HttpInputOverHTTP); line: 2, column: 3]",
"errorClass": "com.fasterxml.jackson.databind.exc.MismatchedInputException",
"host": null
}
How should I build the corresponding query so that it works with Postman?
I am not familiar with Turnilo but have you tried using the Druid Console to write SQL and convert to Native request with the "Explain SQL query" option under the "Run/..." menu?
Your native queries seem to be doing a Top N instead of listing all movies, so I think the SQL might be something like:
SELECT movie_id, country_id, SUM(revenue) total_revenue
FROM movies_source
WHERE __time BETWEEN '2021-11-18 00:01:00' AND '2021-11-21 00:01:00'
GROUP BY movie_id, country_id
ORDER BY total_revenue DESC
LIMIT 50
I don't have the data source to test, but tested with sample wikipedia data with similar query structure:
SELECT namespace, cityName, sum(sum_added) total
FROM "wikipedia" r
WHERE cityName IS NOT NULL
AND __time BETWEEN '2015-09-12 00:00:00' AND '2015-09-15 00:00:00'
GROUP BY namespace, cityName
ORDER BY total DESC
limit 50
which results in the following Native query:
{
"queryType": "groupBy",
"dataSource": {
"type": "table",
"name": "wikipedia"
},
"intervals": {
"type": "intervals",
"intervals": [
"2015-09-12T00:00:00.000Z/2015-09-15T00:00:00.001Z"
]
},
"virtualColumns": [],
"filter": {
"type": "not",
"field": {
"type": "selector",
"dimension": "cityName",
"value": null,
"extractionFn": null
}
},
"granularity": {
"type": "all"
},
"dimensions": [
{
"type": "default",
"dimension": "namespace",
"outputName": "d0",
"outputType": "STRING"
},
{
"type": "default",
"dimension": "cityName",
"outputName": "d1",
"outputType": "STRING"
}
],
"aggregations": [
{
"type": "longSum",
"name": "a0",
"fieldName": "sum_added",
"expression": null
}
],
"postAggregations": [],
"having": null,
"limitSpec": {
"type": "default",
"columns": [
{
"dimension": "a0",
"direction": "descending",
"dimensionOrder": {
"type": "numeric"
}
}
],
"limit": 50
},
"context": {
"populateCache": false,
"sqlOuterLimit": 101,
"sqlQueryId": "cd5aabed-5e08-49b7-af63-fe82c125d3ee",
"useApproximateCountDistinct": false,
"useApproximateTopN": false,
"useCache": false
},
"descending": false
}

Google People API - Birthday - Date object is not returned in GET requests

In Google People - people.connections.list and other GET APIs, the Date object of Birthday field is not returned for some Contacts. Have authenticated with the full People scope [https://www.googleapis.com/auth/contacts].
We also do not know the format of the "text" field to parse that, as the param can have any random string.
How to parse the birthday of a user? When will the Date object not be returned?
Sample Request
https://people.googleapis.com/v1/people/me/connections?pageSize=100&requestSyncToken=true&personFields=birthdays
Sample response Birthdays
{
"birthdays": [
{
"metadata": {
"source": {
"id": "3ebd95668aeed9d7",
"type": "CONTACT"
},
"primary": true
},
"text": "2000-07-24"
}
],
"resourceName": "people/c4520933868599957975",
"etag": "<etag>"
},
{
"birthdays": [
{
"metadata": {
"source": {
"id": "5f5712ce0861c5b0",
"type": "CONTACT"
},
"primary": true
},
"text": "1880-03-11"
}
],
"resourceName": "people/c6869980432690169264",
"etag": "<etag>"
},
{
"birthdays": [
{
"date": {
"month": 1,
"year": 1990,
"day": 26
},
"metadata": {
"source": {
"id": "a16dde58e814a36",
"type": "CONTACT"
},
"primary": true
},
"text": "01/26/1990"
}
],
"resourceName": "people/c727012367875000886",
"etag": "<etag>"
},
{
"birthdays": [
{
"date": {
"month": 1,
"year": 1998,
"day": 1
},
"metadata": {
"source": {
"id": "f350f568dd11db1",
"type": "CONTACT"
},
"primary": true
},
"text": "Jan 1, 1998"
}
],
"resourceName": "people/c1095798948755479985",
"etag": "<etag>"
},
{
"birthdays": [
{
"metadata": {
"source": {
"id": "3652e00f0d28c9f4",
"type": "CONTACT"
},
"primary": true
},
"text": "random string accept"
}
],
"resourceName": "people/c3914437381388290548",
"etag": "<etag>"
}
]
The birthday field, can take both Date or text, and are not guaranteed to be the same.

What is the replacement function for Contract.at in web3.js for including a contract?

Getting this error in the console:
Uncaught TypeError: TestContract.at is not a function
I am implementing a sample contract on a test server using this code i got from a course which I'm doing on Blockchain
var TestContract =new web3.eth.Contract([
{
"constant": false,
"inputs": [
{
"name": "_fName",
"type": "string"
},
{
"name": "_age",
"type": "uint256"
}
],
"name": "setInstructor",
"outputs": [],
"payable": false,
"stateMutability": "nonpayable",
"type": "function"
},
{
"constant": true,
"inputs": [],
"name": "getInstructor",
"outputs": [
{
"name": "",
"type": "string"
},
{
"name": "",
"type": "uint256"
}
],
"payable": false,
"stateMutability": "view",
"type": "function"
}
])
var Test = TestContract.at('0xd1d0ba6a5af6bb66490d04b99f4955eb9c9fef36');
You can just add an address as the second parameter
var TestContract =new web3.eth.Contract([
{
"constant": false,
"inputs": [
{
"name": "_fName",
"type": "string"
},
{
"name": "_age",
"type": "uint256"
}
],
"name": "setInstructor",
"outputs": [],
"payable": false,
"stateMutability": "nonpayable",
"type": "function"
},
{
"constant": true,
"inputs": [],
"name": "getInstructor",
"outputs": [
{
"name": "",
"type": "string"
},
{
"name": "",
"type": "uint256"
}
],
"payable": false,
"stateMutability": "view",
"type": "function"
}
],'0xd1d0ba6a5af6bb66490d04b99f4955eb9c9fef36')
You can read more about the available parameters there
Or you can add it via
TestContract.options.address = '0xd1d0ba6a5af6bb66490d04b99f4955eb9c9fef36'