Terraform GCP Unsupported escape sequence in regex - regex

I am trying to set up custom metrics in GCP using terraform code.
I have a problem with the label extractor part to get 4xx and 5xx response codes.
Basing on console response and Google builder I managed to create this regex:
\\s([4-5][0-9][0-9])\\s\
When I run code with this regex I get this response:
googleapi: Error 400: Failed to parse extractor expression:
unsupported escape sequence in a string literal at line 1, column 36,
token '"\s([1-5][0-9][0-9])\s"'
When I send regex without white spaces (\\s) code works flawlessly.
I have tried different variations of \ before "\ parts, but none seemed to work.
extracted_label = "REGEXP_EXTRACT(jsonPayload.message, \"\\s([1-5][0-9][0-9])\\s\")"
I would like to be able to create a metric, but I cannot bypass the unsupported escape sequence problem.
I would be grateful for any help.

I managed to find answer by myself to this question.
The correct way is using 4 slashes \\\\
Fixed code line should look like this:
extracted_label = "REGEXP_EXTRACT(jsonPayload.message, \"\\\\s([4-5][0-9][0-9])\\\\s\")"
Also, if someone would look for sample code in Terraform GCP to pull HTTP response code from readiness health check here is one:
extracted_label = "REGEXP_EXTRACT(jsonPayload.message, \"\\\\w+\\\\/\\\\d\\\\.\\\\d\\\"\\\\s([4-5][0-9][0-9])\")"

Related

AWS CLI syntax error: Error parsing parameter '--image': Expected: '=', received: ''' for input:

I am trying to follow a walkthrough involving using AWS to use live feed facial recognition (see link below). I have done the 7 previous steps without issue and am currently on step 8, where I have uploaded a picture to a S3 bucket, and I have to do the command in cmd:
aws rekognition index-faces --image '{"S3Object":{"Bucket":"<S3BUCKET>","Name":"<MYFACE_KEY>.jpeg"}}' --collection-id "rekVideoBlog" --detection-attributes "ALL" --external-image-id "<YOURNAME>" --region us-west-2
I have done this, with my own information input in, but I get an error that states:
Error parsing parameter '--image': Expected: '=', received: ''' for input:
'{S3Object:{Bucket:,Name:.jpeg}}'
I have have looked up what could be the issue, with many solutions involving Windows systems having a different style of quotations compared to the walkthrough (i.e. switching single quotes to doubles and doubles to escaped quotes). No matter what I try, I cannot seem to figure out the issue. Does anyone have an idea of what would be the correct syntax for this on a Windows system or if I am doing something else wrong?
Walkthough: https://aws.amazon.com/blogs/machine-learning/easily-perform-facial-analysis-on-live-feeds-by-creating-a-serverless-video-analytics-environment-with-amazon-rekognition-video-and-amazon-kinesis-video-streams/

Google Cloud Logging shows each log line wrapped in quotes

a low-impact question more out of curiosity than anything else. I am logging lines from a Java container on Kubernetes, which logs a JSON format that should be compatible with Google Cloud. The JSON format is as follows.
{"message":"s.s.DefaultSandboxService - Used final flags Flags(true,true,true,false,false,false,None) ","timestamp":{"seconds":1630049408,"nanos":159000000},"severity":"INFO","thread":"application-akka.actor.default-dispatcher-11639"}
The output is shown as follows in Logs Explorer:
Every log line in the output is surrounded with " quotes. Why is that? I don't see similar behavior in other containers.

Error in metric filter pattern in cloud watch

Iam trying to create the custom cloudwatch metric from the Log Groups
I am trying to create the metric pattern for the status of the email. I just need to monitor the the response in email(success/failure)
My cloudwatch logs look like below
Email status : [EmailStatusResponse{farmId=3846, emailIds='xxx', response='success'}
So, i just need to monitor two cases
response='success'
response='failure'
Please find the below snippet for my configuration
Can anyone pls help me with the error in the filter pattern
kindly help!
Wrap this in double quotes.
Metric filter terms that include characters other than alphanumeric or underscore must be placed inside double quotes ("").
For you it would be "response='success'"

Docker AWS ECR error parsing HTTP 404 response body: invalid character 'p' after top-level value: "404 page not found\n"

Had an issue with not being able to push or pull from an AWS ECR registry with the following cryptic error:
error parsing HTTP 404 response body: invalid character 'p' after top-level value: "404 page not found\n"
Several hours of googling indicated it was a protocol issue. It turns out the image name:
xxxxxxxxxxxx.dkr.ecr.us-east-1.amazonaws.com/snowshu__test
was the issue: AWS ECR errors when the image name contains double underscores.
This contradicts ECR naming documentation.
You cannot have two underscores next to each other in a repository name.
As per the Docker Registry API:
A component of a repository name must be at least one lowercase, alpha-numeric characters, optionally separated by periods, dashes or underscores. More strictly, it must match the regular expression [a-z0-9]+(?:[._-][a-z0-9]+)*.
Renaming the image to
xxxxxxxxxxxx.dkr.ecr.us-east-1.amazonaws.com/snowshu_test
solved the issue.

AWS Data Pipeline Escaping Comma in emr activity step section

I am creating an aws datapipline using the architect provided in the aws web console.
Everything is setup ok, my emrcluster is configured and successfully started.
But when I am trying to submit a emr activity I come across following problem:
In the step section of the emr activity my requirement is to provide --packages argument with 3 packages
But as far as I understand steps in emractivity is a comma separated value and commas (,) are replaced with spaces in the resultant step argument.
On the other hand --packages argument is also a comma separated value in case of multiple packages.
Now when I am trying to pass this as argument commas get replaced with spaces that make the step invalid.
This is the statement I required as it is in the resultant emr step:
--packages com.amazonaws:aws-java-sdk-s3:1.11.228,org.apache.hadoop:hadoop-aws:2.6.0,org.postgresql:postgresql:42.1.4
Any solution to escape the comma?
So far i try the \\\\ way as mentioned in http://docs.aws.amazon.com/datapipeline/latest/DeveloperGuide/dp-object-emractivity.html
Not worked.
when u will be using \\\\, it will escape the slashes and comma will get replaced.
You can try using Three slashes, same has worked for me . Like \\\, .
I hope that works