Docker AWS ECR error parsing HTTP 404 response body: invalid character 'p' after top-level value: "404 page not found\n" - amazon-web-services

Had an issue with not being able to push or pull from an AWS ECR registry with the following cryptic error:
error parsing HTTP 404 response body: invalid character 'p' after top-level value: "404 page not found\n"
Several hours of googling indicated it was a protocol issue. It turns out the image name:
xxxxxxxxxxxx.dkr.ecr.us-east-1.amazonaws.com/snowshu__test
was the issue: AWS ECR errors when the image name contains double underscores.
This contradicts ECR naming documentation.

You cannot have two underscores next to each other in a repository name.
As per the Docker Registry API:
A component of a repository name must be at least one lowercase, alpha-numeric characters, optionally separated by periods, dashes or underscores. More strictly, it must match the regular expression [a-z0-9]+(?:[._-][a-z0-9]+)*.

Renaming the image to
xxxxxxxxxxxx.dkr.ecr.us-east-1.amazonaws.com/snowshu_test
solved the issue.

Related

AWS CloudFormation keys not accepting special characters

I have noticed that AWS CloudFormation does not like special characters.
When I update a key:value pair in our pipeline.yml file with special char
e.g. PAR_FTP_PASS: ^XoN*H89Ie!rhpl!wan=Jcyo6mo, I see the following error:
parameters[5] ParameterKey, ParameterValue or UsePreviousValue expected
I am able to update the value through the AWS CloudFormation UI.
It seems like the issue is to do with AWS CloudFOrmation parsing the yml file.
Is there a workaround with this issue?
AWS Tags have some restrictions on what they can contain, see here:
https://docs.aws.amazon.com/AWSEC2/latest/UserGuide/Using_Tags.html#tag-restrictions
A key note which can catch people out is: "Although EC2 allows for any character in its tags, other services are more restrictive. The allowed characters across services are: letters, numbers, and spaces representable in UTF-8, and the following characters: + - = . _ : / #."
So I'd check if the service you are adding this onto can support that string.

Use SSM Document to check a particular application installation

I'm trying to use AWS SSM Document RunPowerShellScript action to check if a particular application is installed on Windows servers. The PowerShell script is very simple, but Doucment validation keeps failing.
The PowerShell script does contain a registry path, which does contain columns and back slashes. I suspect this may contribute to the problem. Tried with changing all the back slashes to forward slashes with no luck.
schemaVersion: "2.2"
description: "Command Document to check if This Software is installed"
mainSteps:
- action: "aws:runPowerShellScript"
name: "CheckThisSoftware"
inputs:
runCommand:
- "$ResultMsg = (Get-ItemProperty HKLM:\SOFTWARE\Microsoft\Windows\CurrentVersion\Uninstall\*).DisplayName -Contains 'Software Name Here'",
- "Write-Output $ResultMsg"
Keep getting InvalidDocumentContent: null while tries to submit the document.
I fixed this by escaping my special characters (\)

Terraform GCP Unsupported escape sequence in regex

I am trying to set up custom metrics in GCP using terraform code.
I have a problem with the label extractor part to get 4xx and 5xx response codes.
Basing on console response and Google builder I managed to create this regex:
\\s([4-5][0-9][0-9])\\s\
When I run code with this regex I get this response:
googleapi: Error 400: Failed to parse extractor expression:
unsupported escape sequence in a string literal at line 1, column 36,
token '"\s([1-5][0-9][0-9])\s"'
When I send regex without white spaces (\\s) code works flawlessly.
I have tried different variations of \ before "\ parts, but none seemed to work.
extracted_label = "REGEXP_EXTRACT(jsonPayload.message, \"\\s([1-5][0-9][0-9])\\s\")"
I would like to be able to create a metric, but I cannot bypass the unsupported escape sequence problem.
I would be grateful for any help.
I managed to find answer by myself to this question.
The correct way is using 4 slashes \\\\
Fixed code line should look like this:
extracted_label = "REGEXP_EXTRACT(jsonPayload.message, \"\\\\s([4-5][0-9][0-9])\\\\s\")"
Also, if someone would look for sample code in Terraform GCP to pull HTTP response code from readiness health check here is one:
extracted_label = "REGEXP_EXTRACT(jsonPayload.message, \"\\\\w+\\\\/\\\\d\\\\.\\\\d\\\"\\\\s([4-5][0-9][0-9])\")"

How to deploy a cloud function in a subfolder of my bucket?

I'm trying to deploy a cloud function to a subfolder of my bucket like this
gcloud beta functions deploy $FUNCTION_NAME_STAGING --stage-bucket bucket/staging --trigger-http --entry-point handle
but I get this error
ERROR: (gcloud.beta.functions.deploy) argument --stage-bucket: Invalid value 'bucket/staging': Bucket must only contain lower case Latin letters, digits and characters . _ -. It must start and end with a letter or digit and be from 3 to 232 characters long. You may optionally prepend the bucket name with gs:// and append / at the end.
I cannot find a way to do it in the documentation. Is it possible ?

AWS CloudSearch request using CLI returns Invalid Javascript Object error

I'm trying to query my AWS Cloudsearch (2013 API) domain using the AWS CLI on Ubuntu. I haven't been able to get it to work successfully when the search is restricted to a specific field. The following query:
aws --profile myprofile cloudsearchdomain search
--endpoint-url "https://search-mydomain-abc123xyz.eu-west-1.cloudsearch.amazonaws.com"
--query-options {"fields":["my_field"]}
--query-parser "simple"
--return "my_field"
--search-query "foo bar"
...returns the following error:
An error occurred (SearchException) when calling the Search operation: q.options contains invalid javascript object
If I remove the --query-options parameter from the above query, then it works. From the AWS CLI docs regarding the fields options of the --query-options parameter:
An array of the fields to search when no fields are specified in a search... Valid for: simple , structured , lucene , and dismax
aws cli version:
aws-cli/1.11.150 Python/2.7.12 Linux/4.10.0-28-generic botocore/1.7.8
I think the documentation is a bit misleading as JSon does not like embedded double quotes inside double quotes, you would need to replace with single quote as
--query-options "{'fields':['my_field']}"
or you can escape the double quote
--query-options "{\"fields\":[\"my_field\"]}"