AWS CloudFormation keys not accepting special characters - amazon-web-services

I have noticed that AWS CloudFormation does not like special characters.
When I update a key:value pair in our pipeline.yml file with special char
e.g. PAR_FTP_PASS: ^XoN*H89Ie!rhpl!wan=Jcyo6mo, I see the following error:
parameters[5] ParameterKey, ParameterValue or UsePreviousValue expected
I am able to update the value through the AWS CloudFormation UI.
It seems like the issue is to do with AWS CloudFOrmation parsing the yml file.
Is there a workaround with this issue?

AWS Tags have some restrictions on what they can contain, see here:
https://docs.aws.amazon.com/AWSEC2/latest/UserGuide/Using_Tags.html#tag-restrictions
A key note which can catch people out is: "Although EC2 allows for any character in its tags, other services are more restrictive. The allowed characters across services are: letters, numbers, and spaces representable in UTF-8, and the following characters: + - = . _ : / #."
So I'd check if the service you are adding this onto can support that string.

Related

How to add a DKIM record to AWS Lightsail

Most DKIM records are much longer than 255 characters. In AWS LightSail DNS tool when you go to add the TXT record it trips the error: "Each line must be between 0 and 255 characters and contain only printable ASCII characters" . There are various posts on stack overflow that indicate to solve this in AWS Route 53 you must break it into various strings. However, in AWS Route 53 you manually add this record (i.e, IN TXT "v=blahlblablh" "blah blah"). However, in was LightSail it does not let you put it in this format. You instead put the value in directly v=blahblahblah...
Therefore, you do not add quotes and if you do try to wrap it in quotes it still causes the same error. So how can you add these types of records to AWS Lightsail?????
Check this:
https://serverfault.com/questions/763815/route-53-doesnt-allow-adding-dkim-keys-because-length-is-too-long
it works" just add two sets of "" a set to cut it and a wrapping set

Add multiple suffixes to AWS Lambda?

I am configuring a Lambda function that should take .png and .pdf files from my bucket.
How can I provide Lambda config with multiple suffixes?
Here is what I want to do:
Please advise how to do it?
As it may seen from the tooltip description; you can't set multiple suffixes on aws console.
Enter a single optional suffix to limit the notifications to objects with keys that end with matching characters.
What you may do is to create multiple triggers and define each suffix in a separate s3 trigger.
The documentation has some sample xml configuration to support multiple/non-overlapping suffix/prefix options but i think it is not possible to set them on web console.

Creating Kinesis Analytics applications using aws cli

I want to create a kinesis analytics application using aws cli. I use this command to create the application
aws kinesisanalytics create-application --application-name smartfactorytest1 --application-code "CREATE OR REPLACE STREAM DESTINATION_SQL_STREAM ( "device_serial" VARCHAR(16), "uploadRate" INTEGER, "downloadRate" INTEGER);
CREATE OR REPLACE PUMP "STREAM_PUMP"
AS INSERT INTO DESTINATION_SQL_STREAM
SELECT STREAM "device_serial", "uploadRate", "downloadRate"
FROM SOURCE_SQL_STREAM_001
-- LIKE compares a string to a string pattern (_ matches all char, % matches substring)
-- SIMILAR TO compares string to a regex, may use ESCAPE
WHERE "uploadRate" >20000" --inputs NamePrefix="SOURCE_SQL_STREAM",KinesisStreamsInput={ResourceARN="sourcearn",RoleARN="rolearn"}
But I get this error
invalid type for parameter Inputs[0].KinesisStreamsInput, value: ResourceARN=string, type: <class 'str'>, valid types: <class 'dict'>
Can anyone tell me what am I doing wrong? Any help would be appreciated.
I believe the issue is either that you need to take the quotes out in the KinesisStreamsInput section, or you need to add quotes and escape them. The documentation is unclear on which is the correct option.
According to the AWS Kinesis Analytics CLI Reference, https://docs.aws.amazon.com/cli/latest/reference/kinesisanalytics/create-application.html, the syntax for --inputs with KinesisStreamsInput should look like the example provided for KinesisStreamsOutput:
Name=string,KinesisStreamsOutput={ResourceARN=string,RoleARN=string},...
This would mean removing the quotes around your sourcearn and rolearn. However, the documentation isn't clear that this refers to the CLI syntax in all cases.
If that doesn't work, according to this AWS CLI usage guide page, https://docs.aws.amazon.com/cli/latest/userguide/cli-usage-parameters-quoting-strings.html, it specifies adding quotes and escaping the relevant ones, depending on your OS...
"Linux or macOS
Use single quotation marks (' ') to enclose the JSON data structure, as in the following example. You don't have to do anything special with the embedded double quotation marks embedded in the JSON string.
aws ec2 run-instances --image-id ami-12345678 --block-device-mappings '[{"DeviceName":"/dev/sdb","Ebs":{"VolumeSize":20,"DeleteOnTermination":false,"VolumeType":"standard"}}]'
PowerShell
PowerShell requires single quotation marks (' ') to enclose the JSON data structure. Also, because double quotation marks have a special meaning to PowerShell, you must use a backslash () to escape each double quotation mark (") within the JSON structure, as in the following example.
PS C:\> aws ec2 run-instances --image-id ami-12345678 --block-device-mappings '[{\"DeviceName\":\"/dev/sdb\",\"Ebs\":{\"VolumeSize\":20,\"DeleteOnTermination\":false,\"VolumeType\":\"standard\"}}]'
Windows Command Prompt
The Windows command prompt requires double quotation marks (" ") to enclose the JSON data structure. Also, to prevent the command processor from misinterpreting the double quotation marks embedded in the JSON, you must also escape (precede with a backslash [ \ ] character) each double quotation mark (") within the JSON data structure itself, as in the following example.
C:\> aws ec2 run-instances --image-id ami-12345678 --block-device-mappings "[{\"DeviceName\":\"/dev/sdb\",\"Ebs\":{\"VolumeSize\":20,\"DeleteOnTermination\":false,\"VolumeType\":\"standard\"}}]"
Only the outermost double quotation marks are not escaped."
This link also references needing to escape quotes on Windows, and is using the kinesisanalytics command: https://github.com/aws/aws-cli/issues/3103
"Rishi74744 commented on Feb 6, 2018
I got it to work as -
aws kinesisanalytics add-application-reference-data-source --endpoint https://kinesisanalytics.us-east-1.amazonaws.com --region us-east-1 --application-name alerts --reference-data-source "{\"TableName\":\"DeviceData\",\"S3ReferenceDataSource\":{\"BucketARN\":\"arn: aws: s3: : : bucket-name\",\"FileKey\":\"device.csv\",\"ReferenceRoleARN\":\"arn: aws: iam: : account-id: role/role-name\"},\"ReferenceSchema\":{\"RecordFormat\":{\"RecordFormatType\":\"CSV\",\"MappingParameters\":{\"CSVMappingParameters\":{\"RecordRowDelimiter\":\"\n\",\"RecordColumnDelimiter\":\", \"}}},\"RecordEncoding\":\"UTF-8\",\"RecordColumns\":[{\"Name\":\"key1\",\"SqlType\":\"VARCHAR(64)\"},{\"Name\":\"key2\",\"SqlType\":\"VARCHAR(64)\"}]}}" --current-application-version-id 2
But this should be mentioned in the documentation."
One note: it may be preferable to use JSON files as inputs and use this syntax instead: --cli-input-json file://input.json. This is referenced in the AWS Kinesis CLI Command Reference (first link, under 1.) and also mentioned in the GitHub link above. It's also the method used by the majority of the AWS Kinesis documentation. For example, JSON files used for different purposes in Kinesis Analytics:
https://docs.aws.amazon.com/kinesisanalytics/latest/dev/how-it-works-input.html
Please let me know what works, and I will work with my AWS rep to improve the documentation.

Escaping Terraform String Properly

How can I properly escape a Terraform string for trying to be interpolated that contains double curly braces? I'm reading a json file using templating and it keeps failing on this issue.
"customInventory": "{{ customInventory }}"
I want to keep the double braces. Nothing works so far and this is preventing the correct passing of this value to an Amazon Web Services Ssm doc. The Terraform documentation doesn't provide much insight other than escaping quotes and dollar signs.
I've tried Unicode values, double braces, backslashes and other permutations without any success.
This syntax is AWS Ssm doc parameter syntax. The error was actually not terraform but AWS reporting back invalid input when attempting to create the doc. Changing to Enabled instead of {{ customInventory }} resolved the issue and allowed me to publish the doc.

How to deploy a cloud function in a subfolder of my bucket?

I'm trying to deploy a cloud function to a subfolder of my bucket like this
gcloud beta functions deploy $FUNCTION_NAME_STAGING --stage-bucket bucket/staging --trigger-http --entry-point handle
but I get this error
ERROR: (gcloud.beta.functions.deploy) argument --stage-bucket: Invalid value 'bucket/staging': Bucket must only contain lower case Latin letters, digits and characters . _ -. It must start and end with a letter or digit and be from 3 to 232 characters long. You may optionally prepend the bucket name with gs:// and append / at the end.
I cannot find a way to do it in the documentation. Is it possible ?