API Gateway stage variable always null - amazon-web-services

I am trying to use stage variables, but I always get this error:
{
"logref": "some_uid",
"message": "Invalid stage variable value: null. Please use values with alphanumeric characters and the symbols ' ', -', '.', '_', ':', '/', '?', '&', '=', and ','."
}
My goal is to call SNS from API gateway without the need from the caller to specify the TopicArn and the Message in the query string.
So in the Integration Request I am mapping the query string TopicArn to stageVariables.TopicArn (I have tried '$stageVariables.TopicArn' as well).
And then in the Stage variables section in AWS console I input the Name TopicArn and the Value arn:aws:sns:my_region:my_account_id:test-topic
After I deployed my API I test it from the AWS console and I get this error:
{
"logref": "some_uid",
"message": "Invalid stage variable value: null. Please use values with alphanumeric characters and the symbols ' ', -', '.', '_', ':', '/', '?', '&', '=', and ','."
}
What am I doing wrong, it his achievable?

Related

An error occurred (InvalidRequestException) when calling the ListSteps operation: Marker '1' is not valid

I am trying to use list_steps API in boto3. I want to use pagination token 'Marker' and I could not find any example value. Therefore I assigned '1' to 'Marker'. (Type should be 'String')
But I got the exception, Marker '1' is not valid. What value should 'Marker' have?
boto3 list_steps : https://boto3.amazonaws.com/v1/documentation/api/latest/reference/services/emr.html#EMR.Client.list_steps
aws API, listSteps: https://docs.aws.amazon.com/emr/latest/APIReference/API_ListSteps.html#API_ListSteps_RequestSyntax

dynamodb PartiQL SELECT query returns ValidationException: Unexpected from source

I am using Amplify to setup a dynamodb with a corresponding lambda using the amplify blueprint for dynamodb.
Accessing the dynamodb the "classic" way with KeyConditionExpression etc works just fine but today I wanted to try and use PartiQL instead with the executeStatement and I am just not able to get it to work.
I have added the "dynamodb:PartiQLSelect" permission to the cloudfront template where all the other dynamodb permissions are so it looks like:
"Action": [
"dynamodb:DescribeTable",
"dynamodb:GetItem",
"dynamodb:Query",
"dynamodb:Scan",
"dynamodb:PutItem",
"dynamodb:UpdateItem",
"dynamodb:DeleteItem",
"dynamodb:PartiQLSelect"
],
and I do not get any permission error so I hope that part is ok, it does however return the same error even without that line added.
The error that is always returned is:
ValidationException: Unexpected from source"
and no matter what I have tried, it does not help.
My code is quite basic so far:
const dynamodb2 = new AWS.DynamoDB();
let tableName = "habits_sensors";
if(process.env.ENV && process.env.ENV !== "NONE") {
tableName = tableName + '-' + process.env.ENV;
}
app.get(path, function(req, res) {
let params = {
Statement: `select * from ${tableName}`
};
dynamodb2.executeStatement(params, (err, data) => {
if (err) {
res.statusCode = 500;
res.json({error: `Could not get users from : ${tableName} =>` + err});
} else {
res.json(data.Items);
}
});
});
The complete error string returned from the lambda is:
{
"error": "Could not get users from : habits_sensors-playground =>ValidationException: Unexpected from source"
}
and I have the table habits_sensors-playground in my AWS account and I can access it the classic way without problems. That is why the "Unexpected from source" is confusing. I interpret it as referring to that the tableName (in from) in the select query is not correct but the name is matching what I have in AWS and it works using the documentclient.
Any suggestion on what might be wrong is very appreciated.
Answering myself in case anyone else ends up here.
I got a reply from AWS that if the table name contains dashes, you need to quote the table name with double quotes when using PartiQL (I had tried single quotes and that did not work).
Maybe this will change in a future release of PartiQL.
The exception ValidationException: Unexpected from source (CLI: An error occurred (ValidationException) when calling the ExecuteStatement operation: Unexpected from source) happens when table name contains dashes and is not quoted.
So change
aws dynamodb execute-statement --statement \
"SELECT * FROM my-table WHERE field='string'"
To:
aws dynamodb execute-statement --statement \
"SELECT * FROM \"my-table\" WHERE field='string'"
or add the double quotes " around the table name in the SDK you're using to use PartiQL.
Note that the WHERE string='value' uses single quotes ' and the table name requires double quotes ".
For anyone else who ends up here as I did, I had a powershell script generating a statement for aws dynamodb execute-statement (aws --version 2.x) and was getting the same error. After far too long, I tried the interactive cli and found that my query worked, so what I ended up needing to do was escape the double quotes for both powershell purposes AND again with \ characters for the AWS CLI.
$statement = "SELECT id FROM \`"${tablename}\`" WHERE source = '990doc'"
This double escaping finally got me where I needed to be, and will hopefully save someone else a great deal of frustration.
Adding this solution for anyone ended up here with this error from AWS SDK(Javascript).
const { Items = [] } = await dynamodbClient.executeStatement({
Statement: `SELECT * FROM "${tablename}" WHERE "_usedId" IS MISSING`
}).promise();
Surround the table name with double quotes as mentioned by the answers above.

Multiple RedactedFields in AWS WAFv2 put-logging-configuration command

I'm trying to set up logging on our Web ACL with WAFv2.
I can successfully run the put-logging-configuration command with one 'RedactedField', but I am having issue adding more headers after the first one.
Here is the documentation in question -- I can't quite get my head around it:
The part of a web request that you want AWS WAF to inspect. Include the single FieldToMatch type that you want to inspect, with additional specifications as needed, according to the type. You specify a single request component in FieldToMatch for each rule statement that requires it. To inspect more than one component of a web request, create a separate rule statement for each component.
Here is my command which works:
aws --region="us-west-2" wafv2 put-logging-configuration \
--logging-configuration ResourceArn=${MY_WEB_ACL_ARN},LogDestinationConfigs=${MY_FIREHOSE_DELIVERY_STREAM_ARN},RedactedFields={SingleHeader={Name="cookie"}}
This gives the following result:
{
"LoggingConfiguration": {
"ResourceArn": "{My arn}",
"LogDestinationConfigs": [
"{My firehose log stream arn}"
],
"RedactedFields": [
{
"SingleHeader": {
"Name": "cookie"
}
}
]
}
}
I also wish to redact the "authorization" header.
I have tried the following as part of "RedactedFields" portion of --logging-configuration:
1) Two SingleHeader statements within brackets
RedactedFields={SingleHeader={Name="cookie"},SingleHeader={Name="cookie"}}
(Results in 'Unknown options' error.)
2) Two sets of brackets with comma
RedactedFields={SingleHeader={Name="cookie"}},{SingleHeader={Name="authorization"}}
Error parsing parameter '--logging-configuration': Expected: '=', received: '{' for input:
3) Two sets of brackets, no comma
RedactedFields={SingleHeader={Name="cookie"}}{SingleHeader={Name="authorization"}}
Error parsing parameter '--logging-configuration': Expected: ',', received: '{' for input:
4) Two SingleHeader statements within brackets, no comma
RedactedFields={SingleHeader={Name="cookie"}{SingleHeader={Name="authorization"}}
Error parsing parameter '--logging-configuration': Expected: ',', received: '{' for input:
5) One SingleHeader statement, two headers (Isn't really a SingleHeader anymore, is it?)
RedactedFields={SingleHeader={Name="cookie", "authorization"}}
Unknown options: authorization}}
What am I getting wrong here? I've tried many other ways including [] square brackets, multiple instances of 'Name', multiple instances of 'RedactedFields' entirely -- none work.
To add multiple SingleHeaders to RedactedFields via shorthand-syntax, I had to
Give each SingleHeader it's own set of brackets
Add a comma between each bracket set
Wrap all of the sets with square brackets
Wrap everything in single quotes.
For example, if I wanted two SingleHeaders, one for 'cookie' and one for 'authorization', I would need to use the following for the RedactedFields portion of --logging-configuration:
RedactedFields='[{SingleHeader={Name="cookie"}},{SingleHeader={Name="authorization"}}]'
In conclusion, if we add this to put-logging-configuration, the whole command would be:
aws --region=${MY_REGION} wafv2 put-logging-configuration \
--logging-configuration ResourceArn=${MY_WEB_ACL_ARN},LogDestinationConfigs=${MY_FIREHOSE_DELIVERY_STREAM_ARN},RedactedFields='[{SingleHeader={Name="cookie"}},{SingleHeader={Name="authorization"}}]'
Giving the following result:
{
"LoggingConfiguration": {
"ResourceArn": "{my acl arn}",
"LogDestinationConfigs": [
"{my firehose log stream arn}"
],
"RedactedFields": [
{
"SingleHeader": {
"Name": "cookie"
}
},
{
"SingleHeader": {
"Name": "authorization"
}
},
]
}
}
This formatting can be used for any other FieldToMatch, such as SingleQueryArgument, AllQueryArguments, QueryString, UriPath, Body, etc.

How do I avoid putting quotes around each url parameter when using AWS API Gateway?

I am trying to utilize AWS' API Gateway to trigger a lambda function that copies a file from a source bucket to a destination bucket. I would like the form of the API call to be
https://some/api/url/my_lambda_function?key1=joe.mp4&key2=video-files&key3=edited-video-files
I set up the lambda function. I attach an API Gateway and configure the API Gateway. The problem is when I set up the integration mapping.
When I run https://some/api/url/my_lambda_function?key1="joe.mp4"&key2="video-files"&key3="edited-video-files" everything works as it should. However if I run it without the quotes around the parameters, I get an error. For example, if I remove quotes around the key3 parameter, the error is
{"message": "Could not parse request body into json: Unrecognized token \'edited\': was expecting (\'true\', \'false\' or \'null\')\n at [Source: (byte[])\"{\n \"key1\": \"joe.mp4\",\n \"key2\": \"video-files\",\n \"key3\": edited-video-files\n\n}\n\"; line: 4, column: 22]"}
Here's my setup.
Under the API Gateway->Resources->Integration Request->Mapping Templates I click the option (When there are no templates defined). I use application/json and my template is:
{
"key1": $input.params('key1'),
"key2": $input.params('key2'),
"key3": $input.params('key3')
}
For completeness, my Lambda is:
import boto3
def lambda_handler(event, context):
# initialize s3
s3 = boto3.client("s3")
# print event output
print(event)
FILENAME = event['key1']
SOURCE_BUCKET = event['key2']
DEST_BUCKET = event['key3']
# formatted copy string
copy_source = {
'Bucket': SOURCE_BUCKET,
'Key': FILENAME,
}
# copy files
s3.copy_object(Bucket=DEST_BUCKET, Key=FILENAME, CopySource=copy_source)
return 'Thanks for watching'
It seems to work if I put quotes around the value in the mapping template key-value pairs:
"key1": "$input.params('key1')",
"key2": "$input.params('key2')",
"key3": "$input.params('key3')"
}```
If you want to pass the url parameters using a key/value pair, eg key1="joe.mp4", then you must use quotes as that defines that key's string value.
However, you can also setup mappings for the URL that don't require the quotes, and are instead separated by a slash ("/") as highlighted in this example, but these aren't as flexible as the key/value setup, because they must be in a specific order.
For example, with a key/value setup you can either do http://url?key1="value1"&key2="value2"&key3="value3", or you can do http://url?key3="value3"&key1="value1"&key2="value2" and it would have the same result (note the order of the keys). However with static parameters separated by slash, you can't do this, all values must be passed in a static order, http://url/value1/value2/value3

Ace editor snippet format for RegEx based triggers

I'm using an array of snippets with following format
{
name: 'response',
trigger: 'resp|rp',
path: ['paths', '.', '.', '.'],
content: [
'${1:code}:',
' description: ${2}',
' schema: ${3}',
'${4}'
].join('\n')
},
How can I use a RegEx for the trigger? I tried regex key with no luck.
It's not possible to do via public api see register method of snippetManager, you can make it to work by accessing snippetNameMap directly, but it would be better to create feature request on Aces issue tracker.