I want to pass some payload to Step Function triggered by Cloudwatch rule. After passing payload, who can I receive it in Step Function ?
If you want to pass a payload to the step function you should pass in a constant, this will be a fixed JSON string that gets passed during every trigger.
From the image above you can see the ability to specify a JSON string, this will replace any of the previous input that would have been passed into the step function (the event JSON).
Related
So the beginning, I pass these input to the step function.
{
"token": "val"
}
and I have multiple steps for the step function.
How can I access the token variable correctly in each step, without passing the "token" variable from step to step?
You can use the Context Object via $$. and thereby access the original input via $$.Execution.Input:
"token.$": "$$.Execution.Input.token"
I have an AWS Step function, and I need to insert items into DynamoDB. I'm passing the following input to the Step Function execution:
{
"uuid": "dd10a857-3711-451e-91ee-d0b3ab621b2e",
"item_id": "0D98C2F77",
"item_count": 3,
"order_id": "IO-98255AX"
}
I have a DynamoDB PutItem Step, set up like so:
Since item_count is a numeric value, I specified "N.$": "$.item_count" - I specified N at the beginning because that maps to the number type in DynamoDB. Since all of the other fields are strings, I started their keys with S.
I then tried to test the PutItem step with the above payload, and I got the following error:
{
"error": "States.Runtime",
"cause": "An error occurred while executing the state 'DynamoDB PutItem' (entered at the event id #2). The Parameters '{\"TableName\":\"test_item_table\",\"Item\":{\"uuid\":{\"S\":\"dd10a857-3711-451e-91ee-d0b3ab621b2e\"},\"item_id\":{\"S\":\"0D98C2F77\"},\"item_count\":{\"N\":3},\"order_id\":{\"S\":\"IO-98255AX\"}}}' could not be used to start the Task: [The value for the field 'N' must be a STRING]"
}
I looked up the The value for the field 'N' must be a STRING error, and I found two relevant results:
A post on AWS where the OP decided to just change the format of the data that gets passed to the Dynamo step
A post on Github, where the OP was using CDK - and he ends up using a numberFromString() function that's available in CDK
In my case, I have an integer value, and I'd prefer to pass in into Dynamo as an integer - but based on the first link, it seems that Step Functions can only pass string values to DynamoDB. This means that my only option is to convert the integer value to a string, but I'm not sure how to do this. I know that Step Functions have intrinsic functions, but I don't think that this is applicable to JSON paths.
What's the best way to handle storing this numeric data to DynamoDB?
TL;DR "item_count": {"N.$": "States.JsonToString($.item_count)"}
it seems that Step Functions can only pass string values to DynamoDB
Yes, although technically it's a constraint of the DynamoDB API. DynamoDB accepts numbers as strings to maximalize compatability, but the underlying data type remains numeric.
This means that my only option is to convert the integer value to a string, but I'm not sure how to do this.
The JsonToString intrinsic function can stringify a number value from the State Machine execution input.
Is there a way to make a Dynamodb Update return Old and New values?
something like:
updateItemSpec
.withPrimaryKey("id", id)
.withUpdateExpression(myUpdateExpression)
.withNameMap(nameMap)
.withValueMap(valueMap)
.withReturnValues("UPDATED_NEW, UPDATED_OLD");
There isn't.
It should be easy for you to simulate this by returning UPDATED_OLD. You already have the new values as you set them in the update, so get the updated old values, and use that to extract your new values from your value map.
Depending on where you want to use the data, if you don't need it in the body of code where you update a DynamoDB record, you can capture table activity using DynamoDB streams. You can configure an AWS lambda trigger on the table so it invokes the lambda when a specified event occurs, passing this event (in our case, the stream) to the lambda. From this, depending on how you have set up the stream, you can access the old and new versions of the record.
I'm using Postman and wondering if I can use a stored JSON object to create variable for additional calls. For an example: I saved an array which include name and ID:
[{"id":28,"name":"Action"},{"id":12,"name":"Adventure"},
{"id":16,"name":"Animation"},{"id":35,"name":"Comedy"},
{"id":80,"name":"Crime"},{"id":99,"name":"Documentary"},
{"id":18,"name":"Drama"},{"id":10751,"name":"Family"},
{"id":14,"name":"Fantasy"},{"id":36,"name":"History"},
{"id":27,"name":"Horror"},{"id":10402,"name":"Music"},
{"id":9648,"name":"Mystery"},{"id":10749,"name":"Romance"},
{"id":878,"name":"Science Fiction"},{"id":10770,"name":"TV Movie"},
{"id":53,"name":"Thriller"},{"id":10752,"name":"War"},
{"id":37,"name":"Western"}]
I'm triggering another API (second call) that retrieves only IDs, so the response is like this: "genre_ids": [35, 10402]
Is there a way to create an environment variable that looks for the IDs, fetch the relevant name from the second API and create a name oriented variable so on the case above 35=comedy and 10402=music so the variable will be: comedy,music?
to save environment variable you can do the following (see the snippets on the right side of postman):
postman.setEnvironmentVariable("variable_key", "variable_value");
if you want to save a global variable just do :
postman.setGlobalVariable("variable_key", "variable_value");
and then use them as you want.
Alexandre
Definitely it's possible with "Tests"(or Post-Request) script.
But since you have 2 requests(and Postman force you there should be 2 separated requests) you should save response from first request into variable with setEnvironmentVariable or setGlobalVariable and after getting response for 2nd request - to parse response for first one and iterate over it, looking up by id given.
I'm looking for the hash equivalent of this question: How to pass array query params to AWS API Gateway?
Basically, I want to set up query parameters that look like this:
example.com?color[background]=yellow&color[foreground]=black
When I try to create a query parameter called color[background] in the API Gateway console, I get the following error message:
Invalid mapping expression specified: Validation Result: warnings : [], errors : [Parameter name should be a non-empty alphanumeric string]
I've also tried setting up a color query param and then passing various "hashes" to it. Here's what I've tried passing into this parameter:
{"background" => 123, "foreground" => "abc"} and removing the spaces
{"background" : 123, "foreground" : "abc"} and removing the spaces
{background:123,foreground:abc}
They all result in a request that is some form of example.com?color=%7Bbackground:123,foreground:abc%7D with the hash that I pass coming after the =.
Any ideas? Is this bad practice for query string parameters anyways, and should I stick with simple params?
Since there isn't a standard defined to pass in complex data structures like arrays or maps via the query string, API Gateway does not attempt to to interpret or parse the query string as anything other than simple key-value string pairs.
If you want to pass in and transform complex types it's best to do so in the body of a POST or PUT request where you can leverage JSON and API Gateway's powerful body mapping templates feature.
Alternatively, if you must stick with query string parameters, then you must either:
Collapse your data structure to be simple key value pairs as suggested by Michael -sqlbot above, or
Passthrough the raw query string to your backend lambda or http integration where it can parsed as you please. See this post for more details on how to do that.