AWS Stepfunction access original input without passing field step to step - amazon-web-services

So the beginning, I pass these input to the step function.
{
"token": "val"
}
and I have multiple steps for the step function.
How can I access the token variable correctly in each step, without passing the "token" variable from step to step?

You can use the Context Object via $$. and thereby access the original input via $$.Execution.Input:
"token.$": "$$.Execution.Input.token"

Related

In AWS Step Functions how do I pass an integer input to DynamoDB

I have an AWS Step function, and I need to insert items into DynamoDB. I'm passing the following input to the Step Function execution:
{
"uuid": "dd10a857-3711-451e-91ee-d0b3ab621b2e",
"item_id": "0D98C2F77",
"item_count": 3,
"order_id": "IO-98255AX"
}
I have a DynamoDB PutItem Step, set up like so:
Since item_count is a numeric value, I specified "N.$": "$.item_count" - I specified N at the beginning because that maps to the number type in DynamoDB. Since all of the other fields are strings, I started their keys with S.
I then tried to test the PutItem step with the above payload, and I got the following error:
{
"error": "States.Runtime",
"cause": "An error occurred while executing the state 'DynamoDB PutItem' (entered at the event id #2). The Parameters '{\"TableName\":\"test_item_table\",\"Item\":{\"uuid\":{\"S\":\"dd10a857-3711-451e-91ee-d0b3ab621b2e\"},\"item_id\":{\"S\":\"0D98C2F77\"},\"item_count\":{\"N\":3},\"order_id\":{\"S\":\"IO-98255AX\"}}}' could not be used to start the Task: [The value for the field 'N' must be a STRING]"
}
I looked up the The value for the field 'N' must be a STRING error, and I found two relevant results:
A post on AWS where the OP decided to just change the format of the data that gets passed to the Dynamo step
A post on Github, where the OP was using CDK - and he ends up using a numberFromString() function that's available in CDK
In my case, I have an integer value, and I'd prefer to pass in into Dynamo as an integer - but based on the first link, it seems that Step Functions can only pass string values to DynamoDB. This means that my only option is to convert the integer value to a string, but I'm not sure how to do this. I know that Step Functions have intrinsic functions, but I don't think that this is applicable to JSON paths.
What's the best way to handle storing this numeric data to DynamoDB?
TL;DR "item_count": {"N.$": "States.JsonToString($.item_count)"}
it seems that Step Functions can only pass string values to DynamoDB
Yes, although technically it's a constraint of the DynamoDB API. DynamoDB accepts numbers as strings to maximalize compatability, but the underlying data type remains numeric.
This means that my only option is to convert the integer value to a string, but I'm not sure how to do this.
The JsonToString intrinsic function can stringify a number value from the State Machine execution input.

Postman, Set on evraibale for "run session"

I know that we can set a variable in different scopes via a pre-request script, but can we set one for on "execution" or "run of test".
I have a folder that contains two requests to validate a scenario where the first one will create a resource with an unique id and the second one will fail by trying to create a resource with the same unique id.
I would like to generate that unique value each time the collection is run. At this time I use a collectionVariables to test and set when not present but that variable is kept between each "retry".
Can I create a variable that will be the same only for one execution of a collection ?
Thanks
I have similar cases, where I store the values in Environment variables and then unset them in the Pre-request script of the first request:
pm.environment.unset("myVariable");
So, my solution is the same as the one suggested by #so cal cheesehead.
I create a variable in either the folder pre-request or the first request script. And unset it after the last test in the last request.
The sad part is that the initialization and destruction of this variable is spread in different scripts.

How to pass constant value to AWS Step Function using Cloudwatch

I want to pass some payload to Step Function triggered by Cloudwatch rule. After passing payload, who can I receive it in Step Function ?
If you want to pass a payload to the step function you should pass in a constant, this will be a fixed JSON string that gets passed during every trigger.
From the image above you can see the ability to specify a JSON string, this will replace any of the previous input that would have been passed into the step function (the event JSON).

How to interpolate environmental variable as number in json body of request in Postman?

My API consumes data in JSON where some fields need to be numbers. I have some shared data in my environmental variables, where one entry is user_id, which in my case is an integer, so I need to provide it to my API as a number.
I've tried to inline value "userId": {{user_id}}, but it highlights coma after last } and API can't see what is coming to it. In Postman console I can see that actual value sent is this:
"user": {{user_id}},
So it doesn't seem to work. And using "userId": "{{user_id}}" doesn't work in my case as user id will be sent as a string.
How to interpolate environmental variable as number in json body of request?
I've mistyped the name of an environmental variable. I was using {{user_id}}, instead of {{userID}} that is in my environment, thus interpolation resulted in the different wrong thing.

Using mapping variables in a post-session command

Workflow generates three files (header, detail, trailer) which I combine via post-session command. There are two variables which are set in my mapping, which I want to use in the post-session command like so:
cat header1.out detail1.out trailer1.out > OUTPUT_$(date +%Y%m%d)_$$VAR1_$$VAR2.dat
But this doesn't work and the values are empty, so I get OUTPUT_20151117__.dat.
I've tried creating workflow variables and assigning them via pre-session variable assignment, but this doesn't work either.
What am I missing? Or was this never going to work?
Can you see the values assigned to those variables on the session log or do they appear empty as well?
Creating workflow variables is what I'd try, but you need to assign the values with the post-session variable assignment.
Basically, you store values in a variable in your mapping and pass the values up to the workflow after the session succeeded. Here is how you can achieve that:
Define a Workflow variables $$VAR1 and $$VAR2
Define the variables in your mapping, but chose different names! So i.e. $$M_VAR1 and $$M_VAR2
In your mapping, assign the values to your mapping variables through the functions SetVariable(var as char, value as data type)
In your session, select Post-session on success variable assignment.
In step 4, the current value from $$M_VAR1 (mapping variable) is stored in your workflow variable $$VAR1 and can then be used in the workflow in command tasks like you asked.
A few notes:
I'm not 100% sure if the variable assignment is exectured before the post-session command. If the command is executed first, you could execute your command tasks in an external command task after your session.
Pre-Session variable assignment is used if you pass a value from a workflow variable down to a mapping variable. You can use this if your variables $$VAR1 or $$VAR2 are used inside another mapping and need to be initialized at the beginning.