I have a 3 step state machine for a step function.
InputStep -> ExecuteSparkJob -> OutputLambda
ExecuteSparkJob is a glue task. Since it cannot pass its output to the step function, it writes it output to an S3 bucket. OutputLambda reads it from there and passes it on to the step function.
The idea of InputStep is simply to define a common S3 URI that the following steps can use.
Below is the code I have for the Input Step.
const op1 = Data.stringAt("$.op1");
const op2 = Data.stringAt("$.op2");
const inputTask = new Pass(this, "Input Step", {
result: Result.fromString(this.getURI(op1, op2)),
resultPath: "$.s3path"
});
getURI(op1: string, op2: string): string {
return op1.concat("/").concat(op2).concat("/").concat("response");
}
However, the string manipulation that I am doing in getURI is not working. The values in inputTask.result are not being substituted by the value in Path.
This is the input and output to the Input Step
{
"op1": "test1",
"op2": "test2"
}
Output
{
"op1": "test1",
"op2": "test2"
"responsePath": "$.op1/$.op2/response"
}
Is it possible to do some string manipulations using parameters in the Path in Step Function definition? If yes, what am I missing?
Thanks for your help.
You can use one or more EvaluateExpression Tasks - it's still a bit clunky.
You can find examples here.
API doc here.
Use a Lambda function instead of a Pass state to build the string.
Step Functions doesn't currently support string concatenation with reference paths. The Result field of a Pass state doesn't allow reference paths either. It has to be static value.
The Pass state's Parameters field supports the intrinsic functions and substitutions you need to do this natively, without a Lambda task. The Result field doesn't.
Compose a string from the execution inputs with the Format intrinsic function:
const inputTask = new Pass(this, "Input Step", {
parameters: {
path: JsonPath.format(
"{}/{}/response",
JsonPath.stringAt("$.op1"),
JsonPath.stringAt("$.op2")
),
},
outputPath: "$.s3",
});
The resulting string value test1/test2/response will be output to $.s3.path.
Related
Background:
I am trying to add DynamoDB:GetItem step to my state machine in AWS Step Functions. GetItem API takes input in the following format:
{
"TableName": "MyDynamoDBTable",
"Key": {
"Column": {
"S": "MyEntry"
}
}
}
where "Column" is the primary key name, and "MyEntry" is the primary key value. The issue is that I want to be able to specify both primary key name and value dynamically, using JSON path reference.
Unfortunately, AWS won't allow me to pass value reference for primary key name ("Column"). So I can't do something like
{
"TableName": "MyDynamoDBTable",
"Key.$": {
"$.ColumnName": {
"S": "MyEntry"
}
}
}
Problem:
The only workaround I could think of (albeight a bit ugly) is to use combination of States.StringToJson and States.Format intrinsic functions to first generate stringified version of the input to Key.$ field, and then convert to JSON from string. Something like:
{
"TableName.$": "$.TableName",
"Key.$": "States.StringToJson(States.Format('\{\"{}\":\{\"S.$\":\"{}\"\}\}', $.PrimaryKeyName, $.PrimaryKeyValue))"
}
It should work in theory, but it seems that AWS Step Functions is not happy about escaping double quotes? It's not able to parse the definition above.
So my question is:
Is there a way to make this work? (either by escaping double quotes somehow, or through a totally different approach)
After lots of experimentation, I finally found a way to make dynamic keys work. I am using Pass step with the following parameters defined:
{
"Key.$": "States.StringToJson(States.Format('\\{\"{}\":\\{\"S\":\"{}\"\\}\\}', $.HashKeyName, $.HashKeyValue))"
}
The secret, apparently, was in using double \\ when escaping { and } symbols. Escaping " wasn't a problem after all, even though it's not documented in AWS docs.
The result of this transformation is following:
{
"Key": {
"MyHashKeyName": {
"S": "MyHashKeyValue"
}
}
}
In the following code, if documentDetails is available in payload of the step function for step Read, then only documentDetails variable be considered, otherwise it shouldn't be. documentDetails is optional and it may or may not be there in payload.
const readStep = new tasks.LambdaInvoke(this, 'Read', {
lambdaFunction: stepfLambda,
resultSelector: {
"s3Url.$": "$.Payload.s3Url",
"documentDetails.$": // present only if documentDetails is present in Payload
},
resultPath: '$.stepEventMetaData'
});
What is the correct syntax for the same?
resultSelector cannot apply conditional logic. Your best option is to have stepfLambda return the output in the desired shape.
I have a JSON input that I would like to transform to another JSON output. I defined my list of input JSONPaths, and trying to create a simple JSON output in the template like so:
{
"fax": \"<userFax>\"
}
This was one of the formats given in the example from AWS themselves:
{
"instance": \"<instance>\",
"state": [9, \"<state>\", true],
"Transformed": "Yes"
}
However, when I try to update the changes, I get the following error:
Invalid InputTemplate for target ... : [Source: (String)"{ "fax": \"null\" }"; line: 2, column: 13].
Basically, I'd like all incoming values in the input to be converted to strings as an output via the template. This is to prevent values like zip codes from being converted into an integer and having it's leading zero stripped away. But it's confusing that even following the simple example from AWS is failing.
How can I get JenkinsPipelineUnit to intercept both text() and string() param calls? I have code that triggers a build with a few params. I want to write a nice unit test to check that it does what it should. However, the string() calls are not intercepted so I cannot test them. I can test the text() call.
Yes, it is possible to write unit tests for Jenkins pipeline code vs testing via jenkins production jobs. This project + shared libraries makes jenkins into a much better tool.
Any ideas on how I could do this? I looked in the JenkinsPipelineUnit project but didn't find an example that fit and couldn't figure out where to look in the runtime objects.
I do see that the project's BasePipelineTest.groovy links string to its stringInterceptor which seems to just eat it the string. Maybe, I can unregister theirs...
Example
def triggeringParams = [:]
....
for (def param in ['text', 'string']) {
helper.registerAllowedMethod(param, [LinkedHashMap],
{ LinkedHashMap data ->
triggeringParams << data
}
)
}
thisScript = helper.loadScript('''
package resources
return this''')
def params = []
params << thisScript.text(name: 'MULTILINE_PARAM', value: '\nline1\nline2')
params << thisScript.string(name: 'STRING_PARAM', value: 'a string')
thisScript.build(job: 'myJob', parameters: params)
println triggeringParams
Results
[
[name:JOB_PROPERTIES, value:
line1
line2]
]
Wrong type was the problem. The project's BasePipelineTest.groovy links string to its stringInterceptor which seems to just eat it the string and uses register Map not LinkedHashMap. So, the first is found before mine and boom, the string doesn't show in my collector.
If I modify my code to use the more generic map it works
void addParameterHelpers() {
for (def param in ['text', 'string']) {
helper.registerAllowedMethod(param, [Map],
{ LinkedHashMap data ->
triggeringParams << data
}
)
}
}
I've been writing test using sinon. During the same I wrote stub where some input parameters are passed and an object is returned. This object returns some values and a random date value generated by system at the time of execution. So need guidance on following.
How can I handle the same as the matching arguments are static in nature and I don't know the possible value of the date generated by actual code.
How can we skip certain key values of an object using sinon. i.e. say object has following values. const object = {name: "abc", employeeNumber : "123"} I only want to check if name is "abc" and don't need to match employeeNumber.
From the sinon.match docs:
Requires the value to be not null or undefined and have at least the same properties as expectation.
From the sinon.assert.match docs:
Uses sinon.match to test if the arguments can be considered a match.
Example:
test('partial object match', () => {
const object = { name: "abc", employeeNumber : "123" };
sinon.assert.match(object, { name: 'abc' }); // SUCCESS
})