ValidationException on DynamoDB with AWS sdk for Java - amazon-web-services

I'm trying to put an item and get it back a few statements later, but I'm getting a cryptic error from AWS.
class DataStore<Payload extends EventPayload> {
private final Clock clock;
private final Table table;
private final ObjectMapper objectMapper = new ObjectMapper();
private final Class<Payload> payloadType;
public DynamoDBEventStore(final Clock clock, final Class<Payload> eventClass, final String dataStoreName) {
final AmazonDynamoDB client =
AmazonDynamoDBClientBuilder.standard().build();
final DynamoDB dynamoDB = new DynamoDB(client);
this.table = dynamoDB.getTable(dataStoreName);
this.clock = clock;
this.payloadType = eventClass;
}
public void persist(final UUID eventId, final String aggregateId, final Long version, final Payload eventPayload) {
final Map<String, Object> rawDomainEvent = Map.of(
"EventId", eventId.toString(),
"Timestamp", LocalDateTime.now(clock).toString(),
"AggregateId", eventPayload.getAggregateKey(),
"Version", version,
"Payload", objectMapper.convertValue(eventPayload, Map.class)
);
final Item domainEvent = Item.fromMap(rawDomainEvent);
table.putItem(domainEvent);
}
public void testEvent(final UUID eventId) {
table.getItem("EventId", eventId.toString();
}
}
If I save an item calling to persist, then the Item is saved as expected (see the JSON below from DynamoDB console).
{
"EventId": {
"S": "8a2c1733-887d-42e1-b720-2e87dcf46269"
},
"Timestamp": {
"S": "2021-10-10T04:18:56.465223700"
},
"Payload": {
"M": {
"transactions": {
"L": []
},
"aggregateKey": {
"S": "bc406432-37f1-440f-9157-b0ce8da814c1"
}
}
},
"Version": {
"N": "1"
},
"AggregateId": {
"S": "bc406432-37f1-440f-9157-b0ce8da814c1"
}
}
But when I call to testEvent it fails returning the following error:
"message": "Unable to unmarshall exception response with the unmarshallers provided (Service: AmazonDynamoDBv2; Status Code: 400; Error Code: ValidationException; Request ID: E6M3NI8CTF05ONUC5G25H53PT7VV4KQNSO5AEMVJF66Q9ASUAAJG; Proxy: null)",
Any thoughts about what could be wrong with my code? FYI, I'm using Java 11, Spring Boot 2.5.4 and AWS SDK 1.12.81:
<dependency>
<groupId>com.amazonaws</groupId>
<artifactId>aws-java-sdk-dynamodb</artifactId>
<version>1.12.81</version>
</dependency>

The error message isn't that cryptic! Ignore the unmarshalling part, and you are left with
Service: AmazonDynamoDBv2; Status Code: 400; Error Code: ValidationException;
I would guess, your eventId can't be found in the database - but you need to debug from here

Related

DynamoDB JavaScript PutItemCommand is neither failing nor working

Please note: although this question mentions AWS SAM, it is 100% a DynamoDB JavaScript SDK question at heart and can be answered by anyone with experience writing JavaScript Lambdas (or any client-side apps) against DynamoDB using the AWS DynamoDB client/SDK.
So I used AWS SAM to provision a new DynamoDB table with the following attributes:
FeedbackDynamoDB:
Type: AWS::DynamoDB::Table
Properties:
TableName: commentary
AttributeDefinitions:
- AttributeName: id
AttributeType: S
KeySchema:
- AttributeName: id
KeyType: HASH
ProvisionedThroughput:
ReadCapacityUnits: 5
WriteCapacityUnits: 5
StreamSpecification:
StreamViewType: NEW_IMAGE
This configuration successfully creates a DynamoDB table called commentary. However, when I view this table in the DynamoDB web console, I noticed a few things:
it has a partition key of id (type S)
it has no sort key
it has no (0) indexes
it has a read/write capacity mode of "5"
I'm not sure if this raises any red flags with anyone but I figured I would include those details, in case I've configured anything incorrectly.
Now then, I have a JavaScript (TypeScript) Lambda that instantiates a DynamoDB client (using the JavaScript SDK) and attempts to add a record/item to this table:
// this code is in a file named app.ts:
import { APIGatewayProxyEvent, APIGatewayProxyResult } from 'aws-lambda';
import { User, allUsers } from './users';
import { Commentary } from './commentary';
import { PutItemCommand } from "#aws-sdk/client-dynamodb";
import { DynamoDBClient } from "#aws-sdk/client-dynamodb";
export const lambdaHandler = async (event: APIGatewayProxyEvent): Promise<APIGatewayProxyResult> => {
try {
const ddbClient = new DynamoDBClient({ region: "us-east-1" });
let status: number = 200;
let responseBody: string = "\"message\": \"hello world\"";
const { id, content, createdAt, providerId, receiverId } = JSON.parse(event.body);
const commentary = new Commentary(id, content, createdAt, providerId, receiverId);
console.log("deserialized this into commentary");
console.log("and the deserialized commentary has content of: " + commentary.getContent());
await provideCommentary(ddbClient, commentary);
responseBody = "\"message\": \"received commentary -- check dynamoDb!\"";
return {
statusCode: status,
body: responseBody
};
} catch (err) {
console.log(err);
return {
statusCode: 500,
body: JSON.stringify({
message: err.stack,
}),
};
}
};
const provideCommentary = async (ddbClient: DynamoDBClient, commentary: Commentary) => {
const params = {
TableName: "commentary",
Item: {
id: {
S: commentary.getId()
},
content: {
S: commentary.getContent()
},
createdAt: {
S: commentary.getCreatedAt()
},
providerId: {
N: commentary.getProviderId()
},
receiverId: {
N: commentary.getReceiverId()
}
}
};
console.log("about to try to insert commentary into dynamo...");
try {
console.log("wait for it...")
const rc = await ddbClient.send(new PutItemCommand(params));
console.log("DDB response:", rc);
} catch (err) {
console.log("hmmm something awry. something....in the mist");
console.log("Error", err.stack);
throw err;
}
};
Where commentary.ts is:
class Commentary {
private id: string;
private content: string;
private createdAt: Date;
private providerId: number;
private receiverId: number;
constructor(id: string, content: string, createdAt: Date, providerId: number, receiverId: number) {
this.id = id;
this.content = content;
this.createdAt = createdAt;
this.providerId = providerId;
this.receiverId = receiverId;
}
public getId(): string {
return this.id;
}
public getContent(): string {
return this.content;
}
public getCreatedAt(): Date {
return this.createdAt;
}
public getProviderId(): number {
return this.providerId;
}
public getReceiverId(): number {
return this.receiverId;
}
}
export { Commentary };
When I update the Lambda with this handler code, and hit the Lambda with the following curl (the Lambda is invoked by an API Gateway URL that I can hit via curl/http):
curl -i --request POST 'https://<my-api-gateway>.execute-api.us-east-1.amazonaws.com/Stage/feedback' \
--header 'Content-Type: application/json' -d '{"id":"123","content":"test feedback","createdAt":"2022-12-02T08:45:26.261-05:00","providerId":457,"receiverId":789}'
I get the following HTTP 500 response:
{"message":"SerializationException: NUMBER_VALUE cannot be converted to String\n
Am I passing it a bad request body (in the curl) or do I need to tweak something in app.ts and/or commentary.ts?
Interestingly the DynamoDB API expects numerical fields of items as strings. For example:
"N": "123.45"
The doc says;
Numbers are sent across the network to DynamoDB as strings, to maximize compatibility across languages and libraries. However, DynamoDB treats them as number type attributes for mathematical operations.
Have you tried sending your input with the numerical parameters as strings as shown below? (See providerId and receiverId)
{
"id":"123",
"content":"test feedback",
"createdAt":"2022-12-02T08:45:26.261-05:00",
"providerId":"457",
"receiverId":"789"
}
You can convert these IDs into string when you're populating your input Item:
providerId: {
N: String(commentary.getProviderId())
},
receiverId: {
N: String(commentary.getReceiverId())
}
You could also use .toString() but then you'd get errors if the field is not set (null or undefined).
Try using a promise to see the outcome:
client.send(command).then(
(data) => {
// process data.
},
(error) => {
// error handling.
}
);
Everything seems alright with your table setup, I believe it's Lambda async issue with the JS sdk. I'm guessing Lambda is not waiting on your code and exiting early. Can you include your full lambda code.

Halt the workflow and return the response to Controller

Create Order triggers the Rest End point and starts the workflow (Its a TASK ). CreateOrderController
Problem is CreateOrderController is always returning Success.I want to return ResponseEntity.ok("Not Success "); as shown in 2nd image and stop the call of Save Order Database
How to achieve it?
> #RestController
> public class CreateOrderController {
>
> #Autowired
> private RuntimeService runtimeService;
>
>
>
> #PostMapping("/rest/create/order")
> public ResponseEntity<?> createOrder(#RequestBody OrderInfo orderInfo) {
> Map<String, Object> inputData = new HashMap<String, Object>();
> inputData.put("orderInfo", orderInfo);
> ProcessInstance p = runtimeService.startProcessInstanceByKey("hello-world-process",inputData);
>
>
>
> return ResponseEntity.ok("Success");
>
> }
If you are executing the complete process in one transaction, then an exception along the way will create a rollback. However, you usually have a transaction boundary somewhere. You can query the status of the process instance after it has been started via the history endpoint.
The execute method returns void. Let the delegate write process data instead of returning a value. You can find a setVariable method on the delegateExecution you are getting in as a parameter.
You can get the data values in the REST response as shown in this example: https://docs.camunda.org/manual/7.18/reference/rest/process-definition/post-start-process-instance/#starting-a-process-instance-with-variables-in-return
Request:
{
"variables":{
"aVariable" : {
"value" : "aStringValue",
"type": "String"},
"anotherVariable" : {
"value" : true,
"type": "Boolean",
"valueInfo" : {
"transient" : true
}
}
},
"businessKey" : "myBusinessKey",
"withVariablesInReturn": true
}
Response
{
"links": [
{
"method": "GET",
"href": "http://localhost:8080/rest-test/process-instance/aProcInstId",
"rel": "self"
}
],
"id": "aProcInstId",
"definitionId": "aProcessDefinitionId",
"businessKey": "myBusinessKey",
"ended": false,
"suspended": false,
"tenantId": null,
"variables": {
"anotherVariable": {
"type": "Boolean",
"value": true,
"valueInfo": {
"transient" : true
}
},
"aVariable": {
"type": "String",
"value": "aStringValue",
"valueInfo": { }
}
}
}
Alternatively, error handling options in the delegate code / process include:
a) Simply throw an exception in your execute() method, for instance a new RuntimeException() and observe in Cockpit how Camunda creates a technical incident for the process (https://docs.camunda.org/manual/7.18/webapps/cockpit/bpmn/failed-jobs/).
b) You can also use custom exceptions and error codes, e.g. as shown here:
// Defining a custom exception.
public class MyException extends ProcessEngineException {
public MyException(String message, int code) {
super(message, code);
}
}
// Delegation code that throws MyException with a custom error code.
public class MyJavaDelegate implements JavaDelegate {
#Override
public void execute(DelegateExecution execution) {
String myErrorMessage = "My error message.";
int myErrorCode = 22_222;
throw new MyException(myErrorMessage, myErrorCode);
}
}
Src: https://docs.camunda.org/manual/7.18/user-guide/process-engine/delegation-code/#exception-codes
c) If you don't want to create e technical incident but prefer to throw a 'business' error which you can catch in the process model, so the process can take a different (error) path:
public class BookOutGoodsDelegate implements JavaDelegate {
public void execute(DelegateExecution execution) throws Exception {
try {
...
} catch (NotOnStockException ex) {
throw new BpmnError("Business issue");
}
}
}
src: https://docs.camunda.org/manual/7.18/user-guide/process-engine/delegation-code/#throw-bpmn-errors-from-delegation-code

AWS Pinpoint: Set APNS "mutable-content": 1

AWS Poinpoint APNS by default sets "mutable-content": 0.
I am using Node.js.
Below works fine, but mutable-content is always 0. "mutable-content": 0:
var messageRequest = {
'Addresses': {
https://forums.aws.amazon.com/: {
'ChannelType': channelType
}
},
'MessageConfiguration': {
'APNSMessage': {
'Action': action,
'Body': message,
'Priority': priority,
'SilentPush': silent,
'Title': title,
'TimeToLive': ttl,
'Url': url,
}
}
Below is the payload I get when an APNS is sent using the above setup
["aps": {
alert = {
body = "TEST";
title = "Test message sent from Amazon Pinpoint.";
};
"content-available" = 1;
"mutable-content" = 0;
}, "data": {
pinpoint = {
deeplink = "https://www.example.com";
};
}]
How can I set "mutable-content": 1 for an APNS through AWS Pinpoint?
There is no documentation but this worked for me after some trial and error:
var payload = {
"aps": {
"alert": {
"title": "Bold text in the notification",
"body": "Second line in the notification"
},
"sound": "default",
"mutable-content": 1
}
};
var messageRequest = {
Addresses: {
[token]: {
ChannelType: "APNS",
},
},
MessageConfiguration: {
APNSMessage: {
RawContent: JSON.stringify(payload),
},
},
};
Just replace their template with RawContent and create the payload as you would normally. Can refer to apple docs on how to create the raw payload. You can also adjust content-available key using this method. Here is the link to how to create a payload with json:
https://developer.apple.com/documentation/usernotifications/setting_up_a_remote_notification_server/generating_a_remote_notification
I know this is a bit old, but just ran into this issue and wanted to share my solution.
I found that setting the "MediaUrl" parameter to a non-empty string would cause pinpoint to send "mutable-content": 1
I did not see this in any of the pinpoint documentation.

Having and issue with queryRequest in AWS Dynamodb

I am having an issue with dynamodb and the QueryRequest. I want to implement paging with dynamodb.
I am getting an error back from postman. Scroll right for full error.
{
"timestamp": "2019-06-04T00:12:42.526+0000",
"status": 500,
"error": "Internal Server Error",
"exception": "com.amazonaws.services.dynamodbv2.model.ResourceNotFoundException",
"message": "Request processing failed; nested exception is com.amazonaws.services.dynamodbv2.model.ResourceNotFoundException: Requested resource not found (Service: AmazonDynamoDBv2; Status Code: 400; Error Code: ResourceNotFoundException; Request ID: TDAUFVG205A11TFDGOD9U53MMBVV4KQNSO5AEMVJF66Q9ASUAAJG)",
My project does connect and work fine with other api's gets. dynamoDBMapper.query works fine and does return data.
The problem is from the new code using QueryRequest.
public String getRequestPage(String query, String lastEvaluatedKey, String limit) {
...
Map<String, AttributeValue> mapLastEvaluatedKey = null;
Map<String,String> expressionAttributesNames = new HashMap<>();
expressionAttributesNames.put("#tagId","tagId");
Map<String,AttributeValue> expressionAttributeValues = new HashMap<>();
expressionAttributeValues.put(":column_search",new AttributeValue().withS(query));
QueryRequest queryRequest = new QueryRequest()
.withTableName(dynamoDbTableName)
.withKeyConditionExpression("#tagId = :column_search")
.withExpressionAttributeNames(expressionAttributesNames)
.withExpressionAttributeValues(expressionAttributeValues)
.withLimit(page_limit)
.withExclusiveStartKey(mapLastEvaluatedKey);
System.out.println(" queryRequest " + queryRequest );
QueryResult queryResult = client.query(queryRequest);
Map<String, AttributeValue> mapLastEvaluatedKeyReturned = null;
mapLastEvaluatedKeyReturned = queryResult.getLastEvaluatedKey();
error -->{ "timestamp": "2019-06-04T19:58:18.156+0000", "status": 500,
"error": "Internal Server Error", "exception":
"com.amazonaws.services.dynamodbv2.model.ResourceNotFoundException",
"message": "Request processing failed; nested exception is
com.amazonaws.services.dynamodbv2.model.ResourceNotFoundException:
Requested resource not found (Service: AmazonDynamoDBv2; Status Code:
400; Error Code: ResourceNotFoundException; Request ID:
BRCI1FLM3375SB8U6JSJEAH09NVV4KQNSO5AEMVJF66Q9ASUAAJG)", "path":
"/api/v1/metadata/tag/tagIdPage/military_status/page/1/limit/2" }
This is the system.out of the queryRequest and sent to the DB
queryRequest {TableName: tagMetadata_Certified-dev,Limit:
2,FilterExpression: tagId = :tagIdValue,KeyConditionExpression: #tagId = :tagIdValue,ExpressionAttributeNames:
{#tagId=tagId},ExpressionAttributeValues: {:tagIdValue={S:
military_status,}}}
Any help would be great.
Thanks
Phil
I found my issued using the aws cli from the terminal window. This is very useful for testing with the --debug switch.
Scroll right for full code...
aws dynamodb query --table-name tableNameHere --region regionNameHere --key-condition-expression "tagId = :tagIdValue" --filter-expression "isReportable = :p" --expression-attribute-values '{":tagIdValue":{"S":"m_status"}, ":p":{"BOOL":true}}' --debug
public String getRequestPage(String tagIdPassed, String lastEvaluatedKey, String limit) {
.... code removed ...
Boolean isReportableBool = true;
TagMetadata tagMetadata = new TagMetadata();
Map<String,String> expressionAttributesNames = new HashMap<>();
expressionAttributesNames.put("#tagId","tagId");
expressionAttributesNames.put("#isReportable","isReportable");
Map<String,AttributeValue> expressionAttributeValues = new HashMap<>();
expressionAttributeValues.put(":tagIdValue",new AttributeValue().withS(tagIdPassed));
expressionAttributeValues.put(":isReportableValue",new AttributeValue().withBOOL(isReportableBool));
QueryRequest queryRequest = new QueryRequest()
.withTableName(dynamoDbTableName)
.withKeyConditionExpression("#tagId = :tagIdValue")
.withFilterExpression("#isReportable = :isReportableValue")
.withExpressionAttributeNames(expressionAttributesNames)
.withExpressionAttributeValues(expressionAttributeValues)
.withLimit(page_limit);

How to extract JSON Array From json in mulesoft

I am getting xml output then i am converting that xml into json object.the format is given below.
{
"SOAP-ENV:Envelope": {
"#xmlns:SOAP-ENV": "http://schemas.xmlsoap.org/soap/envelope/",
"#xmlns:xsi": "http://www.w3.org/2001/XMLSchema-instance",
"#xmlns:xsd": "http://www.w3.org/2001/XMLSchema",
"SOAP-ENV:Body": {
"rpc:TestExampleResponse": {
"#xmlns:rpc": "http://Test.com/asi/",
"TestMessage": {
"listOfTESTS": {
"#xmlns:xmlns": "http://www.Test.com/xml/TEST",
"TESTS": [{
"id": "1",
"lastSyncDate": "12/16/2015 07:06:38",
"listOfTESTsyncrealtimeChild": null
}, {
"id": "2",
"lastSyncDate": "12/16/2015 07:06:38",
"listOfTESTsyncrealtimeChild": null
}
]
}
}
}
}
}
}
i want to extract Test array from JSON Output in Mulesoft.I dont know how to extract that array in mulesoft.Thanks in advance
You can use Dataweave (Transform Message component in Anypoint Studio)
(Mule EE)
Take a look to the documentation:
https://docs.mulesoft.com/mule-user-guide/v/3.7/using-dataweave-in-studio
Sample script for your input:
%dw 1.0
%input payload application/json
%output application/json
---
TESTS: payload."SOAP-ENV:Envelope"."SOAP-ENV:Body"."rpc:TestExampleResponse".TestMessage.listOfTESTS.TESTS map ((tEST , indexOfTEST) -> {
id: tEST.id,
lastSyncDate: tEST.lastSyncDate,
listOfTESTsyncrealtimeChild: tEST.listOfTESTsyncrealtimeChild
})
Output when using %output application/json:
{
"TESTS": [
{
"id": "1",
"lastSyncDate": "12/16/2015 07:06:38",
"listOfTESTsyncrealtimeChild": null
},
{
"id": "2",
"lastSyncDate": "12/16/2015 07:06:38",
"listOfTESTsyncrealtimeChild": null
}
]
}
Output when using %output application/java:
{TESTS=[{id=1, lastSyncDate=12/16/2015 07:06:38, listOfTESTsyncrealtimeChild=null}, {id=2, lastSyncDate=12/16/2015 07:06:38, listOfTESTsyncrealtimeChild=null}]}
You can write a custom transformer like below. This transformer uses Jackson (com.fasterxml.jackson) dependency.
The transformer returns a list of strings where each string represents an element of your TESTS array.
public class JsonArrayExtractor extends AbstractTransformer {
private final ObjectMapper mapper = new ObjectMapper();
private final String testsNodeJsonPointer = "/SOAP-ENV:Envelope/SOAP-ENV:Body/rpc:TestExampleResponse/TestMessage/listOfTESTS/TESTS";
public JsonArrayExtractor() {
registerSourceType(DataTypeFactory.STRING);
}
#Override
protected Object doTransform(Object src, String enc) throws TransformerException {
String payload = Objects.toString(src);
JsonNode root;
try {
root = mapper.readTree(payload);
} catch (IOException e) {
throw new TransformerException(this, e);
}
List<String> testsList = new ArrayList<>();
JsonNode testsNode = root.at(JsonPointer.valueOf(testsNodeJsonPointer));
if (testsNode instanceof ArrayNode) {
ArrayNode testsArrayNode = (ArrayNode) testsNode;
for (JsonNode test : testsArrayNode) {
testsList.add(test.toString());
}
}
return testsList;
}
}
And you can use the above transformer in your flow as below.
<custom-transformer class="org.ram.JsonArrayExtractor" doc:name="extractTestsArray"/>