AWS API gateway body mapping template for dynamodb map - amazon-web-services

I am trying to figure out how to insert dynamic data into a dynamodb table via an API Gateway in AWS. Currently I have a dynamodb table and an API endpoint setup that accepts a POST like so.
POST https://{unique-id}.execute-api.us.east-1.amazonaws.com/notification/events
{
"reference_number": 99,
"purchase_date": "1/1/2017"
}
I've setup a body mapping template in the API gateway to massage the data into the dynamodb.
{
"TableName": "Events",
"Item": {
"reference_number": {
"N": "$input.path('$.reference_number')"
},
"purchase_date": {
"S": "$input.path('$.purchase_date')"
}
}
}
The above works and saves to the table.
Suppose I add the event hash to my json (which can change based on events).
{
"reference_number": 99,
"purchase_date": "1/1/2017",
"event": {
"name": "purchase",
"items": [1,3,6],
"info": {
"currencyID": "USD",
"countryID": "US"
}
}
}
How do I save the event attribute to a Map in dynamodb using the API Gateway Body mapping template syntax?
{
"TableName": "Events",
"Item": {
"reference_number": {
"N": "$input.path('$.reference_number')"
},
"purchase_date": {
"S": "$input.path('$.purchase_date')"
},
"event":{
"M": "$input.path('$.event')"
}
}
}
The above template gives me the following error. "Expected map or null"

It looks like DynamoDB API actually requires the value of an 'M' attribute to be a Map of String -> AttributeValue. Unfortunately you can't pass the raw JSON. You'll have to manually map the whole event object to make the DDB API happy.
http://docs.aws.amazon.com/amazondynamodb/latest/APIReference/API_AttributeValue.html#DDB-Type-AttributeValue-M
One possible workaround would be to stringify the event object and write it as type S but that would of course require the reader to expect that behavior.
{
"TableName": "Events",
"Item": {
"reference_number": {
"N": "$input.path('$.reference_number')"
},
"purchase_date": {
"S": "$input.path('$.purchase_date')"
},
"event":{
"S": "$util.escapeJavaScript($input.json('$.event'))"
}
}
}

As it seems you finally did. I reckon the best option is to create a simple lambda function between you API and dynamoDB. Leaving the mapping work up to the aws-sdk.
In that case, the body mapping template in the API gateway would be as simple as this:
$input.body
And the function won't be much more complicated. I used a javascript function:
var AWS = require("aws-sdk");
var docClient = new AWS.DynamoDB.DocumentClient();
var tableName = "tableName";
var saveData = function (data) {
var params = {
TableName: tableName,
Item: data
};
docClient.put(params, function (err, data) {
if (err) {
console.error("Unable to add item. Error JSON:", JSON.stringify(err, null, 2));
} else {
console.log("Added item:", JSON.stringify(data, null, 2));
}
});
};
exports.handler = function (event) {
try {
console.log("Processing event: ", event);
saveData(event);
} catch (e) {
console.error("Processed unsuccessfully", e, e.stack);
}
};
http://docs.aws.amazon.com/amazondynamodb/latest/gettingstartedguide/GettingStarted.NodeJs.03.html

For my requirement (saving input body), the mapping template is
"rawdata": {
"M": $input.body
}
Note that there are no quotes for input body.
And the data should be in Dynamodb format, for eg
{"username":{"S":"Vishnu"}}
You could use js lib like dynamodb-marshaller to convert json to Dynamodb format. Hope this helps.

Related

DynamoDB return the modified document (Old or new) when using TransactWrtieItems

Is there a way of making transactWriteItem return the document it updated?
const transactionParams = {
ReturnConsumedCapacity: "INDEXES",
TransactItems: [
{
Delete: {
TableName: reactionTableName,
Key: {
"SOME_PK_",
"SOME_SK_",
},
ReturnValues: 'ALL_OLD',
},
},
{
Update: {
TableName: reviewTableName,
Key: { PK: "SOME_PK", SK: "SOME_SK" },
ReturnValues: 'ALL_OLD',
},
},
],
};
try {
const result = await docClient.transactWrite(transactionParams).promise();
} catch (error) {
context.done(error, null);
}
For example in the above code get the documents that were touched (before or after update)?
No, TransactWriteItems API does not provide the ability to return values of a modified item, however, you could obtain those values using DynamoDB Streams, otherwise you would need to default to the singleton APIs UpdateItem/DeleteItem which are not ACID compliant together.

AWS AppSync merging, sorting and pagination of data from multiple REST calls

I'm trying to set a GraphQL endpoint using AWS AppSync that consumes from 2 HTTP data sources, merges the responses, orders them by date and implements pagination.
So far I've accomplished to make both requests and merge them using union
Schema
type Event1 {
id: ID!
name: String
type: String
order_date: String
}
type Event2 {
id: ID!
name: String
type: String
order_date: String
}
union Events = Event1 | Event2
listEvents(): [Events!]
Resolver 1
export function request(ctx) {
return {
"method": "GET",
"resourcePath": "/event1",
"params": {
"headers": {
"Content-Type" : "application/json"
},
}
}
}
export function response(ctx) {
if (ctx.result.statusCode !== 200) {
return util.appendError(ctx.result.body, `${ctx.result.statusCode}`);
}
var responseBody = JSON.parse(ctx.result.body);
/*for (var i in responseBody) {
responseBody[i].__typename = "Event1";
}*/
return responseBody;
}
Resolver 2
export function request(ctx) {
return {
"method": "GET",
"resourcePath": "/event2",
"params": {
"headers": {
"Content-Type" : "application/json"
},
}
}
}
export function response(ctx) {
if (ctx.result.statusCode !== 200) {
return util.appendError(ctx.result.body, `${ctx.result.statusCode}`);
}
var responseBody = JSON.parse(ctx.result.body);
/*for (var i in responseBody) {
responseBody[i].__typename = "Event2";
}*/
return [...ctx.prev.result, ...responseBody];
}
I made each service return the __typename so AppSync knows which type is in the union, but now I've looking for ways to sort by "order_date" and then get the first page, but I don't know if I'm thinking it correctly or this not the purpose GraphQL was intended to fulfill.
Any guide will be very much appreciated.

Bypass custom payload from Whatsapp API or custom integration to dialogflow ES API

I use a Dialogflow API as NLP and the interface that we use is Whatsapp API.
my problem is, when I want to bypass Text and Whatsapp client number to Dialogflow (my reference), I didn't found document to explain that. for comparison, the Telegram official integration dialogflow, from the body request we can extract that data like name and Telegram user ID.
const sessionId = phone_number_id; //session ID get from phone number
const sessionPath = sessionClient.projectAgentSessionPath(projectId, sessionId);
const request = {
session: sessionPath,
queryInput: {
text: {
text: msg_body,
languageCode: "id-ID"
},
},
payload: {
data: "testing",
phoneNumber : phone_number_id
}
};
console.log("request", request);
await sessionClient.detectIntent(request).then(responses => {
console.log("DetectIntent", JSON.stringify(responses));
}).catch(err => {
console.error("ERROR:", err);
})
I tried it with request variable like that but in request body in dialogflow fulfillment, it never showed up
{
"responseId": "censored",
"queryResult": {
"queryText": "halo",
"action": "input.welcome",
"parameters": {},
"allRequiredParamsPresent": true,
"fulfillmentText": "error",
"fulfillmentMessages": [
{
"text": {
"text": [
"error"
]
}
}
],
"outputContexts": [
{
"name": "censored",
"parameters": {
"no-input": 0,
"no-match": 0
}
}
],
"intent": {
"name": "censored",
"displayName": "Default Welcome Intent"
},
"intentDetectionConfidence": 1,
"languageCode": "id"
},
"originalDetectIntentRequest": {
"payload": {}
},
"session": "censored"
}
#Maulana ahmad, As you have mentioned in the comment below example code can be referred to extract data from the body request.
const dialogflow = require('dialogflow');
// Import the JSON to gRPC struct converter
const structjson = require('./structjson.js');
// Instantiates a sessison client
const sessionClient = new dialogflow.SessionsClient();
// The path to identify the agent that owns the created intent.
const sessionPath = sessionClient.sessionPath(projectId, sessionId);
// The text query request.
const request = {
session: sessionPath,
queryInput: {
event: {
name: eventName,
parameters: structjson.jsonToStructProto({foo: 'bar'}),
languageCode: languageCode,
},
},
};
sessionClient
.detectIntent(request)
.then(responses => {
console.log('Detected intent');
logQueryResult(sessionClient, responses[0].queryResult);
})
.catch(err => {
console.error('ERROR:', err);
});
This Stack Overflow link can be referred for more information.
Posting the answer as community wiki for the benefit of the community that might encounter this use case in the future.
Feel free to edit this answer for additional information.

Adding data to DynamoDB from the browser (CodePen) fails

I am new to AWS and got the following error when I tried to input data to the dynamodb invoking the lambda function between the 'API gateway' and the 'DynamoDB'.
Error:
Expected params.Item['Age'].S to be a string........
Screenshot of the Error:
Code:
I tried in the browser (CodePen) (I used the correct Invoke URL from the API gateway):
var xhr = new XMLHttpRequest();
xhr.open('POST', 'The API Invoke URL');
xhr.onreadystatechange = function(event){
console.log(event.target.response);
}
xhr.setRequestHeader('Content-Type', 'application/json');
xhr.send(JSON.stringify({age: 26, height: 71, income: 2400}));
The following lambda function is invoked when running the above code from CodePen:
There I have imported the aws-sdk and dynamodb correctly.
exports.fn = (event, context, callback) => {
const params = {
Item: {
"UserId": {
S: "user_" + Math.random()
},
"Age": {
N: event.age
},
"Height": {
N: event.height
},
"Income": {
N: event.income
}
},
TableName: "compare-yourself"
};
dynamodb.putItem(params, function(err, data) {
if (err) {
console.log(err);
callback(err);
} else {
console.log(data);
callback(null, data);
}
});
};
In the above lambda function you can observe that I have formatted the inputs as numbers but in the API gateway, in the POST integration request I have converted the inputs to strings. so the data that is passed via the lambda function is already a string. No need to format by the Lambda function, again.
Body mapper in 'POST Integration-Request':
#set($inputRoot = $input.path('$'))
{
"age" : "$inputRoot.age",
"height": "$inputRoot.height",
"income": "$inputRoot.income"
}
I need to know the reason for the above error and am happy to provide any additional information required.
Thank you in advance.
Change the params to indicate that the value of the age field is "String" and not "Numeric":
const params = {
Item: {
"UserId": {
S: "user_" + Math.random()
},
"Age": {
"S": event.age # This was previously set to "N" which causes the issue
},
"Height": {
N: event.height
},
"Income": {
N: event.income
}
},
TableName: "compare-yourself"
};

Searching DynamoDB for non primary keys and integrating into Alexa Skills

I am trying to search a non primary key using AWS Lambda and integrating it into the Alexa Skills Kit. I am very new to using DynamoDB and Alexa Skills Kit and I'm struggling to find any solutions to this online. The basic premise for what I am trying to do is querying the table yesno with two columns, id and message. Only looking through the message column to find a match with the text i specify in params.
Here is the Lambda code I am working with:
const AWSregion = 'eu-west-1';
const Alexa = require('alexa-sdk');
const AWS = require('aws-sdk');
//params for searching table
const params = {
TableName: 'yesno',
Key:{ "message": 'Ben Davies' }
};
AWS.config.update({
region: AWSregion
});
exports.handler = function(event, context, callback) {
var alexa = Alexa.handler(event, context);
// alexa.appId = 'amzn1.echo-sdk-ams.app.1234';
// alexa.dynamoDBTableName = 'YourTableName'; // creates new table for session.attributes
alexa.registerHandlers(handlers);
alexa.execute();
};
const handlers = {
'LaunchRequest': function () {
this.response.speak('welcome to magic answers. ask me a yes or no question.').listen('try again');
this.emit(':responseReady');
},
'MyIntent': function () {
var MyQuestion = this.event.request.intent.slots.MyQuestion.value;
console.log('MyQuestion : ' + MyQuestion);
readDynamoItem(params, myResult=>{
var say = MyQuestion;
say = myResult;
say = 'you asked, ' + MyQuestion + '. I found a reckord for: ' + myResult;
this.response.speak(say).listen('try again');
this.emit(':responseReady');
});
},
'AMAZON.HelpIntent': function () {
this.response.speak('ask me a yes or no question.').listen('try again');
this.emit(':responseReady');
},
'AMAZON.CancelIntent': function () {
this.response.speak('Goodbye!');
this.emit(':responseReady');
},
'AMAZON.StopIntent': function () {
this.response.speak('Goodbye!');
this.emit(':responseReady');
}
};
// END of Intent Handlers {} ========================================================================================
// Helper Function =================================================================================================
function readDynamoItem(params, callback) {
var AWS = require('aws-sdk');
AWS.config.update({region: AWSregion});
var dynamodb = new AWS.DynamoDB();
console.log('reading item from DynamoDB table');
dynamodb.query(params, function (err, data) {
if (err) console.log(err, err.stack); // an error occurred
else{
console.log(data); // successful response
callback(data.Item.message);
}
});
}
I know I am probably doing this completely wrong but there isn't much online for integrating DynamoDB with an Alexa Skill and the only thing i was able to find was searching by ID. This doesn't work for what i want to do without pulling all the items from the table into a map or a list, and seeing as I want to create a big database it seems quite inefficient.
On the Alexa side of things I am receiving the following service request when testing the code:
{
"session": {
"new": true,
"sessionId": "SessionId.f9558462-6db8-4bf5-84aa-22ee0920ae95",
"application": {
"applicationId": "amzn1.ask.skill.9f280bf7-d506-4d58-95e8-b9e93a66a420"
},
"attributes": {},
"user": {
"userId": "amzn1.ask.account.AF5IJBMLKNE32GEFQ5VFGVK2P4YQOLVUSA5YPY7RNEMDPKSVCBRCPWC3OBHXEXAHROBTT7FGIYA7HJW2PMEGXWHF6SQHRX3VA372OHPZZJ33K7S4K7D6V3PXYB6I72YFIQBHMJ4QGJW3NS3E2ZFY5YFSBOEFW6V2E75YAZMRQCU7MNYPJUMJSUISSUA2WF2RA3CIIDCSEY35TWI"
}
},
"request": {
"type": "IntentRequest",
"requestId": "EdwRequestId.7310073b-981a-41f8-9fa5-03d1b28c5aba",
"intent": {
"name": "MyIntent",
"slots": {
"MyQuestion": {
"name": "MyQuestion",
"value": "erere"
}
}
},
"locale": "en-US",
"timestamp": "2018-01-25T14:18:40Z"
},
"context": {
"AudioPlayer": {
"playerActivity": "IDLE"
},
"System": {
"application": {
"applicationId": "amzn1.ask.skill.9f280bf7-d506-4d58-95e8-b9e93a66a420"
},
"user": {
"userId": "amzn1.ask.account.AF5IJBMLKNE32GEFQ5VFGVK2P4YQOLVUSA5YPY7RNEMDPKSVCBRCPWC3OBHXEXAHROBTT7FGIYA7HJW2PMEGXWHF6SQHRX3VA372OHPZZJ33K7S4K7D6V3PXYB6I72YFIQBHMJ4QGJW3NS3E2ZFY5YFSBOEFW6V2E75YAZMRQCU7MNYPJUMJSUISSUA2WF2RA3CIIDCSEY35TWI"
},
"device": {
"supportedInterfaces": {}
}
}
},
"version": "1.0"
}
And I am receiving a service response error simply saying 'The response is invalid'
Any help with this would be greatly appreciated
I would like to help you in dynamo db part.
In order to access non primary key columns in dynamodb you should perform scan operation.
For your table (yesno), id is a primary key and message is an additional column.
Snippet to access non primary key column [Message]
var dynamodb = new AWS.DynamoDB();
var params = {
TableName: 'yesno',
FilterExpression: 'message = :value',
ExpressionAttributeValues: {
':value': {"S": "Ben Davies"}
}
};
dynamodb.scan(params, function(err, data) {
if (err) // an error occurred
else console.log(data); // successful response
});
Snippet to access primary key column [Id]
var docClient = new AWS.DynamoDB.DocumentClient();
//Get item by key
var params = {
TableName: 'sis_org_template',
Key: { "id": "1"}
};
docClient.get(params, function(err, data) {
if (err) // an error occurred
else console.log(data); // successful response
});