I recently started using dynamodb dax within my node lambda function, however with 'amazon-dax-client' framework, i cannot longer capture transparently with http requests made by the framework, like so;
const AWS = AWSXRay.captureAWS(require('aws-sdk'));
const dynamoDB = AWSXRay.captureAWSClient(new AWS.DynamoDB(defaults.db.config));
I know i could create an async capture. but i am wondering if there is a better way to do this, like in the previous way and if i someone managed to capture requests, made with dax-client in the same way as with the dynamo client from aws framework.
DAX doesn't currently support XRay, because DAX doesn't use the standard AWS SDK HTTP client (it doesn't use HTTP at all).
The team has received other requests for XRay support so it's certainly something we're considering.
While there is no official support for XRAY from DAX team. I wrote a little snippet as a workaround.
const db = new Proxy(documentClient, {
get: (target: any, prop: any) => {
if (typeof target[prop] === "function") {
return (...args: any) => {
const segment = xray
.getSegment()
?.addNewSubsegment(`DynamoDB::${prop}`);
const request = target[prop](...args);
const promise = request
.promise()
.then((response: any) => {
segment?.close();
return response;
})
.catch((err: any) => {
segment?.close();
return Promise.reject(err);
});
return {
...request,
promise: () => promise,
};
};
}
return target[prop];
},
});
const response await = db.query(...).promise();
Tested in AWS Lambda under VPC private subnet and AWS XRAY service endpoint.
Related
I have a REST API endpoint which is probably called few times every couple of days at most. Yet, when monitoring my API Gateway and Lambda, it shows there's been thousands of API Calls. Is this expected? If not, how can I prevent this?
Here are my graph usages for API Gateway and Lambda function (my API Gateway is connected to my Lambda):
And here is the code for my Lambda (reads data from DynamoDB):
const AWS = require('aws-sdk');
const docClient = new AWS.DynamoDB.DocumentClient();
const getData = async() => {
const scanResult = await docClient.scan({ "TableName": "redirects" }).promise();
return scanResult;
};
exports.handler = async () => {
var iOS = 0;
var android = 0;
var other = 0;
const data = await getData();
data["Items"].forEach((item) => {
switch (item.operating_system) {
case 'iOS':
iOS += 1;
break;
case 'Android':
android += 1;
break;
default:
other += 1;
}
});
const response = {
statusCode: 200,
headers: {
'Access-Control-Allow-Origin': '*',
'Access-Control-Allow-Credentials': true,
},
body: JSON.stringify([
{
os: "iOS",
count: iOS
},
{
os: "Android",
count: android
},
{
os: "Other",
count: other
}
])
};
return response;
};
I hope this can help Throttle API requests for better throughput.
You can configure throttle and quota settings for your API/Account with the API Gateway Service
Or you can think to use other services like Cognito to authorize the access to your APIs (if aren't public) or WAF, restricting access to some CIDR, Region, prevent DDoS attack, and so on.
If you have access to the frontend code that calls your API, you can check that it doesn't have any accidental infinite loops built in that keep calling the API when the app is running.
Generally speaking reducing the no. of API calls It will naturally help the server to handle more calls. You can always try and cache the data as much as possible in that way too you could reduce the volume of data being send. Like query cache at database, reverse proxy server, cache at html or use CDNs etc.
What I want to do?
I want to create REST API that returns data from my DynamoDB table which is being created by GraphQL model.
What I've done
Create GraphQL model
type Public #model {
id: ID!
name: String!
}
Create REST API with Lambda Function with access to my PublicTable
$ amplify add api
? Please select from one of the below mentioned services: REST
? Provide a friendly name for your resource to be used as a label for this category in the project: rest
? Provide a path (e.g., /book/{isbn}): /items
? Choose a Lambda source Create a new Lambda function
? Provide an AWS Lambda function name: listPublic
? Choose the runtime that you want to use: NodeJS
? Choose the function template that you want to use: Hello World
Available advanced settings:
- Resource access permissions
- Scheduled recurring invocation
- Lambda layers configuration
? Do you want to configure advanced settings? Yes
? Do you want to access other resources in this project from your Lambda function? Yes
? Select the category storage
? Storage has 8 resources in this project. Select the one you would like your Lambda to access Public:#model(appsync)
? Select the operations you want to permit for Public:#model(appsync) create, read, update, delete
You can access the following resource attributes as environment variables from your Lambda function
API_MYPROJECT_GRAPHQLAPIIDOUTPUT
API_MYPROJECT_PUBLICTABLE_ARN
API_MYPROJECT_PUBLICTABLE_NAME
ENV
REGION
? Do you want to invoke this function on a recurring schedule? No
? Do you want to configure Lambda layers for this function? No
? Do you want to edit the local lambda function now? No
Successfully added resource listPublic locally.
Next steps:
Check out sample function code generated in <project-dir>/amplify/backend/function/listPublic/src
"amplify function build" builds all of your functions currently in the project
"amplify mock function <functionName>" runs your function locally
"amplify push" builds all of your local backend resources and provisions them in the cloud
"amplify publish" builds all of your local backend and front-end resources (if you added hosting category) and provisions them in the cloud
Succesfully added the Lambda function locally
? Restrict API access No
? Do you want to add another path? No
Successfully added resource rest locally
Edit my Lambda function
/* Amplify Params - DO NOT EDIT
API_MYPROJECT_GRAPHQLAPIIDOUTPUT
API_MYPROJECT_PUBLICTABLE_ARN
API_MYPROJECT_PUBLICTABLE_NAME
ENV
REGION
Amplify Params - DO NOT EDIT */
const AWS = require("aws-sdk");
const region = process.env.REGION
AWS.config.update({ region });
const docClient = new AWS.DynamoDB.DocumentClient();
const params = {
TableName: "PublicTable"
}
async function listItems(){
try {
const data = await docClient.scan(params).promise()
return data
} catch (err) {
return err
}
}
exports.handler = async (event) => {
try {
const data = await listItems()
return { body: JSON.stringify(data) }
} catch (err) {
return { error: err }
}
};
Push my updates
$ amplify push
Open my REST API endpoint /items
{
"message": "User: arn:aws:sts::829736458236:assumed-role/myprojectLambdaRolef4f571b-dev/listPublic-dev is not authorized to perform: dynamodb:Scan on resource: arn:aws:dynamodb:us-east-1:8297345848236:table/Public-ssrh52tnjvcdrp5h7evy3zdldsd-dev",
"code": "AccessDeniedException",
"time": "2021-04-21T21:21:32.778Z",
"requestId": "JOA5KO3GVS3QG7RQ2V824NGFVV4KQNSO5AEMVJF66Q9ASUAAJG",
"statusCode": 400,
"retryable": false,
"retryDelay": 28.689093010346657
}
Problems
What I did wrong?
How do I access my table and why I didn't get it when I created it?
Why API_MYPROJECT_PUBLICTABLE_NAME and other constants are needed?
Decision
The problem turned out to be either the NodeJS version or the amplify-cli version. After updating amplify-cli and installing the node on the 14.16.0 version, everything worked.
I also changed the name of the table to what Amplify creates for us, although this code did not work before. The code became like this:
/* Amplify Params - DO NOT EDIT
API_MYPROJECT_GRAPHQLAPIIDOUTPUT
API_MYPROJECT_PUBLICTABLE_ARN
API_MYPROJECT_PUBLICTABLE_NAME
ENV
REGION
Amplify Params - DO NOT EDIT */
const AWS = require("aws-sdk");
const region = process.env.REGION
const tableName = process.env.API_MYPROJECT_PUBLICTABLE_NAME
AWS.config.update({ region });
const docClient = new AWS.DynamoDB.DocumentClient();
const params = {
TableName: tableName
}
async function listItems(){
try {
const data = await docClient.scan(params).promise()
return data
} catch (err) {
return err
}
}
exports.handler = async (event) => {
try {
const data = await listItems()
return { body: JSON.stringify(data) }
} catch (err) {
return { error: err }
}
};
I have a requirement where I want DynamoDB TTL to put data into DynamoStream (when expired) and then send it to a Lambda. And I am able to achieve this. Now anytime an item gets expired it acts as a trigger to my lambda and I can see the items in console.log of Lambda method by investigating the event.
Question - I want to do some processing based on the items that are expiring. And for that to happen I need to make an API call to certain end point passing information about the items. I searched alot on google and even checked Lambda blueprints but I could not find a basic example where data received by lambda is being sent to a REST End point. Can someone guide me with this. All I find on google is how to integrate API Gateway with lambda. I am a beginner so need some guidance here.
Thanks!
It's pretty simple for a Node.js Lambda to call a REST API. It doesn't matter what your backend is coded in as long as it follows basic REST patterns. As an example, a very simple Node Lambda might look like:
var https = require('https');
exports.handler = (event, context, callback) => {
var params = {
host: "example.com",
path: "/api/v1/yourmethod"
};
var req = https.request(params, function(res) {
let data = '';
console.log('STATUS: ' + res.statusCode);
res.setEncoding('utf8');
res.on('data', function(chunk) {
data += chunk;
});
res.on('end', function() {
console.log("DONE");
console.log(JSON.parse(data));
});
});
req.end();
};
Obviously your code would contain a bit more as you have to process the DynamoDB event but this is the basics for a GET.
I'm try to make an API post request in my lambda function but in the aws website, using nodejs I cannot import API ? Here is what I am trying
console.log('Loading function');
const AWS = require('aws-sdk');
const translate = new AWS.Translate({ apiVersion: '2017-07-01' });
var API = require('aws-amplify');
exports.handler = async (event, context) => {
try {
const params = {
SourceLanguageCode: 'en', /* required */
TargetLanguageCode: 'es', /* required */
Text: 'Hello World', /* required */
};
const data = await translate.translateText(params).promise();
createSite(data.TranslatedText);
} catch (err) {
console.log(err, err.stack);
}
function createSite(site) {
return API.post("sites", "/sites", {
body: site
});
}
};
I have also tried import...
I think you may be looking at front-end browser based JavaScript examples, which aren't always going to work in a back-end AWS Lambda NodeJS runtime environment. It appears you are trying to use this library, which states it is "a JavaScript library for frontend and mobile developers", which probably isn't what you want to use on AWS Lambda. It appears you also did not include that library in your AWS Lambda function's deployment.
I suggest using the AWS Amplify client in the AWS SDK for NodeJS which is automatically included in your Lambda function's runtime environment. You would create an Amplify client like so:
var amplify = new AWS.Amplify();
Using AWS, I have followed an example of a lambda function using the serverless framework. It is working as expected, but now I wonder what the best way of caching the response is.
My final version of this will consist of one or more json objects that will be retrieved on a regular basis.
The client side will call an api that will retrieve the already cached data.
So what AWS service should I implement to actually make the cache?
If it's static bit of JSON i'd simply import/return it from within the function rather than enable caching, but hey it's your API!
To answer your question you can use caching within API Gateway to do so, documentation can be found here
Update:
I'd actually misread the question so apologies for that, whilst that caching works what the Op is asking is where to store data retrieved - if your retrieving it from an external API you can just write it to s3 like so:
import AWS from 'aws-sdk'
export default (event, context, callback) => {
let s3Bucket = new AWS.S3({ params: { Bucket: 'YOUR_BUCKET_NAME' } })
let documentName = 'someName'
let fileExtension = 'json'
let s3data = {
Key: `${documentName}.${fileExtension}`,
Body: JSON.parse(event.body.someJsonObject),
}
s3Bucket.putObject(s3data, (err, s3result) => {
if (err) {
console.log(err)
callback(err, null)
} else {
console.log('S3 added', s3result)
callback(null, 'done')
}
})
}
Then you just need to read the object back in your serving endpoint.