How to invoke an API from Lambda - amazon-web-services

I have a requirement where I want DynamoDB TTL to put data into DynamoStream (when expired) and then send it to a Lambda. And I am able to achieve this. Now anytime an item gets expired it acts as a trigger to my lambda and I can see the items in console.log of Lambda method by investigating the event.
Question - I want to do some processing based on the items that are expiring. And for that to happen I need to make an API call to certain end point passing information about the items. I searched alot on google and even checked Lambda blueprints but I could not find a basic example where data received by lambda is being sent to a REST End point. Can someone guide me with this. All I find on google is how to integrate API Gateway with lambda. I am a beginner so need some guidance here.
Thanks!

It's pretty simple for a Node.js Lambda to call a REST API. It doesn't matter what your backend is coded in as long as it follows basic REST patterns. As an example, a very simple Node Lambda might look like:
var https = require('https');
exports.handler = (event, context, callback) => {
var params = {
host: "example.com",
path: "/api/v1/yourmethod"
};
var req = https.request(params, function(res) {
let data = '';
console.log('STATUS: ' + res.statusCode);
res.setEncoding('utf8');
res.on('data', function(chunk) {
data += chunk;
});
res.on('end', function() {
console.log("DONE");
console.log(JSON.parse(data));
});
});
req.end();
};
Obviously your code would contain a bit more as you have to process the DynamoDB event but this is the basics for a GET.

Related

Postman test script - how to call an api twice to simulate 409 error

I am trying to run a few automated testing using the Postman tool. For regular scenarios, I understand how to write pre-test and test scripts. What I do not know (and trying to understand) is, how to write scripts for checking 409 error (let us call it duplicate resource check).
I want to run a create resource api like below, then run it again and ensure that the 2nd invocation really returns 409 error.
POST /myservice/books
Is there a way to run the same api twice and check the return value for 2nd invocation. If yes, how do I do that. One crude way of achieving this could be to create a dependency between two tests, where the first one creates a resource, and the second one uses the same payload once again to create the same resource. I am looking for a single test to do an end-to-end testing.
Postman doesn't really provide a standard way, but is still flexible. I realized that we have to write javascript code in the pre-request tab, to do our own http request (using sendRequest method) and store the resulting data into env vars for use by the main api call.
Here is a sample:
var phone = pm.variables.replaceIn("{{$randomPhoneNumber}}");
console.log("phone:", phone)
var baseURL = pm.variables.replaceIn("{{ROG_SERVER}}:{{ROG_PORT}}{{ROG_BASE_URL}}")
var usersURL = pm.variables.replaceIn("{{ROG_SERVICE}}/users")
var otpURL = `${baseURL}/${phone}/_otp_x`
// Payload for partner creation
const payload = {
"name": pm.variables.replaceIn("{{username}}"),
"phone":phone,
"password": pm.variables.replaceIn("{{$randomPassword}}"),
}
console.log("user payload:", payload)
function getOTP (a, callback) {
// Get an OTP
pm.sendRequest(otpURL, function(err, response) {
if (err) throw err
var jsonDaata = response.json()
pm.expect(jsonDaata).to.haveOwnProperty('otp')
pm.environment.set("otp", jsonDaata.otp)
pm.environment.set("phone", phone);
pm.environment.set("username", "{{$randomUserName}}")
if (callback) callback(jsonDaata.otp)
})
}
// Get an OTP
getOTP("a", otp => {
console.log("OTP received:", otp)
payload.partnerRef = pm.variables.replaceIn("{{$randomPassword}}")
payload.otp = otp
//create a partner user with the otp.
let reqOpts = {
url: usersURL,
method: 'POST',
headers: { 'Content-Type': 'application/json'},
body: JSON.stringify(payload)
}
pm.sendRequest(reqOpts, (err, response) => {
console.log("response?", response)
pm.expect(response).to.have.property('code', 201)
})
// Get a new OTP for the main request to be executed.
getOTP()
})
I did it in my test block. Create your normal request as you would send it, then in your tests, validate the original works, and then you can send the second command and validate the response.
You can also use the pre and post scripting to do something similar, or have one test after the other in the file (they run sequentially) to do the same testing.
For instance, I sent an API call here to create records. As I need the Key_ to delete them, I can make a call to GET /foo at my API
pm.test("Response should be 200", function () {
pm.response.to.be.ok;
pm.response.to.have.status(200);
});
pm.test("Parse Key_ values and send DELETE from original request response", function () {
var jsonData = JSON.parse(responseBody);
jsonData.forEach(function (TimeEntryRecord) {
console.log(TimeEntryRecord.Key_);
const DeleteURL = pm.variables.get('APIHost') + '/bar/' + TimeEntryRecord.Key_;
pm.sendRequest({
url: DeleteURL,
method: 'DELETE',
header: { 'Content-Type': 'application/json' },
body: { TimeEntryRecord }
}, function (err, res) {
console.log("Sent Delete: " + DeleteURL );
});
});
});

aws xray not monitoring dynamo dax client in nodejs

I recently started using dynamodb dax within my node lambda function, however with 'amazon-dax-client' framework, i cannot longer capture transparently with http requests made by the framework, like so;
const AWS = AWSXRay.captureAWS(require('aws-sdk'));
const dynamoDB = AWSXRay.captureAWSClient(new AWS.DynamoDB(defaults.db.config));
I know i could create an async capture. but i am wondering if there is a better way to do this, like in the previous way and if i someone managed to capture requests, made with dax-client in the same way as with the dynamo client from aws framework.
DAX doesn't currently support XRay, because DAX doesn't use the standard AWS SDK HTTP client (it doesn't use HTTP at all).
The team has received other requests for XRay support so it's certainly something we're considering.
While there is no official support for XRAY from DAX team. I wrote a little snippet as a workaround.
const db = new Proxy(documentClient, {
get: (target: any, prop: any) => {
if (typeof target[prop] === "function") {
return (...args: any) => {
const segment = xray
.getSegment()
?.addNewSubsegment(`DynamoDB::${prop}`);
const request = target[prop](...args);
const promise = request
.promise()
.then((response: any) => {
segment?.close();
return response;
})
.catch((err: any) => {
segment?.close();
return Promise.reject(err);
});
return {
...request,
promise: () => promise,
};
};
}
return target[prop];
},
});
const response await = db.query(...).promise();
Tested in AWS Lambda under VPC private subnet and AWS XRAY service endpoint.

AWS - Caching the response of a scheduled lambda function

Using AWS, I have followed an example of a lambda function using the serverless framework. It is working as expected, but now I wonder what the best way of caching the response is.
My final version of this will consist of one or more json objects that will be retrieved on a regular basis.
The client side will call an api that will retrieve the already cached data.
So what AWS service should I implement to actually make the cache?
If it's static bit of JSON i'd simply import/return it from within the function rather than enable caching, but hey it's your API!
To answer your question you can use caching within API Gateway to do so, documentation can be found here
Update:
I'd actually misread the question so apologies for that, whilst that caching works what the Op is asking is where to store data retrieved - if your retrieving it from an external API you can just write it to s3 like so:
import AWS from 'aws-sdk'
export default (event, context, callback) => {
let s3Bucket = new AWS.S3({ params: { Bucket: 'YOUR_BUCKET_NAME' } })
let documentName = 'someName'
let fileExtension = 'json'
let s3data = {
Key: `${documentName}.${fileExtension}`,
Body: JSON.parse(event.body.someJsonObject),
}
s3Bucket.putObject(s3data, (err, s3result) => {
if (err) {
console.log(err)
callback(err, null)
} else {
console.log('S3 added', s3result)
callback(null, 'done')
}
})
}
Then you just need to read the object back in your serving endpoint.

Can't push data to Firebase from within an Alexa Skill hosted on AWS Lambda

I have a database in Firebase to which I'm trying to write some data from within my Alexa Skill. The Node.js code for that skill sits inside an AWS Lambda function and when that code is run I want to push some data to Firebase.
I've tested the code that connects to and pushes to Firebase outside of Lambda and it works exactly as expected. Below is that code:
var firebase = require('firebase');
firebase.initializeApp({
databaseURL: 'https://myapp.firebaseio.com',
serviceAccount: './myapp.json',
});
var cart = firebase.database().ref('/cart');
console.log(cart);
cart.push( {
item: 'apples', quantity: '1', amount: '0'
}, function(error) {
if (error) {
console.log("Data could not be saved." + error);
} else {
console.log("Data saved successfully.");
}
});
This same code doesn't push anything to the database instance when executed from within the Lambda function. I read online that Lambda timeout limit could be a reason for this, so I increased the timeout limit to a minute, and it still doesn't run as expected. I've also tried using the Firebase REST API instead of their Node.js SDK and that didn't work either. What is the correct way to push data to Firebase from within AWS Lambda?
I think I know why this happens, I had a similar issue and this is what I've done.
If you want to write some date into your database you need to make sure that you don't call this.emit(*****) until you are done. As soon as you return the response to the user the Thread gets closed and your information doesn't get saved.
The easiest way to solve this problem is to return the response to the user once you get confirmation that the information has been saved.
In case of Firebase something like this:
function writeUserData(userId) {
// Get a key for a new Post.
var userKey = db.ref('users/').push().key;
var reference = db.ref('users/' + userKey).set({
user: userId
});
reference.then(() => {
alexa.emit(':ask', "Works");
},
(err) => {
alexa.emit(':ask', "Does not work");
});
}
I couldn't get anything saved until I started doing it like this.
Hope it helps.
I've run into this too and the only way I've figured out how to get it to work is to delay the lambda callback function in the handler. Try this and let me know if it works.
var firebase = require('firebase');
firebase.initializeApp({
databaseURL: 'https://myapp.firebaseio.com',
serviceAccount: './myapp.json',
});
exports.handler = (event, context, callback) => {
var cart = firebase.database().ref('/cart');
console.log(cart);
cart.push( {
item: 'apples', quantity: '1', amount: '0'
setTimeout(()=>{
callback(null, 'success');
},2000);
}, function(error) {
if (error) {
console.log("Data could not be saved." + error);
setTimeout(()=>{
callback(error);
},2000);
} else {
console.log("Data saved successfully.");
setTimeout(()=>{
callback(null, 'success');
},2000);
}
});
}

Django equivalent for Node.js's Bluebird Promises [closed]

Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
We don’t allow questions seeking recommendations for books, tools, software libraries, and more. You can edit the question so it can be answered with facts and citations.
Closed 7 years ago.
Improve this question
I have a complex use case of promises in an Express (Node) server application, and now I'm being asked to migrate this server to Django. Basically my server (let's call it "A") is an OAuth2 client for another server (let's call it "B") and so A can request resources from B through B's API. Likewise, server A offers its own API which is intended to be consumed through ajax from javascript code in the browser. Let me show you the following picture to make things clearer:
My server A is working like a middleware between the browser and server B. So when the browser makes a call to one of A's API functions, A in turn makes several calls to B's API and based on those results A returns its own stuff to the browser.
So, in terms of code, I was doing something like this in Node.js (the code is just a simplification):
var express = require('express');
var router = express.Router();
var request = require('request-promise');
var Promise = require('bluebird');
...
//the following are helper functions
function getStuff1(req,res,params) {
request.get({
uri: "http://externalserver.com/api/whatever...",
headers: {
'Authorization':'Bearer' + req.user.accessToken //<-- notice that I'm using the user's access token (OAuth2)
}
}).then(function(input) {
//use params and input and compute stuff
return stuff;
}).catch(function(error) {
if(error.statusCode == 401) { // the accessToken has expired, we need to refresh it
return refreshOAuthToken(req,res)
.then(function(){
return getStuff1(req,res,params); // after refreshing the accessToken, we recursively call getStuff1 again
})
.catch(function(err) {
throw(err);
});
} else {
throw(error);
}
});
}
function getStuff2(req,res,params) { ... }
function getStuff3(req,res,params) { ... }
function getStuff4(req,res,params) { ... }
...
function refreshOAuthToken(req,res) {
return request.post({
uri: "http://otherserver/oauth/token",
form: {
'client_id': oauthClientId,
'client_secret': oauthClientSecret,
'grant_type': 'refresh_token',
'refreshToken': req.user.refreshToken // we're using the user's refresh token
})
.then( function(body) {
jsonResponse = JSON.parse(body);
req.user.accessToken = jsonResponse.access_token;
req.user.refreshToken = jsonResponse.refresh_token;
})
.catch( function(error) {
throw(error);
});
};
}
// the following is an actual API function
function apiFunction1(req,res) {
//first we compute params1 somehow
var params1 = req.whatever;
getStuff1(req,res, params1)
.then(function(stuff1) {
// do something with stuff1 and compute params2
return getStuff2(req,res,params2);
})
.then(function(stuff2) {
// do something with stuff2 and compute params3
return getStuff3(req,res,params3);
})
.then(function(stuff3) {
// now make 2 asynchronous calls at the same time
var promise4 = getStuff4(req,res,params4);
var promise5 = getStuff5(req,res,params5);
return Promise.all([promise4,promise5]); //we combine 2 promises into 1 with Promise.all
})
.then(function(results) {
var stuff4 = results[0];
var stuff5 = results[1];
//do something with stuff4 and stuff5, and compute the final answer
var answer = doSomethingWith(stuff4,stuff5);
res.send(answer); //finally we send the answer to the client
})
.catch(function(error) {
res.status(401).send({error: error}); // in case of any error, we send it back to the client
});
}
router.get('/api-function-1', apiFunction1);
module.exports = router;
This router is imported later like so:
var api = require('./routes/api');
app.use('/api', api);
So as you can see I'm doing a lot of requests to B which include refreshing OAuth2 tokens and making calls to B's API. Now the browser's javascript can call A's API function like so:
$.ajax('/api/api-function-1' + extra_params, {
dataType: 'json',
type: 'GET'
})
.done(doSomething)
.fail(handleError);
So what is the best way to achieve something like this in Django? I'm new to Django and python in general so I'm very open to any suggestion. Does Django have some equivalent for Node's bluebird library for promises? Any help regarding the OAuth2 part is also very welcomed.
Django conforms to, and is usually served, use the WSGI standard. WSGI and the default django deployment have a completely different execution model compared to node.
Node employs an event loop. Requests come in and are put on a single event loop. Bluebird (promises) allow you to put an event on the event loop and register an action to perform when that even completes. Django doesn't have a concept of an event loop, and doesn't have an equivalent to promises/futures (by default). In django a request comes in and is executed synchronously. There are a pool of workers, and when request comes in a single worker will handle executing the code until it is finished. There are no events registered onto an event loop.
Django code will look like:
# make an authenticated request using oauth user token
# if request fails make another request to refresh token
# remake request