Piping a API Gateway response to client through Lambda handler - amazon-web-services

I have a REST API using AWS API Gateway. The API is handled by a custom Lambda function. I have a /prompts endpoint in my API, for which the Lambda function will call Open AI API, send it the prompt, and stream the result to the user as it is being generated (which can take a few seconds).
I'm able to stream and handle the response from Open AI's API to my Lambda function.
I would now like to re-stream / pipe that response to the client.
My question is how to do that?
Is there a way to simply pipe the stream being received from Open AI API to my client?
My Lambda function is:
ry {
const res = await openai.createCompletion({
...params,
stream: true,
}, { responseType: 'stream' });
res.data.on('data', data => {
const lines = data.toString().split('\n').filter(line => line.trim() !== '');
for (const line of lines) {
const message = line.replace(/^data: /, '');
if (message === '[DONE]') {
// store the response to DynamoDB
storeRecord(content)
return content
}
try {
const parsed = JSON.parse(message);
content += parsed.choices[0].text
// ****** I want to send content to the front-end client... *******
} catch(error) {
console.error('Could not JSON parse stream message', message, error);
}
}
});
} catch (error) {
if (error.response?.status) {
console.error(error.response.status, error.message);
error.response.data.on('data', data => {
const message = data.toString();
try {
const parsed = JSON.parse(message);
console.error('An error occurred during OpenAI request: ', parsed);
} catch(error) {
console.error('An error occurred during OpenAI request: ', message);
}
});
} else {
console.error('An error occurred during OpenAI request', error);
}
}

Related

Using AWS Lambda Console to send push using SNS

I tried every possible solution on the internet with no hope
What I am trying to do is simply use aws lambda functions (through the aws console) to fetch user fcm token from lets say DynamoDB (not included in the question), use that token to create endpointArn, send push to that specific device
I tested to send Using SNS console and the push gets to the device successfully but I failed to get it to the device using Lambda functions although it gives success status and message ID
Here is the code I used
// Load the AWS SDK for Node.js
var AWS = require('aws-sdk');
// Set region
AWS.config.update({region: 'us-east-1'});
const sns = new AWS.SNS()
const sampleMessage = {
"GCM": {
"notification": {
"body": "Sample message for Android endpoints",
"title":"Title Test"
}
}
}
exports.handler = async (event) => {
const snsPayload = JSON.stringify(sampleMessage);
const response = {
statusCode: 200,
body: JSON.stringify('Hello from Lambda!'),
};
const params = {
PlatformApplicationArn: '<Platform Arn>',
Token: '<FCM Token>'
};
try {
const endpointData = await sns.createPlatformEndpoint(params).promise();
const paramsMessage = {
Message: snsPayload,
TargetArn: endpointData.EndpointArn
};
var publishTextPromise = await sns.publish(paramsMessage).promise();
response.MessageId = publishTextPromise.MessageId;
response.result = 'Success';
}
catch (e) {
console.log(e.stack)
response.result = 'Error'
}
return response;
};
After some trials and errors I figured out the solution for my own question
1- The GCM part of the payload should be a string not a json
2- The message parameter should have an attribute that explicitly sets the mime type of the payload to Json
Taking all that into consideration
const GCM_data = {
'notification': {
'body': 'Hellow from lambda function',
'title': 'Notification Title'
}
}
const data = {
"GCM": JSON.stringify(GCM_data)
}
const snsPayload = JSON.stringify(data)
and the params should look like
const paramsMessage = {
Message: snsPayload,
TargetArn: endpointData.EndpointArn,
MessageStructure: 'json'
};
and this will work :)

Kinesis Data Stream GetRecords giving Junk Data

While doing the Getrecords from Kinesis Data Stream it is is giving values like ����Y�ݽ#��5���I$ݔѵ�����ս��aQ���̽�9������ռ�� intermittently. I have a large data and the Junk data comes in between the correct data.
My Code is Nodejs in Lambda and is triggered by SQS (which is a DLQ of an upstream lambda that process the Kinesis Stream) . This lambda is expected to do the reprocessing logic of the failed messages of the upstream lambda.
Any suggestion would be very helpful. Thanks.
Sample Data :
{ "Attribute1" : "Value with over 400 charcters available ����Y�ݽ#��5���I$ݔѵ�����ս��aQ���̽�9������ռ�� . to make it work" , "Attribute2" : "value2" , "Attribute3" : "value3"}
var getKinesisRecords = async function (shardIterator){
var getParams = {
ShardIterator: shardIterator,
Limit: 1
};
await kinesis.getRecords(getParams, function(err, result) {
if (err) {console.log(err, err.stack); return null;}// an error occurred
else {
try {
for (var record in result.Records) {
var data = result.Records[record].Data
console.log("Kinesis Record data"+data);
}
} catch(err) {
console.log("Error while getting Record from Kinesis");
console.log(err);
return null;
}
if (result.NextShardIterator) getKinesisRecords(result.NextShardIterator);
} // successful response
}).promise();
//Process the record
}

Self invoking lambda invocation timing out

We're trying to develop a self-invoking lambda to process S3 files in chunks. The lambda role has the policies needed for the invocation attached.
Here's the code for the self-invoking lambda:
export const processFileHandler: Handler = async (
event: S3CreateEvent,
context: Context,
callback: Callback,
) => {
let bucket = loGet(event, 'Records[0].s3.bucket.name');
let key = loGet(event, 'Records[0].s3.object.key');
let totalFileSize = loGet(event, 'Records[0].s3.object.size');
const lastPosition = loGet(event, 'position', 0);
const nextRange = getNextSizeRange(lastPosition, totalFileSize);
context.callbackWaitsForEmptyEventLoop = false;
let data = await loadDataFromS3ByRange(bucket, key, nextRange);
await database.connect();
log.debug(`Successfully connected to the database`);
const docs = await getParsedDocs(data, lastPosition);
log.debug(`upserting ${docs.length} records to database`);
if (docs.length) {
try {
// upserting logic
log.debug(`total documents added: ${await docs.length}`);
} catch (err) {
await recurse(nextRange.end, event, context);
log.debug(`error inserting docs: ${JSON.stringify(err)}`);
}
}
if (nextRange.end < totalFileSize) {
log.debug(`Last ${context.getRemainingTimeInMillis()} milliseconds left`);
if (context.getRemainingTimeInMillis() < 10 * 10 * 10 * 6) {
log.debug(`Less than 6000 milliseconds left`);
log.debug(`Invoking next iteration`);
await recurse(nextRange.end, event, context);
callback(null, {
message: `Lambda timed out processing file, please continue from LAST_POSITION: ${nextRange.start}`,
});
}
} else {
callback(null, { message: `Successfully completed the chunk processing task` });
}
};
Where recurse is an invocation call to the same lambda. Rest of the things work as expected it just times out whenever the call stack comes on this invocation request:
const recurse = async (position: number, event: S3CreateEvent, context: Context) => {
let newEvent = Object.assign(event, { position });
let request = {
FunctionName: context.invokedFunctionArn,
InvocationType: 'Event',
Payload: JSON.stringify(newEvent),
};
let resp = await lambda.invoke(request).promise();
console.log('Invocation complete', resp);
return resp;
};
This is the stack trace logged to CloudWatch:
{
"errorMessage": "connect ETIMEDOUT 63.32.72.196:443",
"errorType": "NetworkingError",
"stackTrace": [
"Object._errnoException (util.js:1022:11)",
"_exceptionWithHostPort (util.js:1044:20)",
"TCPConnectWrap.afterConnect [as oncomplete] (net.js:1198:14)"
]
}
Not a good idea to create a self-invoking lambda function. In case of an error (could also be a bad handler call on AWS side) a lambda function might re-run several times. Very hard to monitor and debug.
I would suggest using Step Functions. I believe this tutorial can help Iterating a Loop Using Lambda
From the top of my head, if you prefer not dealing with Step Functions, you could create a Lambda trigger for an SQS queue. Then you pass a message to the queue if you want to run the lambda function another time.

API Gateway -> Lambda -> DynamoDB using Cognito. HTTP POST-> Unable to read response but returns a code 200

Scenario:
I query an HTTP POST (using Authorizer as Header parameter from Cognito).
When I try to fetch/read the query response, it triggers the error event. However, in the browser, I can see how 2 HTTP POST responses with 200 code and one of them returning the valid response. For example: if I make the request via POST man I receive the data in 1 response in a good way.
Problem:
I am unable to print the result because it launches the error event with not valid response data.
Browser images:
https://i.postimg.cc/MTMsxZjw/Screenshot-1.png
https://i.postimg.cc/3RstwMgv/Screenshot-2.png
Lambda code:
'use strict';
var AWS = require('aws-sdk'),
documentClient = new AWS.DynamoDB.DocumentClient();
exports.handler = function index(event, context, callback){
var params = {
TableName : "data-table"
};
documentClient.scan(params, function(err, data){
if(err){
callback(err, null);
}else{
console.log(JSON.stringify(data.Items));
callback(null, data.Items);
}
});
}
Client side JS code:
function requestData(pickupLocation) {
$.ajax({
type: 'POST',
url: _config.api.invokeUrl,
headers: {
Authorization: authToken,
},
data: "{}",
cache: false,
success: completeRequest,
error: errorRequest
});
}
function completeRequest(response) {
alert("hello");
alert(response.d);
}
function errorRequest(response) {
alert("hello1");
alert(response.status + ' ' + response.statusText);
}
According to further clarification based on the comments, this looks like API gateway has CORS disabled or enabled with incorrect header value returns.
The solution is to re-enable CORS through API gateway and in the advanced options add Access-Control-Allow-Origin to the header response (if not already on by default).
If you're proxying the response, you need to follow a specific format as described here
'use strict';
console.log('Loading hello world function');
exports.handler = async (event) => {
let name = "you";
let city = 'World';
let time = 'day';
let day = '';
let responseCode = 200;
console.log("request: " + JSON.stringify(event));
// This is a simple illustration of app-specific logic to return the response.
// Although only 'event.queryStringParameters' are used here, other request data,
// such as 'event.headers', 'event.pathParameters', 'event.body', 'event.stageVariables',
// and 'event.requestContext' can be used to determine what response to return.
//
if (event.queryStringParameters && event.queryStringParameters.name) {
console.log("Received name: " + event.queryStringParameters.name);
name = event.queryStringParameters.name;
}
if (event.pathParameters && event.pathParameters.proxy) {
console.log("Received proxy: " + event.pathParameters.proxy);
city = event.pathParameters.proxy;
}
if (event.headers && event.headers['day']) {
console.log("Received day: " + event.headers.day);
day = event.headers.day;
}
if (event.body) {
let body = JSON.parse(event.body)
if (body.time)
time = body.time;
}
let greeting = `Good ${time}, ${name} of ${city}. `;
if (day) greeting += `Happy ${day}!`;
let responseBody = {
message: greeting,
input: event
};
// The output from a Lambda proxy integration must be
// of the following JSON object. The 'headers' property
// is for custom response headers in addition to standard
// ones. The 'body' property must be a JSON string. For
// base64-encoded payload, you must also set the 'isBase64Encoded'
// property to 'true'.
let response = {
statusCode: responseCode,
headers: {
"x-custom-header" : "my custom header value"
},
body: JSON.stringify(responseBody)
};
console.log("response: " + JSON.stringify(response))
return response;
};
If you are using chrome you probably need the cors plugin .

Is it possible to extract contents of a Cloudwatch log from a subscription

I have a lambda that subscribes to a Cloudwatch Log stream. This all works tickety-boo i.e. when the log stream is written to the lambda receives a notification. Now, is there a way of receiving the contents of the log or a section of the log with the notification or do I then have to query the the log stream to garner the information that I need?
Regards
Angus
Yes you can. Here's how to do it with a Node.js Lambda:
var zlib = require('zlib');
exports.handler = function(input, context) {
// decode input from base64
var zippedInput = new Buffer.from(input.awslogs.data, 'base64');
// decompress the input
zlib.gunzip(zippedInput, function(error, buffer) {
if (error) { context.fail(error); return; }
// parse the input from JSON
var payload = JSON.parse(buffer.toString('utf8'));
// ignore control messages
if (payload.messageType === 'CONTROL_MESSAGE') {
return null;
}
// print the timestamp and message of each log event
payload.logEvents.forEach(function(logEvent) {
console.log(logEvent.timestamp + ' ' + logEvent.message);
});
});
};