How can I get log content in AWS Lambda from Cloudwatch - amazon-web-services

I have this basic lambda that posts an image to a web server.
From the events in CloudWatch, I can log successfully anything that happens in that lambda function :
From this Log Group (the lambda function) I clicked on Stream to AWS Lambda, chose a new lambda function in which I expect to receive my logs and didn't put any filters at all so I can get all logs.
The Lambda is triggered properly, but the thing is when I persist what I received in the event and context objects, I have all CloudWatch log stream information but I don't see any of the logs.
What I get :
Do I need to specify a filter for me to see any logs at all? Because in the filter section if I don't put any filters and click on test filter, I get all the logs in the preview window which seems to mean it should send the whole logs to my Lambda function. Also, it looked to me the logs where that unreadable stream in AWSLogs and that it was in Base64 but didn't get any results trying to convert that.

Yes the logs are gzipped and base64-encoded as mentioned by jarmod.
Sample code in NodeJs for extracting the same in lambda will be:
var zlib = require('zlib');
exports.handler = (input, context, callback) => {
var payload = new Buffer(input.awslogs.data, 'base64');
zlib.gunzip(payload, function(e, result) {
if (e) {
context.fail(e);
} else {
result = JSON.parse(result.toString());
console.log(result);
}
});

Related

Lex: The server encountered an error processing lambda

I'm developing a chatbot on AWS Lex and I want to use Lambda function to branch my intent.
In order to do so, I created a Lambda as follows:
exports.handler = async (event) => {
console.log(event); //capture Lex params
/*
let { name, slots } = event.currentIntent
if(slots.MeetingType.toLowerCase() === 'on-line') {
return {
dialogAction: {
type: "ElicitSlot",
intentName: name,
slotToElicit: "InvitationLink",
slots
}
}
}
return {
dialogAction: {
type: "Delegate",
slots
}
}
*/
};
But as you can see, even when the function does nothing but log Lex output, I'm getting this error message in Lex:
An error has occurred: The server encountered an error processing the
Lambda response
Any help would be appreciated.
Because you are trying to build a Lex chatbot using JavaScript, please refer to this use case in the AWS SDK for JavaScript DEV Guide. It will walk you through this use case:
Building an Amazon Lex chatbot
Once you get this working, you can port the logic to a Lambda function.
Amazon Lex is giving you this error message because the Lambda function has failed during execution.
Enable CloudWatch logging for your Lambda function and check the logs after Lex has called it. The logs should provide you with more specific details about what's caused the code to break/fail. From there you should have a better idea of how to resolve the issue.
Feel free to post the output from the logs if you need more assistance with debugging the issue.

GCP cloud build VIEW RAW logs link

I have written a small cloud function in GCP which is subscribed to Pub/Sub event. When any cloud builds triggered function post message into the slack channel over webook.
In response, we get lots of details to trigger name, branch name, variables details but i am more interested in Build logs URL.
Currently getting build logs URL in response is like : logUrl: https://console.cloud.google.com/cloud-build/builds/899-08sdf-4412b-e3-bd52872?project=125205252525252
which requires GCP console access to check logs.
While in the console there an option View Raw. Is it possible to get that direct URL in the event response? so that i can directly sent it to slack and anyone can access direct logs without having GCP console access.
In your Cloud Build event message, you need to extract 2 values from the JSON message:
logsBucket
id
The raw file is stored here
<logsBucket>/log-<id>.txt
So, you can get it easily in your function with Cloud Storage client library (preferred solution) or with a simple HTTP Get call to the storage API.
If you need more guidance, let me know your dev language, I will send you a piece of code.
as #guillaume blaquiere helped.
Just sharing the piece of code used in cloud function to generate the singedURL of cloud build logs.
var filename ='log-' + build.id + '.txt';
var file = gcs.bucket(BUCKET_NAME).file(filename);
const getURL = async () => {
return new Promise((resolve, reject) => {
file.getSignedUrl({
action: 'read',
expires: Date.now() + 76000000
}, (err, url) => {
if (err) {
console.error(err);
reject(err);
}
console.log("URL");
resolve(url);
});
})
}
const singedUrl = await getURL();
if anyone looking for the whole code please follow this link : https://github.com/harsh4870/Cloud-build-slack-notification/blob/master/singedURL.js

Is it possible to make AWS Websocket + Lambda function to constant monitoring of the DynamoDB and send response to the client?

I have a serverless project: AWS + Angular on the frontend. Currently, I get the data when page is initialized and refresh the data when press "update" button. However, I want to monitor changes in the table constantly. In Firebase there is onSnapShot() method, which sends the new data when a collection is updated.
I want to make something similar with AWS. However, in official documentation, I do not see how to correctly do it.
So here are 2 questions:
How can I connect to the WebSocket with aws-sdk? (Currently, I can connect only from the terminal with wscat -c myurl call. Or shall I simply send http.Post with websocket url?
is it possible to pass invoke in the callback URL? - I want to get data from DynamoDB when page initialize and then invoke it again and again (with a callback URL)
My Lambda function looks like this:
exports.handler = async (event, context) => {
let params = {
TableName: "documents"
}
let respond = await db.scan(params).promise();
return respond;
};
On the front-end I have:
ngOnInit(): void {
AWS.config.credentials = new AWS.Credentials({
accessKeyId: '//mykey', secretAccessKey: '//mysecretkey'
})
AWS.config.update({
region:'//myregion'
})
this.updateTable() // triggers post request to APi Gateway => lambda and receives a response with data.
}
From my understanding, you will need to set up a DynamoDB stream and a lambda function that respond to the database CRUD events, send the updated data to the WebSocket connection if the event data matches the criteria (document id for example), through AWS.ApiGatewayManagementApi. (FYI: https://docs.aws.amazon.com/AWSJavaScriptSDK/latest/AWS/ApiGatewayManagementApi.html)

S3 putObject event - older version received

I am setting up a cloudwatch event to trigger on s3 put object and call a lambda function. I am able to trigger the function successfully and here is the sample code that I am trying to run.
exports.handler = function(event, context, callback) {
console.log("Incoming Event: ", event);
print("please");
const bucket = event.Records[0].s3.bucket.name;
const filename = decodeURIComponent(event.Records[0].s3.object.key.replace(/\+/g, ' '));
const message = `File is uploaded in - ${bucket} -> ${filename}`;
console.log(message);
callback(null, message);
};
I am getting an error as the event data does not contain the property "Records". I checked the AWS docs and the event data should contain "Records". The version shown in the documentation is "eventVersion":"2.2". In the event data I am getting the version as: eventVersion: '1.07'
Is there some additional configuration needed to make this work?
Here is what my cloudwatch event looks like:
You've configured CloudTrail API events. The format of those events is different to the event notifications generated from S3 (the docs you linked to).
If you go to the S3 bucket and apply an event trigger there, it will be in the format you expected. See Configuring Event Notifications.

AWS Lambda and AWS API Gateway: How to send a binary file?

I have a lambda function which fetches a file from s3 using the input key in event and needs to send the same to client. I am using the following function to get the file from s3
function getObject(key){
var params = {
Bucket: "my_bucket",
Key: key
}
return new Promise(function (resolve, reject){
s3.getObject(params, function (err, data){
if(err){
reject(err);
}
resolve(data.Body)
})
})
}
If I send the response of this promise (buffer) to context.succeed, it is displayed as a JSON array on front end. How can I send it as a file ? The files can be either ZIP or HTTP Archive (HAR) files. The s3 keys contain the appropriate extension. I am guessing it has got something to do with the "Integration Response" in API Gateway. But not able to figure out where to change
Good news, you can now handle binary input and output for API Gateway (announcement and documentation).
Basically, nothing changes in your Lambda Function, but you can now set the contentHandling API Gateway Integration property to CONVERT_TO_BINARY.
Unfortunately, the official AWS examples showcase only the HTTP API Gateway backend, as the AWS Lambda support seems not complete yet. For example, I haven't managed to return gzipped content from AWS Lambda yet, although it should be possible thanks to the new binary support and the $util.base64Decode() mapping utility.