GCP cloud build VIEW RAW logs link - google-cloud-platform

I have written a small cloud function in GCP which is subscribed to Pub/Sub event. When any cloud builds triggered function post message into the slack channel over webook.
In response, we get lots of details to trigger name, branch name, variables details but i am more interested in Build logs URL.
Currently getting build logs URL in response is like : logUrl: https://console.cloud.google.com/cloud-build/builds/899-08sdf-4412b-e3-bd52872?project=125205252525252
which requires GCP console access to check logs.
While in the console there an option View Raw. Is it possible to get that direct URL in the event response? so that i can directly sent it to slack and anyone can access direct logs without having GCP console access.

In your Cloud Build event message, you need to extract 2 values from the JSON message:
logsBucket
id
The raw file is stored here
<logsBucket>/log-<id>.txt
So, you can get it easily in your function with Cloud Storage client library (preferred solution) or with a simple HTTP Get call to the storage API.
If you need more guidance, let me know your dev language, I will send you a piece of code.

as #guillaume blaquiere helped.
Just sharing the piece of code used in cloud function to generate the singedURL of cloud build logs.
var filename ='log-' + build.id + '.txt';
var file = gcs.bucket(BUCKET_NAME).file(filename);
const getURL = async () => {
return new Promise((resolve, reject) => {
file.getSignedUrl({
action: 'read',
expires: Date.now() + 76000000
}, (err, url) => {
if (err) {
console.error(err);
reject(err);
}
console.log("URL");
resolve(url);
});
})
}
const singedUrl = await getURL();
if anyone looking for the whole code please follow this link : https://github.com/harsh4870/Cloud-build-slack-notification/blob/master/singedURL.js

Related

Using AWS SDK functions in Lambda

I would like to use AWS Lambda to restart an app server in an elastic beanstalk instance.
After extensive googling and not finding anything, I finally found an article that I have now lost that described how to do what I want. I used it to set up a Lambda function and an EventBridge cron schedule to run the function. Looking at my the cloudwatch logs for the eventbridge, I can see the function is successfully running. However, it is definitely not restarting my app server. When I use the console button to restart, the data I pulled in with another cron gets updated and I can see the date on my main page update. This does not happen when the lambda function runs, though it should.
Here is my function:
const AWS = require('aws-sdk');
const eb = new AWS.ElasticBeanstalk({apiVersion: '2010-12-01'});
exports.handler = async (event) => {
const params = {
EnvironmentName: 'my-prod-environment'
};
eb.restartAppServer(params, function(err, data) {
if (err) {
console.log(err, err.stack);
return err;
}
else {
console.log(data);
return data;
}
});
};
The logs, unfortunately, tell me nothing. Despite the console.log statement, no error or data appears in my logs, so I don't know if the function is even completing. Does anyone have any idea why this won't work?

Is it possible to make AWS Websocket + Lambda function to constant monitoring of the DynamoDB and send response to the client?

I have a serverless project: AWS + Angular on the frontend. Currently, I get the data when page is initialized and refresh the data when press "update" button. However, I want to monitor changes in the table constantly. In Firebase there is onSnapShot() method, which sends the new data when a collection is updated.
I want to make something similar with AWS. However, in official documentation, I do not see how to correctly do it.
So here are 2 questions:
How can I connect to the WebSocket with aws-sdk? (Currently, I can connect only from the terminal with wscat -c myurl call. Or shall I simply send http.Post with websocket url?
is it possible to pass invoke in the callback URL? - I want to get data from DynamoDB when page initialize and then invoke it again and again (with a callback URL)
My Lambda function looks like this:
exports.handler = async (event, context) => {
let params = {
TableName: "documents"
}
let respond = await db.scan(params).promise();
return respond;
};
On the front-end I have:
ngOnInit(): void {
AWS.config.credentials = new AWS.Credentials({
accessKeyId: '//mykey', secretAccessKey: '//mysecretkey'
})
AWS.config.update({
region:'//myregion'
})
this.updateTable() // triggers post request to APi Gateway => lambda and receives a response with data.
}
From my understanding, you will need to set up a DynamoDB stream and a lambda function that respond to the database CRUD events, send the updated data to the WebSocket connection if the event data matches the criteria (document id for example), through AWS.ApiGatewayManagementApi. (FYI: https://docs.aws.amazon.com/AWSJavaScriptSDK/latest/AWS/ApiGatewayManagementApi.html)

How can I get log content in AWS Lambda from Cloudwatch

I have this basic lambda that posts an image to a web server.
From the events in CloudWatch, I can log successfully anything that happens in that lambda function :
From this Log Group (the lambda function) I clicked on Stream to AWS Lambda, chose a new lambda function in which I expect to receive my logs and didn't put any filters at all so I can get all logs.
The Lambda is triggered properly, but the thing is when I persist what I received in the event and context objects, I have all CloudWatch log stream information but I don't see any of the logs.
What I get :
Do I need to specify a filter for me to see any logs at all? Because in the filter section if I don't put any filters and click on test filter, I get all the logs in the preview window which seems to mean it should send the whole logs to my Lambda function. Also, it looked to me the logs where that unreadable stream in AWSLogs and that it was in Base64 but didn't get any results trying to convert that.
Yes the logs are gzipped and base64-encoded as mentioned by jarmod.
Sample code in NodeJs for extracting the same in lambda will be:
var zlib = require('zlib');
exports.handler = (input, context, callback) => {
var payload = new Buffer(input.awslogs.data, 'base64');
zlib.gunzip(payload, function(e, result) {
if (e) {
context.fail(e);
} else {
result = JSON.parse(result.toString());
console.log(result);
}
});

Alexa Skill SDK : Requested skill's response error

I have followed a tutorial to get this working.
My Alexa skill is built with invocation, intents and utterances set up.
My Lambda function is set up.
My endpoints default region is:
arn:aws:lambda:us-east-1:(myID found in AWS Support Center ):function:myLearn
myLearn function in Lambda is set with Alexa Skills Kit which has my correct Skill ID copied from the skill.
My HelloIntent doesn't have a slot. I'm just trying to get a response from the invocation.
My code running node.js 6.10 with a handler called index.handler follows as this:
var Alexa = require("alexa-sdk");
var handlers = {
"HelloIntent": function () {
this.response.speak("Hello, It's Me.");
this.emit(':responseReady');
},
"LaunchRequest": function () {
this.response.speak("Welcome to my thing I got going on.");
this.emit(':responseReady');
}
};
exports.handler = function(event, context, callback){
var alexa = Alexa.handler(event, context);
alexa.registerHandlers(handlers);
alexa.execute();
};
I've read that there are issues with zips but I didn't upload anything - I just changed the default index.js file...and my handler isn't named anything different - it's index.handler.
When I run the test in the alexa console I get the ol:
"There was a problem with the requested skill's response"
My json output is null.
And when I go to my logs in Cloud Watch:
Unable to import module 'index': Error at Function.Module._resolveFilename
I did a search for this and many of the errors were how users uploaded the zips and there was a conflict with the handler name and js file.
It looks like you might have created the Lambda function from the AWS Console and not included the alexa-sdk. To fix this you can start by using one of the provided 'Alexa blueprints' that include the alexa-sdk then overwrite the code in the Lambda with your code. Or you can package your code in a .zip file that includes the alexa-sdk module and upload the package through the web console. Here is a video I did a while back that explains the issue https://youtu.be/cFzAIhsldbs - I'm pretty sure that's your problem. I hope this helps.
You can try using "speechOutput" variable to store your response and then use the emit function.

AWS Lambda and AWS API Gateway: How to send a binary file?

I have a lambda function which fetches a file from s3 using the input key in event and needs to send the same to client. I am using the following function to get the file from s3
function getObject(key){
var params = {
Bucket: "my_bucket",
Key: key
}
return new Promise(function (resolve, reject){
s3.getObject(params, function (err, data){
if(err){
reject(err);
}
resolve(data.Body)
})
})
}
If I send the response of this promise (buffer) to context.succeed, it is displayed as a JSON array on front end. How can I send it as a file ? The files can be either ZIP or HTTP Archive (HAR) files. The s3 keys contain the appropriate extension. I am guessing it has got something to do with the "Integration Response" in API Gateway. But not able to figure out where to change
Good news, you can now handle binary input and output for API Gateway (announcement and documentation).
Basically, nothing changes in your Lambda Function, but you can now set the contentHandling API Gateway Integration property to CONVERT_TO_BINARY.
Unfortunately, the official AWS examples showcase only the HTTP API Gateway backend, as the AWS Lambda support seems not complete yet. For example, I haven't managed to return gzipped content from AWS Lambda yet, although it should be possible thanks to the new binary support and the $util.base64Decode() mapping utility.