GCS: Manually trigger "object created" event - google-cloud-platform

I have a Cloud Function that is being triggered by new objects creation in a GCS bucket.
Sometimes things go wrong and the GCF fails. I know that I can enable automatic retry. Still, it would be nice to be able to trigger "object created" event for existing objects during development/debugging. How do I do that?

Example on how to simulate a file upload event trig:
User a function to print a event result, so you can have a template of how is the event data:
def hello_gcs(event, context):
"""Triggered by a change to a Cloud Storage bucket.
Args:
event (dict): Event payload.
context (google.cloud.functions.Context): Metadata for the event.
"""
print(event)
Get the json result in the logs:
Use the tests tab to resend the same json object when you want for test (logs result can take some time)
You have to format the json before use because you have to use " insteade of '. Use this site
Json example:
{
"bucket":"<bucket name>",
"contentType":"image/png",
"crc32c":"a1/tEw==",
"etag":"9999999999999999/UCEAE=",
"generation":"9999999999999999",
"id":"<bucket name>/<file name>",
"kind":"storage#object",
"md5Hash":"9999999999999999==",
"mediaLink":"https://www.googleapis.com/download/storage/v1/b/<bucket name>/o/<file name>?generation=9999999999999999&alt=media",
"metageneration":"1",
"name":"Screenshot 2022-02-10 6.09.37 PM.png",
"selfLink":"https://www.googleapis.com/storage/v1/b/<bucket name>/o/<file name>",
"size":"452941",
"storageClass":"STANDARD",
"timeCreated":"2022-02-11T10:22:01.919Z",
"timeStorageClassUpdated":"2022-02-11T10:22:01.919Z",
"updated":"2022-02-11T10:22:01.919Z"
}

Related

How to add s3 trigger function in the lambda function to read the file

I need to add a s3 trigger in the lambda function in the source code itself instead of creating a trigger in the aws console. I need that trigger to read a file when it is uploaded on a particular folder of S3 bucket. I have done this using the creating the s3 trigger in the console itself with help of prefix. Can someone help with of creating this s3 trigger in the lambda function source code itself. Below is the source code of the lambda function for reading the file.
import json
import urllib.parse
import boto3
print('Loading function')
s3 = boto3.client('s3')
def lambda_handler(event, context):
#print("Received event: " + json.dumps(event, indent=2))
# Get the object from the event and show its content type
bucket = event['Records'][0]['s3']['bucket']['name']
key = urllib.parse.unquote_plus(event['Records'][0]['s3']['object']['key'], encoding='utf-8')
try:
response = s3.get_object(Bucket=bucket, Key=key)
print("CONTENT TYPE: " + response['ContentType'])
return response['ContentType']
except Exception as e:
print(e)
print('Error getting object {} from bucket {}. Make sure they exist and your bucket is in the same region as this function.'.format(key, bucket))
raise e
The AWS Lambda function would presumably only be run when the trigger detects a new object.
However, you are asking how to create the trigger from the AWS Lambda function, but that function will only be run when a trigger exists.
So, it is a Cart before the horse - Wikipedia situation.
Instead, you can create the trigger in Amazon S3 from the AWS Lambda console or via an API call, but this is done as part of the definition of the Lambda function -- it can't be done from within the source code of the function, since it isn't run until the trigger already exists!

how I can trigger Lambda function using boto3

I have a s3 bucket and this is path where I will upload a file dev/uploads/excel . Now I want to add a trigger to invoke a my already made lambda function. is there any specific code I have to run once to enable trigger for this function using boto3 ?or need to paste somewhere ? I am confuse how It will work ?
You need to add a S3 trigger on the Lambda function and handle the S3 event in your code.
To create the S3 trigger select the Add trigger option on the left-hand side on the Lambda console.
Since you want to trigger off of new uploads, you can create this event to trigger off of the PUT event.
For the prefix you can add the path you want: dev/uploads/excel
A Lambda example in Python:
def lambda_handler(event, context):
#print("Received event: " + json.dumps(event, indent=2))
# Get the object from the event and show its content type
bucket = event['Records'][0]['s3']['bucket']['name']
key = urllib.parse.unquote_plus(event['Records'][0]['s3']['object']['key'], encoding='utf-8')
try:
response = s3.get_object(Bucket=bucket, Key=key)
print("CONTENT TYPE: " + response['ContentType'])
return response['ContentType']
except Exception as e:
print(e)
print('Error getting object {} from bucket {}. Make sure they exist and your bucket is in the same region as this function.'.format(key, bucket))
raise e
Also, there a lot of docs explaining this, like the following one: Tutorial: Using an Amazon S3 trigger to invoke a Lambda function.

Can I create Slack subscriptions to an AWS SNS topic?

I'm trying to create a SNS topic in AWS and subscribe a lambda function to it that will send notifications to Slack apps/users.
I did read this article -
https://aws.amazon.com/premiumsupport/knowledge-center/sns-lambda-webhooks-chime-slack-teams/
that describes how to do it using this lambda code:
#!/usr/bin/python3.6
import urllib3
import json
http = urllib3.PoolManager()
def lambda_handler(event, context):
url = "https://hooks.slack.com/services/xxxxxxx"
msg = {
"channel": "#CHANNEL_NAME",
"username": "WEBHOOK_USERNAME",
"text": event['Records'][0]['Sns']['Message'],
"icon_emoji": ""
}
encoded_msg = json.dumps(msg).encode('utf-8')
resp = http.request('POST',url, body=encoded_msg)
print({
"message": event['Records'][0]['Sns']['Message'],
"status_code": resp.status,
"response": resp.data
})
but the problem is, that in that implementation I have to create a lambda function for every user.
I want to subscribe multiple Slack apps/users to one SNS topic.
Is there a way of doing that without creating a lambda function for each one?
You really DON'T need Lambda. Just SNS and SLACK are enough.
I found a way to integrate AWS SNS with slack WITHOUT AWS Lambda or AWS chatbot. With this approach you can confirm the subscription easily.
Follow the video which show all the step clearly.
https://www.youtube.com/watch?v=CszzQcPAqNM
Steps to follow:
Create slack channel or use existing channel
Create a work flow with selecting Webhook
Create a variable name as "SubscribeURL". The name
is very important
Add the above variable in the message body of the
workflow Publish the workflow and get the url
Add the above Url as subscription of the SNS You will see the subscription URL in the
slack channel
Follow the URl and complete the subscription
Come back to the work flow and change the "SubscribeURL" variable to "Message"
The publish the
message in SNS. you will see the message in the slack channel.
Hi i would say you should go for a for loop and make a list of all the users. Either manually state them in the lambda or get them with api call from slack e.g. this one here: https://api.slack.com/methods/users.list
#!/usr/bin/python3.6
import urllib3
import json
http = urllib3.PoolManager()
def lambda_handler(event, context):
userlist = ["name1", "name2"]
for user in userlist:
url = "https://hooks.slack.com/services/xxxxxxx"
msg = {
"channel": "#" + user, # not sure if the hash has to be here
"username": "WEBHOOK_USERNAME",
"text": event['Records'][0]['Sns']['Message'],
"icon_emoji": ""
}
encoded_msg = json.dumps(msg).encode('utf-8')
resp = http.request('POST',url, body=encoded_msg)
print({
"message": event['Records'][0]['Sns']['Message'],
"status_code": resp.status,
"response": resp.data
})
Another solution you can do is set up email for the slack users, see link:
https://slack.com/help/articles/206819278-Send-emails-to-Slack
When you can just add the emails as subscribers to the sns topic. You can fileter the msg that the receiver gets with Subscription filter policy.

Tagging EMR cluster via an AWS Lambda tiggered by a Cloudwatch event rule

I need to catch the event RunflowJob in my Cloudwatch EventRule in order to tag AWS EMR starting clusters.
I'm looking for this event, because i need the username and account informations
Any idea?
Thanks
Calls to the ListClusters, DescribeCluster, and RunJobFlow actions generate entries in CloudTrail log files.
Every log entry contains information about who generated the request. For example, if a request is made to create and run a new job flow (RunJobFlow), CloudTrail logs the user identity of the person or service that made the request
https://docs.aws.amazon.com/emr/latest/ManagementGuide/logging_emr_api_calls.html#understanding_emr_log_file_entries
Here is a sample snippet to get the username using Python Boto3.
import boto3
cloudtrail = boto3.client("cloudtrail")
response = cloudtrail.lookup_events (
LookupAttributes=[
{
'AttributeKey': 'EventName',
'AttributeValue': 'RunJobFlow'
}
],
)
for event in response.get ("Events"):
print(event.get ("Username"))
username and cluster details can be retrieved from the RunJobFlow event itself. Easier solution would be to use Cloudwatch event rule along with Lambda function as a target to fetch these info and subsequently further action can be taken as required. Example below:
Event Pattern to be used with Cloudwatch event rule
{
"source": ["aws.elasticmapreduce"],
"detail": {
"eventName": ["RunJobFlow"]
}
}
Lambda code snippet
def lambda_handler(event, context):
#print("Received event: " + json.dumps(event, indent=2))
user = event['detail']['userIdentity']['userName']
cluster_id = event['detail']['responseElements']['jobFlowId']
region = event['region']

Boto3 - Create S3 'object created' notification to trigger a lambda function

How do I use boto3 to simulate the Add Event Source action on the AWS GUI Console in the Event Sources tab.
I want to programatically create a trigger such that if an object is created in MyBucket, it will call MyLambda function(qualified with an alias).
The relevant api call that I see in the Boto3 documentation is create_event_source_mapping but it states explicitly that it is only for AWS Pull Model while I think that S3 belongs to the Push Model. Anyways, I tried using it but it didn't work.
Scenarios:
Passing a prefix filter would be nice too.
I was looking at the wrong side. This is configured on S3
s3 = boto3.resource('s3')
bucket_name = 'mybucket'
bucket_notification = s3.BucketNotification(bucket_name)
response = bucket_notification.put(
NotificationConfiguration={'LambdaFunctionConfigurations': [
{
'LambdaFunctionArn': 'arn:aws:lambda:us-east-1:033333333:function:mylambda:staging',
'Events': [
's3:ObjectCreated:*'
],
},
]})