Tagging EMR cluster via an AWS Lambda tiggered by a Cloudwatch event rule - amazon-web-services

I need to catch the event RunflowJob in my Cloudwatch EventRule in order to tag AWS EMR starting clusters.
I'm looking for this event, because i need the username and account informations
Any idea?
Thanks

Calls to the ListClusters, DescribeCluster, and RunJobFlow actions generate entries in CloudTrail log files.
Every log entry contains information about who generated the request. For example, if a request is made to create and run a new job flow (RunJobFlow), CloudTrail logs the user identity of the person or service that made the request
https://docs.aws.amazon.com/emr/latest/ManagementGuide/logging_emr_api_calls.html#understanding_emr_log_file_entries
Here is a sample snippet to get the username using Python Boto3.
import boto3
cloudtrail = boto3.client("cloudtrail")
response = cloudtrail.lookup_events (
LookupAttributes=[
{
'AttributeKey': 'EventName',
'AttributeValue': 'RunJobFlow'
}
],
)
for event in response.get ("Events"):
print(event.get ("Username"))

username and cluster details can be retrieved from the RunJobFlow event itself. Easier solution would be to use Cloudwatch event rule along with Lambda function as a target to fetch these info and subsequently further action can be taken as required. Example below:
Event Pattern to be used with Cloudwatch event rule
{
"source": ["aws.elasticmapreduce"],
"detail": {
"eventName": ["RunJobFlow"]
}
}
Lambda code snippet
def lambda_handler(event, context):
#print("Received event: " + json.dumps(event, indent=2))
user = event['detail']['userIdentity']['userName']
cluster_id = event['detail']['responseElements']['jobFlowId']
region = event['region']

Related

GCS: Manually trigger "object created" event

I have a Cloud Function that is being triggered by new objects creation in a GCS bucket.
Sometimes things go wrong and the GCF fails. I know that I can enable automatic retry. Still, it would be nice to be able to trigger "object created" event for existing objects during development/debugging. How do I do that?
Example on how to simulate a file upload event trig:
User a function to print a event result, so you can have a template of how is the event data:
def hello_gcs(event, context):
"""Triggered by a change to a Cloud Storage bucket.
Args:
event (dict): Event payload.
context (google.cloud.functions.Context): Metadata for the event.
"""
print(event)
Get the json result in the logs:
Use the tests tab to resend the same json object when you want for test (logs result can take some time)
You have to format the json before use because you have to use " insteade of '. Use this site
Json example:
{
"bucket":"<bucket name>",
"contentType":"image/png",
"crc32c":"a1/tEw==",
"etag":"9999999999999999/UCEAE=",
"generation":"9999999999999999",
"id":"<bucket name>/<file name>",
"kind":"storage#object",
"md5Hash":"9999999999999999==",
"mediaLink":"https://www.googleapis.com/download/storage/v1/b/<bucket name>/o/<file name>?generation=9999999999999999&alt=media",
"metageneration":"1",
"name":"Screenshot 2022-02-10 6.09.37 PM.png",
"selfLink":"https://www.googleapis.com/storage/v1/b/<bucket name>/o/<file name>",
"size":"452941",
"storageClass":"STANDARD",
"timeCreated":"2022-02-11T10:22:01.919Z",
"timeStorageClassUpdated":"2022-02-11T10:22:01.919Z",
"updated":"2022-02-11T10:22:01.919Z"
}

Can I create Slack subscriptions to an AWS SNS topic?

I'm trying to create a SNS topic in AWS and subscribe a lambda function to it that will send notifications to Slack apps/users.
I did read this article -
https://aws.amazon.com/premiumsupport/knowledge-center/sns-lambda-webhooks-chime-slack-teams/
that describes how to do it using this lambda code:
#!/usr/bin/python3.6
import urllib3
import json
http = urllib3.PoolManager()
def lambda_handler(event, context):
url = "https://hooks.slack.com/services/xxxxxxx"
msg = {
"channel": "#CHANNEL_NAME",
"username": "WEBHOOK_USERNAME",
"text": event['Records'][0]['Sns']['Message'],
"icon_emoji": ""
}
encoded_msg = json.dumps(msg).encode('utf-8')
resp = http.request('POST',url, body=encoded_msg)
print({
"message": event['Records'][0]['Sns']['Message'],
"status_code": resp.status,
"response": resp.data
})
but the problem is, that in that implementation I have to create a lambda function for every user.
I want to subscribe multiple Slack apps/users to one SNS topic.
Is there a way of doing that without creating a lambda function for each one?
You really DON'T need Lambda. Just SNS and SLACK are enough.
I found a way to integrate AWS SNS with slack WITHOUT AWS Lambda or AWS chatbot. With this approach you can confirm the subscription easily.
Follow the video which show all the step clearly.
https://www.youtube.com/watch?v=CszzQcPAqNM
Steps to follow:
Create slack channel or use existing channel
Create a work flow with selecting Webhook
Create a variable name as "SubscribeURL". The name
is very important
Add the above variable in the message body of the
workflow Publish the workflow and get the url
Add the above Url as subscription of the SNS You will see the subscription URL in the
slack channel
Follow the URl and complete the subscription
Come back to the work flow and change the "SubscribeURL" variable to "Message"
The publish the
message in SNS. you will see the message in the slack channel.
Hi i would say you should go for a for loop and make a list of all the users. Either manually state them in the lambda or get them with api call from slack e.g. this one here: https://api.slack.com/methods/users.list
#!/usr/bin/python3.6
import urllib3
import json
http = urllib3.PoolManager()
def lambda_handler(event, context):
userlist = ["name1", "name2"]
for user in userlist:
url = "https://hooks.slack.com/services/xxxxxxx"
msg = {
"channel": "#" + user, # not sure if the hash has to be here
"username": "WEBHOOK_USERNAME",
"text": event['Records'][0]['Sns']['Message'],
"icon_emoji": ""
}
encoded_msg = json.dumps(msg).encode('utf-8')
resp = http.request('POST',url, body=encoded_msg)
print({
"message": event['Records'][0]['Sns']['Message'],
"status_code": resp.status,
"response": resp.data
})
Another solution you can do is set up email for the slack users, see link:
https://slack.com/help/articles/206819278-Send-emails-to-Slack
When you can just add the emails as subscribers to the sns topic. You can fileter the msg that the receiver gets with Subscription filter policy.

Can we Integrate AWS lambda with AWS IOT and perform operations in the IOT using this lambda function?

The scenario is like this
I have a microservice which invokes a LAMBDA function whose role will be to delete things from the AWS IOT.
Is there a way I can perform operations in AWS IOT using the lambda function?
Any article, blog regarding this will be a huge help as I'm not able to find any integration document on the web.
I found a way to do several operations in AWS IOT using lambda function by the means of API's.
https://boto3.amazonaws.com/v1/documentation/api/latest/reference/services/iot.html#IoT.Client.delete_thing_group
The above link has description about API's which will help in this case.
A sample lambda function script to delete a thing from the IOT is
import json
import boto3
def lambda_handler(event, context):
thing_name = event['thing_name']
delete_thing(thing_name=thing_name)
def delete_thing(thing_name):
c_iot = boto3.client('iot')
print(" DELETING {}".format(thing_name))
try:
r_principals = c_iot.list_thing_principals(thingName=thing_name)
except Exception as e:
print("ERROR listing thing principals: {}".format(e))
r_principals = {'principals': []}
print("r_principals: {}".format(r_principals))
for arn in r_principals['principals']:
cert_id = arn.split('/')[1]
print(" arn: {} cert_id: {}".format(arn, cert_id))
r_detach_thing = c_iot.detach_thing_principal(thingName=thing_name, principal=arn)
print("DETACH THING: {}".format(r_detach_thing))
r_upd_cert = c_iot.update_certificate(certificateId=cert_id, newStatus='INACTIVE')
print("INACTIVE: {}".format(r_upd_cert))
r_del_cert = c_iot.delete_certificate(certificateId=cert_id, forceDelete=True)
print(" DEL CERT: {}".format(r_del_cert))
r_del_thing = c_iot.delete_thing(thingName=thing_name)
print(" DELETE THING: {}\n".format(r_del_thing))
And the input for this lambda function will be
{
"thing_name": "MyIotThing"
}

Extracting EC2InstanceId from SNS/SQS Auto Scaling message

I'm using python Boto3 code, when an instance is terminated from Auto Scaling group it notifies SNS which publishes the message to SQS. Lambda is also triggered when SNS is notified, which executes a boto script to grab the message from SQS.
I am using reference code from Sending and Receiving Messages in Amazon SQS.
Here is the code snippet:
if messages.get('Messages'):
m = messages.get('Messages')[0]
body = m['Body']
print('Received and deleted message: %s' % body)
The result is:
START RequestId: 1234-xxxxxxxx Version: $LATEST
{
"Type" : "Notification",
"MessageId" : "d1234xxxxxx",
"TopicArn" : "arn:aws:sns:us-east-1:xxxxxxxxxx:AutoScale-Topic",
"Subject" : "Auto Scaling: termination for group \"ASG\"",
"Message" : "{\"Progress\":50,\"AccountId\":\"xxxxxxxxx\",\"Description\":\"Terminating EC2 instance: i-123456\",\"RequestId\":\"db-xxxxx\",\"EndTime\":\"2017-07-13T22:17:19.678Z\",\"AutoScalingGroupARN\":\"arn:aws:autoscaling:us-east-1:360695249386:autoScalingGroup:fef71649-b184xxxxxx:autoScalingGroupName/ASG\",\"ActivityId\":\"db123xx\",\"EC2InstanceId\":\"i-123456\",\"StatusCode\"\"}",
"Timestamp" : "2017-07-",
"SignatureVersion" : "1",
"Signature" : "",
"SigningCertURL" : "https://sns.us-east-1.amazonaws.com/..",
"UnsubscribeURL" : "https://sns.us-east-1.amazonaws.com/
}
I only need EC2InstanceId of the terminated instance not the whole message. How can I extract the ID?
If your goal is to execute an AWS Lambda function (having the EC2 Instance ID as a parameter), there is no need to also publish the message to an Amazon SQS queue. In fact, this would be unreliable because you cannot guarantee that the message being retrieved from the SQS queue matches the invocation of your Lambda function.
Fortunately, when Auto Scaling sends an event to SNS and SNS then triggers a Lambda function, SNS passes the necessary information directly to the Lambda function.
Start your Lambda function with this code (or similar):
def lambda_handler(event, context):
# Dump the event to the log, for debugging purposes
print("Received event: " + json.dumps(event, indent=2))
# Extract the EC2 instance ID from the Auto Scaling event notification
message = event['Records'][0]['Sns']['Message']
autoscalingInfo = json.loads(message)
ec2InstanceId = autoscalingInfo['EC2InstanceId']
Your code then has the EC2 Instance ID, without having to use Amazon SQS.
The instance id is in the message. It's raw JSON, so you can parse it with the json package and get the information.
import json
if messages.get('Messages'):
m = messages.get('Messages')[0]
body = m['Body']
notification_message = json.loads(body["Message"])
print('instance id is: %s' % notification_message["EC2InstanceId"])

Boto3 - Create S3 'object created' notification to trigger a lambda function

How do I use boto3 to simulate the Add Event Source action on the AWS GUI Console in the Event Sources tab.
I want to programatically create a trigger such that if an object is created in MyBucket, it will call MyLambda function(qualified with an alias).
The relevant api call that I see in the Boto3 documentation is create_event_source_mapping but it states explicitly that it is only for AWS Pull Model while I think that S3 belongs to the Push Model. Anyways, I tried using it but it didn't work.
Scenarios:
Passing a prefix filter would be nice too.
I was looking at the wrong side. This is configured on S3
s3 = boto3.resource('s3')
bucket_name = 'mybucket'
bucket_notification = s3.BucketNotification(bucket_name)
response = bucket_notification.put(
NotificationConfiguration={'LambdaFunctionConfigurations': [
{
'LambdaFunctionArn': 'arn:aws:lambda:us-east-1:033333333:function:mylambda:staging',
'Events': [
's3:ObjectCreated:*'
],
},
]})