I have a requirement to launch a number of lambda functions on the ObjectCreated event in a number of s3 buckets. But the architecture of my application requires modularity thus, I have to create two different templates, one for my bucket creation and another for the lambdas. According to me, one way to achieve this is by using the SNS service.
SNS
we create the SNS topic in the buckets creation template and provide the ObjectCreated event to it through NotificationConfiguration property of the s3. In the lambda template we can subscribe the lambda to the above mentioned SNS topic and the lambda function will be called on the s3 ObjectCreated event.
But again the architecture does not allows using SNS.
Possible way
Is it all possible to do this without using SNS and compromising on the modularity like making two separate templates for buckets and lambdas and using their notification configuration in a third template to complete the chain.
Final Question
I can not use SNS and I want modularity, how can I call my lambda functions on the s3 event? is it even ossible with my restrictions?
thank you
You could trigger your functions straight from S3 using events in the bucket properties. http://docs.aws.amazon.com/lambda/latest/dg/with-s3-example.html
You could also use a CloudWatch Event Rule to trigger your functions. To do so:
Go to your AWs Console and select Services > CloudWatch.
Select Rules under Events on the left.
Select Create Rule.
Leave Event Pattern selected.
Select Simple Storage Service (S3) from Service Name drop down.
Select Object Level Operations from Event Type drop down.
Select Specific operation(s).
Select PutObject from drop down.
Select Specific bucket(s) by name.
Enter bucket names.
Select + Add target* on the right.
Select Lambda function to trigger.
Select Configure details at the bottom of the page.
Enter a rule name.
Finish by selecting Create rule.
Related
I need to fetch information from a lambda function and remove an existing trigger (EventBridge) from this lambda using CLI (script needs to do that).
Tried to use list-event-source-mappings or delete-event-source-mappings but without success.
Seems like EventBridge isn't supported yet (showing me only SQS,Kinesis,DynamoDB,MQ,MSK) but maybe I am wrong and there is a solution?
Edit:
I have a working lambda function that has associated trigger with an Eventbridge rule which was already deleted in the past. It no longer exists in my account, but, I still see it under my Lambda trigger (it also says that this rule cannot be found any more because it is deleted - again, it still appears in my Lambda trigger and I want to CLEAN it using CLI.) I wish to DELETE the association (trigger) from my Lambda, not to delete the EventBridge TARGET which is the Lambda.
The APIs you are looking for are in the EventBridge events client:
aws events list-rule-names-by-target --target-arn <lambda-arn>
aws events list-targets-by-rule --rule <rule-name-from-previous>
aws events remove-targets --rule <rule-name-from-previous> --ids <target-id-from-previous>
Note: The terminology is a bit confusing. An Event Source Mapping is the technical term for the particular polling-type Lambda integration pattern that handles the sources you mention. It is not related to EventBridge events.
You should be able to use events command:
aws events list-rule-names-by-target --target-arn <target_arn>
This will list the names of the rules that are associated with the specified target_arn. You can then use the aws events describe-rule command to get more information about each rule, including the rule id, schedule and pattern.
aws events describe-rule --name <rule_name>
Now to remove a trigger for a Lambda function in EventBridge:
aws events remove-targets --rule <rule_name> --ids <target_id>
The target_id is the unique identifier for the trigger that you want to remove, and the rule_name is the name of the rule that the trigger is associated with.
I have a dynamodb table that triggers a lambda function by enabling the Dynamodb streams. This was setup from the Dynamodb console. I would however like to be able to point the trigger to a specific version/alias of the lambda function. Most other AWS services allow you to specify the lambda ARN where you can tag on the version or alias at the end like arn:aws:lambda:::function::<version/alias>
However when adding a trigger to the dynamodb table , it only allows you to select the lambda function name from a list and there seems to be no way to use a version/alias.
There also does not seem to be a CLI/api command to do the same.
Has anyone had any success doing this?
We can attach a different trigger to each alias we have for a Lambda function. In order to do this, we just have to go Lambda console, select our function, and create a new alias.
After the alias is created, we will have the option to attach new triggers:
On this page we just have press + Add trigger button and we will have to search for DynamoDB. After we select DynamoDB, we are prompted to select our table for which we have the stream:
As I know the way that might be useful for you as well, Which is as follows.
Open AWS Console, Search the DynamoDB service, and open it.
Click your table and click on the triggers option.
Then you can see the lambda's which are linked with the stream. Click on the lambda you wanna change the version/alias. Then click on the edit/test trigger button.
You will redirect to the Lambda service page and where you can deploy your specific version of the lambda. Then stream will call that specific version of lambda.
Short Way:
Open AWS Console, Search lambda service, and open your lambda/function who is trigger with the stream.
Deploy the specific version as you need.
Hope that might be helpful for you!
I am new to AWS lambda.
Will like to seek advice from the experts here.
I understand that Lambda is activated based on a trigger.
If I wanted to send a timed http request (for example, send a http request 4 hours later),
is there any recommendations to do it.
Yes, you can configure scheduled AWS Lambda Triggers using Cloudwatch.
Tutorial: Schedule AWS Lambda Functions Using CloudWatch Events
To create a rule using the console
Open the CloudWatch console at https://console.aws.amazon.com/cloudwatch/
In the navigation pane, choose Events, Create rule.
For Event Source, do the following:
a. Choose Schedule.
b. Choose Fixed rate of and specify the schedule interval (for example, 5 minutes).
For Targets, choose Add target, Lambda function.
For Function, select the Lambda function that you created.
Choose Configure details.
For Rule definition, type a name and description for the rule.
Choose Create rule.
I want to create a Lambda function using Java or Python so that whenever a new S3 bucket gets created, by default it enables default encryption of AES256 and server-access logs on those buckets, if they were not enabled while creating the bucket.
You can use Cloudwatch event here.
Go to cloudwatch and under events, select rule.
Create a new rule.
Select Event Pattern, specific operations, and then select CreateBucket.
Now in the next column you can click add target and select the
lambda you want to trigger. (create one if you don't already have it)
Now write your lambda using Java and use the SDK to make whatever changes you require.
I need to start a Lambda Function when an object has been created on an S3 Bucket. I found 2 solutions to do this.
Using AWS::S3::Bucket NotificationConfiguration.
Using a CloudWatch AWS::Events::Rule.
They both seem to do exactly the same thing, which is to track specific changes and launch a Lambda Function when it happens. I could not find any information on which one should be used. I'm using Cloud Formation Template to provision the Lambda, the S3 Bucket and the trigger.
Which one should I use to call a Lambda on Object level changes and why?
Use the 1st one because of
A push model is much better than a pull model. Push means you send data when you get it instead of polling onto something for some set of interval. This is an era for push notifications all over us. You don't go to facebook to check every 5 minutes if someone has liked your picture or not OR someone has replied to your comment, etc.
In terms of cost and efforts also, S3 event notification wins the race.
Cloudwatch was the best option if you didn't have S3 notification but since you have it, that's the best. Plus if you have a feature in the service itself then why will you go for an alternative solution like Cloudwatch rules.