How to trigger a specific AWS Lambda version from AWS Dynamodb Streams - amazon-web-services

I have a dynamodb table that triggers a lambda function by enabling the Dynamodb streams. This was setup from the Dynamodb console. I would however like to be able to point the trigger to a specific version/alias of the lambda function. Most other AWS services allow you to specify the lambda ARN where you can tag on the version or alias at the end like arn:aws:lambda:::function::<version/alias>
However when adding a trigger to the dynamodb table , it only allows you to select the lambda function name from a list and there seems to be no way to use a version/alias.
There also does not seem to be a CLI/api command to do the same.
Has anyone had any success doing this?

We can attach a different trigger to each alias we have for a Lambda function. In order to do this, we just have to go Lambda console, select our function, and create a new alias.
After the alias is created, we will have the option to attach new triggers:
On this page we just have press + Add trigger button and we will have to search for DynamoDB. After we select DynamoDB, we are prompted to select our table for which we have the stream:

As I know the way that might be useful for you as well, Which is as follows.
Open AWS Console, Search the DynamoDB service, and open it.
Click your table and click on the triggers option.
Then you can see the lambda's which are linked with the stream. Click on the lambda you wanna change the version/alias. Then click on the edit/test trigger button.
You will redirect to the Lambda service page and where you can deploy your specific version of the lambda. Then stream will call that specific version of lambda.
Short Way:
Open AWS Console, Search lambda service, and open your lambda/function who is trigger with the stream.
Deploy the specific version as you need.
Hope that might be helpful for you!

Related

Enable/Disable or delete aws lambda trigger on s3:Objectcreated.* using python

I have a lambda function that triggers when any new objectCreated(s3:Objectcreated.*) in a Bucket A-prod, This lambda process the file saves the result in B-prod.
Now I want to restore this data on QA, I am writing a script for same and I don't want to process file again. For my case I will copy B-prod to B-qa and A-prod to A-qa via a python script.
It would be great if I can disable the lambda trigger temporarily, But It is not possible as aws do not allow disabling anymore(It is grey on aws console).
I can delete trigger from aws console but can't find how to do it from python or cli.
AWS Console Screenshot Lambda Function Trigger
can't find how to do it from python or cli.
In boto3 you use put_bucket_notification_configuration to provide empty notifications.
For anyone looking for a temporary disable from AWS console without deleting the trigger, here's a work-around. Note that this is not meant as a permanent solution. I had to disable temporarily and it works.
Go to S3 > Buckets > your-bucket > Properties
Scroll down to "Event Notifications"
You should see your Lambda Function in Destination,
Choose Edit,
Uncheck "All Object Create Events",
Check another event type that will not occur in your process, such as "Object Tagging",
Save.
When finished, enable by checking the "Object Create Events" and unchecking the other event.

Trigger Lambda function when S3 buckets get created

I want to create a Lambda function using Java or Python so that whenever a new S3 bucket gets created, by default it enables default encryption of AES256 and server-access logs on those buckets, if they were not enabled while creating the bucket.
You can use Cloudwatch event here.
Go to cloudwatch and under events, select rule.
Create a new rule.
Select Event Pattern, specific operations, and then select CreateBucket.
Now in the next column you can click add target and select the
lambda you want to trigger. (create one if you don't already have it)
Now write your lambda using Java and use the SDK to make whatever changes you require.

Invoke AWS Lambda function that is Written in Django App

I have written some cronjobs in my django app and I want to schedule these jobs using AWS Lambda service. Can someone please recommend a good approach to get this done?
I will answer this based on the question's topic rather than the body, since I am not sure what the OP means with "I want to schedule these jobs using AWS Lambda".
If all you want is trigger your Lambda function based in a cronjob, you can use CloudWatch Events to achieve this. You can specify regular cron expressions or some built-in expressions that AWS makes available, like rate(1 min) will run your function every minute. You can see how to trigger a Lambda function via CloudWatch Events on the docs. See cron/rate to see all the available options.
CloudWatch Events is only one of the many options to trigger you Lambda function. Your function can react to a whole bunch of AWS Events, including S3, SQS, SNS, API Gateway, etc. You can see the full list of events here. Just pick one that fits your needs and you are good to go.
EDIT AFTER OP'S UPDATE:
Yes, what you're looking for is CloudWatch Events. Once you have the Lambda to poll your database in place, you can just create a rule in CloudWatchEvents and have your Lambda be triggered by it. Please see the following images for guidance.
Go to CloudWatch, click on Events and choose Schedule as the Event Source
(make sure to setup your own Cron expression or select the pre-defined rate values)
On the right-hand side, choose your Lambda function accordingly.
Click on "Configure Details" when you are done, give it a name, leave the "Enabled" box checked and finally click on Create.
Go back to your Lambda function and you should see it's now triggered by CloudWatch Events (column on the left-hand side)
Your lambda is now configured properly and will execute once a day.

executing lambda on s3 bucket ObjectCreated event in cloudformation

I have a requirement to launch a number of lambda functions on the ObjectCreated event in a number of s3 buckets. But the architecture of my application requires modularity thus, I have to create two different templates, one for my bucket creation and another for the lambdas. According to me, one way to achieve this is by using the SNS service.
SNS
we create the SNS topic in the buckets creation template and provide the ObjectCreated event to it through NotificationConfiguration property of the s3. In the lambda template we can subscribe the lambda to the above mentioned SNS topic and the lambda function will be called on the s3 ObjectCreated event.
But again the architecture does not allows using SNS.
Possible way
Is it all possible to do this without using SNS and compromising on the modularity like making two separate templates for buckets and lambdas and using their notification configuration in a third template to complete the chain.
Final Question
I can not use SNS and I want modularity, how can I call my lambda functions on the s3 event? is it even ossible with my restrictions?
thank you
You could trigger your functions straight from S3 using events in the bucket properties. http://docs.aws.amazon.com/lambda/latest/dg/with-s3-example.html
You could also use a CloudWatch Event Rule to trigger your functions. To do so:
Go to your AWs Console and select Services > CloudWatch.
Select Rules under Events on the left.
Select Create Rule.
Leave Event Pattern selected.
Select Simple Storage Service (S3) from Service Name drop down.
Select Object Level Operations from Event Type drop down.
Select Specific operation(s).
Select PutObject from drop down.
Select Specific bucket(s) by name.
Enter bucket names.
Select + Add target* on the right.
Select Lambda function to trigger.
Select Configure details at the bottom of the page.
Enter a rule name.
Finish by selecting Create rule.

Run AWS Lambda code when creating a new AWS EC2 instance

I'd like to run some code using Lambda on the event that I create a new EC2 instance. Looking the blueprint config-rule-change-triggered I have the ability to run code depending on various configuration changes, but not when one is created. Is there a way to do what I want? Or have I misunderstood the use case of Lambda?
We had similar requirements couple of days back(Users were supposed to get emails whenever a new instance gets launched)
1) Go to cloudwatch, then select Rules
2) Select service name (its ec2 for your case) then select "Ec2 instance state-change notification"
3) Then select pending in "Specific state" dropdown
4) Click on Add target option and select your lambda function.
That's it, whenever a new instance gets launched, Cloudwatch will trigger your lambda function.
Hope it helps !!
You could do this by inserting code into your EC2 instance launch userdata and have that code explicitly invoke a Lambda function, but that's not the best way to do it.
A better way is to use a combination of CloudTrail and Lambda. If you enable CloudTrail logging (every a/c should have this enabled, all the time, in all regions) then CloudTrail will log to S3 all of the API calls made in your account. You then connect this to Lambda by configuring S3 to publish events to Lambda. Your Lambda function will receive an S3 event, can then retrieve the API logs, find RunInstances API calls, and then do whatever work you need to as a consequence of the new instance being launched.
Some helpful references here and here.
I don't see a notification trigger for instance startup, however what you can do is write a startup script and pass that in via userdata. That startup script would need to download and install the AWS CLI and then authenticate to SNS and publish a message to a pre-configured topic. The startup script would authenticate to SNS and whatever other AWS services are needed via your IAM Role, so you would need to give the IAM Role permission to do whatever you want the script to do. This can be done in the IAM console.
That topic would then have your Lambda function subscribed to it, which would execute. Similar to the below article (though the author is doing something similar for shutdown, not startup).
http://rogueleaderr.com/post/48795010760/how-to-notifyemail-yourself-when-an-ec2-instance
If you are putting the EC2 instances into an autoscale group, I believe there is a trigger that gets fired when the autoscale group launches a new instance, so you could take advantage of that.
I hope that helps.
If you have CloudTrail enabled, then you can have S3 PutObject/TrailBucket trigger a Lambda function. Lambda function parses the object that is passed to it and if it finds RunInstances event, then run your code.
I do the exact same thing to notify certain users when a new instance is launched. With Lambda/Python, it is ~20 lines of code.