Suddenly lamda functions in my aws account are missing. and when i try to create new functions create function button keeps loading..
Related
The AWS cloud9 UI changed a lot and I cannot find how to create a lambda function from it. What I have done is to create a lambda function and use upload lambda to the folder but it doesn't show anything. How to do it?
I'm new to AWS and trying to create a lambda function to create a launch template daily. So we have a lambda function that is currently creating daily AMI's of EC2 instances that we are running. Is there any way to automatically run a lambda function daily to create a launch template using the latest created AMI as soon as the daily AMI creation as completed and also delete the old launch templates after certain days?
So you create an event rule as shared in this question is-it-possible-to-get-or-generate-event-for-ami-availability-without-polling.
And then hook your lambda function for creating the launch template to the above event.
I checked the other solution for checking the events for CreateImage and RegisterImage seems like they are triggered immediately as soon as you make the call.
I want to create a Lambda function using Java or Python so that whenever a new S3 bucket gets created, by default it enables default encryption of AES256 and server-access logs on those buckets, if they were not enabled while creating the bucket.
You can use Cloudwatch event here.
Go to cloudwatch and under events, select rule.
Create a new rule.
Select Event Pattern, specific operations, and then select CreateBucket.
Now in the next column you can click add target and select the
lambda you want to trigger. (create one if you don't already have it)
Now write your lambda using Java and use the SDK to make whatever changes you require.
I'm using CloudFormation to create lambda resources and I have several python scripts as well as one js script in a lambda-resources folder. Would it be okay to pass the same file location for every lambda function and just specify unique handlers? When my lambda function is created it looks like it only creates one lambda function.
Yes, this is definitely one way to accomplish what you're looking to do.
You'll need to create a zipped version of your lambda-resources folder and upload it via the Lambda service or even to S3, then reference it as the file location for each Lambda function.
I've been working with CloudFormation YAML for awhile and have found it to be comprehensive - until now. I'm struggling in trying to use SAM/CloudFormation to create a Lambda function that is triggered whenever an object is added to an existing S3 bucket.
All of the examples I've seen thus far seem to require that you create the bucket in the same CloudFormation script as you create the Lambda function. This doesn't work for me, because we have a design goal to be able to use CloudFormation redeploy our entire stack to different regions or AWS accounts and quickly stand up our application. S3 bucket names must be globally unique, so if I create the bucket in CloudFormation, the script will break when I try to deploy it to a different region/account. I could probably get around this by creating buckets with the account name/region in the name, but that's just not desirable from a bucket sprawl perspective.
So, does anyone have a solution for creating a Lambda function in CloudFormation that is triggered by objects being written to an existing S3 bucket?
Thanks!
This is impossible, according to the SAM team. This is something which the underlying CloudFormation service can't do.
There is a possible workaround, if you implement a Custom resource which would trigger a separate Lambda function to modify the existing bucket and link it to the Lambda function that you want to deploy.
As "implement a Custom Resource" isn't very specific: Here is an AWS github repo with scaffold code to help write it, and then you declare something like the following in your template (where LambdaToBucket) is the custom function you wrote. I've found that you need to configure two things in that function: one is a bucket notification configuration on the bucket (saying tell Lambda about changes), the other is a Lambda Permission on the function (saying allow invocations from S3).
Resources:
JoinLambdaToBucket:
Type: Custom::JoinLambdaToExistingBucket
Properties:
ServiceToken: !GetAtt LambdaToBucket.Arn