I have created a S3 bucket made it public using SAM template, is there any way I cloud upload objects to bucket from the template
I'm not familiar with SAM, but I know that CloudFormation cannot populate the contents of a bucket.
One workaround is to create a CloudFormation Custom Resource, which triggers an AWS Lambda function during the stack deployment. The Lambda function can then copy files, such as copying them between S3 buckets.
I have written such a function. If you do it well, you could a list of the files to copy as parameters within the CloudFormation template, so that the same function can be used in multiple templates. Writing your first Custom Resource function can be challenging, since it needs to 'return' differently to normal functions.
Related
i have a lambda function created using the existing blueprint "s3-get-object".
Using the aWS lambda java sdk, i am trying to get the s3 bucket name associated with the lambda function.
I used LambdaAsyncClient.listFunctionEventInvokeConfigs(...) method, but that did not help. Please advice how to achieve this
I need to create an Amazon S3 bucket and a lambda function using CloudFormation. I have the jar file in my local. If I write resources for the S3 bucket and a lambda function in a single template, I have to provide S3bucket and key in the lambda resource. Stack creation fails, as the jar file doesn't exist in the bucket. So, does this mean that I have to create a bucket separately using a template, upload the jar file, and then create a lambda function using another template?
Is there a way to create both the resources using one template?
For some resource properties that require an Amazon S3 location (a bucket name and filename), you can specify local references. Instead of manually uploading the files to an S3 bucket and then adding the location to your template, you can specify local artifacts in your template and then use the aws cloudformation package command to quickly upload them.
You can find more info here: Uploading Local Artifacts to an S3 Bucket
So, does this mean that I have to create a bucket separately using a template, upload the jar file, and then create a lambda function using another template?
Yes and no.
Normally when you create a bucket it will be empty. You can't populate it with vanilla CloudFormation. Normally you would have to manually (e.g. using AWS CLI, SDK or using console) upload the jar file.
For more advanced solution to keep everything inside CloudFormation you would have to look at creating your own Custom Resources in CloudFormation which would upload the jar file for you in CloudFormation. For this, your jar would need to be available online so that it can be downloaded into your bucket.
So if you are just starting with CloudFormation it will be probably to difficult to create a custom resource at first.
I am creating S3 buckets using AWS SAM and I want them to be populated with files after SAM deploy. Is there a way to populate the S3 buckets by default from SAM with files? An idea I has was if there is a way to trigger a lambda when the SAM application is deployed that can populate the bucket.
Look into using a Cloudformation Custom resource.
This allows you to invoke your own code (i.e. Lambda) during a cloudformation stack creation, update and deletion event. I have seen people use this to populate an S3 bucket as well as ensure all files are deleted from the bucket when you tear down (since CF will fail to delete a bucket if it has files in it).
Is there a way to add a trigger to a Lambda function in Cloudformation for s3 events, where the s3 bucket already exists? (i.e, is not created by said template)
I have tried to find an example of this online, but it appears that the only way to set this trigger in CF is by using the bucket notification configuration.
Cloudformation cannot do this directly. However, Cloudformation Custom Resources can call Lambda functions, and Lambda functions can do whatever you program them to do. You could write a Lambda function which creates or deletes some resource based on whatever logic you want.
See more:
AWS Lambda-backed Custom Resources - AWS CloudFormation
I'm trying to create a Lambda function that will be triggered by any change made to any bucket in the S3 console. Is there a way to tie all create events from every bucket in S3 to my Lambda function?
It appears that in the creation of a Lambda function, you can only select one S3 bucket. Is there a way to do this programmatically, if not in the Lambda console?
There is at least one way: you can setup an s3 event notifications, for each bucket you want to monitor, all pointing to a single SQS queue.
That SQS queue can then be the event source for your lambda function.
If you are using any aws-sdk to upload to s3 there is a workaround by setting up an API gateway endpoint to trigger lambda whenever the upload to s3 succeeded.
passing the bucket-name & object-key to lambda you may also specify the dest bucket dynamically.
This also will be helpful with nested prefixes.
e.g.
bucket/users/avatars/user1.jpg
bucket/users/avatars/thumbnails/user1-thumb.jpg
Yes you can, assume that you only want to trigger Lambda if there're new created objects in a few buckets, you can do it via AWS Console, cli, boto3 & other SDK.
If over time there're new bucket created & you also want to add it as event source for Lambda, you can create a Cloudtrail API event source to trigger another Lambda to programmaticallyy dd these new buckets as event sources for the original Lambda.