Delete old versions of lambda functions uploaded through CloudFormation package command - amazon-web-services

I am using aws cloudformation package to automatically package and upload Lambda function code to an S3 bucket.
When I upload a new version of the Lambda function, previous objects remain in the S3 bucket. Is there any way to automatically remove "old" versions when uploading a new one?

Sadly there is no such way with just aws cloudformation package command. You would have to chain with other command to delete (aws s3api delete-object) the previous version yourself.

Related

How to use CloudFormation to update AWS Glue Jobs

We have many AWS Glue jobs and we are only updating the job code, which are scripts stored in S3.
The problem is CloudFormation couldn't tell when and when not to update our Glue jobs because all CloudFormation template parameters remain the same after script changes, even the script location is pointing to the same S3 object.
You can use the CloudFormation package command. This enables you to reference local files in your git repository as scripts for Glue Jobs. Every time before you deploy to CloudFormation you just run the package command.
As this is similar to Lambda code package, you can use parameterize the Glue script path and then have different versions of Glue script file
Glue CFT has "Command" parameter taking in "JobCommand" Type value which includes "ScriptLocation" attribute, make this as a CFT parameter and have the script dynamic
{
"Name" : String,
"PythonVersion" : String,
"ScriptLocation" : String
}
You can probably setup a CI/CD pipeline using AWS CodePipeline or your 3rd party CI/CD tool with the below steps
Pull the new code from your SCM like Github to deploy S3
Update CloudFormation stack with new S3 script path (with versions like v1, v2 etc...)

How to use AWS CLI within AWS Lambda?

I want to Copy data from an S3 bucket in one account and Region to another account and Region which is why I want to use AWS CLI to be triggered by an entry to the source s3 bucket and the lambda function can then use AWS CLI to run aws s3 sync
So I tried using the techniques given here: https://bezdelev.com/hacking/aws-cli-inside-lambda-layer-aws-s3-sync/
Basically
Install AWS CLI in a local virtual environment
Package AWS CLI and all its dependencies to a zip file
Create a Lambda Layer
However even after I add the layer I still see the error ModuleNotFoundError: No module named 'awscli'

How to create cloudformation template from SAm project?

I am trying to convert a SAM project to a cloudformation template in order to call
cloudformation.createStack()
to create multiple stacks when a lambda is invoked. So far I can upload the SAM project with
sam build
sam package
But the size of the S3 is to big and I am getting errors. What are the steps to correctly upload the cloudformation template?
These pre-reqs need to be met before continuing:
Install the SAM CLI.
Create an Amazon S3 bucket to store the serverless code artifacts that the SAM template generates. At a minimum, you will need permission to put objects into the bucket.
The permissions applied to your IAM identity must include iam:ListPolicies.
4.You must have AWS credentials configured either via the AWS CLI or in your shell's environment via the AWS_* environment variables.
5.Git installed.
6.Python 3.x installed.
(Optional) Install Python's virtualenvwrapper.
Source Link:- https://www.packetmischief.ca/2020/12/30/converting-from-aws-sam-to-cloudformation/

How to write AWS Code pipeline only for updated files?

I am very new to AWS.
I have task where I need to write Code pipeline to copy files from CodeCommit to S3 bucket, but condition is only update/deleted files should get copied. I have already written pipeline, but it copies all the files to S3 bucket.
I tried writing AWS Lambda + Python code, but it works the same. I used below chunk of code,
https://medium.com/#michael.niedermayr/using-aws-codecommit-and-lambda-for-automatic-code-deployment-to-s3-bucket-b35aa83d029b
Any help or suggestion appreciated
Instead of using the S3 deploy action to copy your files, use a CodeBuild action in your CodePipeline.
In the CodeBuild buildspec, use 'aws s3 sync' command with the '--delete' switch to sync files to the S3 bucket. Make sure the CodeBuild Service role has permission to write to the S3 bucket.

Why sam package publishes the artifacts to bucket?

As part of packaging the SAM application, the application published to s3 bucket as shown below:
sam package --template-file sam.yaml --s3-bucket mybucket --output-template-file output.yaml
Why sam package provides --s3-bucket option? Is this mandatory option?
What is the purpose of publishing artifacts to s3 bucket?
--s3-bucket option in sam package command is mandatory. What the command does is that it takes your local code, uploads it to S3 and returns transformed template where source location of your local code has been replaced with the S3 bucket URI (URI of object - zipped code - in the S3 bucket).
Main advantage of uploading artifact to S3 is that it is faster to deploy code that already sits within AWS network than send it through the Internet during deployment.
Another thing is that plain CloudFormation let's you inline lambda function code without pushing it to S3 but there are limitations to this approach. If your lambda function needs to use external libraries that are not part of AWS provided lambda environment for a particular runtime or your function's size is big then you still need to zip your function's code together with its dependencies and upload it to S3 before continuing. SAM just makes this easier for you so that you don't have to do this step manually.