Versioning AWS Lambda via terraform and gitlab yml script - amazon-web-services

I am sure there are multiple ways AWS Lambda can be versioned/published, but I am trying to do it in a particular way and need some help.
I have a dotnet core Lambda project as "MyTTL".
Now in gitlab YML script I have code which will push the Lambda to S3 bucket like below (Pseudo Script).
GITLAB SCRIPT
variables:
OUTPUT_FILE_PATH: '$CI_PROJECT_DIR/bin/Release/netcoreapp3.1/MyTTL.zip'
- dotnet lambda package
- aws s3 cp $OUTPUT_FILE_PATH s3://$S3_BUCKET/
Now above script works fine and upload MyTTL.zip to S3 bucket.
Now in the terraform I have below script to reference that Lambda
resource "aws_lambda_function" "lambda" {
s3_bucket = "My S3 BUCKET"
s3_key = "protected/sample/${var.artifact_version}.zip"
source_code_hash = "${filebase64sha256("${var.artifact_version}.zip")}"
}
As you can see I want to pass a version (artifact_version) to this module, so that I can tell which Lambda version a particular client is running on.
Question - I am not sure how do I make sure on every dotnet lambda package a new zip version is created so that old terraform can still point to the old version of Lambda code and I can make terraform modifcation to new version of lambda for different clients at will?
Manual Lame Solution - I make the code change in my dotnet core project let the gitlab script publish it to S3 then i download it rename that zip to version I want and then upload it to S3 and then later reference it in terraform

variables:
OUTPUT_FILE_PATH: '$CI_PROJECT_DIR/bin/Release/netcoreapp3.1/MyTTL.zip'
- dotnet lambda package
- aws s3 cp $OUTPUT_FILE_PATH s3://$S3_BUCKET/MyTTL${CI_COMMIT_SHORT_SHA }.zip
Now you have diferent versions of your lambda proyect... that with the hash of your commit... and now you no need to download only change the hash in the name. That hash always be unique on every commit.

Related

AWS CDK accessing parameters when deploying stacks on the pipeline via yaml, typescript and nodejs

I'm fairly new to AWS and using the CDK but have been working on a project which deploys to AWS via a pipeline using yaml for the cf-template and later a node script to run cdk deploy on a set of stack files that have been written in Typescript.
In the cf-template yaml where the cdk-toolkit is being defined there's a bucket resource with name X. After the toolkit has been created/updated in the pipeline, the cdk deploy command is executed to deploy some stacks and workers, which should live in bucket X. They aren't automatically being uploaded there however, so I've tried using the --parameters flag to specify X as below.
cdk deploy --toolkit-stack-name my-toolkit --parameters uploadBucketName=X --ci --require-approval never
When I do this I get the following error in the pipeline for the first stack that gets deployed:
Parameters: [uploadBucketName] do not exist in the template
I assumed this meant that in the MyFirstStack.ts file it was missing a parameter definition as suggested by the AWS documentation, but it's not clear to me why this is necessary or how it's supposed to be used when it's the cdk deploy command which provides a value for this parameter. I tried adding it per the docs
const uploadBucketName = new CfnParameter(this, "uploadBucketName", {
type: "String",
description: "The name of the Amazon S3 bucket where uploaded files will be stored."});
but not sure if this is really the right thing to do, and it doesn't work besides - I still get the same error.
Does anyone have any ideas where I'm going wrong?

Both SAM and cdk can make stack, SAM include cdk ? or cdk include SAM?

I want to make two bucket(x,y) in S3 and make a lambda.
My goal is uploading files to S3(x) and it triggers lambda then lambda create and put file in S3(y)
Currently, I am developing lambda function on SAM.
Deploying lambda function by SAM
And I made two S3 buckets by cdk.
Then manually adding trigger and Iam policy to lambda to access S3
However I want to do this all automatically.
So my idea is ,
SAM can make two S3 bucket as stack and I don't need cdk anymore ?
cdk can include SAM development environment?
Any other way??
What is the best practice for this purpose??
My `solution is
Local development with SAM
AWS deployment is carried out by cdk only, SAM doesn't work anything for deployment.
My folder structure is below
cdk / bin
lib
cdk.json
etc
samproj/helloworld/app.py
/samconfig.toml
/template.yaml
/etc
For local developing, in samproj directory, do something like this, tutorial.
sam local invoke "HelloWorldFunction" -e events/event.json
And for AWS deployment by cdk project .
Just make lambda directly from samproj/helloworld directory in Stack.
export class CdkVrBaseStack extends Stack {
const lambda_ = new lambda.Function(this, 'TestLambda', {
functionName: 'testLambda',
runtime: lambda.Runtime.PYTHON_3_9,
code: lambda.Code.fromAsset('samproj/helloworld'),
handler: 'index.handler',
timeout: cdk.Duration.seconds(300),

Automate Lambda deployments using S3 buckets zip bundled code

Details - I have a CircleCI job that makes a zip of my lambda code and uploads it to S3 (We just keep updating the version of same s3 object for e.g. code.zip we dont change name).
Now i have CDK AWS code where i am defining the body of my lambda and making use of the s3 zip file using this url https://docs.aws.amazon.com/cdk/api/latest/docs/#aws-cdk_aws-lambda.Code.html#static-fromwbrbucketbucket-key-objectversion.
Issue - Now i want something automated deployment that whenever there is new version of code.zip file gets uploaded to S3, my all lambdas using should be automatically updated with the latest code.
Please suggest !!!
I can think of 2 solution for this
Have a step after you update the latest code in S3 to update your lambda function like below
aws lambda update-function-code
--function-name your_function_name
--s3-bucket --s3-key your_code.zip
Create another lamda function and create S3 create object or whatever event suits for you and even you can filter by .zip
And in you lambda function which will be triggered by S3 upload you can again use same AWS cli command to update your lambda function

aws codepipline update lambda function source using s3 object

I am using terraform to create all the infra(CodePipeline, lambda, buckets) on AWS
currently, I've created a pipeline that builds the source zip file and puts it on s3 bucket but the lambda still keeps using the older source. So, I update the URL manually in the AWS console and it works.
Now I want to automate the flow but available solutions are:
AWS SAM + CFT
Codebuild Stage to update the source using AWS CLI
Create a lambda that updates the source
Code Deploy + AWS SAM + CFT
I am not willing to use CFT at all since all of our code is in terraform and CFT requires me to create new lambdas instead of using old ones.
is there any other simpler way to update the lambda source through Codepipeline
The preferred way to deploy a Lambda via CodePipeline is using a CloudFormation Deploy action [1]. Since you are not looking to use CloudFormation, next option could be to run your terraform plan/apply commands from within a CodeBuild job that is part of the pipeline. You will need to provide the CodeBuild role required permission for resource creation (or export the credentials in Environment variabels for TF to use via this [2] method) and install the TF binary within install phase of buildspec.
Ref:
[1] Building a Continuous Delivery Pipeline for a Lambda Application with AWS CodePipeline - https://docs.aws.amazon.com/lambda/latest/dg/build-pipeline.html
[2] How to retrieve Secret Manager data in buildspec.yaml

Storing AWS Lambda Function code directly in S3 Bucket

AWS Lambda Functions have an option to enter the code uploaded as a file from S3. I have a successfully running lambda function with the code taken as a zip file from an S3 Bucket, however, any time you would like to update this code you would need to either manually edit the code inline within the lambda function or upload a new zip file to S3 and go into the lambda function and manually re-upload the file from S3. Is there any way to get the lambda function to link to a file in S3 so that it will automatically update its function code when you update the code file (or zip file) contained in S3?
Lambda doesn't actually reference the S3 code when it runs--just when it sets up the function. It is like it takes a copy of the code in your bucket and then runs the copy. So while there isn't a direct way to get the lambda function to automatically run the latest code in your bucket, you can make a small script to update the function code using SDK methods. I don't know which language you might want to use, but for example, you could write a script to call the AWS CLI to update the function code. See https://docs.aws.amazon.com/cli/latest/reference/lambda/update-function-code.html
Updates a Lambda function's code.
The function's code is locked when you publish a version. You can't
modify the code of a published version, only the unpublished version.
See also: AWS API Documentation
See 'aws help' for descriptions of global parameters.
Synopsis
update-function-code
--function-name [--zip-file ] [--s3-bucket ] [--s3-key ] [--s3-object-version ] [--publish |
--no-publish] [--dry-run | --no-dry-run] [--revision-id ] [--cli-input-json ] [--generate-cli-skeleton ]
You could do similar things using Python or PowerShell as well, such as using
https://boto3.amazonaws.com/v1/documentation/api/latest/reference/services/lambda.html#Lambda.Client.update_function_code
You can set up an AWS Code deploy pipeline to get your code build and deployed on code commit in your code repository(github,bitbucket,etc)
CodeDeploy is a deployment service that automates application
deployments to Amazon EC2 instances, on-premises instances, serverless
Lambda functions, or Amazon ECS services.
Also, wanted to add if you want to go on a more unattended route of deploying your Updated code to the Lambda use this flow in your code Pipeline
Source -> Code Build (npm installs and zipping etc.) -> S3 Upload (sourcecode.zip in S3 bucket) -> Code Build (another build just for aws lambda update-funtion-code)
Make sure the role for the last stage has both S3 getObject and Lambda UpdateFunctionCode policies attached to it.