How to setup automated deployment from CodeCommit to a Lambda Function? - amazon-web-services

I'm trying to "deploy" the content of a CodeCommit repository to a Lambda function (not an application).
In this particular case it's a simple copy/paste from source to destination.
I'm struggling to find a solution that doesn't involve setting up another Lambda function. From what I understand, there is a solution using CodeBuild and CloudFormation.
Does anyone have a solution for this? Alternatively, can you point to any good documentation?
P.S:
I found this question that seems to answer my question but the links in the relevant answer are outdated.

You can build a Code Commit Pipeline with a CodeBuild Job where you CodeCommit repository has a SAM Template like below and you run
sam build && sam deploy
From the codebuild job.
AWSTemplateFormatVersion : '2010-09-09'
Transform: AWS::Serverless-2016-10-31
Description: A sample SAM template for deploying Lambda functions.
Resources:
# Details about the myDateTimeFunction Lambda function
myDateTimeFunction:
Type: AWS::Serverless::Function
Properties:
Handler: myDateTimeFunction.handler
Runtime: nodejs12.x
# Creates an alias named "live" for the function, and automatically publishes when you update the function.
AutoPublishAlias: live
DeploymentPreference:
# Specifies the deployment configuration
Type: Linear10PercentEvery2Minutes
This documentation page describes the same CodeCommit Rolling deployments for Lambda functions

This is the solution that worked for me.
I setup a pipeline with CodeCommit as source and a Build Phase (no Deploy Phase).
The Build phase reads a buildspec.yml file which itself reads SAM template called template.yml. The SAM stack is created via CloudFormation.
I created an s3 bucket to hold the build artifacts.
Here is the sample buildspec.yml file:
version: 0.2
phases:
install:
commands:
- echo Nothing to do in the install phase...
pre_build:
commands:
- echo Nothing to do in the pre_build phase...
build:
commands:
- aws cloudformation package --template-file template.yml
--s3-bucket <bucketname>
--output-template-file newtemplate.yml
- aws cloudformation deploy --stack-name <stackname>
--capabilities CAPABILITY_IAM
--template-file newtemplate.yml
--role-arn arn:aws:iam::<account number>:role/CloudFormationServiceRole
post_build:
commands:
- echo Build completed
Here is the sample template.yml file :
AWSTemplateFormatVersion : '2010-09-09'
Transform: AWS::Serverless-2016-10-31
Description: CloudFormation Stack for the lambda function
Resources:
# Details about the Lambda function
<StackName>:
Type: AWS::Serverless::Function
Properties:
Handler: index.handler
Runtime: nodejs12.x
CodeUri: src/
# Creates an alias named "live" for the function, and automatically publishes when you update the function.
AutoPublishAlias: live
DeploymentPreference:
# Specifies the deployment configuration
Type: AllAtOnce
The file structure is :
.
├── src/
│ ├── node_modules/
│ └── index.js
├── builspec.yml
└── template.yml
Make sure you set the correct IAM policies for the CloudFormation and CodeBuild IAMs.

Related

AWS SAM package Parameter CodeUri refers to a file or folder that does not exist

I am trying to deploy a lambda function to AWS via codepipeline.
In the pipeline I have a codebuild step which uses SAM to build the function and package it into an S3 bucket.
I keep getting the following error
Error: Unable to upload artifact LambdaMultiRegion referenced by CodeUri parameter of LambdaMultiRegion resource.
Parameter CodeUri of resource LambdaMultiRegion refers to a file or folder that does not exist /codebuild/output/src259622855/src/.aws-sam/build/LambdaMultiRegion
I have confirmed that the bucket does exist and codebuild's IAM role has access to it.
Buildspec file looks like this
version: 0.2
phases:
install:
runtime-versions:
python: 3.7
build:
commands:
- sam build
- sam package --s3-bucket poc-lambdamultiregion --output-template-file outputtemplate.yml
artifacts:
files:
- template.yml
- outputtemplate.yml
template file
AWSTemplateFormatVersion: '2010-09-09'
Transform: AWS::Serverless-2016-10-31
Description: Python function
Resources:
LambdaMultiRegion:
Type: AWS::Serverless::Function
Properties:
Handler: Index.lambda_handler
Runtime: python3.7
FunctionName: PoC-LambdaMultiRegion
CodeUri: ./src
Any idea why sam cannot put the package in S3 bucket?

How can I update my existing lambda(not created by cloudformation) through cloudformation. Setting up the pipeline through cloudformation

I am setting up the pipeline which is using the cloud formation stack by creating a changeset and executing the changeset. But the first time this creates another lambda and does not have a way to update or deploy the existing created lambda.
buildspec.yml
version: 0.1
phases:
install:
commands:
- echo "nothing to do in install phase"
pre_build:
commands:
- mvn clean install
build:
commands:
- aws cloudformation package --template-file samTemplate.yaml --s3-bucket saurabh-lambda-pipeline --output-template-file outputSamTemplate.yaml
artifacts:
type: zip
files:
- samTemplate.yaml
- outputSamTemplate.yaml
samTemplate.yaml
AWSTemplateFormatVersion: '2010-09-09'
Transform: 'AWS::Serverless-2016-10-31'
Description: CD Lambda
Resources:
testLambda:
Type: AWS::Serverless::Function
Properties:
FunctionName: testLambda
Handler: com.test.handler.calculator::handleRequest
Runtime: java8
CodeUri: target/emi-calculator.jar
AutoPublishAlias: prod
Description: 'Lambda function for CD'
MemorySize: 128
Timeout: 30
Events:
getAZsAPI:
Type: Api
Properties:
Path: /calculator
Method: post
Environment:
Variables:
calculatorType: 30
You can add existing resources to a existing cloud formation stack. AWS Console > Cloudformation > Open your Stack > Actions > Import existing resources.
https://aws.amazon.com/de/blogs/aws/new-import-existing-resources-into-a-cloudformation-stack/
Then compare your local cloudformation template with the one on in CloudFormation.

Cannot build and deploy Go Lambda using AWS CodePipeline - BundleType must be either YAML or JSON

I am trying to build the most simple of Lambda functions in Go using AWS CodePipeline. Despite playing with it for about 2 weeks I still haven't managed to get it deployed.
main.go
package main
import (
"context"
"github.com/aws/aws-lambda-go/lambda"
)
func HandleRequest(ctx context.Context) (string, error) {
return "Hello from Go!", nil
}
func main() {
lambda.Start(HandleRequest)
}
buildspec.yml
version: 0.2
env:
variables:
S3_BUCKET: dlp-queuetime
PACKAGE: dlp-queuetime-fetcher
phases:
install:
runtime-versions:
golang: 1.12
commands:
# AWS Codebuild Go images use /go for the $GOPATH so copy the src code into that dir structure
- mkdir -p "/go/src/$(dirname ${PACKAGE})"
- ln -s "${CODEBUILD_SRC_DIR}" "/go/src/${PACKAGE}"
# Print all environment variables (handy for AWS CodeBuild logs)
- env
# Install Lambda Go
- go get github.com/aws/aws-lambda-go/lambda
pre_build:
commands:
# Make sure we're in the project directory within our GOPATH
- cd "/go/src/${PACKAGE}"
# Fetch all dependencies
- go get -t ./...
build:
commands:
# Build our Go app
- go build -o main
post_build:
commands:
- echo Build completed on `date`
artifacts:
type: zip
files:
- appspec.yml
- main
appspec.yml
version: 0.0
Resources:
- dlpQueueTimeFetcher:
Type: AWS::Lambda::Function
Properties:
Name: "dlpQueueTimeFetcher"
Alias: "v0"
CurrentVersion: "1"
TargetVersion: "2"
During the deployment CodeDeploy throws the following error: Action execution failed - BundleType must be either YAML or JSON.
It seems like CodeDeploy cannot find my appspec.yml file despite it being defined in the artifacts sections of my buildspec. What am I doing wrong here?
The problem you are facing is well known when connecting CodePipeline with CodeDeploy for Lambda deployment as CodeDeploy is looking for a Yaml or Json appspec file whereas the artifact presented by CodePipeline is a zip file containing the appspec:
https://forums.aws.amazon.com/thread.jspa?messageID=864336
CodePipeline: CodeDeploy reports "BundleType must be either YAML or JSON"
For now, you can use CloudFormation as a Deployment tool for your Lambda function in your Pipeline. The basic idea to deploy a Lambda function will be as follows:
Create a a SAM template of your Lambda function
A basic SAM template looks like:
AWSTemplateFormatVersion: '2010-09-09'
Transform: 'AWS::Serverless-2016-10-31'
Resources:
FunctionName:
Type: 'AWS::Serverless::Function'
Properties:
Handler: index.handler
Runtime: nodejs6.10
CodeUri: ./code
Add a directory "code" and keep the lambda code files in this directory
Run the command to package and upload:
$ aws cloudformation package --template-file template.yaml --output-template packaged.yaml --s3-bucket {your_S3_bucket}
Deploy the package:
$ aws cloudformation deploy --template-file packaged.yaml --stack-name stk1 --capabilities CAPABILITY_IAM
You can keep the Template Code (Step1-2) in CodeCommit/Github and do the Step4 in a CodeBuild Step. For Step5, I recommend to do it via a CloudFormation action in CodePipeline that is fed the "packaged.yaml" file as input artifact.
The above process is detailed here: https://docs.aws.amazon.com/en_us/lambda/latest/dg/build-pipeline.html

AWS CodePipeline CodeBuild SAM Lambda

I have a PipeLine for Lambda (NodeJS) Auto Deploy from CodeCommit.
My buildspec.yml
version: 0.1
phases:
install:
commands:
- npm install
- aws cloudformation package --template-file samTemplate.yaml --s3-bucket codepipeline-551 --output-template-file outputSamTemplate.yaml
artifacts:
type: zip
files:
- samTemplate.yaml
- outputSamTemplate.yaml
My samTemplate.yaml
AWSTemplateFormatVersion: '2010-09-09'
Transform: 'AWS::Serverless-2016-10-31'
Description: An AWS Serverless Specification template describing your function.
Resources:
KashIoTLambda:
Type: 'AWS::Serverless::Function'
Properties:
Handler: lambda.handler
Runtime: nodejs8.10
CodeUri: ./
Description: ''
MemorySize: 128
Timeout: 3
Role: 'arn:aws:iam::1234:role/abc-backend'
Events:
Api1:
Type: Api
Properties:
Path: '/{proxy+}'
Method: OPTIONS
Api2:
Type: Api
Properties:
Path: /MyResource
Method: ANY
Environment:
Variables:
REGION: ap-south-1
The policy attached to the role mentioned for CodeBuild gives access to All Resources & All S3 commands.
However, I have the following error from build logs
[Container] 2019/01/03 13:25:39 Running command npm install
added 122 packages in 3.498s
[Container] 2019/01/03 13:25:44 Running command aws cloudformation package --template-file samTemplate.yaml --s3-bucket codepipeline-551 --output-template-file outputSamTemplate.yaml
Unable to upload artifact ./ referenced by CodeUri parameter of ABCLambda resource.
An error occurred (AccessDenied) when calling the PutObject operation: Access Denied
[Container] 2019/01/03 13:25:46 Command did not exit successfully aws cloudformation package --template-file samTemplate.yaml --s3-bucket codepipeline-551 --output-template-file outputSamTemplate.yaml exit status 255
[Container] 2019/01/03 13:25:46 Phase complete: INSTALL Success: false
[Container] 2019/01/03 13:25:46 Phase context status code: COMMAND_EXECUTION_ERROR Message: Error while executing command: aws cloudformation package --template-file samTemplate.yaml --s3-bucket codepipeline-551 --output-template-file outputSamTemplate.yaml. Reason: exit status 255
Check the policies/permissions on the bucket itself.
If you were like me, you may have selected the default location option as the artifact store in the pipeline settings when you first set it up. In which case, there may be policies on the S3 bucket itself blocking access.
For me, once I removed the bucket policy the unable to upload artifact error was resolved and my build process went on to complete successfully.
I hope this helps you.

How to create and zip a docker container for AWS Lambda

I'm trying to create and then zip a Docker container to upload to S3 to be run by an AWS Lambda function. I was trying to work off an article but the instructions are sparse (https://github.com/abhisuri97/auto-alt-text-lambda-api).
I've installed Docker and the Amazon Linux image but I don't know how to create a Docker container that contains the github repo, and then zip it so that it can be accessed by Lambda.
This is what I've tried to piece together from other tutorials:
git clone https://github.com/abhisuri97/auto-alt-text-lambda-api.git
cd auto-alt-text-lambda-api
docker run -v -it amazonlinux:2017.12
zip -r -9 -q ~/main.zip
Any help would be greatly appreciated.
The instructions aren't clear but I suspect the reference to Docker is just for testing. You don't need Docker to run an AWS Lambda function. You will need an AWS API Gateway API though to execute the Lambda function over HTTPS.
I'd recommend starting with a CloudFormation stack using the AWS Serverless Application Mode (https://docs.aws.amazon.com/lambda/latest/dg/serverless_app.html).
Create an S3 bucket for the zip file and create a CloudFormation template similar to:
AWSTemplateFormatVersion: '2010-09-09'
Transform: AWS::Serverless-2016-10-31
Resources:
LambdaFunction:
Type: AWS::Serverless::Function
Properties:
Handler: application.predict
Runtime: python2.7
Events:
HttpGet:
Type: Api
Properties:
Path: 'auto-alt-text-api'
Method: get
Package the Lambda function with:
aws cloudformation package --template-file template.yaml --output-template-file template-out.yaml --s3-bucket <your-bucket> --s3-prefix <your-prefix>
Then deploy it with:
aws cloudformation deploy --template-file template-out.yaml --stack-name auto-alt-text-lambda-api-stack --capabilities CAPABILITY_IAM
You will probably have to add IAM roles and Lambda permissions to the template for the application to work properly.