Have recently setup a basic CodePipeline on AWS (following this guide: https://docs.aws.amazon.com/lambda/latest/dg/build-pipeline.html) which is triggered when there is a new commit on the CodeCommit repository.
But even though after successful execution of the pipeline the lambda function is not updated.
My buildspec.yml:
version: 0.2
phases:
install:
runtime-versions:
nodejs: 12
build:
commands:
- npm install
- export BUCKET=xx-test
- aws cloudformation package --template-file template.yaml --s3-bucket $BUCKET --output-template-file outputtemplate.yml
artifacts:
type: zip
files:
- template.yml
- outputtemplate.yml
My template.yaml:
AWSTemplateFormatVersion: '2010-09-09'
Transform: AWS::Serverless-2016-10-31
Description: >
helloWorld
API Gateway connectivity helloWorld
Globals:
Function:
Timeout: 3
Resources:
HelloWorldFunction:
Type: AWS::Serverless::Function
Properties:
CodeUri: ./
Handler: app.lambdaHandler
Runtime: nodejs12.x
Events:
HelloWorld:
Type: Api
Properties:
Path: /hello
Method: get
Is there any additional configuration that need to be done?
From what you've posted, it seems that your last action is CHANGE_SET_REPLACE? If so this would explain why there are no updates to your lambda function. Namely, this only creates a changeset, but doesn't not execute it. It other words, it does not apply it.
You need to add new action after CHANGE_SET_REPLACE action, which is called CHANGE_SET_EXECUTE. This action will take the changes created by the CHANGE_SET_REPLACE, and actually apply it to your stack.
How to add such an action is described in Complete the deployment stage of the tutorial you've provided:
Change sets let you preview the changes that are made before making them, and add approval stages. Add a second action that executes the change set to complete the deployment.
Related
I am deploying a serverless application to AWS. I have a environment parameter in my SAM template ENV: 'DEV'. When I do the deployment up to AWS, I specified a template parameter to change the variable to PROD. I can see in the SAM deploy log that the parameter override worked, but when I look at the function in the Lamda console it still has DEV listed like in the template.
How to I make it override the value upon deploy?
Template Yaml:
Resources:
GetWeatherFunction:
Type: AWS::Serverless::Function
Properties:
FunctionName: get-weather
CodeUri: get-weather/
Handler: app.lambda_handler
Runtime: python3.7
Timeout: 30
Architectures:
- x86_64
Policies: AWSLambdaBasicExecutionRole
Environment:
Variables:
ENV: 'DEV'
Deploy Window:
Deploy Log (some information changed for privacy, none of it relevant to the issue):
"C:\Program Files\Amazon\AWSSAMCLI\bin\sam.cmd" deploy --template-file C:\Users\User\PycharmProjects\Company\.aws-sam\build\packaged-template.yaml --stack-name MyProject --s3-bucket my-lambda-functions --capabilities CAPABILITY_IAM CAPABILITY_NAMED_IAM --no-execute-changeset --parameter-overrides \"ENV\"=\"PROD\"
Deploying with following values
===============================
Stack name : MyProject
Region : us-east-1
Confirm changeset : False
Disable rollback : False
Deployment s3 bucket : my-lambda-functions
Capabilities : ["CAPABILITY_IAM", "CAPABILITY_NAMED_IAM"]
Parameter overrides : {"ENV": "PROD"}
Signing Profiles : {}
Lambda Console:
The "Template Parameters" field maps to the CloudFormation template parameters rather than an individual Lambda's environment variables.
You'll need to add a Parameter definition to the top of your template:
Parameters:
EnvironmentName:
Type: String
Default: DEV
And then you can refer to it anywhere in your template, for example:
Resources:
GetWeatherFunction:
Type: AWS::Serverless::Function
Properties:
FunctionName: get-weather
...
Environment:
Variables:
ENV:
Ref: EnvironmentName
Then in the screen above you'll need to supply the EnvironmentName parameter - it should actually automatically detect that a parameter has been defined in the template.
I'm trying to build a CICD pipeline that supports the very simple process laid out below. I am trying to do this all in AWS (ie, avoiding GitHub Actions), and I do not want to have to manually zip code or transfer anything.
Target process:
Git Push code to GitHub Repository.
AWS Updates code within existing Lambda function and updates the $latest alias accordingly.
Progress so far
I have been able to link AWS CodePipeline to GitHub. When code is pushed to the repository, the pipeline triggers and a compressed file that contains the contents from GitHub is added to an S3 bucket.
Long term I will likely be interested in pre- and post-deployment testing, approvals, etc etc... but for now I just want a simple setup as described above.
Challenge
I cannot fathom how then to actually update the Lamda function now I have this compressed file in S3. I've tried various Build/Deploy things from within the CodeDeloy Pipeline, but I get various errors. I'm not even entirely sure if this entire approach is the best way to go about what I want to do?!
Ask
Is this a valid approach to implementing this kind of CICD pipeline? If no, please suggest alternative and justify why you think it's better.
How do you automatically take the code from within the compressed S3 file and get it in to the Lambda function?
Thanks for your help!
Richard
What you could do, is include an AWS SAM (CloudFormation) template in your repository. You could then in a build step, use the build/package step of AWS SAM, which will create a packaged.yaml CloudFormation template. This template is then usable with the CloudFormation deployment actions.
This is part of a CloudFormation template that sets up such a flow, some things are omitted for brevity:
CodeBuildProject:
Type: AWS::CodeBuild::Project
Properties:
Name: codebuildproject
Description: Package and Deploy
Artifacts:
Type: CODEPIPELINE
Environment:
Type: LINUX_CONTAINER
ComputeType: BUILD_GENERAL1_SMALL
Image: aws/codebuild/amazonlinux2-x86_64-standard:3.0
EnvironmentVariables:
- Name: IAC_BUCKET
Type: PLAINTEXT
Value: !Sub iac-${AWS::Region}-${AWS::AccountId} # Bucket needed for SAM deployment
ServiceRole: !Ref CodeBuildServiceRole
Source:
Type: CODEPIPELINE
BuildSpec: |
version: 0.2
phases:
install:
runtime-versions:
python: 3.8
commands:
- 'pip install --upgrade --user aws-sam-cli'
build:
commands:
- sam build
- sam package --s3-bucket $IAC_BUCKET --output-template-file packaged.yaml
artifacts:
files:
- 'packaged.yaml'
Pipeline:
Type: AWS::CodePipeline::Pipeline
Properties:
ArtifactStore:
Location: !Sub "codepipeline-${AWS::Region}-${AWS::AccountId}"
Type: S3
Name: deployment-pipeline
RoleArn: !GetAtt PipelineExecutionRole.Arn
Stages:
- Name: Source
Actions:
- YourGithubSourceAction
- Name: Package
Actions:
- Name: SamPackage
ActionTypeId:
Category: Build
Owner: AWS
Provider: CodeBuild
Version: '1'
Configuration:
ProjectName: !Ref CodeBuildProject
InputArtifacts:
- Name: SourceZip
OutputArtifacts:
- Name: samArtifact
RunOrder: 1
- Name: Deployment
Actions:
- Name: CreateChangeSet
ActionTypeId:
Category: Deploy
Owner: AWS
Provider: CloudFormation
Version: '1'
Configuration:
ActionMode: "CHANGE_SET_REPLACE"
ChangeSetName: !Sub "${ApplicationName}-${Environment}-changeset"
Capabilities: CAPABILITY_NAMED_IAM
StackName: your-stack-name
RoleArn: !GetAtt PipelineExecutionRole.Arn
ParameterOverrides: !Sub '{ "Environment" : "${Environment}" }'
TemplatePath: 'samArtifact::packaged.yaml'
InputArtifacts:
- Name: samArtifact
RunOrder: 1
- Name: ExecuteChangeSet
ActionTypeId:
Category: Deploy
Owner: AWS
Provider: CloudFormation
Version: '1'
Configuration:
ActionMode: CHANGE_SET_EXECUTE
ChangeSetName: !Sub ${ApplicationName}-${Environment}-changeset
StackName: your-stack-name
RunOrder: 2
Be sure to have a look at AWS SAM if you're not familiar with it to see all the possibilities and how to construct your template itself.
We have the pipeline code. where it does code-commit. code-build and create and deploy changeset where it deploys to other AWS accounts. this codebase has a lambda function in the template.yaml along with java codebase. during the build phase, it creates 3 jar files. how to pack these files to create template-yml code-base. it works fine in a single lamdba function template.yml the challange is here we have lambda functions inside the template.yml
aws cloudformation package --template-file template.yml --s3-bucket $S3Bucket --s3-prefix packages-$EnvironmentName --output-template-file template-export.yml
how to add the CodeUri: this should be like s3://bucketname/packages-/pack1/
Transform: AWS::Serverless-2016-10-31
Description: Outputs the time
Resources:
TimeFunction:
Type: AWS::Serverless::Function
Properties:
Handler: firstsample/firstsample.handler # firstsample.js file is in firstsample direcotory
Role: !GetAtt BasicAWSLambdaRole.Arn
Runtime: java11
CodeUri: s3://test/packages/jar1/
SecondSampleFunction:
Type: AWS::Serverless::Function
Properties:
Handler: secondsample.handler # didn't have to include secondsample directory
Role: !GetAtt BasicAWSLambdaRole.Arn
Runtime: java11
CodeUri: s3://test/packages/jar2/
I want to create a continous delivery pipeline for a Lambda function.
As shown in this docs, the custom environment variables of AWS::CodeBuild::Project can be used in buildspec.yaml like:
aws cloudformation package --template-file template.yaml --s3-bucket $MYEVVARKEY --output-template-file outputtemplate.yaml
Wanted to use those CodeBuild Project environment variables in the SAM template of the repository also. As shown below, I tried with dollar signs, but it did not get it as a key but as a text as it is:
# template.yaml
AWSTemplateFormatVersion: '2010-09-09'
Transform: AWS::Serverless-2016-10-31
Resources:
TimeFunction:
Type: AWS::Serverless::Function
Properties:
FunctionName: $MY_FN_NAME_ENV_VAR
Role: $MY_ROLE_ARN_ENV_VAR
Handler: index.handler
Runtime: nodejs8.10
CodeUri: ./
So, is it possible to utilize CodeBuild Project environment variables in SAM template, if so what's the notation required to achieve that?
CloudFormation can't refer to environment variables, doesn't matter SAM or plain. What you can do is to pass environment variables as parameters via shell in CodeBuild buildspec.yaml file (--parameters ParameterKey=name,ParameterValue=${MY_ENV_VAR}).
Remember to add corresponding parameter to your Parameters section.
If you use aws cloudformation deploy then you should use --parameter-overrides, which is a little simpler form:
--parameter-overrides \
YourParam=${YOUR_ENV_VAR} \
Foo=Bar \
Currently I am able to deploy a lambda by pushing to github. I also automatically deploy a lambda but only because the api gateway is an event in the lambda yaml file
AWSTemplateFormatVersion: '2010-09-09'
Transform: 'AWS::Serverless-2016-10-31'
Description: Identifies paragraphs in documents and links to the law
Resources:
LambdaParagraphLinker:
Type: 'AWS::Serverless::Function'
Properties:
Handler: LambdaParagraphLinker.lambda_handler
Runtime: python3.6
CodeUri: ./
Description: Identifies paragraphs in documents and links to the
law
MemorySize: 512
Timeout: 10
Events:
Api:
Type: Api
Properties:
Path: /LambdaParagraphLinker
Method: ANY
How can I deploy an api gateway using a swagger file ?
Hands down the best way to do this in codepipeline is by using https://serverless.com/ framework. This replaces every super complicated hack-job and workaround I've previously used. Way less complicated IMO.
Create a codepipeline, link it to src & a codebuild project, set a few permissions, done.
//serverless.yml
service: my-api
provider:
name: aws
runtime: python2.7
functions:
hello:
handler: handler.hello
events:
- http:
path: api/v1/message
method: post
//buildspec.yml
version: 0.2
phases:
install:
commands:
#BUILD
- sudo apt-get update -y
build:
commands:
- echo $environment
- serverless package --stage $environment --region us-east-1
- serverless deploy --stage $environment --region us-east-1
Or torture yourself by doing one of the options below...
You can do this in cloudformation from within code pipeline. Export the swagger spec from within the gatewayapi console and place in the cloudformation template.
AWSTemplateFormatVersion: '2010-09-09'
Resources:
PlayersAPI:
Type: AWS::ApiGateway::RestApi
Properties:
Name: MyApi
Description: API Description
Body:
"SWAGGER HERE"
Hooking this up to lambda is a little bit cumbersome but I can describe the steps. First create a codepipeline project with source, build, and deploy steps.
src should be standard from github or codecommit
build should output a zip file and use buildspec.yml Something like this...
//buildspec.yml
version: 0.1
phases:
install:
commands:
#BUILD
- zip -r lambda.zip . -x *.git*
artifacts:
files:
- '**/*.zip'
- '**/*.yml'
discard-paths: no
Have the build step export an artifact MyAppBuild (or whatever you want to call it)
The final pipeline step is creating the lambda function in this repo as a standalone function through the console(its reusable):
https://github.com/tkntobfrk/codepipeline-lambda-s3
This lambda function downloads the pipeline artifact/zipped lambda function and updates it using boto.
After these steps you could add another step as a cloudformation deploy step. Connect it to the lambda function that you've just deployed.
If you are dealing with multiple environments you could create lambda functions and gatewayapi cloudformation template for each environment then run them in sequence.
stage 1: src
stage 2: build
stage 3: deploy lambda test, deploy gateway api cloudformation test
stage 4: validate test
stage 5: deploy lambda prod, deploy gateway api cloudformation prod
Using straight AWS serverless like this works too. However, you need to use a standard artifact location for the uri's. The DefinitionUri: for the API can be the exported swagger from the gatewayapi console.
//cloudformation.yml
AWSTemplateFormatVersion: '2010-09-09'
Transform: AWS::Serverless-2016-10-31
Resources:
MySimpleFunction:
Type: AWS::Serverless::Function
Properties:
Handler: app.lambda_handler
Runtime: python2.7
CodeUri: s3://somebucket/somezip.zip
MyAPI:
Type: AWS::Serverless::Api
Properties:
StageName: prod
DefinitionUri: s3://somebucket/somezip.zip
AWS::Serverless::Api
https://github.com/awslabs/serverless-application-model/blob/master/versions/2016-10-31.md#awsserverlessapi
You can find Swagger docs all over the place, and docs on API Gateway extensions are in the developer guide. I would start by going into the API Gateway console and look at the API that Lambda creates for you. You can go to the 'Stages' page and for any stage, you can Export the API as Swagger.