GiT pipeline to invoke lambda function created using terraform, - amazon-web-services

stages:
- validate
- plan
- apply
- invoke
image:
name: hashicorp/terraform:light
entrypoint:
- '/usr/bin/env'
- 'PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin'
before_script:
- set TF_VAR_AWS_ACCESS_KEY=${AWS_ACCESS_KEY_ID}
- set TF_VAR_AWS_SECRET_ACCESS_KEY=${AWS_SECRET_ACCESS_KEY}
- rm -rf .terraform
- terraform --version
- terraform init
validate:
stage: validate
script:
- terraform validate
plan:
stage: plan
script:
- terraform plan -out "planfile"
dependencies:
- validate
artifacts:
paths:
- planfile
apply:
stage: apply
script:
- terraform apply -input=false "planfile"
dependencies:
- plan
invoke:
stage: invoke
script:
- set func_name = ${CI_COMMIT_REF_NAME}-"hello-world"
- aws lambda invoke --function-name '$func-name' response.json
dependencies:
- apply
In the above script, in invoke stage i want to pass the lambda function main as argument, lambda function name prefixed with the gitlab branch name.
I am getting function arg syntax error.
Could someone help in finding a solution to pass the function name arg prefixed with gitlab branch name to AWS lambda invoke CLI command.

Gitlab branch name environment variable is CI_COMMIT_BRANCH https://docs.gitlab.com/ee/ci/variables/predefined_variables.html#:~:text=branches%20or%20tags.-,CI_COMMIT_BRANCH,-12.6
You don't need to create any extra variables.
invoke:
stage: invoke
script:
- aws lambda invoke --function-name "${CI_COMMIT_BRANCH}-hello-world" response.json
dependencies:
- apply
If you want to use it with variables.
invoke:
stage: invoke
script:
- func_name="${CI_COMMIT_REF_NAME}-hello-world"
- aws lambda invoke --function-name "$func_name" response.json
dependencies:
- apply

Related

Resolve AWS::SSM::Parameter::Value<String> in sam template to use it in Docker build phase into a pipeline

I'm creating an serverless app using sam, and I have a lambda function using a docker image in order to download private pypi packages from a private server.
Solution implemented:
Create a paramenter stored in AWS with name: /app_name/pypi/user and i call the parameter in the template using parameter with type AWS::SSM::Parameter::Value<String>, reference -> (https://youtu.be/fL3ToMdoXDw?t=915)
Create a pipeline using the following command: sam pipeline init --bootstrap
Deploy the pipeline: sam deploy -t codepipeline.yaml --stack-name test-params-pipeline --capabilities=CAPABILITY_IAM --profile development
Problem
The Build phase in the pipeline fails because I'm getting the name of the parameter instead of the parameter's value, so the pip install -r requirements.txt break the process.
I believe that this could be the reason:
For transforms, such as AWS::Include and AWS::Serverless, AWS CloudFormation doesn't resolve dynamic references before invoking any transforms. Rather, AWS CloudFormation passes the literal string of the dynamic reference to the transform. Dynamic references (including those inserted into the processed template as the result of a transform) are resolved when you execute the change set using the template.
https://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/dynamic-references.html#dynamic-references-considerations
Considerations:
The Dockerfile works if I run locally using --parameter-overrides ParameterKey=PypiUser,ParameterValue=value_username
The solutions shouldn't send sensible information through the repository
template.yaml
AWSTemplateFormatVersion: '2010-09-09'
Transform: AWS::Serverless-2016-10-31
Parameters:
PypiUser:
Type: AWS::SSM::Parameter::Value<String>
Default: /app_name/pypi/user
Resources:
EventJs:
Type: AWS::Serverless::Function
Events:
HttpApiEvent:
Type: HttpApi
Properties:
ApiId: !Ref HttpApiSdk
Method: POST
Path: /event-js
PayloadFormatVersion: "2.0"
Properties:
PackageType: Image
Metadata:
Dockerfile: lambda_handlers/create_js_event/Dockerfile
DockerContext: ./
DockerTag: latest
DockerBuildArgs:
PYPI_USER:
Ref: PypiUser
Dockerfile
FROM public.ecr.aws/lambda/python:3.9
ARG PYPI_USER
COPY lambda_handlers/create_js_event/create_event.py lambda_handlers/create_js_event/requirements.txt ./
RUN python3.9 -m pip install -r requirements.txt -t .
CMD ["create_event.lambda_handler"]
and the requirements.txt file
aws-lambda-powertools[pydantic]
--extra-index-url https://${PYPI_USER}:${PYPI_PASSWORD}#domain.co/simple
name-private-package

Which default .gitlab-ci.yml need to install Hello World on AWS Lambda?

I tried many combination, none worked. The latest was:
image: node:14
before_script:
- apk add zip
stages:
- build
build:
stage: build
script:
- echo "Building Lambda function..."
- echo "console.log('Hello World');" > index.js
- zip index.zip index.js
- echo "Deploying Lambda function to AWS..."
- aws configure set aws_access_key_id [AWS_ACCESS_KEY_ID]
- aws configure set aws_secret_access_key [AWS_SECRET_ACCESS_KEY]
- aws lambda create-function --function-name video-promotion --runtime nodejs14 --handler index.handler --zip-file fileb://index.zip

How to configure CircleCI IaC (Terraform) pipeline to use multiple repositories?

I am trying to run Terraform code through a CircleCI IaC pipeline to provision an S3 bucket in AWS.
I have Terraform code to provision S3 bucket s3.tf inside a repo named terraform
I have runtime variables in an s3.tfvars file in a repo named tfvars
So I would like to do these steps in my IaC pipeline:
Clone terraform repo
Clone tfvars repo
Run terraform init
Run terraform plan
Run terraform apply
I have a config.yaml that looks like this below. I am not sure how to clone 2 repos in CircleCI pipeline (terraform and tfvars). Any pointers on how to do this?
version: '2.1'
parameters:
ENV:
type: string
default: ""
orbs:
terraform: 'circleci/terraform#2.1'
workflows:
deploy_infrastructure:
jobs:
- terraform/init:
path: .
- terraform/validate:
path: .
checkout: true
context: terraform
- terraform/plan:
path: .
checkout: true
context: terraform
persist-workspace: true
requires:
- terraform/validate
workspace: parameters.ENV
- terraform/apply:
attach-workspace: true
context: terraform
filters:
branches:
only: 'circleci-project-setup'
requires:
- terraform/plan
This solved the issue:
version: '2.1'
orbs:
terraform: 'circleci/terraform#2.1'
jobs:
single-job-lifecycle:
executor: terraform/default
steps:
- checkout
- run:
command: >-
GIT_SSH_COMMAND='ssh -vv -i ~/.ssh/id_rsa'
git clone https://<url>/Tfvars.git
name: GIT Clone TFvars repository
- terraform/init:
path: .
- terraform/validate:
path: .
- run:
name: "terraform plan"
command: terraform plan -var-file "./Tfvars/tfvars/dev/s3.tfvars"
- run:
name: "terraform apply"
command: terraform apply -auto-approve -var-file "./Tfvars/tfvars/dev/s3.tfvars"
working_directory: ~/src
workflows:
single-job-lifecycle:
jobs:
- single-job-lifecycle````

Send argument to yml anchor for a step in bitbucket-pipelines.yml

I would like to send arguments when I call an anchor with bitbucket pipelines
Here is the file I am using, I have to call after-script because I need to push to a certain S3 bucket
definitions:
steps:
- step: &node-build
name: Build React app
image: node:lts-alpine
script:
- npm install --no-optional
- npm run build
artifacts:
- build/**
- step: &aws-ecr-s3
name: AWS S3 deployment
image: amazon/aws-cli
script:
- aws configure set aws_access_key_id "${AWS_KEY}"
- aws configure set aws_secret_access_key "${AWS_SECRET}"
pipelines:
branches:
master:
- step: *node-build
- step:
<<: *aws-ecr-s3
after-script:
- aws s3 cp ./build s3://my-app-site-dev --recursive
staging:
- step: *node-build
- step:
<<: *aws-ecr-s3
after-script:
- aws s3 cp ./build s3://my-app-site-uat --recursive
I am trying to do something like the following to not have to use that after-script part
definitions:
steps:
- step: &node-build
name: Build React app
image: node:lts-alpine
script:
- npm install --no-optional
- npm run build
artifacts:
- build/**
- step: &aws-ecr-s3 $FIRST-ARGUMENT
name: AWS S3 deployment
image: amazon/aws-cli
script:
- aws configure set aws_access_key_id "${AWS_KEY}"
- aws configure set aws_secret_access_key "${AWS_SECRET}"
- aws s3 cp ./build s3://${FIRST-ARGUMENT} --recursive
pipelines:
branches:
master:
- step: *node-build
- step: *aws-ecr-s3 my-app-site-dev
staging:
- step: *node-build
- step: *aws-ecr-s3 my-app-site-uat
To the best of my knowledge, you can only override particular values of YAML anchors. Attempts to 'pass arguments' won't work.
Instead, Bitbucket Pipelines provide Deployments - an ad-hoc way to assign different values to your variables depending on the environment. You'll need to create two deployments (say, dev and uat), and use them when referring to a step:
pipelines:
branches:
master:
- step: *node-build
<<: *pushImage
deployment: uat
staging:
- step: *node-build
<<: *pushImage
deployment: dev
More on Bitbucket Deployments:
https://support.atlassian.com/bitbucket-cloud/docs/variables-and-secrets/#Deployment-variables
https://support.atlassian.com/bitbucket-cloud/docs/set-up-and-monitor-deployments/

Cannot build and deploy Go Lambda using AWS CodePipeline - BundleType must be either YAML or JSON

I am trying to build the most simple of Lambda functions in Go using AWS CodePipeline. Despite playing with it for about 2 weeks I still haven't managed to get it deployed.
main.go
package main
import (
"context"
"github.com/aws/aws-lambda-go/lambda"
)
func HandleRequest(ctx context.Context) (string, error) {
return "Hello from Go!", nil
}
func main() {
lambda.Start(HandleRequest)
}
buildspec.yml
version: 0.2
env:
variables:
S3_BUCKET: dlp-queuetime
PACKAGE: dlp-queuetime-fetcher
phases:
install:
runtime-versions:
golang: 1.12
commands:
# AWS Codebuild Go images use /go for the $GOPATH so copy the src code into that dir structure
- mkdir -p "/go/src/$(dirname ${PACKAGE})"
- ln -s "${CODEBUILD_SRC_DIR}" "/go/src/${PACKAGE}"
# Print all environment variables (handy for AWS CodeBuild logs)
- env
# Install Lambda Go
- go get github.com/aws/aws-lambda-go/lambda
pre_build:
commands:
# Make sure we're in the project directory within our GOPATH
- cd "/go/src/${PACKAGE}"
# Fetch all dependencies
- go get -t ./...
build:
commands:
# Build our Go app
- go build -o main
post_build:
commands:
- echo Build completed on `date`
artifacts:
type: zip
files:
- appspec.yml
- main
appspec.yml
version: 0.0
Resources:
- dlpQueueTimeFetcher:
Type: AWS::Lambda::Function
Properties:
Name: "dlpQueueTimeFetcher"
Alias: "v0"
CurrentVersion: "1"
TargetVersion: "2"
During the deployment CodeDeploy throws the following error: Action execution failed - BundleType must be either YAML or JSON.
It seems like CodeDeploy cannot find my appspec.yml file despite it being defined in the artifacts sections of my buildspec. What am I doing wrong here?
The problem you are facing is well known when connecting CodePipeline with CodeDeploy for Lambda deployment as CodeDeploy is looking for a Yaml or Json appspec file whereas the artifact presented by CodePipeline is a zip file containing the appspec:
https://forums.aws.amazon.com/thread.jspa?messageID=864336
CodePipeline: CodeDeploy reports "BundleType must be either YAML or JSON"
For now, you can use CloudFormation as a Deployment tool for your Lambda function in your Pipeline. The basic idea to deploy a Lambda function will be as follows:
Create a a SAM template of your Lambda function
A basic SAM template looks like:
AWSTemplateFormatVersion: '2010-09-09'
Transform: 'AWS::Serverless-2016-10-31'
Resources:
FunctionName:
Type: 'AWS::Serverless::Function'
Properties:
Handler: index.handler
Runtime: nodejs6.10
CodeUri: ./code
Add a directory "code" and keep the lambda code files in this directory
Run the command to package and upload:
$ aws cloudformation package --template-file template.yaml --output-template packaged.yaml --s3-bucket {your_S3_bucket}
Deploy the package:
$ aws cloudformation deploy --template-file packaged.yaml --stack-name stk1 --capabilities CAPABILITY_IAM
You can keep the Template Code (Step1-2) in CodeCommit/Github and do the Step4 in a CodeBuild Step. For Step5, I recommend to do it via a CloudFormation action in CodePipeline that is fed the "packaged.yaml" file as input artifact.
The above process is detailed here: https://docs.aws.amazon.com/en_us/lambda/latest/dg/build-pipeline.html