Lambda function calling Codepipeline and passing parameters. Possible? - amazon-web-services

Has anyone ever tried or have examples of a python based Lambda function where I call a specific codepipeline, but I want to pass it variables in such a way that these variables are used during the Codebuild process?
I typically put in env variables during the codebuild process so I want to affect these.

Using the start_pipeline_execution call from boto3 (or any other AWS SDK) it is not possible to pass parameters to CodePipeline, because it is not supported by the CodePipeline API. Maybe you will find a workaround here.

Related

AWS Lambda CI/CD process

I am trying to understand the correct way to setup my project on AWS so that I ultimately get the possibility to have CI/CD on the lambda functions. And also to ingrain good practices.
My application is quite simple : an API that calls lambda functions based on users' requests.
I have deployed the application using AWS SAM. For that, I used a SAM template that was using local paths to the lambda functions' code and that created the necessary AWS ressources (API Gateway and Lambda). It was necessary to use local paths for the lambda functions because the way SAM works does not allow using existing S3 buckets for S3 events trigger (see here) and I deploy a Lambda function that is watching the S3 bucket to see any updated code to trigger lambda updates.
Now what I have to do is to push my Lambda code on Github. And have a way that Github pushes the lambda functions' code from github to the created S3 bucket during the SAM deploy and the correct prefix. Now what I would like is a way to automatically to that upon Github push.
What is the preferred way to achieve that ? I could not find clear information in AWS documentation. Also, if you see a clear flaw in my process don't hesitate to point it out.
What you're looking to do is a standard CI/CD pipeline.
The steps of your pipeline will be (more or less): Pull code from GitHub -> Build/Package -> Deploy
You want this pipeline to be triggered upon a push to GitHub, this can be done by setting up a Webhook which will then trigger the pipeline.
Last two steps are supported by SAM which I think you have already implemented before, so will be a matter of triggering the same from the pipeline.
These capabilities are supported by most CI/CD tools, if you want to keep everything in AWS you could use CodePipeline which also supports GitHub integration. Nevertheless, Jenkins is perfectly fine and suitable for your use case as well.
There are a lot of ways you can do it. So would depend eventually on how you decide to do it and what tools you are comfortable with. If you want to use native AWS tools, then Codepipeline is what might be useful.
You can use CDK for that
https://aws.amazon.com/blogs/developer/cdk-pipelines-continuous-delivery-for-aws-cdk-applications/
If you are not familiar with CDK and would prefer cloudformation, then this can get you started.
https://docs.aws.amazon.com/codepipeline/latest/userguide/tutorials-github-gitclone.html

Is there a way to resolve local module dependency when passing lambda function inline to AWS CloduFormation template?

I have a Nodejs lambda function that needs to be passed 'inline' into the AWS CloudFormation template.
This seems to be working with 'AWS::Lambda::Function' using parameter Code: ZipFileFollowing the official AWS guide
However, my lambda function also has some local dependencies that I have currently stored in local subfolders. and is called and used by my lambda. They currently look like the following in my lambda code:
var dep = require('../config/local_dependencies.json');
var other_dep = require('./lib/other_dependencies.json');
Is there a way to pass those dependencies into the AWS CloudFormation template and used by my lambda function while it's stored inline in the template?
Usually, this problem can be solved using S3 Buckets by putting all lambda files/dependency there and pass the path to AWS template. Due to certain local constraints, I can not approach it that way. Is there another way to solve this problem?

AWS Multiple Lambda Functions in one zip

I'm using CloudFormation to create lambda resources and I have several python scripts as well as one js script in a lambda-resources folder. Would it be okay to pass the same file location for every lambda function and just specify unique handlers? When my lambda function is created it looks like it only creates one lambda function.
Yes, this is definitely one way to accomplish what you're looking to do.
You'll need to create a zipped version of your lambda-resources folder and upload it via the Lambda service or even to S3, then reference it as the file location for each Lambda function.

CodePipeline and CloudFormation paramters

I am using CodePipeline to deploy my SAM (lambda etc) application referencing https://docs.aws.amazon.com/lambda/latest/dg/build-pipeline.html.
The "issue" now is my CloudFormation has some parameters inside and CodePipeline requires that I set these. I could do so via parameter overrides
But is this the correct way? I actually only want it set once at the start. And I'd rather have users set it in CloudFormation and CodePipeline should follow those values.
This stack is already created, why isit that CodePipeline complains I need them set?
The input parameters are required by CloudFormation to update.
Template configuration is the recommended way to specify the input parameters. You could create a template file of input parameters for the customers to use.
Possible solution is to create custom Lambda functions which will be invoked from CodePipeline using Invoke action.
As a parameter to such Lambda you would specify CloudFormation stack name. Lambda then will load CloudFormation parameters from existing stack and create output from it (using appropriate AWS SDK). Such artifact will be used as an input to CloudFormation deployment.
Another solution is to create CodeBuild project which will do the same thing.
It's a bit complex but it seems that CodePipeline always needs full set of parameters unfortunately.

Add trigger to lambda function automatically

I managed to deploy a AWS Lambda function using Travis, however I also need to add a trigger to it (in my case it's Kinesis). Has anyone done it? If there is no out of the box way to do it with travis I suppose I need to add the script using AWS CLI? Anyone has done it and could share some advice or where I could take a look as reference?
I wanted primarily to add the trigger with Travis but Terraform makes it much simpler.
https://www.terraform.io/docs/providers/aws/r/lambda_event_source_mapping.html
So I can create my IAM roles, Kinesis Stream and the event mapping between Kinesis and Lambda using terraform https://www.terraform.io/docs/providers/aws/r/lambda_event_source_mapping.html
If you have any different ways that you believe is better, please do not hesitate in adding here. Thanks.