How to upload zip file from S3 to Lambda using CodeDeploy - amazon-web-services

Since i'm not able to comment to the question: How to deploy ziped lambda code from s3 using code deploy service alone? i need to ask it again. I'm using AWS CodeBuild to use my Code from CodeCommit for creating Artifacts. I'd like to use those Artifacts as Input for my lambda functions. I'M NOT USING SAM.
I have an appspec.yml in my repo and can build it as second artifact on my bucket so deploy is able to find and "deploy" it but the main problem remains: the code doesn't get updated to the contents of the zip file. Regardless of I'm using a zip or a dir and regardless of I'm placing the appspec in my unzipped artifact or next to my zipped artifact. I've found this video: https://youtu.be/mYcRPKeCPXc. But I just don't get how his code is being updated without mentioning his zip file anywhere.

Related

How to write AWS Code pipeline only for updated files?

I am very new to AWS.
I have task where I need to write Code pipeline to copy files from CodeCommit to S3 bucket, but condition is only update/deleted files should get copied. I have already written pipeline, but it copies all the files to S3 bucket.
I tried writing AWS Lambda + Python code, but it works the same. I used below chunk of code,
https://medium.com/#michael.niedermayr/using-aws-codecommit-and-lambda-for-automatic-code-deployment-to-s3-bucket-b35aa83d029b
Any help or suggestion appreciated
Instead of using the S3 deploy action to copy your files, use a CodeBuild action in your CodePipeline.
In the CodeBuild buildspec, use 'aws s3 sync' command with the '--delete' switch to sync files to the S3 bucket. Make sure the CodeBuild Service role has permission to write to the S3 bucket.

deploy specific directory(tree) of github's repo to aws s3 bucket using aws code pipeline

I've a requirement where I need to deploy a specific tree of my gitHub's repo to a folder in AWS s3 bucket. By following link tutorial, I was able to deploy whole repo to the S3 bucket but I need to deploy only one directory(tree).
Say:- I've a tree called foo inside myRepo in github. I want to deploy foo to myBucket in AWS S3.
Ps:- foo contains folders, subfolders and files.
Kindly help, Thanks in advance !

How can I download/pull lambda code to a local machine from command line?

I am using the sam deploy command with the AWS SAM command line tool to deploy.
Now I made some changes with the web IDE in the AWS Console.
How can I pull the changes to the local machine, so that the next sam deploy command won't override them? (I am looking for something similar to a git pull I guess)
To do this you will need to use the AWS CLI, the start of this process will require you to use the get-function function in the AWS CLI.
This will return a pre signed URL in the Code > Location structure, if you then download this (using a CLI tool such as curl) you can then download a zip file containing the contents of the Lambda function.
The expected function would look similar to the below
curl $(aws lambda get-function --function-name $FUNCTION_NAME --output text --query "Code.[Location]")
You should have a single source of truth for your source code. And that should really be your source control repository (Git). If you make changes to your source code in the web IDE then you should copy those changes into your Git repo.
To your original question, to download a Lambda function's source code from the command line, you would use the aws lambda get-function command to download information about the function. Part of the information included in the response is a URL to download the function's deployment package, which is valid for 10 minutes. Then you could download the deployment package at that URL using something like curl.

"BundleType must be either YAML or Json" Error using Jenkins and AWS CodeDeploy

I am trying to deploy revisions to my AWS lambda functions using Jenkins and the AWS CodeDeploy add-on. I am able to build the project successfully and upload a zip of the project to an S3 bucket. At this point I receive the error:
BundleType must be either YAML or JSON
I have an appspec.yml file in my code directory. I am unsure if I need to instruct Jenkins to do something different, or if I need to instruct AWS to unzip the file and use it.
Today CodeDeploy lambda deployment only take in a YAML or JSON file as deployment revision input (which is just your AppSpec file). Today CodeDeploy Jenkins plugin needs to be updated to support uploading YAML or JSON file without zipping it: https://github.com/jenkinsci/aws-codedeploy-plugin/blob/master/src/main/java/com/amazonaws/codedeploy/AWSCodeDeployPublisher.java#L230

Deploying React Website to s3 Bucket

New'ish to AWS very new to CI/CD. And have a question re: deploying a Deploying React Website to an S3 Bucket.
I've got me a Git repo that contains a React web app.
I've setup an AWS CodePipeline project, which polls the master branch of the repo looking for commits.
It then triggers the AWS CodeBuild project which builds the react app as defined in the buildspec.yml
In the example/tutorial I've followed the buildspec.yml had the following...
post_build:
commands:
- aws s3 cp dist s3://${S3_BUCKET} --recursive
...which copies the build output to the destination S3 Bucket.
It all works great, however this assumes that the S3 Bucket is already there.
Question: Which step should be responsible for creating the destination S3 Bucket and what should I be using to do so?
I'm thinking that perhaps it should be a CodeDeploy with another Cloudformation Template
Just after a little guidance before going down the wrong route :)
Many thanks.
Ok, so I think I found the tutorial you were referring to: https://www.karelbemelmans.com/2017/01/deploying-a-hugo-website-to-amazon-s3-using-aws-codebuild/
Can you specify two post_build commands? You could just create the bucket first which might fail if the bucket already exists but who cares right? Or you could check if the bucket exists and only if it doesn't create it.
Here's the s3 command you need to create a bucket:
https://docs.aws.amazon.com/cli/latest/reference/s3api/create-bucket.html
There's an API for list buckets but I can't post it because this new user doesn't have 10 reputation yet unfortunately.
Good luck,
-Asaf