How can I download/pull lambda code to a local machine from command line? - amazon-web-services

I am using the sam deploy command with the AWS SAM command line tool to deploy.
Now I made some changes with the web IDE in the AWS Console.
How can I pull the changes to the local machine, so that the next sam deploy command won't override them? (I am looking for something similar to a git pull I guess)

To do this you will need to use the AWS CLI, the start of this process will require you to use the get-function function in the AWS CLI.
This will return a pre signed URL in the Code > Location structure, if you then download this (using a CLI tool such as curl) you can then download a zip file containing the contents of the Lambda function.
The expected function would look similar to the below
curl $(aws lambda get-function --function-name $FUNCTION_NAME --output text --query "Code.[Location]")

You should have a single source of truth for your source code. And that should really be your source control repository (Git). If you make changes to your source code in the web IDE then you should copy those changes into your Git repo.
To your original question, to download a Lambda function's source code from the command line, you would use the aws lambda get-function command to download information about the function. Part of the information included in the response is a URL to download the function's deployment package, which is valid for 10 minutes. Then you could download the deployment package at that URL using something like curl.

Related

How to upload zip file from S3 to Lambda using CodeDeploy

Since i'm not able to comment to the question: How to deploy ziped lambda code from s3 using code deploy service alone? i need to ask it again. I'm using AWS CodeBuild to use my Code from CodeCommit for creating Artifacts. I'd like to use those Artifacts as Input for my lambda functions. I'M NOT USING SAM.
I have an appspec.yml in my repo and can build it as second artifact on my bucket so deploy is able to find and "deploy" it but the main problem remains: the code doesn't get updated to the contents of the zip file. Regardless of I'm using a zip or a dir and regardless of I'm placing the appspec in my unzipped artifact or next to my zipped artifact. I've found this video: https://youtu.be/mYcRPKeCPXc. But I just don't get how his code is being updated without mentioning his zip file anywhere.

How can I update my AWS Lambda function from VSCode?

So I have an AWS Lambda function written in NodeJS, but I am tired of coding in the AWS Console or having to manually zip my code in my VSCode to manually upload it in he AWS Console.
I know that I can update my function with aws lambda update-function-code --function-name myFunction --zip-file "fileb://myZipFile". But how can I zip it and launch this command every time I save my work in VSCode ?
Also, I am on Windows.
You can't do this without some additional work.
A few options are:
use the Run on Save VS Code extension and configure a custom command to run when a file is saved
create a SAM project and install the AWS Toolkit for VS Code extension to provide deployment assistance
create a package.json that includes a script for zip/deployment and use the NPM extension for VS Code to execute the deploy script
build a CI/CD solution: use VS Code to commit and push your code, then the pipeline takes over and deploys it
use a shell script, or a Makefile with a target, that zips and deploys and then simply execute it, manually or otherwise, in the VS Code in-built terminal
I use a script with below and run it when need to update.
echo "Building zip file"
zip -rq testfunction.zip testfunctionfolder/
echo "update Lambda function"
FUNCTION_ARN=$(aws lambda update-function-code \
--function-name testfunction \
--zip-file fileb://testfunction.zip \
--query 'FunctionArn' \
--output text)
echo "Lambda function updated with ARN ${FUNCTION_ARN}"

Use travis CI to automatically trigger AWS lambda function?

How can I make Travis CI automatically trigger an AWS lambda function after all tests have passed and travis CI builds successfully? Please note the github repo is public.
Background
The bigger problem I'm solving is that I have travis CI on a repo. Each time I push, after everything passes, I manually run a lambda which sets off processes in AWS. I will be open sourcing the repo so anyone can contribute, so I want to avoid having to run the lambda manually, but instead have it automatically triggered whenever a pull request is merged successfully.
You could update your travis-ci build to invoke the lambda with the aws-cli, as long as you install it in your travis build. Here is an example:
aws lambda invoke --function-name awesome-function --payload '{"some":"data", "targetState": true}' /dev/stdout
breakdown:
aws lambda invoke is the basic aws-cli command we want to run
--function-name specifies which function to run
--payload specifies the event data to invoke the function with
/dev/stdout specifies that we want the output of the invocation to enter our terminal output
here's the documentation: https://docs.aws.amazon.com/cli/latest/reference/lambda/invoke.html
Just putting down my idea here, just wanted to check If this is possible or not. If your container has internet access.
Why not use a curl command to do a post with appropriate payload on
to the API Gateway endpoint.
The lambda can be backed behind the API Gateway.
IF its going to be a public repo, in that case we don't want to store any credentials on any docker/container.
Create an IAM user for the container with policies to interact with AWS Lambda invoke only and then use the option of aws cli.
- curl -X POST -H "Content-Type: application/json" -d '{"xyz":"testing","abc":"random stuff"}' https://tst.nhsd.io/restapi/Xyzxyz/testing/

"BundleType must be either YAML or Json" Error using Jenkins and AWS CodeDeploy

I am trying to deploy revisions to my AWS lambda functions using Jenkins and the AWS CodeDeploy add-on. I am able to build the project successfully and upload a zip of the project to an S3 bucket. At this point I receive the error:
BundleType must be either YAML or JSON
I have an appspec.yml file in my code directory. I am unsure if I need to instruct Jenkins to do something different, or if I need to instruct AWS to unzip the file and use it.
Today CodeDeploy lambda deployment only take in a YAML or JSON file as deployment revision input (which is just your AppSpec file). Today CodeDeploy Jenkins plugin needs to be updated to support uploading YAML or JSON file without zipping it: https://github.com/jenkinsci/aws-codedeploy-plugin/blob/master/src/main/java/com/amazonaws/codedeploy/AWSCodeDeployPublisher.java#L230

Amazon Web Services: NoCredentialsError: Unable to locate credentials

I am using amazon web services cli. I use a makefile to to build my lambda project and upload it to aws lambda. I am on a windows machine and using powershell to call make.
I try to delete my lambda function with the following lines
AWS_PATH = /cygdrive/c/Users/TestBox/AppData/Roaming/Python/Scripts/aws
AWS_WIN_PATH = $(shell cygpath -aw ${AWS_PATH})
AWS_REGION = eu-west-2
lambda_delete:
$(AWS_WIN_PATH) lambda delete-function --function-name LambdaTest --region $(AWS_REGION) --debug
I get this error..
NoCredentialsError: Unable to locate credentials
Unable to locate credentials. You can configure credentials by running "aws configure".
Running aws configure list prints out a valid default profile.
I think the problem is because i am using gnu make installed by cygwin on a windows machine. Using powershell to call make.
So the path to credentials looks like this "cygdrive/c/users/testbox/.aws/credentials" instead "c:\users\testbox.aws\credentials", when ~/.aws/credentials is evaluated by aws. I think :)
I had the same problem with the path to aws itself and had to use $(shell cygpath -aw ${AWS_PATH}) to convert it to a path windows python could use.
Is there any way to pass the credentials directly to the lambda delete-function or indirectly through a path to a file? I cant seem to think of a way because the code that searches for the credentials is internal to botocore.
Is there a way around this that you know off?
Alternative solution, consider using AWS SAM templates
Use AWS SAM templates to deploy your Lambda functions and AWS resources using CloudFormation.
Edit your SAM template and define your AWS resources. For example, define Lambda functions/path to your code.
aws cloudformation package to package and upload your local code to S3.
aws cloudformation deploy to provision and update AWS resources with the updated code on S3.
This would work in CMD/Powershell without the make hassle. You will also have the benefit of having your resources versioned as code and you won't need to worry about tracking or adding new AWS APIs in your make file.
More complex serverless frameworks for reference:
AWS Chalice https://github.com/aws/chalice
Django/Flask + Lambda https://github.com/Miserlou/Zappa
Cross cloud serverless solution https://github.com/serverless/serverless