Custom build step for SAM CLI - amazon-web-services

When using the AWS SAM CLI to build a serverless application, it located dependencies magically and installs them all as the "build" step. For example, using a NodeJS application:
$> sam build
Building resource 'HelloWorldFunction'
Running NodejsNpmBuilder:NpmPack
Running NodejsNpmBuilder:CopyNpmrc
Running NodejsNpmBuilder:CopySource
Running NodejsNpmBuilder:NpmInstall
Running NodejsNpmBuilder:CleanUpNpmrc
Build Succeeded
Built Artifacts : .aws-sam/build
Built Template : .aws-sam/build/template.yaml
Commands you can use next
=========================
[*] Invoke Function: sam local invoke
[*] Deploy: sam deploy --guided
$>
Looking at the official documentation they're happy to simply treat it like magic, saying that it:
iterates through the functions in your application, looks for a manifest file (such as requirements.txt) that contains the dependencies, and automatically creates deployment artifacts that you can deploy to Lambda
But what if I have a dependency beyond just those specified in the manifest file? What if my code depends on a compiled binary file, or a static data file?
I would like to add additional build steps so that when I run sam build it compiles these files or copies them appropriately. Is there any way to do this?

sam build is running npm install. So if you insert your own script into a step such as preinstall in package.json, sam build will also execute that step.
package.json
{
...
"preinstall": "cp -r ../../../common ./"
...
}
The above preinstall script is a hack that embeds the common directory in the root folder of the sam inited project in the zip of each lambda handler so that it can be referenced from each.
You should also create a symbolic link in the local lambda handler directory, like ln -s ../common ./common, so that local and lambda work with the same code.

You will need to wrap this command into another custom command and add the steps you need to it.
You can create a make file with multiple targets that satisfy your requirements.
I haven't used sam build before, I usually have a make target for that purpose.
you can give it a try with this bootstrap template here https://github.com/healthbridgeltd/nodejs-sam-bootstrap which is more efficient than using sam build.

Related

Sonarcloud c++ docker cmake

i was trying to integrate sonarcloud in my build.
I have created a free account in sonarcloud.io and added necessary steps in build pipeline.
When i ran the pipeline, i got the error
ERROR: Error during SonarScanner execution java.lang.IllegalStateException: java.nio.file.NoSuchFileException: /home/vsts/work/1/s/bw-outputs/build-wrapper-dump.json
The process '/home/vsts/work/_tasks/SonarCloudAnalyze_ce096e50-6155-4de8-8800-4221aaeed4a1/1.20.0/sonar-scanner/bin/sonar-scanner' failed with exit code 1
Also, i tried with a .properties file.
sonar.projectKey=jfzlma0838_dockersample
sonar.projectName=dockersample
sonar.projectVersion=1.0
sonar.sources=app
# The build-wrapper output dir
sonar.cfamily.build-wrapper-output=bw-outputs
# Encoding of the source files
sonar.sourceEncoding=UTF-8
full repo here (master)
The most likely cause of this error is that you did not run build wrapper.
Step 1. Download the Build Wrapper:
Build Wrapper for
Linux
Build Wrapper for
macOS
Build Wrapper for
Windows
Step 2. Unzip them and push them to your repository.
Step 3. Add their path to the enviornment variable PATH. You can use the following PowerShell script:
Write-Host "##vso[task.setvariable variable=PATH;]${env:PATH};$newPath";
Note that the path of repository is $(System.DefaultWorkingDirectory) in Azure DevOps.
Step 4. Execute Build Wrapper. You can click this document for detailed steps.

AWS C# Lambda function code not deployed after successful deployment

I am trying to deploy a C# core2.0 Lambda function create on visual studio to Amazon Lambda function.
I am using these commands on command line:
dotnet lambda package -c Release -f netcoreapp2.0
Which creates the release folder with zip deployment file.
After that I issue:
dotnet lambda deploy-function -fn AWSLambda1
And that function was created on the AWS
But When I enter the Lambda function there is not code in it:
When I try to upload the zip deployment file it is not working and code is not deploying
Please help
Thanks
Got the same issues , uploads the function but not the code... also tried overwriting an existing lambda , no joy.
OK , i think I figured this out , when you publish a dotnet lambda project from the CLI by default it creates a DLL - then the deploy function zips and uploads the DLL to AWS lambda. Naturally you cant then inspect individual code files as they are compiled in the DLL. Maybe theres some option to upload the raw code files.
Lambda deployment way trough command line.
Step 1 : dotnet tool install -g Amazon.Lambda.Tools
Step 2 : dotnet lambda deploy-serverless
Note: Step 2 for whole lambda's deployment command it is required to first time deployment.
Step 3 : if you want to deploy specific lambda then use below command.
dotnet lambda deploy-function Getdata
Note :(Getdata is a function name which mention in serverless.template file in resource section)
Add below configuration in "aws-lambda-tools-defaults.json"

GitHub Cloud Build Integration with multiple cloudbuild.yamls in monorepo

GitHub's Google Cloud Build integration does not detect a cloudbuild.yaml or Dockerfile if it is not in the root of the repository.
When using a monorepo that contains multiple cloudbuild.yamls, how can GitHub's Google Cloud Build integration be configured to detect the correct cloudbuild.yaml?
File paths:
services/api/cloudbuild.yaml
services/nginx/cloudbuild.yaml
services/websocket/cloudbuild.yaml
Cloud Build integration output:
You can do this by adding a cloudbuild.yaml in the root of your repository with a single gcr.io/cloud-builders/gcloud step. This step should:
Traverse each subdirectory or use find to locate additional cloudbuild.yaml files.
For each found cloudbuild.yaml, fork and submit a build by running gcloud builds submit.
Wait for all the forked gcloud commands to complete.
There's a good example of one way to do this in the root cloudbuild.yaml within the GoogleCloudPlatform/cloud-builders-community repo.
If we strip out the non-essential parts, basically you have something like this:
steps:
- name: 'gcr.io/cloud-builders/gcloud'
entrypoint: 'bash'
args:
- '-c'
- |
for d in */; do
config="${d}cloudbuild.yaml"
if [[ ! -f "${config}" ]]; then
continue
fi
echo "Building $d ... "
(
gcloud builds submit $d --config=${config}
) &
done
wait
We are migrating to a mono-repo right now, and I haven't found any CI/CD solution that handles this well.
The key is to not only detect changes, but also any services that depend on that change. Here is what we are doing:
Requiring every service to have a MAKEFILE with a build command.
Putting a cloudbuild.yaml at the root of the mono repo
We then run a custom build step with this little tool (old but still seems to work) https://github.com/jharlap/affected which lists out all packages have changed and all packages that depend on those packages, etc.
then the shell script will run make build on any service that is affected by the change.
So far it is working well, but I totally understand if this doesn't fit your workflow.
Another option many people use is Bazel. Not the most simple tool, but especially great if you have many different languages or build processes across your mono repo.
You can create a build trigger for your repository. When setting up a trigger with cloudbuild.yaml for build configuration, you need to provide the path to the cloudbuild.yaml within the repository.

exec format error when running AWS Golang Lambda

I have a go application, structured like this:
cmd|reports|main.go
main.go imports internal/reports package and has a single function, main(), which delegates call to aws-lambda-go/lambda.Start() function.
Code is build running the commands (snippet):
cd internal/reports && go build handler.go
cd ../..
go build -o reports ../cmd/reports/main.go && chmod +x reports && zip reports.zip reports
reports.zip is uploaded to AWS Lambda, which in turns throws an error when Test button is pressed:
{
"errorMessage": "fork/exec /var/task/reports: exec format error",
"errorType": "PathError"
}
reports is set as Lambda's Handler.
Also, code is build on Ubuntu 14.04 machine, as a part of aws/codebuild/ubuntu-base:14.04 Docker Image, on AWS CodeBuild. There should be no environment issues here, even though the error suggests a cross-platform problem.
Any ideas?
You have to build with GOARCH=amd64 GOOS=linux.
Wherever you build your binary, the binary for Lambda is run on Amazon Linux.
So , try this build command.
GOARCH=amd64 GOOS=linux go build handler.go
The issue is that main() function is not declared in main package, which is mandatory by Golang language spec

Claudia.js with multiple AWS lambda functions

I'm using claudia.js CLI to deploy functions and web API to AWS lambda and API gateway.
My project files structure is as follow:
functions
--function1
---- node_modules
---- package.json
---- index.js
---- claudia.json
--function2
---- node_modules
---- package.json
---- index.js
---- claudia.json
The problem is that in order to update new version I have to run "claudia update" in every function folder...so I have to run it once for every function (in every folder). Is there a way to tell claudia.js to update all my functions at once?
Rather than getting ClaudiaJS to do the work, use a tool to run ClaudiaJS.
Most monorepo tools will suffice, such as Lerna but there is gamont of less opinionated tools if you don't care for what Lerna offers - Lolaus is pretty low-level.
With Lerna you would need to use the prescribed repo structure, get linked node_modules, and lerna run deploy would run the npm deploy script of each package that has it.
With Lolaus you would search for all of your functions and then run an arbitrary command in each directory: lolaus "*/*/caudia.json" claudia update
We have a lambda repo with multiple replate lambdas, each in its own subfolder.
> lambdas
> |_lambda1
> |___main.js
> |___main.spec.js
> |___claudia.json
> |___package.json
> |_lambda2
> |___main.js
> |___main.spec.js
> |___claudia.json
> |___package.json
> |_helpers
> |_test.sh
> |_deploy.sh
We use npm and a bash script to iterate over each lambda and run a consistent set of npm/eslint commands on them. If that passes the build process we run a claudia command the same way on each lambda. There is some cut and paste