Why AWS Lambda is not updating though GitLab CI / CD Pipeline get a green checkmark - amazon-web-services

Following this tutorial, https://docs.gitlab.cn/14.0/ee/user/project/clusters/serverless/aws.html#serverless-framework
Created a function in AWS Lambda called create-promo-animation
Created a /src/handler.js
"use strict";
module.exports.hello = async (event) => {
return {
statusCode: 200,
body: JSON.stringify(
{
message: "Your function executed successfully!",
},
null,
2
),
};
};
Created gitlab-ci.yml
stages:
- deploy
production:
stage: deploy
before_script:
- npm config set prefix /usr/local
- npm install -g serverless
script:
- serverless deploy --stage production --verbose
environment: production
Created serverless.yml
service: gitlab-example
provider:
name: aws
runtime: nodejs14.x
functions:
create-promo-animation:
handler: src/handler.hello
events:
- http: GET hello
pushed to GitLab, Pipe run well
But code is not updating in AWS, why?

Related

Lambda works locally, but not when deployed

I'm using serverless framework and my endpoint work locally with sls offline, but when I sls deploy it to AWS I get 502 Bad Gateway in postman
and if I go to AWS Lambda console and click test event to see what comes up I get
{
"errorType": "Runtime.UserCodeSyntaxError",
"errorMessage": "SyntaxError: Unexpected token '??='",
"trace": [
"Runtime.UserCodeSyntaxError: SyntaxError: Unexpected token '??='",
" at _loadUserApp (/var/runtime/UserFunction.js:222:13)",
" at Object.module.exports.load (/var/runtime/UserFunction.js:300:17)",
" at Object.<anonymous> (/var/runtime/index.js:43:34)",
" at Module._compile (internal/modules/cjs/loader.js:1085:14)",
" at Object.Module._extensions..js (internal/modules/cjs/loader.js:1114:10)",
" at Module.load (internal/modules/cjs/loader.js:950:32)",
" at Function.Module._load (internal/modules/cjs/loader.js:790:12)",
" at Function.executeUserEntryPoint [as runMain] (internal/modules/run_main.js:75:12)",
" at internal/main/run_main_module.js:17:47"
]
}
There's nothing in my code that has ??=' in it so I made my endpoint super simple as a process of elimination and returned a response instantly ie
router.get('/my-api', async (req, res: Response) => {
return res.status(200).json({"message": "success"});
});
but I still get the same error.
serverless.yaml
service: my-service
plugins:
- serverless-webpack
- serverless-offline
custom:
env:
default: Sandbox
prod: Production
AWS_REGION: us-east-1
apigwBinary:
types: #list of mime-types
- '*/*'
contentCompression: 14000
webpack:
webpackConfig: ./webpack.config.js
packager: 'npm'
includeModules:
forceExclude:
- aws-sdk
- dotenv
provider:
name: aws
versionFunctions: false
region: ${opt:region, 'us-east-1'}
stage: ${opt:stage, 'staging'}
environment:
STAGE: ${self:provider.stage}
functions:
myApi:
handler: src/handler.service
events:
- http:
path: route/my-api
method: GET
cors: true
handler.ts
import { APIGatewayProxyEvent, Context, ProxyResult } from 'aws-lambda';
import awsServerlessExpress from 'aws-serverless-express';
import app from './app';
const server = awsServerlessExpress.createServer(app);
export function service(event: APIGatewayProxyEvent, context: Context) {
return awsServerlessExpress.proxy(server, event, context);
}
So as a recap, it works locally, but not after sls deploy in AWS in postman or aws console lambda test event. So if it works locally, why isn't it working in aws. I appreciate any help!

Serverless framework is ignoring CLI options

I'm trying to dynamically pass in options to resolve when deploying my functions with serverless but they're always null or hit the fallback.
custom:
send_grid_api: ${opt:sendgridapi, 'missing'}
SubscribedUsersTable:
name: !Ref UsersSubscriptionTable
arn: !GetAtt UsersSubscriptionTable.Arn
bundle:
linting: false
provider:
name: aws
lambdaHashingVersion: 20201221
runtime: nodejs12.x
memorySize: 256
stage: ${opt:stage, 'dev'}
region: us-west-2
environment:
STAGE: ${self:provider.stage}
SEND_GRID_API_KEY: ${self:custom.send_grid_api}
I've also tried:
environment:
STAGE: ${self:provider.stage}
SEND_GRID_API_KEY: ${opt:sendgridapi, 'missing'}
both yield 'missing', but why?
sls deploy --stage=prod --sendgridapi=xxx
also fails if I try with space instead of =.
Edit: Working Solution
In my github action template, I defined the following:
- name: create env file
run: |
touch .env
echo SEND_GRID_API_KEY=${{ secrets.SEND_GRID_KEY }} >> .env
ls -la
pwd
In addition, I explicitly set the working directory for this stage like so:
working-directory: /home/runner/work/myDir/myDir/
In my serverless.yml I added the following:
environment:
SEND_GRID_API_KEY: ${env:SEND_GRID_API_KEY}
sls will read the contents from the file and load them properly
opt is for serverless' CLI options. These are part of serverless, not your own code.
You can instead use...
provider:
...
environment:
...
SEND_GRID_API_KEY: ${env:SEND_GRID_API_KEY}
And pass the value as an environment variable in your deploy step.
- name: Deploy
run: sls deploy --stage=prod
env:
SEND_GRID_API_KEY: "insert api key here"

localstack serverless deploy is not getting deployed

I am trying to deploy my serverless project locally with LocalStack and serverless-local plugin. When I try to deploy it with serverless deploy it throws an error and its failing to create the cloudformation stack.But, I manage to create the same stack when I deploy the project in to real aws environment. What is the possible issue here. I checked answers in all the previous questions asked on similar issue, nothing seems to work.
docker-compose.yml
version: "3.8"
services:
localstack:
container_name: "serverless-localstack_main"
image: localstack/localstack
ports:
- "4566-4597:4566-4597"
environment:
- AWS_DEFAULT_REGION=eu-west-1
- EDGE_PORT=4566
- SERVICES=lambda,cloudformation,s3,sts,iam,apigateway,cloudwatch
volumes:
- "${TMPDIR:-/tmp/localstack}:/tmp/localstack"
- "/var/run/docker.sock:/var/run/docker.sock"
serverless.yml
service: serverless-localstack-test
frameworkVersion: '2'
plugins:
- serverless-localstack
custom:
localstack:
debug: true
host: http://localhost
edgePort: 4566
autostart: true
lambda:
mountCode: True
stages:
- local
endpointFile: config.json
provider:
name: aws
runtime: nodejs12.x
lambdaHashingVersion: 20201221
stage: local
region: eu-west-1
deploymentBucket:
name: deployment
functions:
hello:
handler: handler.hello
Config.json (which has the endpoints)
{
"CloudFormation": "http://localhost:4566",
"CloudWatch": "http://localhost:4566",
"Lambda": "http://localhost:4566",
"S3": "http://localhost:4566"
}
Error in Localstack container
serverless-localstack_main | 2021-06-04T17:41:49:WARNING:localstack.utils.cloudformation.template_deployer: Error calling
<bound method ClientCreator._create_api_method.<locals>._api_call of
<botocore.client.Lambda object at 0x7f31f359a4c0>> with params: {'FunctionName':
'serverless-localstack-test-local-hello', 'Runtime': 'nodejs12.x', 'Role':
'arn:aws:iam::000000000000:role/serverless-localstack-test-local-eu-west-1-lambdaRole',
'Handler': 'handler.hello', 'Code': {'S3Bucket': '__local__', 'S3Key':
'/Users/charles/Documents/Practice/serverless-localstack-test'}, 'Timeout': 6,
'MemorySize': 1024} for resource: {'Type': 'AWS::Lambda::Function', 'Properties':
{'Code': {'S3Bucket': '__local__', 'S3Key':
'/Users/charles/Documents/Practice/serverless-localstack-test'}, 'Handler':
'handler.hello', 'Runtime': 'nodejs12.x', 'FunctionName': 'serverless-localstack-test-
local-hello', 'MemorySize': 1024, 'Timeout': 6, 'Role':
'arn:aws:iam::000000000000:role/serverless-localstack-test-local-eu-west-1-lambdaRole'},
'DependsOn': ['HelloLogGroup'], 'LogicalResourceId': 'HelloLambdaFunction',
'PhysicalResourceId': None, '_state_': {}}
I fixed that problem using this plugin: https://www.serverless.com/plugins/serverless-deployment-bucket
You need to make some adjustments in your files.
Update your docker-compose.yml, use the reference docker compose from
localstack, you can check it here.
Use a template that works correctly, AWS docs page have several
examples, you can check it here.
Run it with next command aws cloudformation create-stack --endpoint-url http://localhost:4566 --stack-name samplestack --template-body file://lambda.yml --profile dev
You can also run localstack using Python with next commands
pip install localstack
localstack start

How to package executables in aws lambda function when using serverless framework?

I need to upload an executable file ( i.e. wkhtmltopdf to be exact) along with my function code in aws lambda. I'm using serverless framework. I tried different ways but the exe is not uploaded. The function works well when the code is zipped and uploaded via the aws dashboard.
Given below is the directory structure of the function that need to be uploaded
node_modules
index.js
wkhtmltopdf
This is my serverless.yml
service: consult-payment-api
frameworkVersion: ">=1.1.0 <2.0.0"
package:
individually: true
provider:
name: aws
region: us-west-2
runtime: nodejs8.10
stage: dev
timeout: 300
functions:
UserPackageCharge:
handler: payment/module/chargePackage.create
package:
include:
- packages/wkhtmltopdf
events:
- http:
path: payment/module/package
method: post
cors:
origin: '*'
headers:
- Content-Type
- X-Amz-Date
- Authorization
- X-Api-Key
- X-Amz-Security-Token
- X-Amz-User-Agent
- My-Custom-Header
This is my index.js (handler)
var wkhtmltopdf = require('wkhtmltopdf');
var MemoryStream = require('memorystream');
process.env['PATH'] = process.env['PATH'] + ':' + process.env['LAMBDA_TASK_ROOT'];
exports.handler = function(event, context) {
var memStream = new MemoryStream();
var html_utf8 = new Buffer(event.html_base64, 'base64').toString('utf8');
wkhtmltopdf(html_utf8, event.options, function(code, signal) { context.done(null, { pdf_base64: memStream.read().toString('base64') }); }).pipe(memStream);
};
But I still get the error 'Error: /bin/bash: wkhtmltopdf: command not found'
How to get this working in serverless?
I did get a version working.
Here's what I did:
1) Created a package.json and added:
"dependencies": {
"wkhtmltopdf": "^0.3.4",
"memorystream": "^0.3.1"
},
2) Ran ndm install
3) Added WKhtmltopdf in the directory:
4) Added this in serverless.yml
package:
include:
- wkhtmltopdf
5) Added this in the lambda:
var wkhtmltopdf = require('wkhtmltopdf');
var MemoryStream = require('memorystream');
That's about it. Hope it helps.
Well I can suggest for Python as that's what I've implemented recently in my project. I've all my lambda scripts and dependency python scripts in one zip and put those on my bastion server. To make those easier to execute and upload I've implemented cattle+click cli which ensure correct version of zips are picked up which are then uploaded to s3 bucket location. When the lambda is triggered based on s3 event it looks for the required parameter file or the input file in the repository(which is nothing but an s3 bucket only).

Create a getMethod using serverless framework

I use the serverless framework.
I use AWS API GateWay.
I would like to create an API to enter this command at the terminal and receive the parameter "name" and the parameter "type".
$ mkdir test-serverless
$ cd test-serverless
$ sls create --template aws-nodejs --name test
$ vi serverless.yml
$cat serverless.yml
service:test
provider:
name: aws
runtime: nodejs6.10
region: ap-northeast-1
functions:
testfunc:
handler: handler.func
events:
- http:
path: testpath
method: get
request:
querystrings:
name: true
type: true
headers:
Accept: application/json
$ sls deploy -v
With this command, the API was created successfully.
However, none of the parameters were set.
As a result, I set the parameters manually in the AWS console.
But wait for correct knowledge.
As a result, is not it possible to eliminate manual input with the serverless framework?
Reflecting the setting of parameters in API GateWay If anyone knows how to write yml, please let me know.
I want to easily hit the API with curl.
$ curl http://url.com/para?name=test&type=test
You need to add the correct identation:
functions:
testfunc:
handler: handler.func
events:
- http:
path: testpath
method: get
request:
querystrings:
name: true
type: true
headers:
Accept: application/json