Serverless offline in Golang - amazon-web-services

I'm trying to execute lambda in go
using serverless offline (SLS_DEBUG=* sls offline --stage dev --printOutput --useDocker )
even running it seems that my Router function in lambda.start isn't being called.
The "main working" was showed, but not the "Request info".
And this message appears in serverless log:
Here is my main.go:
package main
import (
"context"
"database/sql"
"os"
"net/http"
"saws/database"
"github.com/joho/godotenv"
"github.com/sirupsen/logrus"
"github.com/aws/aws-lambda-go/events"
"github.com/aws/aws-lambda-go/lambda"
)
type Response events.APIGatewayProxyResponse
type Request events.APIGatewayProxyRequest
type Ctx context.Context
func Router(ctx Ctx, req Request) (Response, error) {
logrus.Error("Request info")
logrus.Error(req)
if req.Path == "/settings" {
if req.HTTPMethod == "GET" {
return HandleGetSettings(req, ctx)
}
if req.HTTPMethod == "POST" {
return HandleSetSettings(req, ctx)
}
}
return Response{
StatusCode: http.StatusMethodNotAllowed,
Body: http.StatusText(http.StatusMethodNotAllowed),
}, nil
}
func main() {
logrus.Error("main working")
lambda.Start(Router)
}
And here is my serverless.yml:
service: sss
frameworkVersion: '>=2.0.0'
variablesResolutionMode: 20210326
configValidationMode: error
provider:
name: aws
runtime: go1.x
region: us-east-2
profile: saws
versionFunctions: false
stage: ${opt:stage,'dev'}
timeout: 30
lambdaHashingVersion: 20201221
apiGateway:
shouldStartNameWithService: true
apiKeys:
- ${self:provider.stage}-${self:service}-apikey
custom:
stages:
- dev
- prod
package:
individually: true
patterns:
- '!./**'
functions:
sync_saws:
handler: bin/main
package:
patterns:
- './bin/main'
events:
- http:
path: /settings
method: any
integration: lambda
plugins:
- serverless-stage-manager
- serverless-offline
How to solve that??

Related

Serverless middleware Error: Cannot find module 'notifications-dev-findLead'

After a lot of tries I'm surrendering and asking this
this is my serverless.yml
console: true
service: notifications
frameworkVersion: '3'
useDotenv: true
plugins:
- serverless-offline
- serverless-dotenv-plugin
- serverless-plugin-typescript
- serverless-middleware
custom:
middleware:
folderName: middleware # defaults to '.middleware'
cleanFolder: false # defaults to 'true'
provider:
name: aws
runtime: nodejs14.x
versionFunctions: false
region: ${opt:region, 'sa-east-1'}
functions:
findAllLeads:
handler: src/server.findAllLeads
events:
- http:
path: leads
method: get
cors: true # <-- CORS!
findLead:
handler: src/server.findLead
middleware:
pre:
- middleware/middleware.myF
events:
- http:
path: leads/{id}
method: get
cors:
origin: "*"
createLead:
handler: src/server.createLead
middleware:
pre:
- middleware/middleware.myF
events:
- http:
path: leads
method: post
cors: true # <-- CORS!
and my project strcucture is something like
middleware/
middleware.ts
src/
servers.ts // with handlers
and my middleware.ts is
// export const middleware = (event: any) => {
// console.log(event.body);
// };
export const myF = (event) => {
// will do some logic if I make it work
console.log(event.body);
};
and ALL im getting is
{"message": "Internal server error"}
as a response and in the logs
Does anyone know how to fix this?

AWS Lambda function is returning response from a different Lambda function

I'm new to AWS and attempting to build an API for a basic course scheduling app. I am currently able to get the API running locally and able to invoke two functions. Function 1 is executing properly, but function 2 seems to be executing the code from function 1. Here is how I have my SAM app structured:
- sam-app
| - events
| - tests
| - src
| - api
| - course
| - AddCourse Lambda
| app.js (Index Lambda, the default hello world sample function, mostly just using to check that API is up)
The Index Lambda at app.js does a GET / and returns status code 200 and body with message "Hello World!" so long as the API is reachable.
The AddCourse Lambda is supposed to do the following via POST /courses:
try {
console.log("Adding a new item...");
await docClient.put(params).promise();
response = {
'statusCode': 200,
'headers': {
'Content-Type': "application/json"
},
'body': JSON.stringify({
message: 'Successfully created item!'
})
}
} catch (err) {
console.error(err);
response = {
'statusCode': 400,
'headers': {
'Content-Type': "application/json"
},
'body': JSON.stringify(err)
}
}
Instead, it is returning status code 200 and body with message "Hello World!".
My template.yml seems to have the correct routes specified too:
Resources:
Index:
Type: AWS::Serverless::Function
Properties:
CodeUri: src
Handler: app.handler
Runtime: nodejs14.x
Policies: AmazonDynamoDBReadOnlyAccess
PackageType: Image
Events:
GetEvent:
Type: Api
Properties:
Path: /
Method: get
Metadata:
DockerTag: nodejs14.x-v1
DockerContext: ./src
Dockerfile: Dockerfile
AddCourse:
Type: AWS::Serverless::Function
Properties:
CodeUri: src/api/course/Course-POST-CreateNewCourse
Handler: index.lambdaHandler
Runtime: nodejs14.x
Policies: AmazonDynamoDBFullAccess
PackageType: Image
Events:
GetEvent:
Type: Api
Properties:
Path: /courses
Method: post
Metadata:
DockerTag: nodejs14.x-v1
DockerContext: ./src
Dockerfile: Dockerfile
What could possibly be going on here? Is there something inherently wrong with how I structured my app?

503: Service Unavailable AWS Lambda with Github Probot

I'm trying to receive webhooks from Github using a probot application, but every single time I try this, I get a {"message":"Service Unavailable"} error.
Github sends this payload to an AWS Lambda function, and by googling (i think) this is an issue with not having enough nodes to handle the number of requests.
Either something is wrong with my code or there is an error with the configuration.
I'm using the serverless framework to upload to AWS lambda.
Here is the part where the code fails (no error messages in logs, and the bot just quits):
const yamlFile = async (context) => {
try {
console.log("trying to get yaml")
var yamlfile = await context.octokit.repos.getContent({
owner: context.payload.repository.owner.login,
repo: context.payload.repository.name,
path: ".bit/config.yml",
});
console.log(yamlfile)
} catch (e) {
console.log("Error with getting content of yaml");
console.log(e)
return null
}
console.log("got yaml, but no content yet");
yamlfile = Buffer.from(yamlfile.data.content, 'base64').toString()
console.log(yamlfile)
try {
let fileContents = yamlfile
configyml = yaml.load(fileContents);
} catch (e) {
const issueBody = context.issue({
title: "[ERROR] Please read",
body: `There was an issue parsing the config file of this course. Please contact your counselor and send them the below error.\n${e}`,
});
context.octokit.issues.create(issueBody)
console.log("ERROR: " + e);
return null
}
console.log("returining configyml")
return configyml;
}
The function yamlFile() is being called in our main function here:
let currentStep = ""
let configData = await data.yamlFile(context);
console.log(configData)
if (configData == null) {
console.log("null config data");
return
}
AWS Config
Timeout: 60 seconds
serverless.yml for Serverless framework:
service: <SERVICE NAME>
# app and org for use with dashboard.serverless.com
app: <APP NAME>
org: <ORG NAME>
frameworkVersion: "2"
useDotenv: true
provider:
name: aws
runtime: nodejs12.x
lambdaHashingVersion: 20201221
environment:
APP_ID: ${param:APP_ID}
PRIVATE_KEY: ${param:PRIVATE_KEY}
WEBHOOK_SECRET: ${param:WEBHOOK_SECRET}
NODE_ENV: production
LOG_LEVEL: debug
memorySize: 2048
functions:
webhooks:
handler: handler.webhooks
memorySize: 2048
events:
- httpApi:
path: /api/github/webhooks
method: post

AWS SAM Deployment Configuration Issue

I have set up a simple serverless rest api using Node JS and AWS Lambda.
The deployment is done using AWS SAM
Below is the SAM Template :
AWSTemplateFormatVersion: '2010-09-09'
Transform: 'AWS::Serverless-2016-10-31'
Description: Serverless API With SAM
Resources:
createUser:
Type: AWS::Serverless::Function
Properties:
Handler: handler.create
MemorySize: 128
Runtime: nodejs12.x
Timeout: 3
Events:
createUserApi:
Type: Api
Properties :
Path : /users
Method : post
listUsers:
Type: AWS::Serverless::Function
Properties:
Handler: handler.list
MemorySize: 128
Runtime: nodejs12.x
Timeout: 3
Events:
listUserApi:
Type: Api
Properties :
Path : /users
Method : get
This works fine but below is my observation regarding stacks created.
It creates two AWS Lambda functions instead of one.
Both contain two APIs listed as -
createUser
listUsers
Can it not contain only on Lambda function with these two handlers inside ?
handler.js file :
const connectedToDb = require('./src/db/db.js');
const User = require('./src/model/user.js');
module.exports.create = async (event,context) =>{
console.log('create function called !!');
try{
console.log('before connecting to DB ');
await connectedToDb();
console.log('after connecting to DB ');
const usr = new User(JSON.parse(event.body));
console.log('saving a user now with ',event.body);
await usr.save();
return{
statusCode : 200,
body:JSON.stringify(usr)
}
}catch(err){
return{
statusCode : err.statusCode || 500,
body : 'Cannot Create Order'
}
}
}
module.exports.list = async (event,context) => {
console.log('listing all users');
try{
await connectedToDb();
const users = await User.find({});
console.log('users are == ',users);
return {
statusCode:200,
body:JSON.stringify(users)
}
}catch(err){
return {
statusCode:err || 500,
body:JSON.stringify(err)
}
}
}

Getting Sequelize.js library to work on Amazon Lambda

So I'm trying to run a lambda on amazon and narrowed down the error finally by testing the lambda in amazons testing console.
The error I got is this.
{
"errorMessage": "Please install mysql2 package manually",
"errorType": "Error",
"stackTrace": [
"new MysqlDialect (/var/task/node_modules/sequelize/lib/dialects/mysql/index.js:14:30)",
"new Sequelize (/var/task/node_modules/sequelize/lib/sequelize.js:234:20)",
"Object.exports.getSequelizeConnection (/var/task/src/twilio/twilio.js:858:20)",
"Object.<anonymous> (/var/task/src/twilio/twilio.js:679:25)",
"__webpack_require__ (/var/task/src/twilio/twilio.js:20:30)",
"/var/task/src/twilio/twilio.js:63:18",
"Object.<anonymous> (/var/task/src/twilio/twilio.js:66:10)",
"Module._compile (module.js:570:32)",
"Object.Module._extensions..js (module.js:579:10)",
"Module.load (module.js:487:32)",
"tryModuleLoad (module.js:446:12)",
"Function.Module._load (module.js:438:3)",
"Module.require (module.js:497:17)",
"require (internal/module.js:20:19)"
]
}
Easy enough, so I have to install mysql2. So I added it to my package.json file.
{
"name": "test-api",
"version": "1.0.0",
"description": "",
"main": "handler.js",
"scripts": {
"test": "echo \"Error: no test specified\" && exit 0"
},
"keywords": [],
"author": "",
"license": "ISC",
"devDependencies": {
"aws-sdk": "^2.153.0",
"babel-core": "^6.26.0",
"babel-loader": "^7.1.2",
"babel-plugin-transform-runtime": "^6.23.0",
"babel-preset-es2015": "^6.24.1",
"babel-preset-stage-3": "^6.24.1",
"serverless-domain-manager": "^1.1.20",
"serverless-dynamodb-autoscaling": "^0.6.2",
"serverless-webpack": "^4.0.0",
"webpack": "^3.8.1",
"webpack-node-externals": "^1.6.0"
},
"dependencies": {
"babel-runtime": "^6.26.0",
"mailgun-js": "^0.13.1",
"minimist": "^1.2.0",
"mysql": "^2.15.0",
"mysql2": "^1.5.1",
"qs": "^6.5.1",
"sequelize": "^4.31.2",
"serverless": "^1.26.0",
"serverless-plugin-scripts": "^1.0.2",
"twilio": "^3.10.0",
"uuid": "^3.1.0"
}
}
I noticed when I do sls deploy however, it seems to only be packaging some of the modules?
Serverless: Package lock found - Using locked versions
Serverless: Packing external modules: babel-runtime#^6.26.0, twilio#^3.10.0, qs#^6.5.1, mailgun-js#^0.13.1, sequelize#^4.31.2, minimi
st#^1.2.0, uuid#^3.1.0
Serverless: Packaging service...
Serverless: Uploading CloudFormation file to S3...
Serverless: Uploading artifacts...
Serverless: Validating template...
Serverless: Updating Stack...
Serverless: Checking Stack update progress...
................................
Serverless: Stack update finished...
I think this is why it's not working. In short, how do I get mysql2 library to be packaged correctly with serverless so my lambda function will work with the sequelize library?
Please note that when I test locally my code works fine.
My serverless file is below
service: testapi
# Use serverless-webpack plugin to transpile ES6/ES7
plugins:
- serverless-webpack
- serverless-plugin-scripts
# - serverless-domain-manager
custom:
#Define the Stage or default to Staging.
stage: ${opt:stage, self:provider.stage}
webpackIncludeModules: true
#Define Databases Here
databaseName: "${self:service}-${self:custom.stage}"
#Define Bucket Names Here
uploadBucket: "${self:service}-uploads-${self:custom.stage}"
#Custom Script setup
scripts:
hooks:
#Script below will run schema changes to the database as neccesary and update according to stage.
'deploy:finalize': node database-schema-update.js --stage ${self:custom.stage}
#Domain Setup
# customDomain:
# basePath: "/"
# domainName: "api-${self:custom.stage}.test.com"
# stage: "${self:custom.stage}"
# certificateName: "*.test.com"
# createRoute53Record: true
provider:
name: aws
runtime: nodejs6.10
stage: staging
region: us-east-1
environment:
DOMAIN_NAME: "api-${self:custom.stage}.test.com"
DATABASE_NAME: ${self:custom.databaseName}
DATABASE_USERNAME: ${env:RDS_USERNAME}
DATABASE_PASSWORD: ${env:RDS_PASSWORD}
UPLOAD_BUCKET: ${self:custom.uploadBucket}
TWILIO_ACCOUNT_SID: ""
TWILIO_AUTH_TOKEN: ""
USER_POOL_ID: ""
APP_CLIENT_ID: ""
REGION: "us-east-1"
IDENTITY_POOL_ID: ""
RACKSPACE_API_KEY: ""
#Below controls permissions for lambda functions.
iamRoleStatements:
- Effect: Allow
Action:
- dynamodb:DescribeTable
- dynamodb:UpdateTable
- dynamodb:Query
- dynamodb:Scan
- dynamodb:GetItem
- dynamodb:PutItem
- dynamodb:UpdateItem
- dynamodb:DeleteItem
Resource: "arn:aws:dynamodb:us-east-1:*:*"
functions:
create_visit:
handler: src/visits/create.main
events:
- http:
path: visits
method: post
cors: true
authorizer: aws_iam
get_visit:
handler: src/visits/get.main
events:
- http:
path: visits/{id}
method: get
cors: true
authorizer: aws_iam
list_visit:
handler: src/visits/list.main
events:
- http:
path: visits
method: get
cors: true
authorizer: aws_iam
update_visit:
handler: src/visits/update.main
events:
- http:
path: visits/{id}
method: put
cors: true
authorizer: aws_iam
delete_visit:
handler: src/visits/delete.main
events:
- http:
path: visits/{id}
method: delete
cors: true
authorizer: aws_iam
twilio_send_text_message:
handler: src/twilio/twilio.send_text_message
events:
- http:
path: twilio/sendtextmessage
method: post
cors: true
authorizer: aws_iam
#This function handles incoming calls and where to route it to.
twilio_incoming_call:
handler: src/twilio/twilio.incoming_calls
events:
- http:
path: twilio/calls
method: post
twilio_failure:
handler: src/twilio/twilio.twilio_failure
events:
- http:
path: twilio/failure
method: post
twilio_statuschange:
handler: src/twilio/twilio.statuschange
events:
- http:
path: twilio/statuschange
method: post
twilio_incoming_message:
handler: src/twilio/twilio.incoming_message
events:
- http:
path: twilio/messages
method: post
twilio_whisper:
handler: src/twilio/twilio.whisper
events:
- http:
path: twilio/whisper
method: post
- http:
path: twilio/whisper
method: get
twilio_start_call:
handler: src/twilio/twilio.start_call
events:
- http:
path: twilio/startcall
method: post
- http:
path: twilio/startcall
method: get
resources:
Resources:
uploadBucket:
Type: AWS::S3::Bucket
Properties:
BucketName: ${self:custom.uploadBucket}
RDSDatabase:
Type: AWS::RDS::DBInstance
Properties:
Engine : mysql
MasterUsername: ${env:RDS_USERNAME}
MasterUserPassword: ${env:RDS_PASSWORD}
DBInstanceClass : db.t2.micro
AllocatedStorage: '5'
PubliclyAccessible: true
#TODO: The Value of Stage is also available as a TAG automatically which I may use to replace this manually being put here..
Tags:
-
Key: "Name"
Value: ${self:custom.databaseName}
DeletionPolicy: Snapshot
DNSRecordSet:
Type: AWS::Route53::RecordSet
Properties:
HostedZoneName: test.com.
Name: database-${self:custom.stage}.test.com
Type: CNAME
TTL: '300'
ResourceRecords:
- {"Fn::GetAtt": ["RDSDatabase","Endpoint.Address"]}
DependsOn: RDSDatabase
UPDATE:: So I confirmed that running sls package --stage dev seems to create this in the zip folder that would eventually upload to AWS. This confirms that serverless is not creating the package correctly with the mysql2 reference for some reason? Why is this?
webpack config file as requested
const slsw = require("serverless-webpack");
const nodeExternals = require("webpack-node-externals");
module.exports = {
entry: slsw.lib.entries,
target: "node",
// Since 'aws-sdk' is not compatible with webpack,
// we exclude all node dependencies
externals: [nodeExternals()],
// Run babel on all .js files and skip those in node_modules
module: {
rules: [
{
test: /\.js$/,
loader: "babel-loader",
include: __dirname,
exclude: /node_modules/
}
]
}
};
Thanks to dashmugs comment after some investigation on this page (https://github.com/serverless-heaven/serverless-webpack), there is a section on Forced Inclusion. I'll paraphrase it here.
Forced inclusion Sometimes it might happen that you use dynamic
requires in your code, i.e. you require modules that are only known at
runtime. Webpack is not able to detect such externals and the compiled
package will miss the needed dependencies. In such cases you can force
the plugin to include certain modules by setting them in the
forceInclude array property. However the module must appear in your
service's production dependencies in package.json.
# serverless.yml
custom:
webpackIncludeModules:
forceInclude:
- module1
- module2
So I simply did this...
webpackIncludeModules:
forceInclude:
- mysql
- mysql2
Now it works! Hope this helps someone else with the same issue.
None of the previous helped me, I used this solution: https://github.com/sequelize/sequelize/issues/9489#issuecomment-493304014
The trick is to use dialectModule property and override sequelize.
import Sequelize from 'sequelize';
import mysql2 from 'mysql2'; // Needed to fix sequelize issues with WebPack
const sequelize = new Sequelize(
process.env.DB_NAME,
process.env.DB_USER,
process.env.DB_PASSWORD,
{
dialect: 'mysql',
dialectModule: mysql2, // Needed to fix sequelize issues with WebPack
host: process.env.DB_HOST,
port: process.env.DB_PORT
}
)
export async function connectToDatabase() {
console.log('Trying to connect via sequelize')
await sequelize.sync()
await sequelize.authenticate()
console.log('=> Created a new connection.')
// Do something
}
The previous so far works on MySql but is not working with Postgres