I'm using serverless framework and my endpoint work locally with sls offline, but when I sls deploy it to AWS I get 502 Bad Gateway in postman
and if I go to AWS Lambda console and click test event to see what comes up I get
{
"errorType": "Runtime.UserCodeSyntaxError",
"errorMessage": "SyntaxError: Unexpected token '??='",
"trace": [
"Runtime.UserCodeSyntaxError: SyntaxError: Unexpected token '??='",
" at _loadUserApp (/var/runtime/UserFunction.js:222:13)",
" at Object.module.exports.load (/var/runtime/UserFunction.js:300:17)",
" at Object.<anonymous> (/var/runtime/index.js:43:34)",
" at Module._compile (internal/modules/cjs/loader.js:1085:14)",
" at Object.Module._extensions..js (internal/modules/cjs/loader.js:1114:10)",
" at Module.load (internal/modules/cjs/loader.js:950:32)",
" at Function.Module._load (internal/modules/cjs/loader.js:790:12)",
" at Function.executeUserEntryPoint [as runMain] (internal/modules/run_main.js:75:12)",
" at internal/main/run_main_module.js:17:47"
]
}
There's nothing in my code that has ??=' in it so I made my endpoint super simple as a process of elimination and returned a response instantly ie
router.get('/my-api', async (req, res: Response) => {
return res.status(200).json({"message": "success"});
});
but I still get the same error.
serverless.yaml
service: my-service
plugins:
- serverless-webpack
- serverless-offline
custom:
env:
default: Sandbox
prod: Production
AWS_REGION: us-east-1
apigwBinary:
types: #list of mime-types
- '*/*'
contentCompression: 14000
webpack:
webpackConfig: ./webpack.config.js
packager: 'npm'
includeModules:
forceExclude:
- aws-sdk
- dotenv
provider:
name: aws
versionFunctions: false
region: ${opt:region, 'us-east-1'}
stage: ${opt:stage, 'staging'}
environment:
STAGE: ${self:provider.stage}
functions:
myApi:
handler: src/handler.service
events:
- http:
path: route/my-api
method: GET
cors: true
handler.ts
import { APIGatewayProxyEvent, Context, ProxyResult } from 'aws-lambda';
import awsServerlessExpress from 'aws-serverless-express';
import app from './app';
const server = awsServerlessExpress.createServer(app);
export function service(event: APIGatewayProxyEvent, context: Context) {
return awsServerlessExpress.proxy(server, event, context);
}
So as a recap, it works locally, but not after sls deploy in AWS in postman or aws console lambda test event. So if it works locally, why isn't it working in aws. I appreciate any help!
Related
Following this tutorial, https://docs.gitlab.cn/14.0/ee/user/project/clusters/serverless/aws.html#serverless-framework
Created a function in AWS Lambda called create-promo-animation
Created a /src/handler.js
"use strict";
module.exports.hello = async (event) => {
return {
statusCode: 200,
body: JSON.stringify(
{
message: "Your function executed successfully!",
},
null,
2
),
};
};
Created gitlab-ci.yml
stages:
- deploy
production:
stage: deploy
before_script:
- npm config set prefix /usr/local
- npm install -g serverless
script:
- serverless deploy --stage production --verbose
environment: production
Created serverless.yml
service: gitlab-example
provider:
name: aws
runtime: nodejs14.x
functions:
create-promo-animation:
handler: src/handler.hello
events:
- http: GET hello
pushed to GitLab, Pipe run well
But code is not updating in AWS, why?
I'm trying to receive webhooks from Github using a probot application, but every single time I try this, I get a {"message":"Service Unavailable"} error.
Github sends this payload to an AWS Lambda function, and by googling (i think) this is an issue with not having enough nodes to handle the number of requests.
Either something is wrong with my code or there is an error with the configuration.
I'm using the serverless framework to upload to AWS lambda.
Here is the part where the code fails (no error messages in logs, and the bot just quits):
const yamlFile = async (context) => {
try {
console.log("trying to get yaml")
var yamlfile = await context.octokit.repos.getContent({
owner: context.payload.repository.owner.login,
repo: context.payload.repository.name,
path: ".bit/config.yml",
});
console.log(yamlfile)
} catch (e) {
console.log("Error with getting content of yaml");
console.log(e)
return null
}
console.log("got yaml, but no content yet");
yamlfile = Buffer.from(yamlfile.data.content, 'base64').toString()
console.log(yamlfile)
try {
let fileContents = yamlfile
configyml = yaml.load(fileContents);
} catch (e) {
const issueBody = context.issue({
title: "[ERROR] Please read",
body: `There was an issue parsing the config file of this course. Please contact your counselor and send them the below error.\n${e}`,
});
context.octokit.issues.create(issueBody)
console.log("ERROR: " + e);
return null
}
console.log("returining configyml")
return configyml;
}
The function yamlFile() is being called in our main function here:
let currentStep = ""
let configData = await data.yamlFile(context);
console.log(configData)
if (configData == null) {
console.log("null config data");
return
}
AWS Config
Timeout: 60 seconds
serverless.yml for Serverless framework:
service: <SERVICE NAME>
# app and org for use with dashboard.serverless.com
app: <APP NAME>
org: <ORG NAME>
frameworkVersion: "2"
useDotenv: true
provider:
name: aws
runtime: nodejs12.x
lambdaHashingVersion: 20201221
environment:
APP_ID: ${param:APP_ID}
PRIVATE_KEY: ${param:PRIVATE_KEY}
WEBHOOK_SECRET: ${param:WEBHOOK_SECRET}
NODE_ENV: production
LOG_LEVEL: debug
memorySize: 2048
functions:
webhooks:
handler: handler.webhooks
memorySize: 2048
events:
- httpApi:
path: /api/github/webhooks
method: post
I am trying to deploy my serverless project locally with LocalStack and serverless-local plugin. When I try to deploy it with serverless deploy it throws an error and its failing to create the cloudformation stack.But, I manage to create the same stack when I deploy the project in to real aws environment. What is the possible issue here. I checked answers in all the previous questions asked on similar issue, nothing seems to work.
docker-compose.yml
version: "3.8"
services:
localstack:
container_name: "serverless-localstack_main"
image: localstack/localstack
ports:
- "4566-4597:4566-4597"
environment:
- AWS_DEFAULT_REGION=eu-west-1
- EDGE_PORT=4566
- SERVICES=lambda,cloudformation,s3,sts,iam,apigateway,cloudwatch
volumes:
- "${TMPDIR:-/tmp/localstack}:/tmp/localstack"
- "/var/run/docker.sock:/var/run/docker.sock"
serverless.yml
service: serverless-localstack-test
frameworkVersion: '2'
plugins:
- serverless-localstack
custom:
localstack:
debug: true
host: http://localhost
edgePort: 4566
autostart: true
lambda:
mountCode: True
stages:
- local
endpointFile: config.json
provider:
name: aws
runtime: nodejs12.x
lambdaHashingVersion: 20201221
stage: local
region: eu-west-1
deploymentBucket:
name: deployment
functions:
hello:
handler: handler.hello
Config.json (which has the endpoints)
{
"CloudFormation": "http://localhost:4566",
"CloudWatch": "http://localhost:4566",
"Lambda": "http://localhost:4566",
"S3": "http://localhost:4566"
}
Error in Localstack container
serverless-localstack_main | 2021-06-04T17:41:49:WARNING:localstack.utils.cloudformation.template_deployer: Error calling
<bound method ClientCreator._create_api_method.<locals>._api_call of
<botocore.client.Lambda object at 0x7f31f359a4c0>> with params: {'FunctionName':
'serverless-localstack-test-local-hello', 'Runtime': 'nodejs12.x', 'Role':
'arn:aws:iam::000000000000:role/serverless-localstack-test-local-eu-west-1-lambdaRole',
'Handler': 'handler.hello', 'Code': {'S3Bucket': '__local__', 'S3Key':
'/Users/charles/Documents/Practice/serverless-localstack-test'}, 'Timeout': 6,
'MemorySize': 1024} for resource: {'Type': 'AWS::Lambda::Function', 'Properties':
{'Code': {'S3Bucket': '__local__', 'S3Key':
'/Users/charles/Documents/Practice/serverless-localstack-test'}, 'Handler':
'handler.hello', 'Runtime': 'nodejs12.x', 'FunctionName': 'serverless-localstack-test-
local-hello', 'MemorySize': 1024, 'Timeout': 6, 'Role':
'arn:aws:iam::000000000000:role/serverless-localstack-test-local-eu-west-1-lambdaRole'},
'DependsOn': ['HelloLogGroup'], 'LogicalResourceId': 'HelloLambdaFunction',
'PhysicalResourceId': None, '_state_': {}}
I fixed that problem using this plugin: https://www.serverless.com/plugins/serverless-deployment-bucket
You need to make some adjustments in your files.
Update your docker-compose.yml, use the reference docker compose from
localstack, you can check it here.
Use a template that works correctly, AWS docs page have several
examples, you can check it here.
Run it with next command aws cloudformation create-stack --endpoint-url http://localhost:4566 --stack-name samplestack --template-body file://lambda.yml --profile dev
You can also run localstack using Python with next commands
pip install localstack
localstack start
I have been trying to deploy a backend api service made with graphql api and express to Amazon web services. This is my folder structure
Graphql-api
-src
-index.js
-serverless.yml
-index.js(this only contains an import statement of the src folder)
The serverless.yml looks like this
service: graphql-api
provider:
name: aws
runtime: nodejs12.x
lambdaHashingVersion: 20201221
functions:
api:
handler: src/index.handler
events:
- http:
path: graphql
method: ANY
cors: true
And I have exported this handler in my src/index.js file
`const awsServerlessExpress = require('aws-serverless-express');
const app = require('./app');
const server = awsServerlessExpress.createServer(app);
exports.handler = (event, context) => awsServerlessExpress.proxy(server, event, context);`
But when I run serverless deploy and get the endpoint url which is: https://nir4749aal.execute-api.us-east-1.amazonaws.com/dev/graphql
I get a message saying internal server error. I can't figure out what I am doing wrong. This is my first time trying to deploy to AWS. Any help or suggestion would help thanks in advance.
-----UPDATE--------
This are the logs from lambda
{
"errorType": "Runtime.UserCodeSyntaxError",
"errorMessage": "SyntaxError: Cannot use import statement outside a module",
"stack": [
"Runtime.UserCodeSyntaxError: SyntaxError: Cannot use import statement outside a module",
" at _loadUserApp (/var/runtime/UserFunction.js:98:13)",
" at Object.module.exports.load (/var/runtime/UserFunction.js:140:17)",
" at Object.<anonymous> (/var/runtime/index.js:43:30)",
" at Module._compile (internal/modules/cjs/loader.js:1015:30)",
" at Object.Module._extensions..js (internal/modules/cjs/loader.js:1035:10)",
" at Module.load (internal/modules/cjs/loader.js:879:32)",
" at Function.Module._load (internal/modules/cjs/loader.js:724:14)",
" at Function.executeUserEntryPoint [as runMain] (internal/modules/run_main.js:60:12)",
" at internal/main/run_main_module.js:17:47"
]
}
So I'm trying to run a lambda on amazon and narrowed down the error finally by testing the lambda in amazons testing console.
The error I got is this.
{
"errorMessage": "Please install mysql2 package manually",
"errorType": "Error",
"stackTrace": [
"new MysqlDialect (/var/task/node_modules/sequelize/lib/dialects/mysql/index.js:14:30)",
"new Sequelize (/var/task/node_modules/sequelize/lib/sequelize.js:234:20)",
"Object.exports.getSequelizeConnection (/var/task/src/twilio/twilio.js:858:20)",
"Object.<anonymous> (/var/task/src/twilio/twilio.js:679:25)",
"__webpack_require__ (/var/task/src/twilio/twilio.js:20:30)",
"/var/task/src/twilio/twilio.js:63:18",
"Object.<anonymous> (/var/task/src/twilio/twilio.js:66:10)",
"Module._compile (module.js:570:32)",
"Object.Module._extensions..js (module.js:579:10)",
"Module.load (module.js:487:32)",
"tryModuleLoad (module.js:446:12)",
"Function.Module._load (module.js:438:3)",
"Module.require (module.js:497:17)",
"require (internal/module.js:20:19)"
]
}
Easy enough, so I have to install mysql2. So I added it to my package.json file.
{
"name": "test-api",
"version": "1.0.0",
"description": "",
"main": "handler.js",
"scripts": {
"test": "echo \"Error: no test specified\" && exit 0"
},
"keywords": [],
"author": "",
"license": "ISC",
"devDependencies": {
"aws-sdk": "^2.153.0",
"babel-core": "^6.26.0",
"babel-loader": "^7.1.2",
"babel-plugin-transform-runtime": "^6.23.0",
"babel-preset-es2015": "^6.24.1",
"babel-preset-stage-3": "^6.24.1",
"serverless-domain-manager": "^1.1.20",
"serverless-dynamodb-autoscaling": "^0.6.2",
"serverless-webpack": "^4.0.0",
"webpack": "^3.8.1",
"webpack-node-externals": "^1.6.0"
},
"dependencies": {
"babel-runtime": "^6.26.0",
"mailgun-js": "^0.13.1",
"minimist": "^1.2.0",
"mysql": "^2.15.0",
"mysql2": "^1.5.1",
"qs": "^6.5.1",
"sequelize": "^4.31.2",
"serverless": "^1.26.0",
"serverless-plugin-scripts": "^1.0.2",
"twilio": "^3.10.0",
"uuid": "^3.1.0"
}
}
I noticed when I do sls deploy however, it seems to only be packaging some of the modules?
Serverless: Package lock found - Using locked versions
Serverless: Packing external modules: babel-runtime#^6.26.0, twilio#^3.10.0, qs#^6.5.1, mailgun-js#^0.13.1, sequelize#^4.31.2, minimi
st#^1.2.0, uuid#^3.1.0
Serverless: Packaging service...
Serverless: Uploading CloudFormation file to S3...
Serverless: Uploading artifacts...
Serverless: Validating template...
Serverless: Updating Stack...
Serverless: Checking Stack update progress...
................................
Serverless: Stack update finished...
I think this is why it's not working. In short, how do I get mysql2 library to be packaged correctly with serverless so my lambda function will work with the sequelize library?
Please note that when I test locally my code works fine.
My serverless file is below
service: testapi
# Use serverless-webpack plugin to transpile ES6/ES7
plugins:
- serverless-webpack
- serverless-plugin-scripts
# - serverless-domain-manager
custom:
#Define the Stage or default to Staging.
stage: ${opt:stage, self:provider.stage}
webpackIncludeModules: true
#Define Databases Here
databaseName: "${self:service}-${self:custom.stage}"
#Define Bucket Names Here
uploadBucket: "${self:service}-uploads-${self:custom.stage}"
#Custom Script setup
scripts:
hooks:
#Script below will run schema changes to the database as neccesary and update according to stage.
'deploy:finalize': node database-schema-update.js --stage ${self:custom.stage}
#Domain Setup
# customDomain:
# basePath: "/"
# domainName: "api-${self:custom.stage}.test.com"
# stage: "${self:custom.stage}"
# certificateName: "*.test.com"
# createRoute53Record: true
provider:
name: aws
runtime: nodejs6.10
stage: staging
region: us-east-1
environment:
DOMAIN_NAME: "api-${self:custom.stage}.test.com"
DATABASE_NAME: ${self:custom.databaseName}
DATABASE_USERNAME: ${env:RDS_USERNAME}
DATABASE_PASSWORD: ${env:RDS_PASSWORD}
UPLOAD_BUCKET: ${self:custom.uploadBucket}
TWILIO_ACCOUNT_SID: ""
TWILIO_AUTH_TOKEN: ""
USER_POOL_ID: ""
APP_CLIENT_ID: ""
REGION: "us-east-1"
IDENTITY_POOL_ID: ""
RACKSPACE_API_KEY: ""
#Below controls permissions for lambda functions.
iamRoleStatements:
- Effect: Allow
Action:
- dynamodb:DescribeTable
- dynamodb:UpdateTable
- dynamodb:Query
- dynamodb:Scan
- dynamodb:GetItem
- dynamodb:PutItem
- dynamodb:UpdateItem
- dynamodb:DeleteItem
Resource: "arn:aws:dynamodb:us-east-1:*:*"
functions:
create_visit:
handler: src/visits/create.main
events:
- http:
path: visits
method: post
cors: true
authorizer: aws_iam
get_visit:
handler: src/visits/get.main
events:
- http:
path: visits/{id}
method: get
cors: true
authorizer: aws_iam
list_visit:
handler: src/visits/list.main
events:
- http:
path: visits
method: get
cors: true
authorizer: aws_iam
update_visit:
handler: src/visits/update.main
events:
- http:
path: visits/{id}
method: put
cors: true
authorizer: aws_iam
delete_visit:
handler: src/visits/delete.main
events:
- http:
path: visits/{id}
method: delete
cors: true
authorizer: aws_iam
twilio_send_text_message:
handler: src/twilio/twilio.send_text_message
events:
- http:
path: twilio/sendtextmessage
method: post
cors: true
authorizer: aws_iam
#This function handles incoming calls and where to route it to.
twilio_incoming_call:
handler: src/twilio/twilio.incoming_calls
events:
- http:
path: twilio/calls
method: post
twilio_failure:
handler: src/twilio/twilio.twilio_failure
events:
- http:
path: twilio/failure
method: post
twilio_statuschange:
handler: src/twilio/twilio.statuschange
events:
- http:
path: twilio/statuschange
method: post
twilio_incoming_message:
handler: src/twilio/twilio.incoming_message
events:
- http:
path: twilio/messages
method: post
twilio_whisper:
handler: src/twilio/twilio.whisper
events:
- http:
path: twilio/whisper
method: post
- http:
path: twilio/whisper
method: get
twilio_start_call:
handler: src/twilio/twilio.start_call
events:
- http:
path: twilio/startcall
method: post
- http:
path: twilio/startcall
method: get
resources:
Resources:
uploadBucket:
Type: AWS::S3::Bucket
Properties:
BucketName: ${self:custom.uploadBucket}
RDSDatabase:
Type: AWS::RDS::DBInstance
Properties:
Engine : mysql
MasterUsername: ${env:RDS_USERNAME}
MasterUserPassword: ${env:RDS_PASSWORD}
DBInstanceClass : db.t2.micro
AllocatedStorage: '5'
PubliclyAccessible: true
#TODO: The Value of Stage is also available as a TAG automatically which I may use to replace this manually being put here..
Tags:
-
Key: "Name"
Value: ${self:custom.databaseName}
DeletionPolicy: Snapshot
DNSRecordSet:
Type: AWS::Route53::RecordSet
Properties:
HostedZoneName: test.com.
Name: database-${self:custom.stage}.test.com
Type: CNAME
TTL: '300'
ResourceRecords:
- {"Fn::GetAtt": ["RDSDatabase","Endpoint.Address"]}
DependsOn: RDSDatabase
UPDATE:: So I confirmed that running sls package --stage dev seems to create this in the zip folder that would eventually upload to AWS. This confirms that serverless is not creating the package correctly with the mysql2 reference for some reason? Why is this?
webpack config file as requested
const slsw = require("serverless-webpack");
const nodeExternals = require("webpack-node-externals");
module.exports = {
entry: slsw.lib.entries,
target: "node",
// Since 'aws-sdk' is not compatible with webpack,
// we exclude all node dependencies
externals: [nodeExternals()],
// Run babel on all .js files and skip those in node_modules
module: {
rules: [
{
test: /\.js$/,
loader: "babel-loader",
include: __dirname,
exclude: /node_modules/
}
]
}
};
Thanks to dashmugs comment after some investigation on this page (https://github.com/serverless-heaven/serverless-webpack), there is a section on Forced Inclusion. I'll paraphrase it here.
Forced inclusion Sometimes it might happen that you use dynamic
requires in your code, i.e. you require modules that are only known at
runtime. Webpack is not able to detect such externals and the compiled
package will miss the needed dependencies. In such cases you can force
the plugin to include certain modules by setting them in the
forceInclude array property. However the module must appear in your
service's production dependencies in package.json.
# serverless.yml
custom:
webpackIncludeModules:
forceInclude:
- module1
- module2
So I simply did this...
webpackIncludeModules:
forceInclude:
- mysql
- mysql2
Now it works! Hope this helps someone else with the same issue.
None of the previous helped me, I used this solution: https://github.com/sequelize/sequelize/issues/9489#issuecomment-493304014
The trick is to use dialectModule property and override sequelize.
import Sequelize from 'sequelize';
import mysql2 from 'mysql2'; // Needed to fix sequelize issues with WebPack
const sequelize = new Sequelize(
process.env.DB_NAME,
process.env.DB_USER,
process.env.DB_PASSWORD,
{
dialect: 'mysql',
dialectModule: mysql2, // Needed to fix sequelize issues with WebPack
host: process.env.DB_HOST,
port: process.env.DB_PORT
}
)
export async function connectToDatabase() {
console.log('Trying to connect via sequelize')
await sequelize.sync()
await sequelize.authenticate()
console.log('=> Created a new connection.')
// Do something
}
The previous so far works on MySql but is not working with Postgres