I am using symfony and fosuserbundle. my login fails at "login_check". I have tried re writting my security.yaml
security:
access_denied_url: /login
encoders:
FOS\UserBundle\Model\UserInterface: bcrypt
role_hierarchy:
ROLE_ADMIN: ROLE_USER
ROLE_SUPER_ADMIN: ROLE_ADMIN
ROLE_BROKER: ROLE_BROKER
providers:
fos_userbundle:
id: fos_user.user_provider.username
firewalls:
main:
pattern: ^/
form_login:
provider: fos_userbundle
csrf_token_generator: security.csrf.token_manager
login_path: /login
check_path: /login_check
always_use_default_target_path: true
default_target_path: /dashboard
use_referer: true
logout: true
anonymous: true
access_control:
- { path: ^/login$, role: IS_AUTHENTICATED_ANONYMOUSLY }
- { path: ^/apply, role: IS_AUTHENTICATED_ANONYMOUSLY }
- { path: ^/, role: ROLE_USER }
Here is my screen
So I'm trying to run a lambda on amazon and narrowed down the error finally by testing the lambda in amazons testing console.
The error I got is this.
{
"errorMessage": "Please install mysql2 package manually",
"errorType": "Error",
"stackTrace": [
"new MysqlDialect (/var/task/node_modules/sequelize/lib/dialects/mysql/index.js:14:30)",
"new Sequelize (/var/task/node_modules/sequelize/lib/sequelize.js:234:20)",
"Object.exports.getSequelizeConnection (/var/task/src/twilio/twilio.js:858:20)",
"Object.<anonymous> (/var/task/src/twilio/twilio.js:679:25)",
"__webpack_require__ (/var/task/src/twilio/twilio.js:20:30)",
"/var/task/src/twilio/twilio.js:63:18",
"Object.<anonymous> (/var/task/src/twilio/twilio.js:66:10)",
"Module._compile (module.js:570:32)",
"Object.Module._extensions..js (module.js:579:10)",
"Module.load (module.js:487:32)",
"tryModuleLoad (module.js:446:12)",
"Function.Module._load (module.js:438:3)",
"Module.require (module.js:497:17)",
"require (internal/module.js:20:19)"
]
}
Easy enough, so I have to install mysql2. So I added it to my package.json file.
{
"name": "test-api",
"version": "1.0.0",
"description": "",
"main": "handler.js",
"scripts": {
"test": "echo \"Error: no test specified\" && exit 0"
},
"keywords": [],
"author": "",
"license": "ISC",
"devDependencies": {
"aws-sdk": "^2.153.0",
"babel-core": "^6.26.0",
"babel-loader": "^7.1.2",
"babel-plugin-transform-runtime": "^6.23.0",
"babel-preset-es2015": "^6.24.1",
"babel-preset-stage-3": "^6.24.1",
"serverless-domain-manager": "^1.1.20",
"serverless-dynamodb-autoscaling": "^0.6.2",
"serverless-webpack": "^4.0.0",
"webpack": "^3.8.1",
"webpack-node-externals": "^1.6.0"
},
"dependencies": {
"babel-runtime": "^6.26.0",
"mailgun-js": "^0.13.1",
"minimist": "^1.2.0",
"mysql": "^2.15.0",
"mysql2": "^1.5.1",
"qs": "^6.5.1",
"sequelize": "^4.31.2",
"serverless": "^1.26.0",
"serverless-plugin-scripts": "^1.0.2",
"twilio": "^3.10.0",
"uuid": "^3.1.0"
}
}
I noticed when I do sls deploy however, it seems to only be packaging some of the modules?
Serverless: Package lock found - Using locked versions
Serverless: Packing external modules: babel-runtime#^6.26.0, twilio#^3.10.0, qs#^6.5.1, mailgun-js#^0.13.1, sequelize#^4.31.2, minimi
st#^1.2.0, uuid#^3.1.0
Serverless: Packaging service...
Serverless: Uploading CloudFormation file to S3...
Serverless: Uploading artifacts...
Serverless: Validating template...
Serverless: Updating Stack...
Serverless: Checking Stack update progress...
................................
Serverless: Stack update finished...
I think this is why it's not working. In short, how do I get mysql2 library to be packaged correctly with serverless so my lambda function will work with the sequelize library?
Please note that when I test locally my code works fine.
My serverless file is below
service: testapi
# Use serverless-webpack plugin to transpile ES6/ES7
plugins:
- serverless-webpack
- serverless-plugin-scripts
# - serverless-domain-manager
custom:
#Define the Stage or default to Staging.
stage: ${opt:stage, self:provider.stage}
webpackIncludeModules: true
#Define Databases Here
databaseName: "${self:service}-${self:custom.stage}"
#Define Bucket Names Here
uploadBucket: "${self:service}-uploads-${self:custom.stage}"
#Custom Script setup
scripts:
hooks:
#Script below will run schema changes to the database as neccesary and update according to stage.
'deploy:finalize': node database-schema-update.js --stage ${self:custom.stage}
#Domain Setup
# customDomain:
# basePath: "/"
# domainName: "api-${self:custom.stage}.test.com"
# stage: "${self:custom.stage}"
# certificateName: "*.test.com"
# createRoute53Record: true
provider:
name: aws
runtime: nodejs6.10
stage: staging
region: us-east-1
environment:
DOMAIN_NAME: "api-${self:custom.stage}.test.com"
DATABASE_NAME: ${self:custom.databaseName}
DATABASE_USERNAME: ${env:RDS_USERNAME}
DATABASE_PASSWORD: ${env:RDS_PASSWORD}
UPLOAD_BUCKET: ${self:custom.uploadBucket}
TWILIO_ACCOUNT_SID: ""
TWILIO_AUTH_TOKEN: ""
USER_POOL_ID: ""
APP_CLIENT_ID: ""
REGION: "us-east-1"
IDENTITY_POOL_ID: ""
RACKSPACE_API_KEY: ""
#Below controls permissions for lambda functions.
iamRoleStatements:
- Effect: Allow
Action:
- dynamodb:DescribeTable
- dynamodb:UpdateTable
- dynamodb:Query
- dynamodb:Scan
- dynamodb:GetItem
- dynamodb:PutItem
- dynamodb:UpdateItem
- dynamodb:DeleteItem
Resource: "arn:aws:dynamodb:us-east-1:*:*"
functions:
create_visit:
handler: src/visits/create.main
events:
- http:
path: visits
method: post
cors: true
authorizer: aws_iam
get_visit:
handler: src/visits/get.main
events:
- http:
path: visits/{id}
method: get
cors: true
authorizer: aws_iam
list_visit:
handler: src/visits/list.main
events:
- http:
path: visits
method: get
cors: true
authorizer: aws_iam
update_visit:
handler: src/visits/update.main
events:
- http:
path: visits/{id}
method: put
cors: true
authorizer: aws_iam
delete_visit:
handler: src/visits/delete.main
events:
- http:
path: visits/{id}
method: delete
cors: true
authorizer: aws_iam
twilio_send_text_message:
handler: src/twilio/twilio.send_text_message
events:
- http:
path: twilio/sendtextmessage
method: post
cors: true
authorizer: aws_iam
#This function handles incoming calls and where to route it to.
twilio_incoming_call:
handler: src/twilio/twilio.incoming_calls
events:
- http:
path: twilio/calls
method: post
twilio_failure:
handler: src/twilio/twilio.twilio_failure
events:
- http:
path: twilio/failure
method: post
twilio_statuschange:
handler: src/twilio/twilio.statuschange
events:
- http:
path: twilio/statuschange
method: post
twilio_incoming_message:
handler: src/twilio/twilio.incoming_message
events:
- http:
path: twilio/messages
method: post
twilio_whisper:
handler: src/twilio/twilio.whisper
events:
- http:
path: twilio/whisper
method: post
- http:
path: twilio/whisper
method: get
twilio_start_call:
handler: src/twilio/twilio.start_call
events:
- http:
path: twilio/startcall
method: post
- http:
path: twilio/startcall
method: get
resources:
Resources:
uploadBucket:
Type: AWS::S3::Bucket
Properties:
BucketName: ${self:custom.uploadBucket}
RDSDatabase:
Type: AWS::RDS::DBInstance
Properties:
Engine : mysql
MasterUsername: ${env:RDS_USERNAME}
MasterUserPassword: ${env:RDS_PASSWORD}
DBInstanceClass : db.t2.micro
AllocatedStorage: '5'
PubliclyAccessible: true
#TODO: The Value of Stage is also available as a TAG automatically which I may use to replace this manually being put here..
Tags:
-
Key: "Name"
Value: ${self:custom.databaseName}
DeletionPolicy: Snapshot
DNSRecordSet:
Type: AWS::Route53::RecordSet
Properties:
HostedZoneName: test.com.
Name: database-${self:custom.stage}.test.com
Type: CNAME
TTL: '300'
ResourceRecords:
- {"Fn::GetAtt": ["RDSDatabase","Endpoint.Address"]}
DependsOn: RDSDatabase
UPDATE:: So I confirmed that running sls package --stage dev seems to create this in the zip folder that would eventually upload to AWS. This confirms that serverless is not creating the package correctly with the mysql2 reference for some reason? Why is this?
webpack config file as requested
const slsw = require("serverless-webpack");
const nodeExternals = require("webpack-node-externals");
module.exports = {
entry: slsw.lib.entries,
target: "node",
// Since 'aws-sdk' is not compatible with webpack,
// we exclude all node dependencies
externals: [nodeExternals()],
// Run babel on all .js files and skip those in node_modules
module: {
rules: [
{
test: /\.js$/,
loader: "babel-loader",
include: __dirname,
exclude: /node_modules/
}
]
}
};
Thanks to dashmugs comment after some investigation on this page (https://github.com/serverless-heaven/serverless-webpack), there is a section on Forced Inclusion. I'll paraphrase it here.
Forced inclusion Sometimes it might happen that you use dynamic
requires in your code, i.e. you require modules that are only known at
runtime. Webpack is not able to detect such externals and the compiled
package will miss the needed dependencies. In such cases you can force
the plugin to include certain modules by setting them in the
forceInclude array property. However the module must appear in your
service's production dependencies in package.json.
# serverless.yml
custom:
webpackIncludeModules:
forceInclude:
- module1
- module2
So I simply did this...
webpackIncludeModules:
forceInclude:
- mysql
- mysql2
Now it works! Hope this helps someone else with the same issue.
None of the previous helped me, I used this solution: https://github.com/sequelize/sequelize/issues/9489#issuecomment-493304014
The trick is to use dialectModule property and override sequelize.
import Sequelize from 'sequelize';
import mysql2 from 'mysql2'; // Needed to fix sequelize issues with WebPack
const sequelize = new Sequelize(
process.env.DB_NAME,
process.env.DB_USER,
process.env.DB_PASSWORD,
{
dialect: 'mysql',
dialectModule: mysql2, // Needed to fix sequelize issues with WebPack
host: process.env.DB_HOST,
port: process.env.DB_PORT
}
)
export async function connectToDatabase() {
console.log('Trying to connect via sequelize')
await sequelize.sync()
await sequelize.authenticate()
console.log('=> Created a new connection.')
// Do something
}
The previous so far works on MySql but is not working with Postgres
I have a project with 2 DB connections (default and routeone). My config is like this:
dbal:
default_connection: routeone
types:
point: ACME\Infrastructure\Resources\Types\PointType
connections:
default:
driver: pdo_mysql
host: "%database_host%"
port: "%database_port%"
dbname: "%database_name%"
user: "%database_user%"
password: "%database_password%"
charset: UTF8
routeone:
driver: pdo_mysql
host: "%database_routeone_host%"
port: "%database_routeone_port%"
dbname: "%database_routeone_name%"
user: "%database_routeone_user%"
password: "%database_routeone_password%"
charset: UTF8
mapping_types:
enum: string
point: point
orm:
default_entity_manager: routeone
auto_generate_proxy_classes: "%kernel.debug%"
entity_managers:
default:
naming_strategy: doctrine.orm.naming_strategy.underscore
auto_mapping: true
mappings:
ACME:
type: yml
dir: '%kernel.root_dir%/../src/ACME/Infrastructure/Resources/doctrine'
is_bundle: false
prefix: ACME\Domain\Entity
alias: ACME
routeone:
naming_strategy: doctrine.orm.naming_strategy.underscore
auto_mapping: false
mappings:
RouteOne:
type: yml
dir: '%kernel.root_dir%/../src/ACME/Infrastructure/Resources/doctrine/RouteOne'
is_bundle: false
prefix: RouteOne
alias: RouteOne
I have the control of the default connection, but no the routeone. I cannot change the structure of the existing DB in routeone.
When I generate the YML configuration files with this this command (NB I have put momentarily the routeone connection as a default):
php bin/console doctrine:mapping:convert yml ./src/ACME/Infrastructure/Resources/doctrine/Routeone/ --from-database
This generate a files in this path like that: AclClass.orm.yml
AclClass:
type: entity
table: acl_class
....
But when I want to generate Entities Class with this command:
php bin/console doctrine:generate:entities src/ACME/Infrastructure/Resources/doctrine/RouteOne/ --path=src/ACME/Domain/Entity/RouteOne/
Prompt this error:
[Doctrine\Common\Persistence\Mapping\MappingException]
No mapping file found named 'AclClass.orm.yml' for class 'ACME\Domain\Entity\RouteOne\AclClass'.
What I'm doing wrong?
I need to use 2 different entity managers, as I need to run $em->clear(); for performance reason - batching of inserts.
My config.yml contains
doctrine:
dbal:
default_connection: default
connections:
default:
driver: pdo_mysql
host: "%database_host%"
port: "%database_port%"
dbname: "%database_name%"
user: "%database_user%"
password: "%database_password%"
charset: UTF8
options:
1001: true
orm:
auto_generate_proxy_classes: "%kernel.debug%"
default_entity_manager: default
entity_managers:
default:
connection: default
naming_strategy: doctrine.orm.naming_strategy.underscore
auto_mapping: true
errorlog:
connection: default
naming_strategy: doctrine.orm.naming_strategy.underscore
mappings:
AppBundle:
type: yml
dir: Entity
is_bundle: true
prefix: AppBundleEntity
alias: AppBundleEntity
However I'm still getting an error
[Doctrine\ORM\ORMException]
Unknown Entity namespace alias 'AppBundle'.
From looking at other questions, it seems entities need to be separated into different folders at least. Is this the only solution to use different entity managers in the same bundle?
As per #Cerad s comment, this works:
doctrine:
dbal:
default_connection: default
connections:
default:
driver: pdo_mysql
host: "%database_host%"
port: "%database_port%"
dbname: "%database_name%"
user: "%database_user%"
password: "%database_password%"
charset: UTF8
options:
1001: true
orm:
auto_generate_proxy_classes: "%kernel.debug%"
default_entity_manager: default
entity_managers:
default:
connection: default
naming_strategy: doctrine.orm.naming_strategy.underscore
mappings:
AppBundle: ~
errorlog:
connection: default
naming_strategy: doctrine.orm.naming_strategy.underscore
mappings:
AppBundle: ~
I'm uasing symfony2 and FOSUserBundle.
I have one firewall for login, one for assets and a main one catching everything.
Still i get the "no security.context" Exception thrown when a route is not covered by any firewall and you try to access it with "is_granted" (Seen and solved here).
The route is mydomain/de_DE/area where the de_DE part obviously is my {_locale}.
Here's my FOSUserBundle configuration from my config.yml.
firewalls:
login_firewall:
pattern: ^/(de_DE|de_CH)/(login|resetting)$
anonymous: true
form_login:
provider: fos_userbundle
login_path: fos_user_security_login
check_path: fos_user_security_check
csrf_provider: form.csrf_provider
logout:
path: fos_user_security_logout
assets_localeless:
pattern: ^/(compiled|web|js|css|_wdt|_profiler)/$
anonymous: true
main:
pattern: ^/$
anonymous: false
form_login:
provider: fos_userbundle
login_path: fos_user_security_login
check_path: fos_user_security_check
csrf_provider: form.csrf_provider
logout:
path: fos_user_security_logout
access_control:
- { path: ^/(compiled|web|js|css|_wdt|_profiler)$, role: IS_AUTHENTICATED_ANONYMOUSLY }
- { path: ^/(de_DE|de_CH)/(login|resetting)$, role: IS_AUTHENTICATED_ANONYMOUSLY }
- { path: ^/(de_DE|de_CH)/(my-admin|admin), role: ROLE_ADMIN }
- { path: ^/(de_DE|de_CH)/$, role: ROLE_USER }
role_hierarchy:
ROLE_ADMIN: ROLE_USER
ROLE_SUPER_ADMIN: ROLE_ADMIN
EDIT / SOLUTION: My Problem was teh RegEx. I had a misunderstanding with the tutorial on that one.
So the Pattern is in plain RegEx, which is why my firewalls all didn't work (see answer).
The new setting is this:
firewalls:
main:
pattern: .
anonymous: true
form_login:
provider: fos_userbundle
login_path: fos_user_security_login
check_path: fos_user_security_check
csrf_provider: form.csrf_provider
logout:
path: fos_user_security_logout
access_control:
- { path: '^/(compiled|web|js|css|_wdt|_profiler)([\w\d/_-]{0,})', role: IS_AUTHENTICATED_ANONYMOUSLY }
- { path: '^/([\w]{0,})/(login|resetting|sale|imprint|contact)([\w\d/_-]{0,})', role: IS_AUTHENTICATED_ANONYMOUSLY }
- { path: '^/([\w]{0,})/(my-admin|admin)([\w\d/_-]{0,})', role: ROLE_ADMIN }
- { path: '^/([\w]{0,})/([\w\d/_-]{0,})', role: IS_AUTHENTICATED_FULLY }
There is no firewall for mydomain/de_DE/area configured ... that's why you have no security.context for that route.
$ means end in a regex. That's why..
- { path: ^/(de_DE|de_CH)/$, role: ROLE_USER }
... will only match exactly ...
yourdomain/de_DE/
yourdomain/de_CH/
... and no other route.