Loopback uses sequential number for model ID. Can I use my own ID generator on server side? How do I go about doing that?
It is possible to specify Loopback generators (guid, uuid, ...) as a default function for id properties in your model definition file.
example with guid:
{
"name": "ModelName",
"base": "PersistedModel",
"idInjection": false,
"properties": {
"id": {
"type": "string",
"id": true,
"defaultFn": "guid"
}
},
"validations": [],
"relations": {},
"acls": [],
"methods": {}
}
As far as I know, you can't specify there your own default function yet. See related github issue.
If you want more advanced behavior (e.g. your own generator), you can create models/model-name.js file and extend a constructor of your model.
Yes, you would need to do a few things:
Set "idInjection": false in the corresponding model.json to turn off automatic id injection
Add the property you want to your model, then set it to be an id either by setting "id": true on the property in the model.json, or selecting the id radial next to the prop in the composer
Generate and inject the id, probably with an operation hook on before save (https://docs.strongloop.com/display/public/LB/Operation+hooks) or maybe a mixin (https://docs.strongloop.com/display/public/LB/Defining+mixins)
If you use Loopback 4 then this is the setting for generating UUID in prime key.
Inside you Model change this.
#property({
type: 'string',
id: true,
defaultFn: 'uuidv4',
})
id?: string;
This is the way to gen a unique id in your table.
Related
How to define "ENUM" type and put values in a model itself? If thats not possible, then in the documentation, its mentioned to use enum like this
https://loopback.io/doc/en/lb3/MySQL-connector.html#enum
But where should I put this code as per best practice ?
You could add custom validation to your model, so it would check if the value you're passing is correct. You can find a more elaborated answer here:
Can I define a custom validation with options for Loopback?
You can use this way with MYSQL Connector.
"properties": {
"name": {
"type": "string",
"mysql": {
"columnName": "name",
"dataType": "ENUM('Daily', 'Week Days','Weekends','Monthly','Custom')",
"default": "Week Days"
}
}
}
`
Use ENUM property in JSON Schema
#property({
type: 'string',
required: true,
jsonSchema: {
enum: ['Daily', 'Week Days','Weekends','Monthly','Custom'],
},
})
I have two models: Account and Customer both having an email address.
An Account can exist without a Customer and a Customer can exist without an Account.
However, an Account should return the related Customer record if this exists.
I was thinking about doing this by creating a hasOne relation on the Account using the unique identifier available in both records (the email address) as foreignKey.
Unfortunately this is not working.
These are my models:
Account
...
"properties": {
"username": {
"type": [
"string"
]
},
"email": {
"type": "string"
}
},
"validations": [],
"relations": {
"customer": {
"type": "hasOne",
"model": "Customer",
"foreignKey": "email"
}
}
...
Customer
...
"properties": {
"name": {
"type": [
"string"
]
},
"email": {
"type": "string"
}
},
"validations": [],
"relations": {}
...
By calling /api/account?filter={"include": ["customer"]} I don't get any additional information.
I don't understand if the problem is the foreignKey or the relation.
You could use an afterRemote hook to do the marshaling just before returning the requested instance.
However this won't be automatic, i.e. you still need to provide some sort of id to link the two instances together. In your case, if the email is such an id, then you would just search for a Customer instance with the same email as the Account instance.
The advantage is that you don't need to provide any extra filters or anything else to your query.
e.g.
Account.afterRemote('find', function(ctx, modelInstance, next) {
// Here you can check if the Account instance has a Customer instance
// via a regular find or findById, and if you do find the related instance
// you can add the data to ctx.result, which is the object that will be returned.
Customer.find({where:{email: modelInstance.email}}, addCustomerDetails);
function addCustomerDetails(err, linkedCustomer) {
// Add the Customer to the Account instance here
ctx.result.customer = linkedCustomer;
next();
}
});
And of course, you can do the same in the Customer afterRemote hook, but instead searching for the linked Account instance email.
Your models are defined well.
Be sure you have customer instance with existed email in db.
And the correct for of rest api calls is : /api/account?filter[include]=customer
UPDATE
Loopback overwrite the type of email because of the relation. hasOne relation should be setup over id foreign key not any other fields.
So if you want to fix the problem, you need to add below to properties section of account definition :
"id": false,
"email": {
"type": "string",
"id": true
}
The foreignKey field is just an alias for the relation. Having an email property and setting email as foreignKey does not create any sort of link between the two.
Then, it's simply a matter of using the REST API to instanciate the models, setup the relation and fetch the data
create an account
POST api/accounts/
{
"email": "account#bar.com"
}
create a related customer
POST api/accounts/1/foreignKey
{
"email": "customer#bar.com"
}
Fetch the accound and include the related customer
GET api/accounts/1?filter[include]=foreignKey
I want to create a model in loopback with very complex logic, impossible to map to any datasource. SO I would like to somehow only generate the CRUD methods skeletons in JS and be able to simply override them, as explained here:
extend the CRUD method in LoopBack
From the outside it should be accessible as any REST API, with all the CRUDs and other methods, typical in loopback.
I would also apply ACLs, authorization and all the stuff to it, just as normal.
How should I proceed?
Is this case somewhere formally documented?
Are the CRUD methods officially documented, so I can safely override them?
You can create it with the lb model command. Be sure to select:
Datasource: (no datasource)
Model: Model
Expose: Yes
Common/server: Common
This will create the files inside common/models. You can do this manually too. A datasource-less model is essentially composed of these file contents:
test.json
{
"name": "test",
"base": "Model",
"idInjection": true,
"options": {
"validateUpsert": true
},
"properties": {},
"validations": [],
"relations": {},
"acls": [],
"methods": {}
}
test.js
'use strict';
module.exports = function(Test) {
Test.greet = function(msg, cb) {
cb(null, 'Greetings... ' + msg);
}
Test.remoteMethod('greet', {
accepts: { arg: 'msg', type: 'string' },
returns: { arg: 'greeting', type: 'string' }
});
};
This will create a route called /test, with a function named "greet".
The loopback node API is documented there.
Just override the methods like in the link you provided. You will need to match the node API of the original method in your overriden method, but apart from that no restrictions. ACLs are decoupled from that so nothing to worry on this side.
However, I don't know how you plan to write a stateless loopback application without using a datasource, since this is where the state is stored. If your loopback application is not stateless, remember that it will not scale (cannot start multiple instances in a cluster), and will do nasty things when it will crash. Can't you just split your problem / simplify it ?
I have created models using slc loopback:model tool. Now I want Loopback to create corresponding MongoDB collections, that is to perform auto-migration.
One of the models is a Client model whose base class is a User model. That means that client/models/client.json is just empty because all its properties (fields) are inherited from User:
{
"name": "Client",
"plural": "Clients",
"base": "User",
"idInjection": true,
"properties": {},
"validations": [],
"relations": {},
"acls": [],
"methods": []
}
So I think to myself that if I make an auto-migration, Loopback finds all User properties and creates Client collection with them. But it doesn't! My Client collection has only _id property.
Here is a my code for auto-migration:
module.exports = function(app) {
app.dataSources.mongodb.automigrate('Client', function(err) {
if (err) throw err;
});
};
My question:
Why Loopback doesn't use User model properties for my Client model? How to auto-migrate so that Loopback will create correct collection?
automigrate is used to migrate model data into tables i.e. Model name as tablename and Model's properties as table columns.
Now as you are using MongoDB, it drops and creates indexes as written in documentation. This is because MongoDB is schemaless.
So, probably you can avoid automigration and insert new documents directly.
What have you done to persist the User data in production? Is there an easy way to find the schema of the User model so it can be reproduced in a database?
(Preemptive Note: DiscoverSchema finds the schema of the database, not the model)
(Also, I know the docs say the User model can be persisted by setting the file property in the default db datasource, but I have security, scalability, and durability concerns with that.)
setup a database.
define a new datasource by editing the ./server/datasources.json by adding your database, for example:
"mongodb_dev": {
"name": "mongodb_dev",
"connector": "mongodb",
"host": "127.0.0.1",
"database": "devDB",
"username": "devUser",
"password": "devPassword",
"port": 27017
}
Update your ./server/model-config.json to have the built in models use your new datasource :
{
"_meta": {
"sources": [
"loopback/common/models",
"loopback/server/models",
"../common/models",
"./models"
],
"mixins": [
"loopback/common/mixins",
"loopback/server/mixins",
"../common/mixins",
"./mixins"
]
},
"User": {
"dataSource": "mongodb_dev"
},
"AccessToken": {
"dataSource": "mongodb_dev",
"public": false
},
"ACL": {
"dataSource": "mongodb_dev",
"public": false
},
"RoleMapping": {
"dataSource": "mongodb_dev",
"public": false
},
"Role": {
"dataSource": "mongodb_dev",
"public": false
}
}
3.Create server/create-lb-tables.js file to move the built-in tables to your database with the following
var server = require('./server');
var ds = server.dataSources.mongodb_dev;// <<<<<<note the datasource name
var lbTables = ['User', 'AccessToken', 'ACL', 'RoleMapping', 'Role'];
ds.automigrate(lbTables, function(er) {
if (er) throw er;
console.log('Loopback tables [' + lbTables + '] created in ', ds.adapter.name);
ds.disconnect();
});
Run the script
cd server
node create-lb-tables.js
This is a link to the official docs about putting built in models on your db
https://docs.strongloop.com/display/public/LB/Creating+database+tables+for+built-in+models
You should persist users into your chosen database via a connector.
The file property is only used to persist data to the filesystem and is NOT recommended for production. For production, you should use one of the connectors (MongoDB, MySQL, etc) to persist your data.
See the docs to find out what properties are part of the built-in User model or change the default database settings to persist the User model to the filesystem to see what properties are available in the JSON file output. If you don't understand all this, go through the tutorial series to get an understanding of all these concepts. Cheers. ;)
Since no one actually shared how to accomplish this, I built this example using the example-mysql repo from strongloop and tweaked 2 files.
model-config.json (change the built-in models from db to your datasource)
bin/automigrate.js (add additional automigrate functions for each model)
See how here: https://github.com/mikesparr/loopback-example-users-to-mysql
Good luck!
Create a persistent(Mongo or any other) datasource using slc loopback:dataSource
Edit model-config.json and mention above created datasource.
Restart server