lookback API where filter with multiple conditions - loopbackjs

When using loopback API, is 'AND' operator redundant in 'where' filter with multiple conditions?
For example, I tested the following two queries and they return the same result:
<model>.find({ where: { <condition1>, <condition2> } });
<model>.find({ where: { and: [<condition1>, <condtion2>] } });
To be more specific, suppose this is the table content:
name value
---- -----
a 1
b 2
When I execute 'find()' using two different 'where' filters, I get the first record in both cases:
{ where: { name: 'a', value: 1 } }
{ where: { and: [ { name: 'a'}, { value: 1 } ] } }
I've read through the API documents, but didn't find what logical operator is used when there are multiple conditions.
If 'AND' is redundant as shown in my test, I prefer not using it. But I just want to make sure if this is true in general, or if it just happens to work with postgreSQL which I'm using.

This is a valid query which could only be accomplished with an and statement.
{
"where": {
"or": [
{"and": [{"classification": "adn"}, {"series": "2"}]},
{"series": "3"}
]
}
}
EDIT: https://github.com/strongloop/loopback-filters/blob/master/index.js
function matchesFilter(obj, filter) {
var where = filter.where;
var pass = true;
var keys = Object.keys(where);
keys.forEach(function(key) {
if (key === 'and' || key === 'or') {
if (Array.isArray(where[key])) {
if (key === 'and') {
pass = where[key].every(function(cond) {
return applyFilter({where: cond})(obj);
});
return pass;
}
if (key === 'or') {
pass = where[key].some(function(cond) {
return applyFilter({where: cond})(obj);
});
return pass;
}
}
}
if (!test(where[key], getValue(obj, key))) {
pass = false;
}
});
return pass;
}
It iterates through the keys of the where object where looking for failure, so it acts like an implicit and statement in your case.
EDIT 2: https://github.com/strongloop/loopback-datasource-juggler/blob/cc60ef8202092ae4ed564fc7bd5aac0dd4119e57/test/relations.test.js
The loopback datasource juggler contains tests which use the implicit and format
{PictureLink.findOne({where: {pictureId: anotherPicture.id, imageableType: 'Article'}},
{pictureId: anotherPicture.id, imageableId: article.id, imageableType: 'Article',}

But I just want to make sure if this is true in general, or if it just happens to work with postgreSQL which I'm using.
Is it true in general? No.
It appears that this is handled for PostgreSQL and MySQL (and probably other SQL databases) in SQLConnector. So, it is possible connectors not using SQLConnector (e.g MongoDB) don't support this. However, given the many examples I've seen online, I would say it's safe to assume other connectors have implemented it this way, too.

Related

AdonisJS exists Validator

I'm following the official [documentation] (https://legacy.adonisjs.com/docs/4.0/validator) && indicative, but I couldn't find anything to help me.
I want to validate if the given param exists on database.
So I tried:
app/Validators/myValidator
const { rule } = use('Validator')
get rules () {
return {
userId: 'required|integer|exists:MyDatabase.users,userId', <-- this is what isn't working
date: [
rule('date'),
rule('dateFormat', 'YYYY-MM-DD')
]
}
}
// Getting the data to be validated
get data () {
const params = this.ctx.params
const { userId, date } = params
return Object.assign({}, { userId }, { date })
}
It gives me the following error:
{
"error": {
"message": "select * from `MyDatabase`.`users`.`userId` where `undefined` = '2' limit 1 - ER_PARSE_ERROR: You have an error in your SQL syntax; check the manual that corresponds to your MySQL server version for the right syntax to use near '.`userId` where `undefined` = '2' limit 1' at line 1",
"name": "ErrorValidator",
"status": 40
}
}
How should I properly indicate that I want to compare MyDatabase.users.userid to the given parameter?
After a few hard try/error I stumbled upon the solution.
Just need to follow what is inside hooks.js and pass the values separated by comma, like so:
get rules () {
return {
userId: 'required|integer|exists:MyDatabase,MyTable,columntoCompare',
}
}

Is it possible to build dynamic queries for Amplify Datastore?

I am looking to create a query-builder for my Amplify Datastore.
The function should process an an array of conditions, that need to be applied to the query and return the according Predicate.
This is easily done, if there is only one filter, but I would like to be able to process any amount of filters.
My goal is to be able to write the queries like so:
Datastore.query(Post, *queryBuilder(filters)*)
Where I can pass an array of filters with a filter looking like this:
filter = {
connector: 'or' |
property: rating
predicate: 'gt'
value: 4
}
and the query builder returns the Predicate in the below mentioned format.
I have tried to chain and return multiple functions in the query builder, but I was not able to figure out a pattern for how to create the correct predicate function.
For reference, this is how queries are built according to the docs: https://docs.amplify.aws/lib/datastore/data-access/q/platform/js#predicates
const posts = await DataStore.query(Post, c => c.rating("gt", 4));
and for multiple conditions:
const posts = await DataStore.query(Post, c =>
c.rating("gt", 4).status("eq", PostStatus.PUBLISHED)
);
Let's say we have the model:
type Post #model{
id: ID!
category: String
city: String
content: String
}
And we want to query & filter by city and category by a dynamic amount of variables. Then we can make a function as such on our script:
const fetchData = async props => {
/*
More configurable wrapper for Datastore.query calls
#param props: {model: Model, criteria: [{fieldId, predicate, value}]}.
*/
try {
let criteria;
if (props.criteria && typeof props.criteria === 'object') {
criteria = c => {
props.criteria.forEach(item => {
const predicate = item.predicate || 'eq';
c[item.fieldId](predicate, item.value);
});
return c;
};
} else {
criteria = props.criteria;
}
return await DataStore.query(props.model, criteria);
} catch (e) {
throw new Error(e);
}
}
So now if we want to execute this we can pass the parameters:
// where Post = models.Post
const myResult = fetchData({model: Post, criteria: [
{ fieldId: 'category',
predicate: 'eq',
value: 'news'
},
{
fieldId: 'city',
predicate: 'eq',
value: 'SomeCityName'
}]
})
Unfortunately I do not know of a way to also query linked relationships as you would using a direct graphQL api query while using DataStore and this method I presented only uses implicit AND between criteria.
I don't know if this has changed since you asked the question but, based on the documents, it looks like multiple conditions have an implicit and, but you can explicitly chain them with or/and/not:
const posts = await DataStore.query(Post, c => c.or(
c => c.rating("gt", 4).status("eq", PostStatus.PUBLISHED)
));

how can I get order '2' < '10' with filter in loopback?

My data like ['2', '13', '13A', '14-1'], How can i get the correct order with filter? Thanks everyone.
IIUC, you are storing numbers (2, 10, etc.) as strings ('2', '10', etc.) in your database.
LoopBack relies on the database to perform ordering (sorting).
Here are few things to try:
Modify your model definition to store the property as number. LoopBack is smart and will coerce string values provided by the user (REST API clients) to numbers before they are stored in the database. This would be my preferred solution, because it does not require any complex code in your application and preserves performance.
Depending on the database you are using, it may be possible to configure it to treat string values as numbers for sorting. This is not LoopBack specific, I can't really help you with that.
As a last resort, you can sort the records in-memory, LoopBack is already doing that for location-based queries when the database does not support them. The idea is to tell the database to return all records matching the filter criteria and then apply order, limit, skip and other options inside your Node.js process. Please note this comes with a severe performance hit and will work only for reasonably-sized data.
As for the 3rd option: implementation wise, you need to override find method in your model class.
// common/models/my-model.js
module.exports = function(MyModel) {
MyModel.on('modelRemoted', () => {
MyModel._findRaw = MyModel.find;
MyModel.find = findWithCustomSort;
});
}
function findWithCustomSort(filter, options, cb) {
if (!cb) {
if (typeof options === 'function') {
cb = options;
options = undefined;
} else if (!options && typeof filter === 'function') {
cb = filter;
filter = undefined;
}
}
const dbFilter = {
where: filter.where,
include: filter.include,
fields: filter.fields,
};
if (cb) {
this._findRaw(dbFilter, options, (err, found) => {
if (err) return cb(err);
else cb(null, sortResults(filter, found))
});
} else {
return this._findRaw(dbFilter, options)
.then(found => sortResults(filter, found));
}
}
function sortResults(filter, data) {
// implement your sorting rules, don't forget about "limit", "skip", etc.
}
UPDATE
Is there a way to use sql for query in custom method?
Yes, you can execute any SQL by using MyModel.dataSource.connector.execute function, see Executing native SQL. There is one catch though - this method is callback based, you cannot use Promise API or async/await.
const idValue = 1;
MyModel.dataSource.connector.execute(
'SELECT * FROM MyModel WHERE id=?',
[idValue]
(err, results) => {
if (err) console.error('query failed', err);
else console.log('found data', results);
});

graphql appsync query with boolean filter

I have the need to query all incomplete projects, wherein upon completion a project will be given a status change (Completed) plus a boolean isComplete==true.
I'm working through AWS Appsync to test the queries before I hard-code them into my app, but this one doesn't seem to be effective. I want all projects where isComplete==false or isComplete==null: boolean logic doesn't work with the input1 variable below (0 results).
{"__typename":{"S":"Project"},"addressLine1":{"S":"321 Faith Cir"},"city":{"S":"Perris"},"createdAt":{"S":"2019-03-05T01:01:39.513Z"},"currentOwner":{"S":"pgres52"},"dateRequired":{"S":"2019-03-13-07:00"},"id":{"S":"89a5-42ef7efef8fb"},"status":{"S":"Created"},"statusLastChangedAt":{"S":"2019-03-05T01:01:39.513Z"}}
{
"input1":{
"isComplete": {
"ne": true
}
}
}
query listNonCompleteProjects($input1: ModelProjectFilterInput) {
listProjects(filter: $input1, limit: 20) {
items {
id
currentOwner
addressLine1
city
dateRequired
isComplete
statusLastChangedAt
}
nextToken
}
}```
Solved! Partially helped with this post: Prisma.io: How do I filter items with certain fields being null?
I was able to get it to work with an additional parameter status (string):
query listNonCompleteProjects($input1: ModelProjectFilterInput) {
listProjects(filter: $input1, limit: 20) {
items {
...
}
}
}
"input1":{
"and": [
{"status": {"notContains": "Complete"}},
{"isComplete": {
"ne": true
}}
]
},

Advanced update using mongodb [duplicate]

In MongoDB, is it possible to update the value of a field using the value from another field? The equivalent SQL would be something like:
UPDATE Person SET Name = FirstName + ' ' + LastName
And the MongoDB pseudo-code would be:
db.person.update( {}, { $set : { name : firstName + ' ' + lastName } );
The best way to do this is in version 4.2+ which allows using the aggregation pipeline in the update document and the updateOne, updateMany, or update(deprecated in most if not all languages drivers) collection methods.
MongoDB 4.2+
Version 4.2 also introduced the $set pipeline stage operator, which is an alias for $addFields. I will use $set here as it maps with what we are trying to achieve.
db.collection.<update method>(
{},
[
{"$set": {"name": { "$concat": ["$firstName", " ", "$lastName"]}}}
]
)
Note that square brackets in the second argument to the method specify an aggregation pipeline instead of a plain update document because using a simple document will not work correctly.
MongoDB 3.4+
In 3.4+, you can use $addFields and the $out aggregation pipeline operators.
db.collection.aggregate(
[
{ "$addFields": {
"name": { "$concat": [ "$firstName", " ", "$lastName" ] }
}},
{ "$out": <output collection name> }
]
)
Note that this does not update your collection but instead replaces the existing collection or creates a new one. Also, for update operations that require "typecasting", you will need client-side processing, and depending on the operation, you may need to use the find() method instead of the .aggreate() method.
MongoDB 3.2 and 3.0
The way we do this is by $projecting our documents and using the $concat string aggregation operator to return the concatenated string.
You then iterate the cursor and use the $set update operator to add the new field to your documents using bulk operations for maximum efficiency.
Aggregation query:
var cursor = db.collection.aggregate([
{ "$project": {
"name": { "$concat": [ "$firstName", " ", "$lastName" ] }
}}
])
MongoDB 3.2 or newer
You need to use the bulkWrite method.
var requests = [];
cursor.forEach(document => {
requests.push( {
'updateOne': {
'filter': { '_id': document._id },
'update': { '$set': { 'name': document.name } }
}
});
if (requests.length === 500) {
//Execute per 500 operations and re-init
db.collection.bulkWrite(requests);
requests = [];
}
});
if(requests.length > 0) {
db.collection.bulkWrite(requests);
}
MongoDB 2.6 and 3.0
From this version, you need to use the now deprecated Bulk API and its associated methods.
var bulk = db.collection.initializeUnorderedBulkOp();
var count = 0;
cursor.snapshot().forEach(function(document) {
bulk.find({ '_id': document._id }).updateOne( {
'$set': { 'name': document.name }
});
count++;
if(count%500 === 0) {
// Excecute per 500 operations and re-init
bulk.execute();
bulk = db.collection.initializeUnorderedBulkOp();
}
})
// clean up queues
if(count > 0) {
bulk.execute();
}
MongoDB 2.4
cursor["result"].forEach(function(document) {
db.collection.update(
{ "_id": document._id },
{ "$set": { "name": document.name } }
);
})
You should iterate through. For your specific case:
db.person.find().snapshot().forEach(
function (elem) {
db.person.update(
{
_id: elem._id
},
{
$set: {
name: elem.firstname + ' ' + elem.lastname
}
}
);
}
);
Apparently there is a way to do this efficiently since MongoDB 3.4, see styvane's answer.
Obsolete answer below
You cannot refer to the document itself in an update (yet). You'll need to iterate through the documents and update each document using a function. See this answer for an example, or this one for server-side eval().
For a database with high activity, you may run into issues where your updates affect actively changing records and for this reason I recommend using snapshot()
db.person.find().snapshot().forEach( function (hombre) {
hombre.name = hombre.firstName + ' ' + hombre.lastName;
db.person.save(hombre);
});
http://docs.mongodb.org/manual/reference/method/cursor.snapshot/
Starting Mongo 4.2, db.collection.update() can accept an aggregation pipeline, finally allowing the update/creation of a field based on another field:
// { firstName: "Hello", lastName: "World" }
db.collection.updateMany(
{},
[{ $set: { name: { $concat: [ "$firstName", " ", "$lastName" ] } } }]
)
// { "firstName" : "Hello", "lastName" : "World", "name" : "Hello World" }
The first part {} is the match query, filtering which documents to update (in our case all documents).
The second part [{ $set: { name: { ... } }] is the update aggregation pipeline (note the squared brackets signifying the use of an aggregation pipeline). $set is a new aggregation operator and an alias of $addFields.
Regarding this answer, the snapshot function is deprecated in version 3.6, according to this update. So, on version 3.6 and above, it is possible to perform the operation this way:
db.person.find().forEach(
function (elem) {
db.person.update(
{
_id: elem._id
},
{
$set: {
name: elem.firstname + ' ' + elem.lastname
}
}
);
}
);
I tried the above solution but I found it unsuitable for large amounts of data. I then discovered the stream feature:
MongoClient.connect("...", function(err, db){
var c = db.collection('yourCollection');
var s = c.find({/* your query */}).stream();
s.on('data', function(doc){
c.update({_id: doc._id}, {$set: {name : doc.firstName + ' ' + doc.lastName}}, function(err, result) { /* result == true? */} }
});
s.on('end', function(){
// stream can end before all your updates do if you have a lot
})
})
update() method takes aggregation pipeline as parameter like
db.collection_name.update(
{
// Query
},
[
// Aggregation pipeline
{ "$set": { "id": "$_id" } }
],
{
// Options
"multi": true // false when a single doc has to be updated
}
)
The field can be set or unset with existing values using the aggregation pipeline.
Note: use $ with field name to specify the field which has to be read.
Here's what we came up with for copying one field to another for ~150_000 records. It took about 6 minutes, but is still significantly less resource intensive than it would have been to instantiate and iterate over the same number of ruby objects.
js_query = %({
$or : [
{
'settings.mobile_notifications' : { $exists : false },
'settings.mobile_admin_notifications' : { $exists : false }
}
]
})
js_for_each = %(function(user) {
if (!user.settings.hasOwnProperty('mobile_notifications')) {
user.settings.mobile_notifications = user.settings.email_notifications;
}
if (!user.settings.hasOwnProperty('mobile_admin_notifications')) {
user.settings.mobile_admin_notifications = user.settings.email_admin_notifications;
}
db.users.save(user);
})
js = "db.users.find(#{js_query}).forEach(#{js_for_each});"
Mongoid::Sessions.default.command('$eval' => js)
With MongoDB version 4.2+, updates are more flexible as it allows the use of aggregation pipeline in its update, updateOne and updateMany. You can now transform your documents using the aggregation operators then update without the need to explicity state the $set command (instead we use $replaceRoot: {newRoot: "$$ROOT"})
Here we use the aggregate query to extract the timestamp from MongoDB's ObjectID "_id" field and update the documents (I am not an expert in SQL but I think SQL does not provide any auto generated ObjectID that has timestamp to it, you would have to automatically create that date)
var collection = "person"
agg_query = [
{
"$addFields" : {
"_last_updated" : {
"$toDate" : "$_id"
}
}
},
{
$replaceRoot: {
newRoot: "$$ROOT"
}
}
]
db.getCollection(collection).updateMany({}, agg_query, {upsert: true})
(I would have posted this as a comment, but couldn't)
For anyone who lands here trying to update one field using another in the document with the c# driver...
I could not figure out how to use any of the UpdateXXX methods and their associated overloads since they take an UpdateDefinition as an argument.
// we want to set Prop1 to Prop2
class Foo { public string Prop1 { get; set; } public string Prop2 { get; set;} }
void Test()
{
var update = new UpdateDefinitionBuilder<Foo>();
update.Set(x => x.Prop1, <new value; no way to get a hold of the object that I can find>)
}
As a workaround, I found that you can use the RunCommand method on an IMongoDatabase (https://docs.mongodb.com/manual/reference/command/update/#dbcmd.update).
var command = new BsonDocument
{
{ "update", "CollectionToUpdate" },
{ "updates", new BsonArray
{
new BsonDocument
{
// Any filter; here the check is if Prop1 does not exist
{ "q", new BsonDocument{ ["Prop1"] = new BsonDocument("$exists", false) }},
// set it to the value of Prop2
{ "u", new BsonArray { new BsonDocument { ["$set"] = new BsonDocument("Prop1", "$Prop2") }}},
{ "multi", true }
}
}
}
};
database.RunCommand<BsonDocument>(command);
MongoDB 4.2+ Golang
result, err := collection.UpdateMany(ctx, bson.M{},
mongo.Pipeline{
bson.D{{"$set",
bson.M{"name": bson.M{"$concat": []string{"$lastName", " ", "$firstName"}}}
}},
)