Loopback 3 Queries not returning expected results for MySQL data field - loopbackjs

I have a content model that looks like:
{
"name": "content",
"base": "PersistedModel",
"idInjection": true,
"options": {
"validateUpsert": true
},
"properties": {
"id": {
"type": "number"
},
"url": {
"type": "string",
"required": false
},
"data": {
"type": "Object",
"required": true
}
},
"validations": [],
"relations": {},
"acls": [
{
"accessType": "WRITE",
"principalType": "ROLE",
"principalId": "$authenticated",
"permission": "ALLOW"
},
{
"accessType": "READ",
"principalType": "ROLE",
"principalId": "$everyone",
"permission": "ALLOW"
}
],
"methods": {}
Example of this data in MySQL looks like this:
I've tried querying this data both with REST and the Node API and haven't been able to query by any of the nested object fields inside of the "data" database field. A few examples that don't work:
contentModel.findOne({ where: { "data.url": url } }, function (
err,
content
) {
if (err) {
console.log(err);
}
console.log("find:", content);
});
const filter = encodeURI(`{"where":{"systemid":"${contentType}"}}`);
let url = `${apiUrl}contentTypes?filter=${filter}`;
let contentTypeRecord = await this.getAxios().get(url);
I've also tried many queries in the loopback swagger ui. I usually get no results or it returns all the content records.
The above data access attempts do however work if I have the same data setup in MongoDb.
What am I doing wrong? Loopback should presumably be parsing the object in the data field and allowing me to filter on it.

AFAIK, LoopBack does not support query filters on nested properties. According to this comment, there is a limited support in PostgreSQL and MongoDB connectors:
#michelgokan as you'll see from the thread:
There is some support in postgres connector
Mongo also has some basic support (only for dot nested properties)
Further support isn't going to be added to other RDB connectors as loopback 3 is in LTS and no longer accepting new feature PRs
Loopback 3 ends LTS at the end of 2019 so it's even less likely to be added.
Options are:
Write the custom SQL yourself as a custom endpoint
Create an alternate connector for something like Knex or Bookshelf that has support (big job)
Do the filtering in memory
Use json for the nested data to use dot queries in something like mongo or postgres
Look at migrating to loopback next which may add support in the future
It seems that MySQL supports querying of JSON data, see docs for JSON_CONTAINS, thus it should be possible to implement this feature in loopback-connector-mysql. Feel free to open an issue in https://github.com/strongloop/loopback-connector-mysql/issues. A pull request would be even better! ❤️

Related

AWS DMS CDC - Only capture changed values not entire record? (Source RDS MySQL)

I have a DMS CDC task set (change data capture) from a MySQL database to stream to a Kinesis stream which a Lambda is connected to.
I was hoping to ultimately receive only the value that has changed and not on entire dump of the row, this way I know what column is being changed (at the moment it's impossible to decipher this without setting up another system to track changes myself).
Example, with the following mapping rule:
{
"rule-type": "selection",
"rule-id": "1",
"rule-name": "1",
"object-locator": {
"schema-name": "my-schema",
"table-name": "product"
},
"rule-action": "include",
"filters": []
},
and if I changed the name property of a record on the product table, I would hope to recieve a record like this:
{
"data": {
"name": "newValue"
},
"metadata": {
"timestamp": "2021-07-26T06:47:15.762584Z",
"record-type": "data",
"operation": "update",
"partition-key-type": "schema-table",
"schema-name": "my-schema",
"table-name": "product",
"transaction-id": 8633730840
}
}
However what I actually recieve is something like this:
{
"data": {
"name": "newValue",
"id": "unchangedId",
"quantity": "unchangedQuantity",
"otherProperty": "unchangedValue"
},
"metadata": {
"timestamp": "2021-07-26T06:47:15.762584Z",
"record-type": "data",
"operation": "update",
"partition-key-type": "schema-table",
"schema-name": "my-schema",
"table-name": "product",
"transaction-id": 8633730840
}
}
As you can see when receiving this, it's impossible to decipher what property has changed without setting up additional systems to track this.
I've found another stackoverflow thread where someone is posting an issue because their CDC is doing what I want mine to do. Can anyone point me into the right direction to achieve this?
I found the answer after digging into AWS documentation some more.
https://docs.aws.amazon.com/dms/latest/userguide/CHAP_Target.Kinesis.html#CHAP_Target.Kinesis.BeforeImage
Different source database engines provide different amounts of
information for a before image:
Oracle provides updates to columns only if they change.
PostgreSQL provides only data for columns that are part of the primary
key (changed or not).
MySQL generally provides data for all columns (changed or not).
I used the BeforeImageSettings on the task setting to include the original data with payloads.
"BeforeImageSettings": {
"EnableBeforeImage": true,
"FieldName": "before-image",
"ColumnFilter": "all"
}
While this still gives me the whole record, it give me enough data to work out what's changed without additional systems.
{
"data": {
"name": "newValue",
"id": "unchangedId",
"quantity": "unchangedQuantity",
"otherProperty": "unchangedValue"
},
"before-image": {
"name": "oldValue",
"id": "unchangedId",
"quantity": "unchangedQuantity",
"otherProperty": "unchangedValue"
},
"metadata": {
"timestamp": "2021-07-26T06:47:15.762584Z",
"record-type": "data",
"operation": "update",
"partition-key-type": "schema-table",
"schema-name": "my-schema",
"table-name": "product",
"transaction-id": 8633730840
}
}

List users as non admin with custom fields

As per the documentation, I should be able to get a list of users with a custom schema as long as the field in the schema has a value of ALL_DOMAIN_USERS in the readAccessType property. That is the exact set up I have in the admin console; Moreover, when I perform a get request to the schema get endpoint for the schema in question, I get confirmation that the schema fields are set to ALL_DOMAIN_USERS in the readAccessType property.
The problem is when I perform a users list request, I don't get the custom schema in the response. The request is the following:
GET /admin/directory/v1/users?customer=my_customer&projection=full&query=franc&viewType=domain_public
HTTP/1.1
Host: www.googleapis.com
Content-length: 0
Authorization: Bearer fakeTokena0AfH6SMD6jF2DwJbgiDZ
The response I get back is the following:
{
"nextPageToken": "tokenData",
"kind": "admin#directory#users",
"etag": "etagData",
"users": [
{
"externalIds": [
{
"type": "organization",
"value": "value"
}
],
"organizations": [
{
"department": "department",
"customType": "",
"name": "Name",
"title": "Title"
}
],
"kind": "admin#directory#user",
"name": {
"fullName": "Full Name",
"givenName": "Full",
"familyName": "Name"
},
"phones": [
{
"type": "work",
"value": "(999)999-9999"
}
],
"thumbnailPhotoUrl": "https://photolinkurl",
"primaryEmail": "user#domain.com",
"relations": [
{
"type": "manager",
"value": "user#domain.com"
}
],
"emails": [
{
"primary": true,
"address": "user#domain.com"
}
],
"etag": "etagData",
"thumbnailPhotoEtag": "photoEtagData",
"id": "xxxxxxxxxxxxxxxxxx",
"addresses": [
{
"locality": "Locality",
"region": "XX",
"formatted": "999 Some St Some State 99999",
"primary": true,
"streetAddress": "999 Some St",
"postalCode": "99999",
"type": "work"
}
]
}
]
}
However, if I perform the same request with a super admin user, I get an extra property in the response:
"customSchemas": {
"Dir": {
"fieldOne": false,
"fieldTwo": "value",
"fieldThree": value
}
}
My understanding is that I should get the custom schema with a non admin user as long as the custom schema fields are set to be visible by all domain users. This is not happening. I opened a support ticket with G Suite but the guy that provided "support", send me in this direction. I believe this is a bug or maybe I overlooked something.
I contacted G Suite support and in fact, this issue is a domain specific problem.
It took several weeks for the issue to be addressed by the support engineers at Google but it was finally resolved. The behaviour is the intended one now.

List PowerBI workspace collection keys from arm template

When using ARM templates to deploy various Azure components you can use some functions. One of them is called listkeys and you can use it to return through the output the keys that were created during the deployment, for example when deploying a storage account.
Is there a way to get the keys when deploying a Power BI workspace collection?
According to you mentioned link, if we want to use listKeys function, then we need to know resourceName and ApiVersion.
From the Azure PowerBI workspace collection get access keys API, we could get resource name
Microsoft.PowerBI/workspaceCollections/{workspaceCollectionName} and API version "2016-01-29"
So please have a try to use the follow coding, it works for me correctly.
"outputs": {
"exampleOutput": {
"value": "[listKeys(resourceId('Microsoft.PowerBI/workspaceCollections', parameters('workspaceCollections_tompowerBItest')), '2016-01-29')]",
"type": "object"
}
Check the created PowerBI Service from Azure portal
Whole ARM template I used:
{
"$schema": "https://schema.management.azure.com/schemas/2015-01-01/deploymentTemplate.json#",
"contentVersion": "1.0.0.0",
"parameters": {
"workspaceCollections_tompowerBItest": {
"defaultValue": "tomjustforbitest",
"type": "string"
}
},
"variables": {},
"resources": [
{
"type": "Microsoft.PowerBI/workspaceCollections",
"sku": {
"name": "S1",
"tier": "Standard"
},
"tags": {},
"name": "[parameters('workspaceCollections_tompowerBItest')]",
"apiVersion": "2016-01-29",
"location": "South Central US"
}
],
"outputs": {
"exampleOutput": {
"value": "[listKeys(resourceId('Microsoft.PowerBI/workspaceCollections', parameters('workspaceCollections_tompowerBItest')), '2016-01-29')]",
"type": "object"
}
}
}

Delete record Libcloud (GoDaddy api)

I try to implement delete method for Record delate-record, but its my first time to use python and this api.
The GoDaddy API doesn't have a delete record method, so this functionality is not exposed in the driver.
https://developer.godaddy.com/doc#!/_v1_domains/recordReplace
The driver could offer the 'replace records in zone' method, which would allow you to fetch the current list of records, and then set the new list minus the record you want to remove. But that feature is not implemented and quite risky.
First,
Send a GET request to https://api.godaddy.com/v1/domains/{DOMAIN}/records
Then, Enumerate over all records of API Response (JSON Array) and prepare new data by removing the one that needs to be deleted.
API Response (SAMPLE)
[
{
"data": "192.168.1.1",
"name": "#",
"ttl": 600,
"type": "A"
},
{
"data": "ns1.example.com",
"name": "#",
"ttl": 3600,
"type": "NS"
},
{
"data": "#",
"name": "www",
"ttl": 3600,
"type": "CNAME"
},
{
"data": "mail.example.com",
"name": "#",
"ttl": 3600,
"priority": 1,
"type": "MX"
}
]
New Data (After deleting record) (SAMPLE)
[
{
"data": "192.168.1.1",
"name": "#",
"ttl": 600,
"type": "A"
},
{
"data": "ns1.example.com",
"name": "#",
"ttl": 3600,
"type": "NS"
},
{
"data": "#",
"name": "www",
"ttl": 3600,
"type": "CNAME"
}
]
Now,
Send a PUT request to https://api.godaddy.com/v1/domains/{DOMAIN}/records with new data.
The most important thing is how you identify the records in above array which needs to be deleted. This would not be a difficult task, assuming you have good programming skills.
I managed to worked around it in kind of a hacky - we had bunch of records we wanted to delete, doing it manually seemed weird so I added a Javascript into the Chrome Developer Console, running on an authenticated session from the DNS manage page:
function deleteGoDaddyRecords(recordId) {
$.ajax({
url: 'https://dcc.godaddy.com/api/v3/domains/<YOUR-DOMAIN.com>/records?recordId='+recordId,
type: 'DELETE',
success: function(result) {
console.log(result)
}
});
}
which let me use the same call the UI is calling when you ask to delete a record.
the only thing you need to provide is the AttributeUid which is not available with the public API, but it is in the front-end API:
https://dcc.godaddy.com/api/v2/domains/runahr.com/records
So I managed to create a script that will generate bunch of
deleteGoDaddyRecords('<RECORD-UUID>');
deleteGoDaddyRecords('<RECORD-UUID>');
copy & paste the generated script into the Developers Console and that solved it for now.
I hope GoDaddy will add a public DELETE endpoint to their API in the future :)

Where/when should I remove the relationship type to avoid an unknown keys warning?

I'm using Ember Data with a server application that follows the json:api standard. When I normalize the response from the server, I'm adding a relationshipType attribute from the links so that Ember Data knows what type of model to build when the relationship is polymorphic.
For example, here's the response from the server:
{
"members": {
"id": "1",
"created_at": "2014-10-15T18:35:00.000Z",
"updated_at": "2014-10-15T18:35:00.000Z",
"links": {
"user": {
"id": "1",
"type": "users",
"href": "http://test.host/api/v1/users/1"
},
"organization": {
"id": "2",
"type": "customers",
"href": "http://test.host/api/v1/customers/2"
}
}
}
}
The organization relationship is polymorphic, and the type in this instance is customers.
In the Ember application, I'm normalizing the response into following (which follows the RESTSerializer convention):
{
"members": {
"id": "1",
"created_at": "2014-10-15T18:35:00.000Z",
"updated_at": "2014-10-15T18:35:00.000Z",
"user": "1",
"userType": "users",
"organization": "2",
"organizationType": "customers"
}
}
This works, and Ember Data builds the correct user relationship and organization relationship (using the Customer model).
But, I'm receiving the following warning:
WARNING: The payload for '(subclass of DS.Model)' contains these unknown keys:
[userType,organizationType]. Make sure they've been defined in your model.
I'd like to remove these relationshipType keys and their values after they've been used.
Where should I do this?
Have you tried using https://github.com/kurko/ember-json-api? I may have some other overrides in my app to handle polymorphism but hopefully the package will get you closer.
Also see my pending PR.