What are the properties of the StrongLoop Loopback 'Datasource' Object? - loopbackjs

I am working on a custom boot script that will autoupdate (sync database to loopback models) for only SOME of my Loopback models (so as to not over-ride schema changes made by other apps / devs on DBs I connect to inadvertently).
Right now I am using the following in my bootscript to get the datasource object for each of my connected datasources:
var postgres = app.dataSources.AWSPostgres;
The two pieces of data I would love to be able to pull from the above object are:
The name of the datasource (String)
An array of Models that are being stored in that DataSource (Array)
The problem is that I can't find any documentation reference to the properties of the individual dataSource objects.
If I can get the name of the dataSource from the object above (in the case of the example the name would be 'AWSPostgres') then I can create an array of the datasources I would like to make available for Autoupdate thereby exempting the datasources where I DO NOT want to overwrite the existing schema.

So I was able to list the properties of the individual datasource connector object using the following bit of code:
console.log("Show properties... ", Object.keys(datasource.connector));
Properties for a Loopback Datasource Connector
[ '_models',
'name',
'settings',
'dataSource',
'client',
'log',
'logger',
'_mixins',
'observe',
'removeObserver',
'clearObservers',
'notifyObserversOf',
'_notifyBaseObservers',
'notifyObserversAround' ]
For my purposes I was able to pull the name out of... wait for it... the datasource.connector.name property.
For those wondering, datasource.connector._models is an array of the models attached to a given datasource. To get about the names of each model attached to that datasource you would do something like this (I am using Lodash which I recommend):
_.forEach(datasource.connector._models, function(datasourceModel) {
console.log("Model name is... " + datasourceModel.model.modelName);
});
The above loops through each model in the datasource.connector._models array and logs the modelName to the console, that will give you a list of the models attached to a given datasource which is usable inside of a Loopback boot script.

For anyone wondering what my final boot script looked like (which sets up to autoupdate ALL models on only SOME datasources) here is my full code:
var _ = require('lodash');
module.exports = function (app) {
'use strict'
//Define Postgres_Local Datasource and Models
var postgres_local = app.dataSources.LocalPostgres;
//Define Postgres_aws Datasource and Models to AutoUpdate
var postgres_aws = app.dataSources.AWSPostgres;
//Define Mongo Datasource and Models
var mlab_Mongo = app.dataSources.mlab_Mongo;
//Define MySQL Datasource and Models
var mysql = app.dataSources.mysql;
// Only the Datasources listed in this array will have their Models AutoUpdated
var datasources = [postgres_aws, mysql];
console.log('-- Datasources to Autoupdate: ' + datasources.length);
//Loop through all specified datasources and models to autoupdate where necessary
_.forEach(datasources, function(datasource) {
console.log("Working on... " + datasource.connector.name);
var currentDataSource = datasource;
_.forEach(datasource.connector._models, function(datasourceModel) {
console.log("Cheking if table for model " + datasourceModel.model.modelName + " is created and up-to-date in " + datasource.connector.name + " DB.");
currentDataSource.isActual(datasourceModel.model.modelName, function (err, actual) {
if (actual) {
console.log("Model " + datasourceModel.model.modelName + " is up-to-date. No auto-update necessary.");
} else {
console.log('Difference found! Auto-updating model ' + datasourceModel.model.modelName + '...');
currentDataSource.autoupdate(datasourceModel.model.modelName, function () {
console.log("Auto-updated model " + datasourceModel.model.modelName + " in " + datasource.connector.name + " datasource successfully.");
});
}
});
});
});
} // End Exports

Related

Pulling Drive activity report through GCP, is there a way to see folder path?

I am supposed to generate drive activity report so we can track what type of file users are using and where is the file being created (My Drive/shared drive).
I used the GAM command to pull drive activity report which has various fields except for the root path.
Does anyone know a way i can manipulate that so i can get a field that shows folder path as well.
Thanks!
You can try these particular GAM commands so you can edit them later to gather information of the folders and root folders:
gam user <User Email Address> print filetree depth 0 showmimetype gfolder excludetrashed todrive
You can edit the depth, for example orphaned folders when using -1. I am not familiar with which command you use, but you might need to mix or add some fields so it shows the root folder or path.
gam user <User Email Address> print filelist todrive select 1Yvxxxxxxxxxxxxxxxxxxxxxxjif9 showmimetype gfolder fields id
You might need to add over your command something like "print filetree" or "show filepath"
Reference:
https://github.com/taers232c/GAMADV-XTD3/wiki/Users-Drive-Files-Display
I have created a custom menu that iterates through a table of data, the data must have a column with the file IDs of interest and 2 additional columns for owner and path, since the file can be owned by either a user or a shared drive. The user running the function must have Super Admin rights to access files owned by other users and the user in question must be a member of a shared drive for the file to be located. My previous implementation as a custom function failed to address a limitation of this feature where advanced services are inaccessible.
The custom menu is created as explained in this documentation article https://developers.google.com/apps-script/guides/menus. There must be a trigger that executes when the sheet opens the menu is created.
In addition to that the code requires the use of Advanced Services, Google Drive must be added following the steps of this other article https://developers.google.com/apps-script/guides/services/advanced#enable_advanced_services. The advanced service will ask for authorization but the first time the code is executed. You may expedite the process by creating an empty function and running it.
function onOpen() {
var ui = SpreadsheetApp.getUi();
ui.createMenu('File ownership').addItem('Read data', 'readData').addToUi();
}
function readData() {
var sheetData = SpreadsheetApp.getActiveSheet().getDataRange().getValues();
var i = 0;
for (; i < sheetData.length; i++){
if (sheetData[0][i] == '')
break;
}
SpreadsheetApp.getUi().alert('There are ' + i + ' cells with data.');
for (i = 1; i < sheetData.length; i++){
var fileID = sheetData[i][0];
var owner = getFileOwner(fileID);
var path = getFilePath(fileID);
SpreadsheetApp.getActiveSheet().getRange(i + 1,2).setValue(owner);
SpreadsheetApp.getActiveSheet().getRange(i + 1,3).setValue(path );
}
SpreadsheetApp.getUi().alert('The owner and file path have been populated');
}
function getFilePath(fileID, filePath = ""){
try {
var file = Drive.Files.get(fileID,{
supportsAllDrives: true
});
if (!file.parents[0])
return "/" + filePath;
var parent = file.parents[0];
var parentFile = Drive.Files.get(parent.id,{ supportsAllDrives: true });
var parentPath = parentFile.title;
if (parent.isRoot || parentFile.parents.length == 0)
return "/" + filePath;
else {
return getFilePath(
parentFile.id,
parentPath + "/" + filePath);
}
}
catch (GoogleJsonResponseException){
return "File inaccesible"
}
}
function getFileOwner(fileID){
try {
var file = Drive.Files.get(
fileID,
{
supportsAllDrives: true
});
var driveId = file.driveId;
if (driveId){
var driveName = Drive.Drives.get(driveId).name;
return driveName + "(" + driveId + ")";
}
var ownerEmailAddress = file.owners[0].emailAddress;
return ownerEmailAddress;
}
catch (GoogleJsonResponseException){
return "File inaccesible"
}
}
After executing the function, it will take significantly longer the more files IDs it has, the cells will be updated with their respective owner and path.
Note: with a Super Admin account you can programmatically create a view permission for shared drives you don't have access to using APIs or Apps Script, you may submit a separate question for more details or read the documentation in the developer page at https://developers.google.com/drive/api/v2/reference/permissions.

Mocking EF core computed field in InMemoryDatabase for unit tests

In my database I have computed field "FullName" created by EF core using HasComputedColumnSql fluent API and everything is working fine in the app.
However I have some unit tests using InMemory database which are testing some logic using "FullName" column which is null in InMemory database.
Any idea how I can mock computed column behavior in in memory database?
I’m using the following solution:
Add even handler for SavingChanges
Inside the handler, loop over entities of the type ou have a computed column on
Initialize computed value
Example:
Add event handler:
dbContext.SavingChanges += OnSaveChanges;
In my case, the class name is Ressource end computed property is “FullName” :
private void OnSaveChanges(object? sender, SavingChangesEventArgs e)
{
var dbContext = sender as <your context class>;
var ressources = dbContext.ChangeTracker.Entries().Where(x => x.Entity is Ressource).Select(x => x.Entity as Ressource).ToList();
foreach (var item in ressources)
item.FullName = item.FirstName + " " + item.LastName;
}
Hope it helps.

Unable to delete entities from command line

I am trying to run import on non prod env using the prod exported data but do not know how to delete all the kids from command line before starting import. Since we are creating fake data for load testing, it is essential to delete all the kinds and perform fresh import. I can delete the kinds manually from datastore admin but would like to do it programmatically.
I hope the below code is helpful. You can follow the below approach to delete all the entities from a kind problematically. First we are fetching all the entity from the kind and populating a list of keys and then selecting each entity using key and deleting it from datastore.
Step 1: Fetch all keys from datastore
List<String> googleDSKeyList = new ArrayList<String>();
StructuredQuery<Entity> query = Query.newEntityQueryBuilder().setKind(kind).build();
QueryResults<Entity> results = datastore.run(query);
results.forEachRemaining(entity -> googleDSKeyList.add(String.valueOf(entity.getKey().getNameOrId())));
Step 2: For each entity, pass the key to the delete method.
googleDSKeyList.forEach(keyId-> deleteEntity(keyId));
Step 3: To delete the entity, perform select and then delete.
public void deleteEntity(String KEY_VALUE) {
Query<Entity> query = Query
.newGqlQueryBuilder(Query.ResultType.ENTITY,
"SELECT * WHERE __key__ HAS ANCESTOR KEY (" + KIND_NAME + ", '" + PRIMARY_KEY + "')")
.setAllowLiteral(true).build();
QueryResults<Entity> results = datastore.run(query);
if (results.hasNext()) {
Entity rs = results.next();
datastore.delete(rs.getKey());
LOGGER.info("Successfully Deleted KEY # : " + KEY_VALUE);
}
}

Salesforce APEX Unit Test Error with REST Services / Error at SELECT of Case Object

I've succeeded to successfully construct a REST API using APEX language defined with an annotation: #RestResource.
I also wrote a matching Unit test procedure with #isTest annotation. The execution of the REST API triggered by a HTTP GET with two input parameters works well, while the Unit Test execution, returns a "null" value list resulting from the SOQL query shown below:
String mycase = inputs_case_number; // for ex. '00001026'
sObject[] sl2 = [SELECT Id, CaseNumber FROM Case WHERE CaseNumber = :mycase LIMIT 1];
The query returns:
VARIABLE_ASSIGNMENT [22]|sl2|[]|0x1ffefea6
I've also tried to execute it with a RunAs() method (see code below), using a dynamically created Salesforce test user, not anonymous, connected to a more powerful profile, but still receiving a "null" answer at the SOQL query. The new profile defines "View All" permission for Cases. Other SOQL queries to objects like: "User" and "UserRecordAccess" with very similar construction are working fine, both for REST APEX and Test APEX.
Is there a way to configure an access permission for Unit test (#isTest) to read the Case object and a few fields like: Id and CaseNumber. Is this error related to the "Tooling API" function and how can we fix this issue in the test procedure?
Code attachment: Unit Test Code
#isTest
private class MyRestResource1Test {
static testMethod void MyRestRequest() {
// generate temporary test user object and assign to running process
String uniqueUserName = 'standarduser' + DateTime.now().getTime() + '#testorg.com';
Profile p = [SELECT Id FROM Profile WHERE Name='StandardTestUser'];
User pu = new User(Alias='standt',Email='standarduser#testorg.com',LastName='testing',EmailEncodingKey='UTF-8',LanguageLocaleKey='en_US',LocaleSidKey='en_US',ProfileId=p.Id,TimeZoneSidKey='America/New_York',UserName=uniqueUserName);
System.RunAs(pu) {
RestRequest req = new RestRequest();
RestResponse res = new RestResponse();
req.requestURI = '/services/apexrest/sfcheckap/';
req.addParameter('useremail','testuserid#red.com');
req.addParameter('casenumber','00001026');
req.httpMethod = 'GET';
RestContext.request = req;
RestContext.response = res;
System.debug('Current User assigned is: ' + UserInfo.getUserName());
System.debug('Current Profile assigned is: ' + UserInfo.getProfileId());
Test.startTest();
Map<String, Boolean> resultMap = MyRestResource1.doGet();
Test.stopTest();
Boolean debugflag = resultMap.get('accessPermission');
String debugflagstr = String.valueOf(debugflag);
System.assert(debugflagstr.contains('true'));
}
}
}
Found a solution path by using: #isTest(SeeAllData=true)
See article: "Using the isTest(SeeAllData=true) Annotation"
https://developer.salesforce.com/docs/atlas.en-us.apexcode.meta/apexcode/apex_testing_seealldata_using.htm

Refresh all data other than session data when context changes

I'm writing a small sample application which will display some information to a user in dashboards and tables.
The application is being written using ember-cli version 0.0.37. and ember.js version 1.5.1.
I am using ember-simple-auth and ember-simple-auth-oauth2 for authentication, with a custom authenticator and authroizer to inject the client_id, client_secret and access_token into the requests where appropriate (I am fully aware that the client secret shouldn't be in a front end application, but this is a sample application for internal consumption only and not the subject of my question).
The custom authorizer also injects an o parameter into the requests, the value of which is an organisation id. The API returning data uses both the access_token and the o parameter to return data pertaining to a particular organisation. The organisation id is stored in the session.
So once I've browsed around for one organisation, I've got a dropdown component which allows me to choose another organisation. At present, this calls an action on the ApplicationRoute which updates the value in the session, and clears the store of all models that are not account or organisation, as these are used at the application level.
Code:
setSessionOrganisation: function(organisation_id) {
var self = this;
/**
* Check if its the same organisation
*/
if (this.get('session.organisation_id') == organisation_id) {
return;
}
/**
* Get the organisation for the id provided
*/
this.store.find('organisation', organisation_id).then(
function(org) {
/**
* Update the session details
*/
self.set('session.organisation_id', org.get('id'));
self.set('session.organisation_name', org.get('name'));
/**
* Get all types
*/
var store = self.container.lookup('store:main');
var types = [];
var typeMaps = store.typeMaps;
$.each(typeMaps, function(key) {
if (typeMaps.hasOwnProperty(key)) {
var type = typeMaps[key].type.typeKey;
if (type != 'account' && type != 'organisation'){
types.push(typeMaps[key].type.typeKey);
}
}
});
/**
* Clear data for types
*/
for (var i = 0; i < types.length; i++) {
store.unloadAll(types[i]);
};
});
}
I feel the code above is a bit hackish, but I've not found another way to return the model types currently in the store.
When this action is called, and the data has been flushed, I would like to then refresh/reload the current route. Some of the routes will be dynamic, and using the object ids as the dynamic segment. These routes will need to redirect to their list routes. Other routes will carry on lazily loading data when they are navigated to.
So my questions are:
Is there a better way to clear the store data than I have done above?
How can I trigger a route reload?
How can I redirect to the parent route if in a route with a dynamic segment?
As a bonus question, are there any dangers in unloading all the data for views/routes that are presently not being displayed?
Thanks!
I've come to a resolution that satisfies my criteria, with great help from #marcoow
As per the comments on the question, we discussed reloading the entire application to refresh the data store cache. While this approach does work, there was a flicker while the browser completely reloaded the page, and I would prefer to only reload the data and have the application sort itself out. This flicker was also present if App.reset() was called.
I explored other options and found the Ember.Route.refresh() method. This appears to cause the nested routes to reload their models when called from the application route.
In my application I have extended the store in an initializer, so that I can call a function which unloads all records from the store, for every type of model in the store, but also provide a list of model names to exclude:
app/initializers/custom-store.js:
import DS from 'ember-data';
var CustomStore = DS.Store.extend({
/**
* Extend the functionality in the store to unload all instances of each type
* of data except those passed
*
* #param {Array} exclusions List of model type names to exclude from data
* unload
*/
unloadAllExcept: function(exclusions) {
var typeMaps = this.get('typeMaps');
for (var type in typeMaps) {
if (typeMaps.hasOwnProperty(type)) {
var typeKey = (typeMaps[type].type && typeMaps[type].type.typeKey) ? typeMaps[type].type.typeKey : false;
if (typeKey && exclusions.indexOf(typeKey) === -1) {
this.unloadAll(typeKey);
}
}
}
}
});
export
default {
after: 'store',
name: 'custom-store',
initialize: function(container /*, app*/ ) {
container.unregister('store:main');
container.register('store:main', CustomStore);
}
};
This is called from app/routes/application.js in the setSessionOrganisation Action:
setSessionOrganisation: function(organisation_id) {
var _this = this;
/**
* Check if its the same organisation
*/
if (this.get('session.organisation_id') === organisation_id) {
return;
}
/**
* Get the organisation for the id provided
*/
this.store.find('organisation', organisation_id)
.then(
function(org) {
/**
* Update the session details
*/
_this.get('session')
.set('organisation_id', org.get('id'));
_this.get('session')
.set('organisation_name', org.get('name'));
/**
* Clean the local data cache of all data pertaining to
* the old organisation
*/
_this.store.unloadAllExcept(['account', 'organisation']);
/**
* refresh the application route (and subsequently all
* child routes)
*/
_this.refresh();
});
}
As a footnote, this solution does not satisfy the third question I asked, but I have realised that question is a separate issue to do with handling the response from the API, and must be dealt with during the model hook.