Modify ChangeStream Responce In Loopback 3 - loopbackjs

First off, if you're not familiar with change streams, please read this.
It seems, when using lb to scaffold applications, that a change stream endpoint is automatically created for models. I have already successfully implemented a change stream where, on submitting a new model instance to my Statement model the changes are sent to all connected clients in real time. This works great.
Except it only sends the modelInstance of the Statement model. I need to know a bit about the user that submitted the statement as well. Since Statement has a hasOne relationship with my user model I would normally make my query with an includes filter. But I'm not making a query here... that's not how change streams work. The node server sends the information to the client without any query for that information being sent first.
My question is, how can I hook the outgoing changestream in the Statement model so that I can pull in the needed data from the user module? Something like:
module.exports = function(Statement) {
Statement.hookChangeStream(function(ctx, statementInstance, cb) {
const myUser = Statement.app.models.myUser
myUser.findOne({ 'where': { 'id': statementInstance.userId } }, function(err, userInstance) {
if (err !== undefined && err !== null) cb(err, null);
// strip sensitive data from user model
cleanUserInstance = someCleanerFunc(userInstance);
// add cleaned myUser modelInstance to Statement modelInstance
statementInstance.user = cleanUserInstance;
cb(null, true);
}
});
}
Can this be done? If so, how?

Related

How can I use Apollo/GraphQL to incrementally/progressively query a datasource?

I have a query like this in my React/Apollo application:
const APPLICATIONS_QUERY = gql`
{
applications {
id
applicationType {
name
}
customer {
id
isActive
name
shortName
displayTimezone
}
deployments {
id
created
user {
id
username
}
}
baseUrl
customerIdentifier
hostInformation
kibanaUrl
sentryIssues
sentryShortName
serviceClass
updown
updownToken
}
}
`;
The majority of the items in the query are in a database and so the query is quick. But a couple of the items, like sentryIssues and updown rely on external API calls, so they make the duration of the query very long.
I'd like to split the query into the database portion and the external API portion so I can show the applications table immediately and add loading spinners for the two columns that hit an external API... But I can't find a good example of incremental/progressive querying or merging the results of two queries with Apollo.
This is a good example of where the #defer directive would be helpful. You can indicate which fields you want to defer for a given query like this:
const APPLICATIONS_QUERY = gql`
{
applications {
id
applicationType {
name
}
customer #defer {
id
isActive
name
shortName
displayTimezone
}
}
}
`
In this case, the client will make one request but receive 2 responses -- the initial response with all the requested fields sans customer and a second "patch" response with just the customer field that's fired once that resolver is finished. The client does the heavy lifting and pieces these two responses together for you -- there's no additional code necessary.
Please be aware that only nullable fields can be deferred, since the initial value sent with the first response will always be null. As a bonus, react-apollo exposes a loadingState property that you can use to check the loading state for your deferred fields:
<Query query={APPLICATIONS_QUERY}>
{({ loading, error, data, loadingState }) => {
const customerComponent = loadingState.applications.customer
? <CustomerInfo customer={data.applications.customer} />
: <LoadingIndicator />
// ...
}}
</Query>
The only downside is this is an experimental feature, so at the moment you have to install the alpha preview version of both apollo-server and the client libraries to use it.
See the docs for full details.

Ember.js - Updating a model via Websocket Stream

I want to keep my market model updated via the websocket stream.
I have a platform model that has many markets.
When the user first requests the model, it is retrieved from the backend database. I then want to update with the websocket data.
How do I update different values in the model? I can't figure out how to filter the hasmany relationship by market name then set the values. Maybe there's an easier way to go about it that I'm not seeing.
It's actually pretty simple - just make sure you have these things setup:
you'll want your websocket to send json data to ember, using the same format of json (json:ap for example)
when you establish your websocket connection on the ember side of things, you'll want an event handler for handling received messages.
that event handler will use store.pushPayload to add/update the model in the store (which means your websocket code needs access to the store).
an example:
// some controller.js
import Controller from '#ember/controller';
import { action } from 'ember-decorators/object';
import myAwesomeWebSocketStuff from 'lib/websocket';
export default class extends Controller {
init() {
const socket = myAwesomeWebSocketStuff(this.store);
this.set('socket', socket');
}
willDestroy() {
this.get('socket').disconnect();
}
}
and then in lib/websocket.js
import SomeWebSocketLibrary from 'some-library';
export default function(store) {
const socket = new SomeWebSocketLibrary(url);
socket.connect();
socket.on('receive', data => store.pushPayload(data));
return socket;
}

validatesLengthOf not working on password field due to hashing

I am trying to validate the password when a user is registered, but the validation is not done on the plain text but hashed value. How do i fix this?
My user model is client:
module.exports = function(client) {
client.validatesLengthOf('password', {min: 20})
};
Validations are for a model itself. I mean it affects on operation hooks not remote hooks.
You need to create a remote hook like this :
client.beforeRemote('create', function(ctx, instance, next){
if(ctx.args.data.password.length < 20){
return next(PsswordValidationError);
/* assuming you have this error object
or return any error validation you want */
}
next();
});

Meteor share sessions data between client and server

I'm building a restricted signup. I want user with a specific code passed in a url to be able to signup and not others. I'm using the accounts package.
I can prevent account creation in the Accounts.onCreateUser method. I'm looking for a way to tell the server if the client had an authorised signup code. With a classic form (email+password) I can just add an extra hidden field. How can I achieve the same result if the user signs up with let's say Facebook?
Since Meteor doesn't use cookies, I can't store this info in a cookie that the server would access. Session variable are not accessible server side. And since I'm not controlling what got send with the account-facebook creation, I can't use a Session variable on the client side that I'd pass along when the user presses sign up.
Any idea"?
Just add the special token to the user object being passed to Accounts.createUser():
var user = {
email: email,
password: password,
profile: {
token: token
}
};
Accounts.createUser(user, function (error, result) {
if (error) {
console.log(error)
}
});
On the server side you can access this in the Accounts.onCreateUser():
Accounts.onCreateUser(function(options, user) {
console.log(options);
console.log(user);
});
I think it's in the options variable that you will find your token, so it would be options.profile.token.
for me, the best option here was passing in custom parameters to loginbuttons.
see the package docs:
https://github.com/ianmartorell/meteor-accounts-ui-bootstrap-3
Where it outlines the below:
accountsUIBootstrap3.setCustomSignupOptions = function() {
return {
mxpDistinctId: Session.get('mxpdid'),
leadSource: Session.get('leadSource')
}
};

Django Rest Framework + React + Reflux: Can't GET new objects

I'm trying to post a new object via a React form using a Reflux action. That part is working fine. The object is posting as expected, however, when I try to GET that object programmatically, it doesn't seem to be accessible unless I log out, restart my local server, or sometimes even simply visit the api page manually.
I can't seem to get consistent behavior as to when I can get the object and when I can't. It does seem that viewing the api page and then returning to my app page has some kind of effect, but I'm at a loss as to why. Perhaps someone can shed a little light onto this for me.
One thing that's for sure is that the POST request is working properly, as the object is always there when I check for it manually.
Also, if you notice in the code below, I check to see what the last object on the api page is, and the console responds with what previously was the last item. So I'm able to access the api page programmatically, but the object I created is not there (though it is if I visit the page manually). Note, too, that refreshing produces the same results.
Any ideas where the issue could be or why this might happen?
Here's the action:
MainActions.getUserProfile.listen(function(user) {
request.get('/api/page/').accept('application/json').end( (err, res) => {
if (res.ok) {
var profiles = res.body;
var filteredData = profiles.filter(function (profile) {
if (profile) {
return profile.user === user
}
else {
console.log('No Profile yet.')
}
});
if (filteredData[0]) {
var data = {
user: filteredData[0].user,
...
};
... // other actions
} else {
console.log(profiles[profiles.length - 1].user)
}
} else {
console.log(res.text);
}
});
});
The problem ended up being with the Cache-Control header of the response having a max-age=600. Changing that to max-age=0 solved the issue. In this situation, it doesn't make much sense to provide a cached response, so this I added this to my serializer's ViewSet:
def finalize_response(self, request, *args, **kwargs):
response = super(MyApiViewSet, self).finalize_response(request, *args, **kwargs)
response['Cache-Control'] = 'max-age=0'
return response