How to use the bluebird concurrency option for the map function - concurrency

I am trying to use bluebird's map function with the built-in concurrency control.
I want to retrieve a list of names, then make a number of POST requests for each name. For example, I want to make a request for each name for each day of the week. However, I need to throttle the number of concurrent POST requests because the intended server has rate limits.
function getNames() {
//Open mongodb connection
//Get collection and array of names
//return array of names in a promise
}
function createDatesArray() {
//Create an array of rates
//return array of dates in a promise
//Ex. return Promise.resolve(datesArray);
}
getNames().map(function (name) {
return createDatesArray().map(function (date) {
return requestData(date, name);
}, {concurrency: 5});
}).then(function () {
//do something
});
Is this the correct way to use bluebird's concurrency?
The documentation link is here bluebird documentation.

Short answer: yes , this will limit the number of requests to 5.
Caveat: keep in mind you might still be susceptible to more limits, such as the HTTP client's, or any other pools, modules and services you might be using.
Also, a Mongo connection is meant to be used as a persistent one, so you should probably only open one and then use it rather than open and close one every time.
If createDatesArray does not do anything asynchronous, you do not have to Promise.resolve it, you can instead use the static variant of map as Promise.map(datesArray, function(date){ ... }) etc. I would also not nest. Assuming createDatesArray is indeed async:
Promise.join(getNames(), createDatesArray(), function(names, dates){
var tasks = [];
names.forEach(function(name ){ // create Cartesian product of names * dates
dates.forEach(function(date){
tasks.push(function(){ return requestData(name, date); });
});
});
return Promise.map(tasks, function(job){ return job(); } , { concurrency: 5} );
}), then(function(results){
// do whatever
});

Related

Apollo duplicates first result to every node in array of edges

I am working on a react app with react-apollo
calling data through graphql when I check in browser network tab response it shows all elements of the array different
but what I get or console.log() in my app then all elements of array same as the first element.
I don't know how to fix please help
The reason this happens is because the items in your array get "normalized" to the same values in the Apollo cache. AKA, they look the same to Apollo. This usually happens because they share the same Symbol(id).
If you print out your Apollo response object, you'll notice that each of the objects have Symbol(id) which is used by Apollo cache. Your array items probably have the same Symbol(id) which causes them to repeat. Why does this happen?
By default, Apollo cache runs this function for normalization.
export function defaultDataIdFromObject(result: any): string | null {
if (result.__typename) {
if (result.id !== undefined) {
return `${result.__typename}:${result.id}`;
}
if (result._id !== undefined) {
return `${result.__typename}:${result._id}`;
}
}
return null;
}
Your array item properties cause multiple items to return the same data id. In my case, multiple items had _id = null which caused all of these items to be repeated. When this function returns null the docs say
InMemoryCache will fall back to the path to the object in the query,
such as ROOT_QUERY.allPeople.0 for the first record returned on the
allPeople root query.
This is the behavior we actually want when our array items don't work well with defaultDataIdFromObject.
Therefore the solution is to manually configure these unique identifiers with the dataIdFromObject option passed to the InMemoryCache constructor within your ApolloClient. The following worked for me as all my objects use _id and had __typename.
const client = new ApolloClient({
link: authLink.concat(httpLink),
cache: new InMemoryCache({
dataIdFromObject: o => (o._id ? `${o.__typename}:${o._id}`: null),
})
});
Put this in your App.js
cache: new InMemoryCache({
dataIdFromObject: o => o.id ? `${o.__typename}-${o.id}` : `${o.__typename}-${o.cursor}`,
})
I believe the approach in other two answers should be avoided in favor of following approach:
Actually it is quite simple. To understand how it works simply log obj as follows:
dataIdFromObject: (obj) => {
let id = defaultDataIdFromObject(obj);
console.log('defaultDataIdFromObject OBJ ID', obj, id);
}
You will see that id will be null in your logs if you have this problem.
Pay attention to logged 'obj'. It will be printed for every object returned.
These objects are the ones from which Apollo tries to get unique id ( you have to tell to Apollo which field in your object is unique for each object in your array of 'items' returned from GraphQL - the same way you pass unique value for 'key' in React when you use 'map' or other iterations when rendering DOM elements).
From Apollo dox:
By default, InMemoryCache will attempt to use the commonly found
primary keys of id and _id for the unique identifier if they exist
along with __typename on an object.
So look at logged 'obj' used by 'defaultDataIdFromObject ' - if you don't see 'id' or '_id' then you should provide the field in your object that is unique for each object.
I changed example from Apollo dox to cover three cases when you may have provided incorrect identifiers - it is for cases when you have more than one GraphQL types:
dataIdFromObject: (obj) => {
let id = defaultDataIdFromObject(obj);
console.log('defaultDataIdFromObject OBJ ID', obj, id);
if (!id) {
const { __typename: typename } = obj;
switch (typename) {
case 'Blog': {
// if you are using other than 'id' and '_id' - 'blogId' in this case
const undef = `${typename}:${obj.id}`;
const defined = `${typename}:${obj.blogId}`;
console.log('in Blogs -', undef, defined);
return `${typename}:${obj.blogId}`; // return 'blogId' as it is a unique
//identifier. Using any other identifier will lead to above defined problem.
}
case 'Post': {
// if you are using hash key and sort key then hash key is not unique.
// If you do query in DB it will always be the same.
// If you do scan in DB quite often it will be the same value.
// So use both hash key and sort key instead to avoid the problem.
// Using both ensures ID used by Apollo is always unique.
// If for post you are using hashKey of blogID and sortKey of postId
const notUniq = `${typename}:${obj.blogId}`;
const notUniq2 = `${typename}:${obj.postId}`;
const uniq = `${typename}:${obj.blogId}${obj.postId}`;
console.log('in Post -', notUniq, notUniq2, uniq);
return `${typename}:${obj.blogId}${obj.postId}`;
}
case 'Comment': {
// lets assume 'comment's identifier is 'id'
// but you dont use it in your app and do not fetch from GraphQl, that is
// you omitted 'id' in your GraphQL query definition.
const undefnd = `${typename}:${obj.id}`;
console.log('in Comment -', undefnd);
// log result - null
// to fix it simply add 'id' in your GraphQL definition.
return `${typename}:${obj.id}`;
}
default: {
console.log('one falling to default-not good-define this in separate Case', ${typename});
return id;
}
I hope now you see that the approach in other two answers are risky.
YOU ALWAYS HAVE UNIQUE IDENTIFIER. SIMPLY HELP APOLLO BY LETTING KNOW WHICH FIELD IN OBJECT IT IS. If it is not fetched by adding in query definition add it.
An alternative option to the accepted is to instead of dataIdFromObject, which appears to be for everything in the query, I was able to provide a keyFields function per type that required it.
const client = new ApolloClient({
cache: new InMemoryCache({
typePolicies: {
ItemType: {
keyFields: (obj) =>
obj.id + "-" + obj.language.id,
},
},
}),
});
In the above example ItemType can be whichever Type is specified in your schema. I happened to be joining a non-unique ID with a language to make a unique key but you can do it however you wish.

Loopback include unrelated lists

In Loopback it is easy to include relational objects when querying for data. For example, one can include all the comments that belong to a blog post in a single call using the include filter.
But in my case I want to get data that doesn't have a relation.
I have a User Detail page. On that page a user can choose a username and there's also a dropdown list where a user can choose from what country he is.
So from the client side I do something like:
Country.find().$promise.then(function(countryData) {
$scope.countries = countryData;
});
Player.find().$promise.then(function(playerData) {
$scope.player = playerData;
}
But what if I get more lists that I want to fill? Like, city, state, colors etc.
Then I'd have to make a lot of separate calls.
Is there a way to include all this data in one call, eventhough they have no relation? Something like this:
Player.find({ filter: { include: ["countries", "colors"] } }).$promise.then(function(data) {
// some stuff
}
You may want to try using the Where filter as documented here
An example of this for querying two specific things would be:
Post.find({where: {and: [{title: 'My Post'}, {content: 'Hello'}]}},
function (err, posts) {
...
});
You could create a remote method on one of your models that makes the calls internally and packages them back up for you.
Use some promise library if not using ES6 to wait for all and then return
Model.getAll = function(next) {
var promises = [];
promises.push(Model.app.models.Country.find());
promises.push(Model.app.models.Player.find());
promises.push(Model.app.models.Color.find());
Promise.all(promises)
.then(function(results) {
next(results);
});
}
/**
Register your remote method here
*/
You could create a remote method on one of your models that makes the calls internally and packages them back up for you.
Use some promise library if not using ES6 to wait for all and then return
Model.getAll = function(next) {
var promises = [];
promises.push(Model.app.models.Country.find());
promises.push(Model.app.models.Player.find());
promises.push(Model.app.models.Color.find());
Promise.all(promises)
.then(function(results) {
next(results);
});
}
/**
Register your remote method here
*/
I have problem and try with this solution but i get error "Failed with multiple errors, see details for more information.". It seems like there is bug on Loopback while using promise.all

How can I do a "where in" type query using ember-data

How can I perform a where-in type query using ember-data?
Say I have a list of tags - how can I use the store to query the API to get all relevant records where they have one of the tags present?
Something like this:
return this.store.find('tags', {
name: {
"in": ['tag1', 'tag2', 'tag3']
}
})
There isn't built in support for something like that. And, I don't think its needed.
The result that you are after can be obtained in two steps.
return this.store.find('posts'); // I guess its a blog
and then in your controller you use a computed property
filteredPosts: function('model', function() {
var tags = ['tag1', 'tag2', 'tag3'];
return this.get('model').filter(function(post) {
if ( /* post has one of tags */ ) {
}
return false;
});
});
Update: What if there are tens of thousands of tags?!
Amother option is to send a list of tags as a single argument to the back end. You'll have to do a bit of data processing before sending a request and before querying.
return this.store.find('tags', {
tags: ['tag1', 'tag2', 'tag3'].join(', ')
})
In your API you'll know that the tags argument needs to be converted into an array before querying the DB.
So, this is better because you avoid the very expensive nested loop caused by the use of filter. (expensive !== bad, it has its benefits)
It is a concern to think that there will be tens of thousands of tags, if those are going to be available in your Ember app they'll have a big memory footprint and maybe something much more advanced is needed in terms of app design.

How would I modify this ember.js function to return an Enumerable or Array

I have just begun learning ember.js, I have followed some tutorials and created a working example here:
App.Track.reopenClass({
find: function() {
var tracks = [];
$.ajax({
url: 'http://ws.spotify.com/lookup/1/.jsonuri=spotify:album:6J6nlVu4JMveJz0YM9zDgL&extras=track',
dataType: 'json',
context: this,
success: function(data, textStatus, jqXHR) {
$.each(data.album.tracks, function(index, value) {
track_id = value.href.replace("spotify:track:", "");
tracks.addObject(App.Track.create(value));
// I would rather do something like:
// tracks[track_id] = App.Track.create(value)
});
}
})
return tracks;
}
});
This function hits an API and loops through the returned data to populate the tracks object (tracks.addObject(App.Track.create(value));) and return it.
Rather than getting an ordinary object back from this function, I would like to get an Enumerable / Array so I can manipulate it with filterProperty or pull out tracks by id (There is a track_id which I would like to use as the array index).
All of my attempts to use an array have broken ember's magical ability to update the view when the ajax call populates the tracks.
Can anyone modify http://jsfiddle.net/ZEzwn/ to return an Enumerable (preferably an Array) but still update the view automatically?
As your method already returns an Array (because you have Ember prototype extension enabled), doing:
var tracks = [];
is equivalent to
var tracks = Ember.A();
On ajax request success, you're just populating the array, so you could use Ember.Array methods like filterProperty.
Just one thing about using id as array key, you really SHOULD NOT, as Ryan Bigg says in its blog:
However, if the variant’s id is [something a little higher, like] 1,013,589,413, then you start to run into problems.
In that case, JavaScript would create a one billion, thirteen million, five hundred and eighty-nine thousand, four hundred and fourteen element array. All to store one value in, right at the end.
Ok this is now working, as louiscoquio pointed out, tracks IS an enumerable object and I can do stuff like
tracks.filterProperty('href', 'spotify:track:7x7F7xBqXqr0L9wqJ3tuQW')
tracks.getEach('name')
tracks.get('firstObject')

Adding item to filtered result from ember-data

I have a DS.Store which uses the DS.RESTAdapter and a ChatMessage object defined as such:
App.ChatMessage = DS.Model.extend({
contents: DS.attr('string'),
roomId: DS.attr('string')
});
Note that a chat message exists in a room (not shown for simplicity), so in my chat messages controller (which extends Ember.ArrayController) I only want to load messages for the room the user is currently in:
loadMessages: function(){
var room_id = App.getPath("current_room.id");
this.set("content", App.store.find(App.ChatMessage, {room_id: room_id});
}
This sets the content to a DS.AdapterPopulatedModelArray and my view happily displays all the returned chat messages in an {{#each}} block.
Now it comes to adding a new message, I have the following in the same controller:
postMessage: function(contents) {
var room_id = App.getPath("current_room.id");
App.store.createRecord(App.ChatMessage, {
contents: contents,
room_id: room_id
});
App.store.commit();
}
This initiates an ajax request to save the message on the server, all good so far, but it doesn't update the view. This pretty much makes sense as it's a filtered result and if I remove the room_id filter on App.store.find then it updates as expected.
Trying this.pushObject(message) with the message record returned from App.store.createRecord raises an error.
How do I manually add the item to the results? There doesn't seem to be a way as far as I can tell as both DS.AdapterPopulatedModelArray and DS.FilteredModelArray are immutable.
so couple of thoughts:
(reference: https://github.com/emberjs/data/issues/190)
how to listen for new records in the datastore
a normal Model.find()/findQuery() will return you an AdapterPopulatedModelArray, but that array will stand on its own... it wont know that anything new has been loaded into the database
a Model.find() with no params (or store.findAll()) will return you ALL records a FilteredModelArray, and ember-data will "register" it into a list, and any new records loaded into the database will be added to this array.
calling Model.filter(func) will give you back a FilteredModelArray, which is also registered with the store... and any new records in the store will cause ember-data to "updateModelArrays", meaning it will call your filter function with the new record, and if you return true, then it will stick it into your existing array.
SO WHAT I ENDED UP DOING: was immediately after creating the store, I call store.findAll(), which gives me back an array of all models for a type... and I attach that to the store... then anywhere else in the code, I can addArrayObservers to those lists.. something like:
App.MyModel = DS.Model.extend()
App.store = DS.Store.create()
App.store.allMyModels = App.store.findAll(App.MyModel)
//some other place in the app... a list controller perhaps
App.store.allMyModels.addArrayObserver({
arrayWillChange: function(arr, start, removeCount, addCount) {}
arrayDidChange: function(arr, start, removeCount, addCount) {}
})
how to push a model into one of those "immutable" arrays:
First to note: all Ember-Data Model instances (records) have a clientId property... which is a unique integer that identifies the model in the datastore cache whether or not it has a real server-id yet (example: right after doing a Model.createRecord).
so the AdapterPopulatedModelArray itself has a "content" property... which is an array of these clientId's... and when you iterate over the AdapterPopulatedModelArray, the iterator loops over these clientId's and hands you back the full model instances (records) that map to each clientId.
SO WHAT I HAVE DONE
(this doesn't mean it's "right"!) is to watch those findAll arrays, and push new clientId's into the content property of the AdapterPopulatedModelArray... SOMETHING LIKE:
arrayDidChange:function(arr, start, removeCount, addCount){
if (addCount == 0) {return;} //only care about adds right now... not removes...
arr.slice(start, start+addCount).forEach(function(item) {
//push clientId of this item into AdapterPopulatedModelArray content list
self.getPath('list.content').pushObject(item.get('clientId'));
});
}
what I can say is: "its working for me" :) will it break on the next ember-data update? totally possible
For those still struggling with this, you can get yourself a dynamic DS.FilteredArray instead of a static DS.AdapterPopulatedRecordArray by using the store.filter method. It takes 3 parameters: type, query and finally a filter callback.
loadMessages: function() {
var self = this,
room_id = App.getPath('current_room.id');
this.store.filter(App.ChatMessage, {room_id: room_id}, function (msg) {
return msg.get('roomId') === room_id;
})
// set content only after promise has resolved
.then(function (messages) {
self.set('content', messages);
});
}
You could also do this in the model hook without the extra clutter, because the model hook will accept a promise directly:
model: function() {
var self = this,
room_id = App.getPath("current_room.id");
return this.store.filter(App.ChatMessage, {room_id: room_id}, function (msg) {
return msg.get('roomId') === room_id;
});
}
My reading of the source (DS.Store.find) shows that what you'd actually be receiving in this instance is an AdapterPopulatedModelArray. A FilteredModelArray would auto-update as you create records. There are passing tests for this behaviour.
As of ember.data 1.13 store.filter was marked for removal, see the following ember blog post.
The feature was made available as a mixin. The GitHub page contains the following note
We recommend that you refactor away from using this addon. Below is a short guide for the three filter use scenarios and how to best refactor each.
Why? Simply put, it's far more performant (and not a memory leak) for you to manage filtering yourself via a specialized computed property tailored specifically for your needs