Deleting hasMany record and saving with ember-data - ember.js

I have a jsbin setup with my issue:
http://emberjs.jsbin.com/sovub/3/edit
From my example, in certain situations when I try to delete a subtitle and then save, I get the error:
Attempted to handle event `pushedData` on <App.Subtitle:ember563:7> while in state root.deleted.uncommitted.
If I delete the last subtitle then save, it's fine. But deleting the first subtitle, or deleting and adding new records then saving gives me the error message.
Is it because I'm manually setting the IDs for each subtitle during extractSingle, like so:
extractSingle: function(store, type, payload, id){
var list = payload.list
var nid = 6
// extra subtitles
var subs = list.subtitles
list.subtitles = []
subs.forEach(function(item){
item.id = nid++
list.subtitles.push(item.id)
item.list = list.id
})
// do the same for links
payload = { list: list, subtitle: subs, link: li}
return this._super(store, type, payload, id)
},
I've also noticed that the subtitles attribute of the payload in extractSingle doesn't contain the correct model whenever the error is thrown. Instead, it contains only the ID of the subtitle record.
// normally
id: "532",
subtitles: Array[2]
0: Class
__ember1403240151252: "ember562"
__ember1403240151252_meta: Object
// rest of data
1: Class
__ember1403240151252: "ember563"
__ember1403240151252_meta: Object
__nextSuper: undefined
// rest of data
// when error is thrown
id: "532",
subtitles: Array[1]
0: 6
length: 1
__proto__: Array[0]
I'm not really sure as how I should approach this, let alone resolve it. Any help would be appreciated.

I did some more research and found out about DS.RootState ( http://emberjs.com/api/data/classes/DS.RootState.html) which is related to the state root.deleted.uncommitted error I was getting.
To solve it, all I did was dematerialize the record after deleting it:
var model = this.get('model')
model.deleteRecord()
this.store.dematerializeRecord(model)
This removed the records indexes (and relationships?) so it was properly deleted.

Related

Apollo duplicates first result to every node in array of edges

I am working on a react app with react-apollo
calling data through graphql when I check in browser network tab response it shows all elements of the array different
but what I get or console.log() in my app then all elements of array same as the first element.
I don't know how to fix please help
The reason this happens is because the items in your array get "normalized" to the same values in the Apollo cache. AKA, they look the same to Apollo. This usually happens because they share the same Symbol(id).
If you print out your Apollo response object, you'll notice that each of the objects have Symbol(id) which is used by Apollo cache. Your array items probably have the same Symbol(id) which causes them to repeat. Why does this happen?
By default, Apollo cache runs this function for normalization.
export function defaultDataIdFromObject(result: any): string | null {
if (result.__typename) {
if (result.id !== undefined) {
return `${result.__typename}:${result.id}`;
}
if (result._id !== undefined) {
return `${result.__typename}:${result._id}`;
}
}
return null;
}
Your array item properties cause multiple items to return the same data id. In my case, multiple items had _id = null which caused all of these items to be repeated. When this function returns null the docs say
InMemoryCache will fall back to the path to the object in the query,
such as ROOT_QUERY.allPeople.0 for the first record returned on the
allPeople root query.
This is the behavior we actually want when our array items don't work well with defaultDataIdFromObject.
Therefore the solution is to manually configure these unique identifiers with the dataIdFromObject option passed to the InMemoryCache constructor within your ApolloClient. The following worked for me as all my objects use _id and had __typename.
const client = new ApolloClient({
link: authLink.concat(httpLink),
cache: new InMemoryCache({
dataIdFromObject: o => (o._id ? `${o.__typename}:${o._id}`: null),
})
});
Put this in your App.js
cache: new InMemoryCache({
dataIdFromObject: o => o.id ? `${o.__typename}-${o.id}` : `${o.__typename}-${o.cursor}`,
})
I believe the approach in other two answers should be avoided in favor of following approach:
Actually it is quite simple. To understand how it works simply log obj as follows:
dataIdFromObject: (obj) => {
let id = defaultDataIdFromObject(obj);
console.log('defaultDataIdFromObject OBJ ID', obj, id);
}
You will see that id will be null in your logs if you have this problem.
Pay attention to logged 'obj'. It will be printed for every object returned.
These objects are the ones from which Apollo tries to get unique id ( you have to tell to Apollo which field in your object is unique for each object in your array of 'items' returned from GraphQL - the same way you pass unique value for 'key' in React when you use 'map' or other iterations when rendering DOM elements).
From Apollo dox:
By default, InMemoryCache will attempt to use the commonly found
primary keys of id and _id for the unique identifier if they exist
along with __typename on an object.
So look at logged 'obj' used by 'defaultDataIdFromObject ' - if you don't see 'id' or '_id' then you should provide the field in your object that is unique for each object.
I changed example from Apollo dox to cover three cases when you may have provided incorrect identifiers - it is for cases when you have more than one GraphQL types:
dataIdFromObject: (obj) => {
let id = defaultDataIdFromObject(obj);
console.log('defaultDataIdFromObject OBJ ID', obj, id);
if (!id) {
const { __typename: typename } = obj;
switch (typename) {
case 'Blog': {
// if you are using other than 'id' and '_id' - 'blogId' in this case
const undef = `${typename}:${obj.id}`;
const defined = `${typename}:${obj.blogId}`;
console.log('in Blogs -', undef, defined);
return `${typename}:${obj.blogId}`; // return 'blogId' as it is a unique
//identifier. Using any other identifier will lead to above defined problem.
}
case 'Post': {
// if you are using hash key and sort key then hash key is not unique.
// If you do query in DB it will always be the same.
// If you do scan in DB quite often it will be the same value.
// So use both hash key and sort key instead to avoid the problem.
// Using both ensures ID used by Apollo is always unique.
// If for post you are using hashKey of blogID and sortKey of postId
const notUniq = `${typename}:${obj.blogId}`;
const notUniq2 = `${typename}:${obj.postId}`;
const uniq = `${typename}:${obj.blogId}${obj.postId}`;
console.log('in Post -', notUniq, notUniq2, uniq);
return `${typename}:${obj.blogId}${obj.postId}`;
}
case 'Comment': {
// lets assume 'comment's identifier is 'id'
// but you dont use it in your app and do not fetch from GraphQl, that is
// you omitted 'id' in your GraphQL query definition.
const undefnd = `${typename}:${obj.id}`;
console.log('in Comment -', undefnd);
// log result - null
// to fix it simply add 'id' in your GraphQL definition.
return `${typename}:${obj.id}`;
}
default: {
console.log('one falling to default-not good-define this in separate Case', ${typename});
return id;
}
I hope now you see that the approach in other two answers are risky.
YOU ALWAYS HAVE UNIQUE IDENTIFIER. SIMPLY HELP APOLLO BY LETTING KNOW WHICH FIELD IN OBJECT IT IS. If it is not fetched by adding in query definition add it.
An alternative option to the accepted is to instead of dataIdFromObject, which appears to be for everything in the query, I was able to provide a keyFields function per type that required it.
const client = new ApolloClient({
cache: new InMemoryCache({
typePolicies: {
ItemType: {
keyFields: (obj) =>
obj.id + "-" + obj.language.id,
},
},
}),
});
In the above example ItemType can be whichever Type is specified in your schema. I happened to be joining a non-unique ID with a language to make a unique key but you can do it however you wish.

Ember: Create a new record if none are found

I have a route that pulls data from a REST API. The first time a user enters, there won't be any data saved, so I need to create the record with some default values. I figure I need to do this here in the route so the user can see my default values (which in my actual app aren't just null), rather than creating during the save action.
If, however, there is data in the database, I need to return that record. Right now I'm stuck and must not be doing something right (probably has to do with promises, but I'm not sure).
The error I get is:
Error while processing route: project.dates Assertion Failed:
Expected an object as `data` in a call to `push` for star#model:project-date: ,
but was undefined
Here's my route code:
var project = this.modelFor('project');
var projectDates = this.store.find('project-date', project.id);
if (projectDates) {
return projectDates;
} else {
return this.store.createRecord('project-date', {
project: project.id,
start: null,
checkpointA: null,
finish: null
};
}
The puzzling thing is that if I just negate my if statement to get the other return value (like so: if (!projectDates)) then I still get the error above, but it also loads up the model! I have confirmed that the API (or my mock API) is returning data in the right format, as an object.
Okay, as I suspected, I could use promises stuff to solve this, just had to keep googling until I found something that jogged the right idea.
Here is how to set the model to the found record, or else create a new record:
model: function() {
var project = this.modelFor('project');
var projectDates = this.store.find('project-date', project.id).catch(function() {
return this.store.createRecord('project-date', {
project: project.id,
start: null,
checkpointA: null,
finish: null
};
});
return projectDates;
}

Ember.js: How to get an array of model IDs from a corresponding array of model attributes

For a Tag model that I have in Ember-Data, I have 4 records in my store:
Tags:
id tag_name
1 Writing
2 Reading-Comprehension
3 Biology
4 Chemistry
In my code I have an array of tag_names, and I want to get a corresponding array of tag IDs. I'm having 2 problems:
My server is being queried even though I have these tags in my store. When I call store.find('tag', {tag_name: tag_name}), I didn't expect to need a call to the server. Here is all the code I'm using to attempt to create an array of IDs.
var self = this;
var tagsArray = ["Writing", "Reading-Comprehension", "Chemistry"];
var tagIdArr = []
tagsArray.forEach(function(tag_name) {
return self.store.find('tag', { tag_name: tag_name }).then(function(tag) {
tagIdArr.pushObject(tag.get('content').get('0').get('id'));
})
})
return tagIdArr;
When I console.log the output of the above code gives me an empty array object with length 0. Clicking on the caret next to the empty array shows three key-value pairs with the correct data. But the array is empty. I'm sure there is a simple explanation for this behavior, but I'm not sure why this is. I've used code similar to the above in other places successfully.
Find hits the server, but peek does not.
var tagsArray = ["Writing", "Reading-Comprehension", "Chemistry"];
return this.store.peekAll('tag').filter(function(tag){
return tagsArray.indexOf(tag) !== -1;
}).mapBy('id');
See: http://emberjs.com/blog/2015/06/18/ember-data-1-13-released.html#toc_reorganized-find-methods

EmberJS - Adding an item to the store does not update view

I have a one to many relationship in the model of the application to relevantUsers. Now I want to iterate via the {{#each}} helper over those values. Which works.
content: function()
{
return this.get('controllers.application.model.relevantUsers');
}.property('controllers.application.model.relevantUsers'),
And when removing an item from the relevantUsers the view updates. But when adding a new relevantUser nothing happens. The user gets added to the data store, but the view does not update. Am I missing something?
This is how I create a new user
// Create new user
var relevantUser = this.store.createRecord('relevantUser', relevantUserData);
// And push it to remote
relevantUser.save();
In my application, I've done it like [CoffeeScript] :
video = self.store.createRecord 'video', id: #Id, title: #Title, thumbnailUrl: #ThumbnailUrl
#Formats.map (i) -> //Formats is simple JavaScript array.
format = self.store.createRecord 'format', itag: i.itag, quality: i.quality, resolution: i.resolution, type: i.type, url: i.url
video.get('formats').then (f) ->
f.pushObject format
The difference, is, that I use .pushObject() method here. As #fanta wrote in his comment, you have to get array that has relationship with model, so it will notify all observers.
More example code.

Adding item to filtered result from ember-data

I have a DS.Store which uses the DS.RESTAdapter and a ChatMessage object defined as such:
App.ChatMessage = DS.Model.extend({
contents: DS.attr('string'),
roomId: DS.attr('string')
});
Note that a chat message exists in a room (not shown for simplicity), so in my chat messages controller (which extends Ember.ArrayController) I only want to load messages for the room the user is currently in:
loadMessages: function(){
var room_id = App.getPath("current_room.id");
this.set("content", App.store.find(App.ChatMessage, {room_id: room_id});
}
This sets the content to a DS.AdapterPopulatedModelArray and my view happily displays all the returned chat messages in an {{#each}} block.
Now it comes to adding a new message, I have the following in the same controller:
postMessage: function(contents) {
var room_id = App.getPath("current_room.id");
App.store.createRecord(App.ChatMessage, {
contents: contents,
room_id: room_id
});
App.store.commit();
}
This initiates an ajax request to save the message on the server, all good so far, but it doesn't update the view. This pretty much makes sense as it's a filtered result and if I remove the room_id filter on App.store.find then it updates as expected.
Trying this.pushObject(message) with the message record returned from App.store.createRecord raises an error.
How do I manually add the item to the results? There doesn't seem to be a way as far as I can tell as both DS.AdapterPopulatedModelArray and DS.FilteredModelArray are immutable.
so couple of thoughts:
(reference: https://github.com/emberjs/data/issues/190)
how to listen for new records in the datastore
a normal Model.find()/findQuery() will return you an AdapterPopulatedModelArray, but that array will stand on its own... it wont know that anything new has been loaded into the database
a Model.find() with no params (or store.findAll()) will return you ALL records a FilteredModelArray, and ember-data will "register" it into a list, and any new records loaded into the database will be added to this array.
calling Model.filter(func) will give you back a FilteredModelArray, which is also registered with the store... and any new records in the store will cause ember-data to "updateModelArrays", meaning it will call your filter function with the new record, and if you return true, then it will stick it into your existing array.
SO WHAT I ENDED UP DOING: was immediately after creating the store, I call store.findAll(), which gives me back an array of all models for a type... and I attach that to the store... then anywhere else in the code, I can addArrayObservers to those lists.. something like:
App.MyModel = DS.Model.extend()
App.store = DS.Store.create()
App.store.allMyModels = App.store.findAll(App.MyModel)
//some other place in the app... a list controller perhaps
App.store.allMyModels.addArrayObserver({
arrayWillChange: function(arr, start, removeCount, addCount) {}
arrayDidChange: function(arr, start, removeCount, addCount) {}
})
how to push a model into one of those "immutable" arrays:
First to note: all Ember-Data Model instances (records) have a clientId property... which is a unique integer that identifies the model in the datastore cache whether or not it has a real server-id yet (example: right after doing a Model.createRecord).
so the AdapterPopulatedModelArray itself has a "content" property... which is an array of these clientId's... and when you iterate over the AdapterPopulatedModelArray, the iterator loops over these clientId's and hands you back the full model instances (records) that map to each clientId.
SO WHAT I HAVE DONE
(this doesn't mean it's "right"!) is to watch those findAll arrays, and push new clientId's into the content property of the AdapterPopulatedModelArray... SOMETHING LIKE:
arrayDidChange:function(arr, start, removeCount, addCount){
if (addCount == 0) {return;} //only care about adds right now... not removes...
arr.slice(start, start+addCount).forEach(function(item) {
//push clientId of this item into AdapterPopulatedModelArray content list
self.getPath('list.content').pushObject(item.get('clientId'));
});
}
what I can say is: "its working for me" :) will it break on the next ember-data update? totally possible
For those still struggling with this, you can get yourself a dynamic DS.FilteredArray instead of a static DS.AdapterPopulatedRecordArray by using the store.filter method. It takes 3 parameters: type, query and finally a filter callback.
loadMessages: function() {
var self = this,
room_id = App.getPath('current_room.id');
this.store.filter(App.ChatMessage, {room_id: room_id}, function (msg) {
return msg.get('roomId') === room_id;
})
// set content only after promise has resolved
.then(function (messages) {
self.set('content', messages);
});
}
You could also do this in the model hook without the extra clutter, because the model hook will accept a promise directly:
model: function() {
var self = this,
room_id = App.getPath("current_room.id");
return this.store.filter(App.ChatMessage, {room_id: room_id}, function (msg) {
return msg.get('roomId') === room_id;
});
}
My reading of the source (DS.Store.find) shows that what you'd actually be receiving in this instance is an AdapterPopulatedModelArray. A FilteredModelArray would auto-update as you create records. There are passing tests for this behaviour.
As of ember.data 1.13 store.filter was marked for removal, see the following ember blog post.
The feature was made available as a mixin. The GitHub page contains the following note
We recommend that you refactor away from using this addon. Below is a short guide for the three filter use scenarios and how to best refactor each.
Why? Simply put, it's far more performant (and not a memory leak) for you to manage filtering yourself via a specialized computed property tailored specifically for your needs