I'm trying to do an optimisticResponse in a mutation with ApolloClient with vue-apollo.
The mutation itself works fine - the server gets updated, and when I refresh the page, the insert has successfully appeared. However, the optimisticResponse that gets written to the query becomes "undefined" once the update() function runs for the second time, with the "real" response from the server.
My mutation code looks like this:
this.$apollo.mutate({
mutation: UPSERT_ITEMS,
variables: {
todoInput: [todoInput]
},
update: (store, result) => {
const upsertTodos = result.data!.upsertTodos
const newTodo = upsertTodos.todo
const data = store.readQuery<QueryResult>({ query: GET_PROJECTS })
console.log("Results from the cache:", data);
data!.rc_todos.push(newTodo)
console.log("after that push, data looks like...", data)
store.writeQuery({ query: GET_PROJECTS, data })
// re-reading the query for debugging purposes:
const sameData = store.readQuery<QueryResult>({ query: GET_PROJECTS })
console.log("New results from the cache:", sameData)
},
optimisticResponse: {
__typename: "Mutation",
upsertTodos: {
__typename: "InsertedTodo",
id: 1,
todo_id: 1,
todo: {
id: 1,
label: nodeTitle,
children: [],
__typename: "rc_todos"
}
},
},
})
Note that right now I'm calling readQuery a second time in update() for debugging purposes.
In that second call of readQuery in the first run of update() (i.e. right after inserting optimisticResponse it looks like this:
...so everything looks fine, the fields there match the data that was already in the cache, etc.
But, when update() runs the second time (to process results from the server) our optimisticResponse is "there" but as undefined:
This yields an error TypeError: Cannot read property '__typename' of undefined.
To be honest, I'm not sure what's supposed to be happening at this point in the lifecycle... should optimisticResult still be there? Or should it have already been removed? In either case, I'm sure it shouldn't be there and undefined since that's causing the error.
It turned out to be a silly mistake- the real response from the mutation the server was returning was a different shape than how it was set up in the client.
The part I screenshotted in my question is actually the intended behavior- it sets the previously added optimisticResponse to undefined so as to remove it once the "real" data is available.
Related
I have already looked at: Writing nested object to Apollo client cache
And I guess everywhere else on the internet.
So what I want to do is to write an object to the Apollo cache, to store it locally. This is my setup:
const cache = new InMemoryCache()
const client = new ApolloClient({
uri: '/hasura/v1/graphql',
cache,
credentials: 'include',
})
Then I write this to initialize the cache, taken from: https://www.apollographql.com/docs/react/data/local-state/#querying-local-state
cache.writeData({
data: {
todos: [],
visibilityFilter: 'SHOW_ALL',
networkStatus: {
__typename: 'NetworkStatus',
id: 1,
isConnected: false,
},
},
})
Now, when I want to query the cache using this query:
const lolquery = gql`{
visibilityFilter #client
}`
const result = cache.readQuery({ query: lolquery })
console.log(result)
I can get the visibilityFilter and todos, but when I try to query for networkStatus i.e.,
const lolquery = gql`{
networkStatus #client
}`
I get the following error:
Uncaught Invariant Violation: Invariant Violation: 10
Googling this error doesn't give me much of an answer.
Am I the only one experiencing this? Am I missing something really obvious?
networkStatus was initialized as an object, so you have to query it with a subselection of one or more fields.
{
networkStatus #client {
isConnected
}
}
The question is about the interaction of a mutation, optimistic response, and a watchQuery.
I have a mutation "myMutation" which has an "optimisticResponse" and an implemented "update" function.
Every time I do a mutation query the "update" function is called twice, the first time with optimistic response data and the second one with real data. All is Ok and all as described in the documentation.
Into my "update" function I modify "myQuery" cache data through using readQuery/writeQuery methods.
Every time I modify "myQuery" cache data a watchQuery (based on "myQuery") subscription is called. All is Ok and all as described in the documentation.
But the problem is that I cannot distinguish into my watchQuery whether I receive optimistic response data or real response data. It is crucial for me because the reaction must be different since valuable part of data can be provided by a server only.
I should show a GUI element with a special style when I receive an optimistic response and I should prohibit any interactions with it until I receive a real response.
Unfortunately, I can't solve this matter. At a glance, there is no difference between optimistic and real responses. I've googled a lot and haven't found a solution. The only idea I have is adding a special field to my GraphQL data which will show whether a response is received from a server or not. But it looks ugly and smells bad. I am sure, there must be a simple correct way to overcome the problem.
Maybe there is an easier way or there will be one in the future but here is what I know.
The data in optimisticResponse is only provided during the first call to update. That is where you can flag to your update function that it is dealing with optimistic data. You can put any data you want there. I put isOptimistic: true,.
To deal with the watchQuery issue, I recommend you make use of apollo-link-state to add a client-only field or fields to the areas of your data model where optimistic upserts should be known to the display. Don't include isOptimistic in your mutation query so you know it's from the server and not the optimistic response and force it to false if it's not true. See this example:
const SUBMIT_COMMENT_MUTATION = gql`
mutation submitComment($repoFullName: String!, $commentContent: String!) {
submitComment(repoFullName: $repoFullName, commentContent: $commentContent) {
postedBy {
login
html_url
}
createdAt
content
}
}
`;
const CommentsPageWithMutations = ({ currentUser }) => (
<Mutation mutation={SUBMIT_COMMENT_MUTATION}>
{(mutate) => (
<CommentsPage
submit={(repoFullName, commentContent) =>
mutate({
variables: { repoFullName, commentContent },
optimisticResponse: {
__typename: 'Mutation',
submitComment: {
__typename: 'Comment',
postedBy: currentUser,
createdAt: new Date(),
content: commentContent,
isOptimistic: true, // Only provided to update on the optimistic call
},
},
update: (proxy, { data: { submitComment } }) => {
// Make sure CommentAppQuery includes isOptimistic for each comment added by apollo-link-state
// submitComment.isOptimistic will be undefined here if it's from the server
const newComment = {
...submitComment,
isOptimistic: submitCommit.isOptimistic ? true : false,
};
// Read the data from our cache for this query.
const data = proxy.readQuery({ query: CommentAppQuery });
// Add our comment from the mutation to the end.
data.comments.push(newComment);
// Write our data back to the cache.
proxy.writeQuery({ query: CommentAppQuery, data });
},
})
}
/>
)}
</Mutation>
);
See https://www.apollographql.com/docs/link/links/state.html.
I couldn't get this to work on Apollo 3.X by only adding a property on the optimistic response, the property was getting stripped away. To get it to work I had to add a local variable to the query.
fragment object on Object {
id
isOptimistic #client
...
Once that is done, I was able to add the local-only flag to my optimistic response.
const optimisticResponse = {
object: {
id: "temp-id",
isOptimistic: true,
...
}
}
Let's say I have a table that lists a bunch of Posts using a query like:
const PostsQuery = gql`
query posts($name: string) {
posts {
id
name
status
}
}
`;
const query = apolloClient.watchQuery({query: PostsQuery});
query.subscribe({
next: (posts) => console.log(posts) // [ {name: "Post 1", id: '1', status: 'pending' }, { name: "Paul's Post", id: '2', status: 'pending'} ]
});
Then later my user comes along and enters a value in a search field and calls this code:
query.setVariables({name: 'Paul'})
It fetches the filtered posts and logs it out fine.
// [ { name: "Paul's Post", id: '2', status: 'pending'} ]
Now, in my table there is a button that changes the status of a post from 'Pending' to 'Active'. The user clicks that and it calls code like:
const PostsMutation = gql`
mutation activatePost($id: ID!) {
activatePost(id: $id) {
ok
object {
id
name
status
}
}
}
`;
apolloClient.mutate({mutation: PostsMutation});
All is well with the mutation, but now I want to refetch the table data so it has the latest, so I make a change:
apolloClient.mutate({
mutation: PostsMutation,
refetchQueries: [{query: PostsQuery, variables: {name: 'Paul'}]
});
Hurray, it works!
// [ { name: "Paul's Post", id: '2', status: 'active'} ]
But... now my user clears the search query, expecting the results to update.
query.setVariables({});
// [ {name: "Post 1", id: '1', status: 'pending' }, { name: "Paul's Post", id: '2', status: 'pending'} ]
Oh no! Because the data was not refetched in our mutation with our "original" variables (meaning none), we are getting stale data!
So how do you handle a situation where you have a mutation that may affect a query that could have many permutations of variables?
I had a similar issue, I am using Apollo with Angular, so I am not sure if this method will work with React Client, but it should.
If you look closely at refetchQueries properties on the mutate method, you will see that the function can also return a string array of query names to refetch. By returning just the query name as a string, you do not need to worry about the variables. Be advised that this will refetch all the queries matching the name. So if you had a lot queries with different variables it could end up being a large request. But, in my case it is worth the trade off. If this is a problem, you could also get access to the queryManager through apolloClient.queryManager which you could use to do some more fine grained control of what to refetch. I didn't implement it, but it looked very possible. I found the solution below fits my needs fine.
In your code, what you need to do is:
apolloClient.mutate({
mutation: PostsMutation,
refetchQueries: (mutationResult) => ['PostQueries']
});
This will refetch any query with the name 'PostQueries'. Again, it is possible to only refetch a subset of them if you dig into the queryManager and do some filtering on the active watch queries. But, that is another exercise.
I've been trying to sort data from strongloop using a call from angular controller.
angular.module('fsbs')
.controller('LocationCtrl', ['$scope', '$state', '$window', 'Location',
function ($scope, $state, $window, Location) {
$scope.dataset = [];
function get() {
Location
.find({
order: "name ASC"
})
.$promise
.then(function (data) {
$scope.dataset = data;
})
}
get();
}])
The request sent using the lb-service is:
http://localhost:3001/api/locations?order=name+ASC
The request sent using the explorer is:
http://0.0.0.0:3001/api/locations?filter=%7B%22order%22%3A%22name%20asc%22%7D
Even though I'm getting the data, it's not sorted. My question is,
Why is lb-service generating such a request?
Is there anything wrong in my controller?
Okay, sorry for posting the question. I'm not deleting it, since it may be useful for others who might make the same mistake.
I got the sorted data when I changed the code to the following:
.find({
filter: {
order: "name asc"
}
})
I want to make an API call for searching that looks like this:
https://myapi.com/search/<query>/<token>
where query is the search term and token (optional) is an alphanumeric set of characters which identifies the position of my latest batch of results, which is used for infinite scrolling.
This call returns the following JSON response:
{
"meta": { ... },
"results" {
"token": "125fwegg3t32",
"content": [
{
"id": "125125122778",
"text": "Lorem ipsum...",
...
},
{
"id": "125125122778",
"text": "Dolor sit amet...",
...
},
...
]
}
}
content is an array of (embedded) items that I'm displaying as search results. My models look like this:
App.Content = Em.Model.extend({
id: Em.attr(),
text: Em.attr(),
...
});
App.Results = Em.Model.extend({
token: Em.attr(),
content: Em.hasMany('App.Content', {
key: 'content',
embedded: true
})
});
In order to make that API call, I figured I have to do something like this:
App.Results.reopenClass({
adapter: Em.RESTAdapter.create({
findQuery: function(klass, records, params) {
var self = this,
url = this.buildURL(klass) + '/' + params.query;
if (params.token) {
url += '/' + params.token;
}
return this.ajax(url).then(function(data) {
self.didFindQuery(klass, records, params, data);
return records;
});
}
}),
url: 'https://myapi.com/search',
});
then somewhere in my routes do this:
App.Results.fetch({query: 'query', token: '12kgkj398512j'}).then(function(data) {
// do something
return data;
})
but because the API returns a single object and Em.RESTAdapter.findQuery expects an array, an error occurs when Ember Model tries to materialize the data. So how do I do this properly? I'm using the latest build of Ember Model.
By the way, I'm aware that it would be much more convenient if the API was designed in a way so I can just call App.Content.fetch(<object>), which would return a similar JSON response, but I would then be able to set the collectionKey option to content and my data would be properly materialized.
You simply need to override your models load() method to adjust the payload hash to what Ember.Model wants. There are no serializers in Ember.Model. There is both a class level load for handling collections and an instance level load for loading the JSON specific to a single model. You want to override the instance level load method to wrap the content key value in an array if its not one already.
I have been using Ember.Mode quite heavily and enhanced it for a number of my use cases and submitted PR's for both fixes and enhancements. Those PRs have been sitting there for a while with no response from the maintainers. I have now moved to Ember.Data which has been 'rebooted' so to speak and having a lot better result with it now.
I would strongly suggest walking away from Ember.Model as it appears dead with the new pragmatic direction Ember Data has taken and because the project maintainer doesn't appear to have any interest in it anymore.