OS: Windows 10 Pro
apollo-client: 2.6.3
apollo-boost: 0.1.16
Can anyone explain why I'm getting the following error message?:
Found #client directives in a query but no ApolloClient resolvers were
specified. This means ApolloClient local resolver handling has been
disabled, and #client directives will be passed through to your link
chain.
when I've defined my ApolloClient as follows:
return new ApolloClient({
uri: process.env.NODE_ENV === 'development' ? endpoint : prodEndpoint,
request: operation => {
operation.setContext({
fetchOptions: {
credentials: 'include',
},
headers: { cookie: headers && headers.cookie },
});
},
// local data
clientState: {
resolvers: {
Mutation: {
toggleCart(_, variables, { cache }) {
// Read the cartOpen value from the cache
const { cartOpen } = cache.readQuery({
query: LOCAL_STATE_QUERY,
});
// Write the cart State to the opposite
const data = {
data: { cartOpen: !cartOpen },
};
cache.writeData(data);
return data;
},
},
},
defaults: {
cartOpen: false,
},
},
});
From the docs:
If you're interested in integrating local state handling capabilities with Apollo Client < 2.5, please refer to our (now deprecated) apollo-link-state project. As of Apollo Client 2.5, local state handling is baked into the core, which means it is no longer necessary to use apollo-link-state
The clientState config option was only used with apollo-link-state. You need to add the resolvers directly to the config as shown in the docs:
new ApolloClient({
uri: '/graphql',
resolvers: { ... },
})
Also note that there is no defaults option anymore -- the cache should be initialized by calling writeData directly on the cache instance (see here).
I would suggest going through the latest docs and avoiding any examples from external sources (like existing repos or tutorials) since these may be outdated.
Note: As of version 3.0, writeData was removed in favor of writeFragment and writeQuery.
Related
I tried but didn't work. Got an error: Error when evaluating SSR module /node_modules/cross-fetch/dist/browser-ponyfill.js:
<script lang="ts">
import fetch from 'cross-fetch';
import { ApolloClient, InMemoryCache, HttpLink } from "#apollo/client";
const client = new ApolloClient({
ssrMode: true,
link: new HttpLink({ uri: '/graphql', fetch }),
uri: 'http://localhost:4000/graphql',
cache: new InMemoryCache()
});
</script>
With SvelteKit the subject of CSR vs. SSR and where data fetching should happen is a bit deeper than with other somewhat "similar" solutions. The bellow guide should help you connect some of the dots, but a couple of things need to be stated first.
To define a server side route create a file with the .js extension anywhere in the src/routes directory tree. This .js file can have all the import statements required without the JS bundles that they reference being sent to the web browser.
The #apollo/client is quite huge as it contains the react dependency. Instead, you might wanna consider importing just the #apollo/client/core even if you're setting up the Apollo Client to be used only on the server side, as the demo bellow shows. The #apollo/client is not an ESM package. Notice how it's imported bellow in order for the project to build with the node adapter successfully.
Try going though the following steps.
Create a new SvelteKit app and choose the 'SvelteKit demo app' in the first step of the SvelteKit setup wizard. Answer the "Use TypeScript?" question with N as well as all of the questions afterwards.
npm init svelte#next demo-app
cd demo-app
Modify the package.json accordingly. Optionally check for all packages updates with npx npm-check-updates -u
{
"name": "demo-app",
"version": "0.0.1",
"scripts": {
"dev": "svelte-kit dev",
"build": "svelte-kit build --verbose",
"preview": "svelte-kit preview"
},
"devDependencies": {
"#apollo/client": "^3.3.15",
"#sveltejs/adapter-node": "next",
"#sveltejs/kit": "next",
"graphql": "^15.5.0",
"node-fetch": "^2.6.1",
"svelte": "^3.37.0"
},
"type": "module",
"dependencies": {
"#fontsource/fira-mono": "^4.2.2",
"#lukeed/uuid": "^2.0.0",
"cookie": "^0.4.1"
}
}
Modify the svelte.config.js accordingly.
import node from '#sveltejs/adapter-node';
export default {
kit: {
// By default, `npm run build` will create a standard Node app.
// You can create optimized builds for different platforms by
// specifying a different adapter
adapter: node(),
// hydrate the <div id="svelte"> element in src/app.html
target: '#svelte'
}
};
Create the src/lib/Client.js file with the following contents. This is the Apollo Client setup file.
import fetch from 'node-fetch';
import { ApolloClient, HttpLink } from '#apollo/client/core/core.cjs.js';
import { InMemoryCache } from '#apollo/client/cache/cache.cjs.js';
class Client {
constructor() {
if (Client._instance) {
return Client._instance
}
Client._instance = this;
this.client = this.setupClient();
}
setupClient() {
const link = new HttpLink({
uri: 'http://localhost:4000/graphql',
fetch
});
const client = new ApolloClient({
link,
cache: new InMemoryCache()
});
return client;
}
}
export const client = (new Client()).client;
Create the src/routes/qry/test.js with the following contents. This is the server side route. In case the graphql schema doesn't have the double function specify different query, input(s) and output.
import { client } from '$lib/Client.js';
import { gql } from '#apollo/client/core/core.cjs.js';
export const post = async request => {
const { num } = request.body;
try {
const query = gql`
query Doubled($x: Int) {
double(number: $x)
}
`;
const result = await client.query({
query,
variables: { x: num }
});
return {
status: 200,
body: {
nodes: result.data.double
}
}
} catch (err) {
return {
status: 500,
error: 'Error retrieving data'
}
}
}
Add the following to the load function of routes/todos/index.svelte file within <script context="module">...</script> tag.
try {
const res = await fetch('/qry/test', {
method: 'POST',
credentials: 'same-origin',
headers: {
'Content-Type': 'application/json'
},
body: JSON.stringify({
num: 19
})
});
const data = await res.json();
console.log(data);
} catch (err) {
console.error(err);
}
Finally execute npm install and npm run dev commands. Load the site in your web browser and see the server side route being queried from the client whenever you hover over the TODOS link on the navbar. In the console's network tab notice how much quicker is the response from the test route on every second and subsequent request thanks to the Apollo client instance being a singleton.
Two things to have in mind when using phaleth solution above: caching and authenticated requests.
Since the client is used in the endpoint /qry/test.js, the singleton pattern with the caching behavior makes your server stateful. So if A then B make the same query B could end up seeing some of A data.
Same problem if you need authorization headers in your query. You would need to set this up in the setupClient method like so
setupClient(sometoken) {
...
const authLink = setContext((_, { headers }) => {
return {
headers: {
...headers,
authorization: `Bearer ${sometoken}`
}
};
});
const client = new ApolloClient({
credentials: 'include',
link: authLink.concat(link),
cache: new InMemoryCache()
});
}
But then with the singleton pattern this becomes problematic if you have multiple users.
To keep your server stateless, a work around is to avoid the singleton pattern and create a new Client(sometoken) in the endpoint.
This is not an optimal solution: it recreates the client on each request and basically just erases the cache. But this solves the caching and authorization concerns when you have multiple users.
How to pass Cookie headers from gatsby-source-graphql?
I'm using the gatsby-source-graphql (https://github.com/gatsbyjs/gatsby/tree/master/packages/gatsby-source-graphql) and recently had to implement AWS Cloudfront Signed Cookies to authorise users to acccess a private staging environment, for this reason the requests to the graphql endpoint, handled by the plugin, need to have the cookie in the request header, which I do by:
{
resolve: 'gatsby-source-graphql',
options: {
cookie: 'var1=val1; var2=val2; '
}
}
The above fails,
ServerParseError: Unexpected token < in JSON at position 0
If disabling Signed Cookies and making the endpoint public, it works.
And, if I keep it private again and test with curl, works:
curl --cookie 'var1=val1; var2=val2; ' graphql_endpoint.com
I tried to figure out why the Cookie header is not passed, but seems that the problem is in a different plugin that the plugin above uses called 'apollo-link-http' (https://github.com/gatsbyjs/gatsby/blob/master/packages/gatsby-source-graphql/src/gatsby-node.js)
Meanwhile, looking at the apollo-http-link (https://www.apollographql.com/docs/link/links/http/) and a issue reported here (https://github.com/apollographql/apollo-client/issues/4455), I tried:
{
resolve: 'gatsby-source-graphql',
options: {
typeName: 'FOOBAR',
fieldName: 'foobar',
createLink: (pluginOptions) => {
return createHttpLink({
uri: process.env.GATSBY_GRAPHQL_API_URL,
credentials: 'include',
headers: {
cookie: "CloudFront-Policy=xxxxx_; CloudFront-Key-Pair-Id=xxxxx; CloudFront-Signature=xxxxxxxxxx; path=/;",
},
fetch,
})
},
}
},
Without success, the same error as before.
Also tried to use the fetch options for node-fetch,
{
resolve: 'gatsby-source-graphql',
options: {
typeName: 'FOOBAR',
fieldName: 'foobar',
url: process.env.GATSBY_GRAPHQL_API_URL,
fetchOptions: {
credentials: 'include',
headers: {
cookie: "CloudFront-Policy=xxxxx_; CloudFront-Key-Pair-Id=xxxxx; CloudFront-Signature=xxxxxxxxxx; path=/;",
},
},
}
},
As you can see fetchOptions here (https://github.com/gatsbyjs/gatsby/blob/master/packages/gatsby-source-graphql/src/gatsby-node.js)
No success! This is probably a bug.
After spending a lot of time looking at the docs and other reports, I found a solution based on the attempts I've originally posted.
I started by looking at the browser version, and check the cookie header property name to avoid any typos. Which I've determined it should be "Cookie", as most examples I found mention '.cookie', etc.
With that said, I've checked the documentation for all the related packages and source code:
https://github.com/gatsbyjs/gatsby/tree/master/packages/gatsby-source-graphql
https://github.com/gatsbyjs/gatsby/blob/master/packages/gatsby-source-graphql/src/gatsby-node.js
https://www.apollographql.com/docs/link/links/http/
https://github.com/apollographql/apollo-client/issues/4455
Finally, I declared the headers cookie parameter and in a separate property, the options for the node-fetch package:
https://github.com/bitinn/node-fetch
The result:
{
resolve: 'gatsby-source-graphql',
options: {
typeName: 'FOOBAR',
fieldName: 'foobar',
url: process.env.GATSBY_GRAPHQL_API_URL,
headers: {
Cookie: 'CloudFront-Policy=xxxxx_; CloudFront-Key-Pair-Id=xxxxx; CloudFront-Signature=xxxxxxxxxx; path=/;'
},
credentials: 'include',
}
},
What happens above, is that the "credentials include" allow cross-browser origin requests and enables cookies (https://www.apollographql.com/docs/react/networking/authentication/#cookie)
Hope that this helps someone else in the future, as it's not trivial.
I've got express js server code:
...
const server = new GraphQLServer({
typeDefs: `schema.graphql`,
resolvers,
context: context => {
let cookie = get(context, 'request.headers.cookie');
return { ...context, cookie, pubsub };
},
});
such that I can attach cookie to resolvers' requests:
...
method: 'GET',
headers: {
cookie: context.cookie,
},
Now I want to be able to use Relay (as a GraphQL client) and I want to be able to attach a cookie to Relay's requests as well.
I've found a similar question but it's not clear to me where can I insert that code:
Relay.injectNetworkLayer(
new Relay.DefaultNetworkLayer('/graphql', {
credentials: 'same-origin',
})
);
since I don't import Relay in Environment.js.
Update: I tried to add
import { Relay, graphql, QueryRenderer } from 'react-relay';
Relay.injectNetworkLayer(
new Relay.DefaultNetworkLayer('http://example.com/graphql', {
credentials: 'same-origin',
})
);
to a file where I send GraphQL queries (e.g., client.js), but it says that Relay is undefined.
Update #2: this repo looks interesting.
I have a fairly simple node app using AWS AppSync. I am able to run queries and mutations successfully but I've recently found that if I run a query twice I get the same response - even when I know that the back-end data has changed. In this particular case the query is backed by a lambda and in digging into it I've discovered that the query doesn't seem to be sent out on the network because the lambda is not triggered each time the query runs - just the first time. If I use the console to simulate my query then everything runs fine. If I restart my app then the first time a query runs it works fine but successive queries again just return the same value each time.
Here are some part of my code:
client.query({
query: gql`
query GetAbc($cId: String!) {
getAbc(cId: $cId) {
id
name
cs
}
}`,
options: {
fetchPolicy: 'no-cache'
},
variables: {
cid: event.cid
}
})
.then((data) => {
// same data every time
})
Edit: trying other fetch policies like network-only makes no visible difference.
Here is how I set up the client, not super clean but it seems to work:
const makeAWSAppSyncClient = (credentials) => {
return Promise.resolve(
new AWSAppSyncClient({
url: 'lalala',
region: 'us-west-2',
auth: {
type: 'AWS_IAM',
credentials: () => {
return credentials
}
},
disableOffline: true
})
)
}
getRemoteCredentials()
.then((credentials) => {
return makeAWSAppSyncClient(credentials)
})
.then((client) => {
return client.hydrated()
})
.then((client) => {
// client is good to use
})
getRemoteCredentials is a method to turn an IoT authentication into normal IAM credentials which can be used with other AWS SDKs. This is working (because I wouldn't get as far as I do if not).
My issue seems very similar to this one GraphQL Query Runs Sucessfully One Time and Fails To Run Again using Apollo and AWS AppSync; I'm running in a node environment (rather than react) but it is essentially the same issue.
I don't think this is relevant but for completeness I should mention I have tried both with and without the setup code from the docs. This appears to make no difference (except annoying logging, see below) but here it is:
global.WebSocket = require('ws')
global.window = global.window || {
setTimeout: setTimeout,
clearTimeout: clearTimeout,
WebSocket: global.WebSocket,
ArrayBuffer: global.ArrayBuffer,
addEventListener: function () { },
navigator: { onLine: true }
}
global.localStorage = {
store: {},
getItem: function (key) {
return this.store[key]
},
setItem: function (key, value) {
this.store[key] = value
},
removeItem: function (key) {
delete this.store[key]
}
};
require('es6-promise').polyfill()
require('isomorphic-fetch')
This is taken from: https://docs.aws.amazon.com/appsync/latest/devguide/building-a-client-app-javascript.html
With this code and without offlineDisabled: true in the client setup I see this line spewed continuously on the console:
redux-persist asyncLocalStorage requires a global localStorage object.
Either use a different storage backend or if this is a universal redux
application you probably should conditionally persist like so:
https://gist.github.com/rt2zz/ac9eb396793f95ff3c3b
This makes no apparent difference to this issue however.
Update: my dependencies from package.json, I have upgraded these during testing so my yarn.lock contains more recent revisions than listed here. Nevertheless: https://gist.github.com/macbutch/a319a2a7059adc3f68b9f9627598a8ca
Update #2: I have also confirmed from CloudWatch logs that the query is only being run once; I have a mutation running regularly on a timer that is successfully invoked and visible in CloudWatch. That is working as I'd expect but the query is not.
Update #3: I have debugged in to the AppSync/Apollo code and can see that my fetchPolicy is being changed to 'cache-first' in this code in apollo-client/core/QueryManager.js (comments mine):
QueryManager.prototype.fetchQuery = function (queryId, options, fetchType, fetchMoreForQueryId) {
var _this = this;
// Next line changes options.fetchPolicy to 'cache-first'
var _a = options.variables, variables = _a === void 0 ? {} : _a, _b = options.metadata, metadata = _b === void 0 ? null : _b, _c = options.fetchPolicy, fetchPolicy = _c === void 0 ? 'cache-first' : _c;
var cache = this.dataStore.getCache();
var query = cache.transformDocument(options.query);
var storeResult;
var needToFetch = fetchPolicy === 'network-only' || fetchPolicy === 'no-cache';
// needToFetch is false (because fetchPolicy is 'cache-first')
if (fetchType !== FetchType.refetch &&
fetchPolicy !== 'network-only' &&
fetchPolicy !== 'no-cache') {
// so we come through this branch
var _d = this.dataStore.getCache().diff({
query: query,
variables: variables,
returnPartialData: true,
optimistic: false,
}), complete = _d.complete, result = _d.result;
// here complete is true, result is from the cache
needToFetch = !complete || fetchPolicy === 'cache-and-network';
// needToFetch is still false
storeResult = result;
}
// skipping some stuff
...
if (shouldFetch) { // shouldFetch is still false so this doesn't execute
var networkResult = this.fetchRequest({
requestId: requestId,
queryId: queryId,
document: query,
options: options,
fetchMoreForQueryId: fetchMoreForQueryId,
}
// resolve with data from cache
return Promise.resolve({ data: storeResult });
If I use my debugger to change the value of shouldFetch to true then at least I see a network request go out and my lambda executes. I guess I need to unpack what that line that is changing my fetchPolicy is doing.
OK I found the issue. Here's an abbreviated version of the code from my question:
client.query({
query: gql`...`,
options: {
fetchPolicy: 'no-cache'
},
variables: { ... }
})
It's a little bit easier to see what is wrong here. This is what it should be:
client.query({
query: gql`...`,
fetchPolicy: 'network-only'
variables: { ... }
})
Two issues in my original:
fetchPolicy: 'no-cache' does not seem to work here (I get an empty response)
putting the fetchPolicy in an options object is unnecessary
The graphql client specifies options differently and we were switching between the two.
Set the query fetch-policy to 'network-only' when running in an AWS Lambda function.
I recommend using the overrides for WebSocket, window, and localStorage since these objects don't really apply within a Lambda function. The setup I typically use for NodeJS apps in Lambda looks like the following.
'use strict';
// CONFIG
const AppSync = {
"graphqlEndpoint": "...",
"region": "...",
"authenticationType": "...",
// auth-specific keys
};
// POLYFILLS
global.WebSocket = require('ws');
global.window = global.window || {
setTimeout: setTimeout,
clearTimeout: clearTimeout,
WebSocket: global.WebSocket,
ArrayBuffer: global.ArrayBuffer,
addEventListener: function () { },
navigator: { onLine: true }
};
global.localStorage = {
store: {},
getItem: function (key) {
return this.store[key]
},
setItem: function (key, value) {
this.store[key] = value
},
removeItem: function (key) {
delete this.store[key]
}
};
require('es6-promise').polyfill();
require('isomorphic-fetch');
// Require AppSync module
const AUTH_TYPE = require('aws-appsync/lib/link/auth-link').AUTH_TYPE;
const AWSAppSyncClient = require('aws-appsync').default;
// INIT
// Set up AppSync client
const client = new AWSAppSyncClient({
url: AppSync.graphqlEndpoint,
region: AppSync.region,
auth: {
type: AppSync.authenticationType,
apiKey: AppSync.apiKey
}
});
There are two options to enable/disable caching with AppSyncClient/ApolloClient, for each query or/and on initializing the client.
Client Config:
client = new AWSAppSyncClient(
{
url: 'https://myurl/graphql',
region: 'my-aws-region',
auth: {
type: AUTH_TYPE.AWS_MY_AUTH_TYPE,
credentials: await getMyAWSCredentialsOrToken()
},
disableOffline: true
},
{
cache: new InMemoryCache(),
defaultOptions: {
watchQuery: {
fetchPolicy: 'no-cache', // <-- HERE: check the apollo fetch policy options
errorPolicy: 'ignore'
},
query: {
fetchPolicy: 'no-cache',
errorPolicy: 'all'
}
}
}
);
Alternative: Query Option:
export default graphql(gql`query { ... }`, {
options: { fetchPolicy: 'cache-and-network' },
})(MyComponent);
Valid fetchPolicy values are:
cache-first: This is the default value where we always try reading data from your cache first. If all the data needed to fulfill your query is in the cache then that data will be returned. Apollo will only fetch from the network if a cached result is not available. This fetch policy aims to minimize the number of network requests sent when rendering your component.
cache-and-network: This fetch policy will have Apollo first trying to read data from your cache. If all the data needed to fulfill your query is in the cache then that data will be returned. However, regardless of whether or not the full data is in your cache this fetchPolicy will always execute query with the network interface unlike cache-first which will only execute your query if the query data is not in your cache. This fetch policy optimizes for users getting a quick response while also trying to keep cached data consistent with your server data at the cost of extra network requests.
network-only: This fetch policy will never return you initial data from the cache. Instead it will always make a request using your network interface to the server. This fetch policy optimizes for data consistency with the server, but at the cost of an instant response to the user when one is available.
cache-only: This fetch policy will never execute a query using your network interface. Instead it will always try reading from the cache. If the data for your query does not exist in the cache then an error will be thrown. This fetch policy allows you to only interact with data in your local client cache without making any network requests which keeps your component fast, but means your local data might not be consistent with what is on the server. If you are interested in only interacting with data in your Apollo Client cache also be sure to look at the readQuery() and readFragment() methods available to you on your ApolloClient instance.
no-cache: This fetch policy will never return your initial data from the cache. Instead it will always make a request using your network interface to the server. Unlike the network-only policy, it also will not write any data to the cache after the query completes.
Copied from: https://www.apollographql.com/docs/react/api/react-hoc/#graphql-options-for-queries
Following the example provided in the docs I find the following message repeated many times in the logs:
redux-persist asyncLocalStorage requires a global localStorage object. Either use a different storage backend or if this is a universal redux application you probably should conditionally persist like so: https://gist.github.com/rt2zz/ac9eb396793f95ff3c3b
I can work around it by turning off offline support when creating the AppSync client, like this:
new AWSAppSyncClient({
url: 'https://...appsync-api.us-west-2.amazonaws.com/graphql',
region: 'us-west-2',
auth: {
type: 'AWS_IAM',
credentials: ...
},
disableOffline: true
})
... however I do want to use the offline store. I am using the setup config from the documentation like so:
global.WebSocket = require('ws');
global.window = global.window || {
setTimeout: setTimeout,
clearTimeout: clearTimeout,
WebSocket: global.WebSocket,
ArrayBuffer: global.ArrayBuffer,
addEventListener: function () { },
navigator: { onLine: true }
};
global.localStorage = {
store: {},
getItem: function (key) {
return this.store[key]
},
setItem: function (key, value) {
this.store[key] = value
},
removeItem: function (key) {
delete this.store[key]
}
};
require('es6-promise').polyfill();
require('isomorphic-fetch');
But it doesn't seem to work with redux-persist which is used a few layers deep in the AppSync client.
I have found a very simple way to resolve this issue. While this section is taken directly from the AWS docs it is not quite right:
global.localStorage = {
store: {},
...
};
By setting global.window.localStorage instead I am able to work around the issues:
global.window.localStorage = {
store: {},
...
};
Anyone else trying to use AppSync like this may want to know that node-localstorage also seems to work with this usage (after yarn add node-localstorage):
var LocalStorage = require('node-localstorage').LocalStorage
global.window.localStorage = new LocalStorage(<path for storage>)
Importantly, in this case, your queries are persisted to the file system and will be read if connectivity is lost. This could potentially work after restarting your application (but I've not tested this yet because you need an AWS credentials object to create the AppSync client).