I try use pubsub in apollo server & apollo client. but subscribed data is only null.
client dependency
"#apollo/react-hooks": "^3.1.5",
"apollo-boost": "^0.4.9",
"apollo-link-ws": "^1.0.20",
"graphql": "^15.0.0",
"react": "^16.13.1",
"react-dom": "^16.13.1",
"react-router-dom": "^5.2.0",
"react-scripts": "3.4.1",
"styled-components": "^5.1.1",
"subscriptions-transport-ws": "^0.9.16",
"typescript": "~3.7.2"
server dependency
"apollo-server": "^2.14.1",
"graphql": "^15.0.0",
"merge-graphql-schemas": "^1.7.8",
"ts-node": "^8.10.2",
"tsconfig-paths": "^3.9.0",
"typescript": "^3.9.3"
// apolloClient.ts
import { ApolloClient, HttpLink, InMemoryCache, split } from 'apollo-boost'
import { WebSocketLink } from 'apollo-link-ws'
import { getMainDefinition } from 'apollo-utilities'
const wsLink = new WebSocketLink({
uri: 'ws://localhost:4000/graphql',
options: {
reconnect: true
}
})
const httpLink = new HttpLink({
uri: 'http://localhost:4000'
})
const link = split(
// split based on operation type
({ query }) => {
const definition = getMainDefinition(query);
return (
definition.kind === 'OperationDefinition' &&
definition.operation === 'subscription'
);
},
wsLink,
httpLink,
)
const cache = new InMemoryCache()
const client = new ApolloClient({
cache: cache,
link: link,
})
export default client
// subscribe.ts
const ON_PUT_UNIT = gql`
subscription onPutUnit($code: String!) {
onPutUnit(code: $code)
}
`
const onPutResult = useSubscription(
ON_PUT_UNIT,
{ variables: {
code: code,
}}
)
// in is only null!!
console.log('subscribe', onPutResult)
-server-
onPutUnit.ts
type Subscription {
onPutUnit(code: String!): Room
}
import { pubsub } from '#src/index'
const { withFilter } = require('apollo-server')
export default {
Subscription: {
onPutUnit: {
subscribe: withFilter(
() => pubsub.asyncIterator(['PUT_UNIT']),
(payload: any, variables: any) => {
// no problem in payload & variable data
return payload.code === variables.code
}
)
}
},
}
putUnit.ts
type Mutation {
putUnit(code: String!, x: Int!, y: Int!, userName: String!): Room!
}
export default {
Mutation: {
putUnit: async (_: any, args: args) => {
const { code, x, y, userName } = args
const room = findRoom(code)
console.log(room) // no problem. normal data.
pubsub.publish('PUT_UNIT', room)
return room
},
},
}
Is it some problem? subscribe event is normally reached to client when publish. but data is is only null. I can't fine the reason.
You only specified a subscribe function for onPutUnit, without specifying a resolve function. That means the field utilizes the default resolver.
The default resolver just looks for a property with the same name as the field on the parent object (the first parameter passed to the resolver) and returns that. If there is no property on the parent object with the same name as the field, then the field resolves to null. The parent object is the value the parent field resolved to. For example, if we have a query like this:
{
user {
name
}
}
whatever the resolver for user returns will be the parent value provided to the resolver for name (if user returns a Promise, it's whatever the Promise resolved to).
But what about user? It has no parent field because it's a root field. In this case, user is passed the rootValue you set when initializing the ApolloServer (or {} if you didn't).
With subscriptions, this works a bit differently because whatever value you publish is actually passed to the resolver as the root value. That means you can take advantage of the default resolver by publishing an object with a property that matches the field name:
pubsub.publish('PUT_UNIT', { onPutUnit: ... })
if you don't do that, though, you'll need to provide a resolve function that transforms the payload you published. For example, if we do:
pubsub.publish('PUT_UNIT', 'FOOBAR')
Then our resolver map needs to look something like this:
const resolvers = {
Subscription: {
onPutUnit: {
subscribe: ...,
resolve: (root) => {
console.log(root) // 'FOOBAR'
// return whatever you want onPutUnit to resolve to
}
}
},
}
Related
I have two databases that I need to interact with in my code. I have a simple function that takes an object and writes it to my PostgreSQL database using Prisma. I've tested the function with Postman, and it works perfectly, but when I try to execute it using a Jest mock (using the singleton pattern found in the Prisma unit testing guide), it returns undefined indicating that it didn't interact with the database and create the new record. Here's my code:
/prisma/clinical-schema.prisma
generator client {
provider = "prisma-client-js"
output = "./generated/clinical"
}
datasource clinicalDatabase {
provider = "postgresql"
url = "postgresql://postgres:postgres#localhost:5432/clinical-data?schema=public"
}
model pcc_webhook_update {
id Int #id #default(autoincrement())
event_type String
organization_id Int
facility_id Int
patient_id Int
resource_id String?
webhook_date DateTime #default(now()) #clinicalDatabase.Timestamptz(6)
status pcc_webhook_update_status #default(pending)
status_changed_date DateTime? #clinicalDatabase.Timestamptz(6)
error_count Int #default(0)
##unique([organization_id, facility_id, patient_id, resource_id, event_type, status])
}
enum pcc_webhook_update_status {
pending
processing
processed
error
}
/prisma/clinical-client.ts
import { PrismaClient } from './generated/clinical';
const prismaClinical = new PrismaClient();
export default prismaClinical;
/testing/prisma-clinical-mock.ts
import { PrismaClient } from '../prisma/generated/clinical';
import { mockDeep, mockReset, DeepMockProxy } from 'jest-mock-extended';
import prisma from '../prisma/clinical-client';
jest.mock('../prisma/clinical-client', () => ({
__esModule: true,
default: mockDeep<PrismaClient>()
}));
beforeEach(() => {
mockReset(prismaClinicalMock);
});
export const prismaClinicalMock = prisma as unknown as DeepMockProxy<PrismaClient>;
Everything up to this point follows the conventions outlined by the Prisma unit testing docs. The only modification I made was to make it database specific. Below is my function and tests. The request object in handle-pcc-webhooks.ts is a sample http request object, the body of which contains the webhook data I care about.
/functions/handle-pcc-webhooks/handler.ts
import prismaClinical from '../../../prisma/clinical-client';
import { pcc_webhook_update } from '../../../prisma/generated/clinical';
import { requestObject } from './handler.types';
export const handlePccWebhook = async (request: requestObject) => {
try {
const webhook = JSON.parse(request.body);
// if the webhook doesn't include a resource id array, set it to an array with an empty string to ensure processing and avoid violating
// the multi-column unique constraint on the table
const { resourceId: resourceIds = [''] } = webhook;
let records = [];
for (const resourceId of resourceIds) {
// update an existing record if one exists in the pending state, otherwise create a new entry
const record: pcc_webhook_update = await prismaClinical.pcc_webhook_update.upsert({
where: {
organization_id_facility_id_patient_id_resource_id_event_type_status: {
organization_id: webhook.orgId,
facility_id: webhook.facId,
patient_id: webhook.patientId,
resource_id: resourceId,
event_type: webhook.eventType,
status: 'pending'
}
},
update: {
webhook_date: new Date()
},
create: {
event_type: webhook.eventType,
organization_id: webhook.orgId,
facility_id: webhook.facId,
patient_id: webhook.patientId,
resource_id: resourceId,
status: 'pending' // not needed
}
});
records.push(record);
}
return records;
} catch (error) {
console.error(error);
}
};
/functions/handle-pcc-webhooks/handler.spec.ts
import fs from 'fs';
import path from 'path';
import MockDate from 'mockdate';
import { prismaClinicalMock } from '../../../testing/prisma-clinical-mock';
import { createAllergyAddRecord } from './__mocks__/allergy';
import { requestObject } from './handler.types';
import { handlePccWebhook } from './handler';
describe('allergy.add', () => {
let requestObject: requestObject;
let allergyAddRecord: any;
beforeAll(() => {
requestObject = getRequestObject('allergy.add');
});
beforeEach(() => {
MockDate.set(new Date('1/1/2022'));
allergyAddRecord = createAllergyAddRecord(new Date());
});
afterEach(() => {
MockDate.reset();
});
test('should create an allergy.add database entry', async() => {
prismaClinicalMock.pcc_webhook_update.create.mockResolvedValue(allergyAddRecord);
// this is where I would expect handlePccWebhook to return the newly created database
// record, but instead it returns undefined. If I run the function outside of this
// unit test, with the same input value, it functions perfectly
await expect(handlePccWebhook(requestObject)).resolves.toEqual([allergyAddRecord]);
});
});
// This just builds a request object with the current webhook being tested
function getRequestObject(webhookType: string) {
// read the contents of request object file as a buffer, then convert it to JSON
const rawRequestObject = fs.readFileSync(path.resolve(__dirname, '../../sample-data/handle-pcc-webhook-request.json'));
const requestObject: requestObject = JSON.parse(rawRequestObject.toString());
// read the contents of the webhook file as a buffer, then convert it to a string
const rawWebhook = fs.readFileSync(path.resolve(__dirname, `../../sample-data/${webhookType}.json`));
const webhookString = rawWebhook.toString();
// set the body of the request object to the contents of the target webhook
requestObject.body = webhookString;
return requestObject;
}
Finally, here is the result of running the unit test:
So after banging my had against the wall for a few hours, I figured out the issue. In my handler.spec.ts file, I had the following line:
prismaClinicalMock.pcc_webhook_update.create.mockResolvedValue(allergyAddRecord);
what that does is mock the value returned for any create functions run using Prisma. The issue is that my function is using an upsert function, which I wasn't explicitly mocking, thus returning undefined. I changed the above line to
prismaClinicalMock.pcc_webhook_update.upsert.mockResolvedValue(allergyAddRecord);
and it started working.
I am trying to update a query in AWS Dynamo using AWS Amplify on top of Next.js.
My scenario is simple.
On page load, if there exists a user and the user has not visited a page before, a new object will be created with set values using SWR.
const fetchUserSite = async (owner, code) => {
try {
// Create site object if no site exists
if (userData == null) {
const siteInfo = {
id: uuidv4(),
code: parkCode,
owner: user?.username,
bookmarked: false,
visited: false,
}
await API.graphql({
query: createSite,
variables: {input: siteInfo},
authMode: 'AMAZON_COGNITO_USER_POOLS',
})
console.log(`${code} added for the first time`)
}
return userData || null
} catch (err) {
console.log('Site not added by user', data, err)
}
}
// Only call the fetchUserSite method if `user` exists
const {data} = useSWR(user ? [user?.username, parkCode] : null, fetchUserSite)
Currently, this works. The object is added to the database with the above attributes. HOWEVER, when I click a button to update this newly created object, I get an error of path: null, locations: (1) […], message: "Variable 'input' has coerced Null value for NonNull type 'ID!'"
This is my call to update the object when I click a button with the onClick handler "handleDBQuery".
const handleDBQuery = async () => {
await API.graphql({
query: updateSite,
variables: {
input: {
id: data?.id,
bookmarked: true,
owner: user?.username,
},
},
authMode: 'AMAZON_COGNITO_USER_POOLS',
})
console.log(`${name} Bookmarked`)
}
My hunch is that the updateSite query does not know about the createSite query on page load.
In short, how can I update an item after I just created it?
I looked into the code at master branch and follow along as you describe. I found that the data?.id here comes from a state variable and it is set only before the call to createSite. I suggest you try setId again using the data returned from the createSite
Try this
const fetchUserSite = async (owner, code) => {
try {
// Create site object if no site exists
if (userData == null) {
const siteInfo = {
id: uuidv4(),
code: parkCode,
owner: user?.username,
bookmarked: false,
visited: false,
}
const { data: newData } = await API.graphql({
query: createSite,
variables: {input: siteInfo},
authMode: 'AMAZON_COGNITO_USER_POOLS',
});
setId(newData.id); // <====== here (or setId(siteInfo.id))
console.log(`${code} added for the first time`)
return newData; // <======= and this, maybe? (you may have to modify the qraphql query to make it return the same item as in the listSite
}
return userData || null
} catch (err) {
console.log('Site not added by user', data, err)
}
}
I have a mutation to create a new card object, and I expect it should be added to the user interface after update. Cache, Apollo Chrome tool, and console logging reflect the changes, but the UI does not without a manual reload.
const [createCard, { loading, error }] = useMutation(CREATE_CARD, {
update(cache, { data: { createCard } }) {
let localData = cache.readQuery({
query: CARDS_QUERY,
variables: { id: deckId }
});
localData.deck.cards = [...localData.deck.cards, createCard];
;
client.writeQuery({
query: CARDS_QUERY,
variables: { id: parseInt(localData.deck.id, 10) },
data: { ...localData }
});
I have changed cache.writeQuery to client.writeQuery, but that didn't solve the problem.
For reference, here is the Query I am running...
const CARDS_QUERY = gql`
query CardsQuery($id: ID!) {
deck(id: $id) {
id
deckName
user {
id
}
cards {
id
front
back
pictureName
pictureUrl
createdAt
}
}
toggleDeleteSuccess #client
}
`;
I managed the same result without the cloneDeep method. Just using the spread operator solved my problem.
const update = (cache, {data}) => {
const queryData = cache.readQuery({query: USER_QUERY})
const cartItemId = data.cartItem.id
queryData.me.cart = queryData.me.cart.filter(v => v.id !== cartItemId)
cache.writeQuery({query: USER_QUERY, data: {...queryData}})
}
Hope this helps someone else.
Ok, finally ran into a long Github thread discussing their solutions for the same issue. The solution that ultimately worked for me was deep cloning the data object (I personally used Lodash cloneDeep), which after passing in the mutated data object to cache.writeQuery, it was finally updating the UI. Ultimately, it still seems like there ought to be a way to trigger the UI update, considering the cache reflects the changes.
Here's the after, view my original question for the before...
const [createCard, { loading, error }] = useMutation(CREATE_CARD, {
update(cache, { data: { createCard } }) {
const localData = cloneDeep( // Lodash cloneDeep to make a fresh object
cache.readQuery({
query: CARDS_QUERY,
variables: { id: deckId }
})
);
localData.deck.cards = [...localData.deck.cards, createCard]; //Push the mutation to the object
cache.writeQuery({
query: CARDS_QUERY,
variables: { id: localData.deck.id },
data: { ...localData } // Cloning ultimately triggers the UI update since writeQuery now sees a new object.
});
},
});
Currently I've set up Apollo's web socket link like so:
const wsLink = new WebSocketLink({
uri: `ws://example.com/graphql?token=${getToken()}`,
options: {
reconnect: true,
connectionParams(): ConnectionParams {
return {
authToken: getToken(),
};
},
},
});
This works fine while the connection lasts, but fails when the connection needs to be re-established if the token in the query string has expired.
The way the infra I'm dealing with is set up requires this token to be set as a query param in the URI. How can I dynamically change the URI so that I may provide a new token when the connection needs to be re-established?
You can set property wsLink.subscriptionClient.url manually (or create a new subscriptionClient instance?) in function setContext https://www.apollographql.com/docs/link/links/context/.
For example:
import { setContext } from 'apollo-link-context'
...
const wsLink = your code...
const authLink = setContext(() => {
wsLink.subscriptionClient.url = `ws://example.com/graphql?token=${getToken()}`
})
...
const config = {
link: ApolloLink.from([
authLink,
wsLink
]),
...
}
I'm performing a query to get PowerMeter details in which contains another type inside called Project. I write the query this way:
query getPowerMeter($powerMeterId: ID!) {
powerMeter: powerMeter(powerMeterId: $powerMeterId) {
id
name
registry
project {
id
name
}
}
}
When I perform the query for the first time, project is successfully returned. The problem is that when I perform subsequent queries with the same parameters and default fetchPolicy (cache-first), project isn't returned anymore.
How may I solve this problem?
Also, I call readFragment to check how powerMeter is saved in the cache and the response shows that powerMeter has project saved.
const frag = client.readFragment({
fragment: gql`
fragment P on PowerMeter {
id
name
registry
project {
id
name
}
}
`,
id: 'PowerMeter:' + powerMeterId,
});
Power Meter returned first time
{
"powerMeter":{
"id":"7168adb4-4198-443e-ab76-db0725be2b18",
"name":"asd123123",
"registry":"as23",
"project":{
"id":"41d8e71b-d1e9-41af-af96-5b4ae9e492c1",
"name":"ProjectName",
"__typename":"Project"
},
"__typename":"PowerMeter"
}
}
Fragment after calling power meter first time
{
"id":"7168adb4-4198-443e-ab76-db0725be2b18",
"name":"asd123123",
"registry":"as23",
"project":{
"id":"41d8e71b-d1e9-41af-af96-5b4ae9e492c1",
"name":"ProjectName",
"__typename":"Project"
},
"__typename":"PowerMeter"
}
Power Meter returned second time
{
"powerMeter":{
"id":"7168adb4-4198-443e-ab76-db0725be2b18",
"name":"asd123123",
"registry":"as23",
"__typename":"PowerMeter"
}
}
Fragment after calling power meter second time
{
"id":"7168adb4-4198-443e-ab76-db0725be2b18",
"name":"asd123123",
"registry":"as23",
"project":{
"id":"41d8e71b-d1e9-41af-af96-5b4ae9e492c1",
"name":"ProjectName",
"__typename":"Project"
},
"__typename":"PowerMeter"
}
Edit 1: Fetching Query
The code below is how I'm fetching data. I'm using useApolloClient and not a query hook because I'm using AWS AppSync and it doesn't support query hook yet.
import { useApolloClient } from '#apollo/react-hooks';
import gql from 'graphql-tag';
import { useEffect, useState } from 'react';
export const getPowerMeterQuery = gql`
query getPowerMeter($powerMeterId: ID!) {
powerMeter: powerMeter(powerMeterId: $powerMeterId) {
id
name
registry
project {
id
name
}
}
}
`;
export const useGetPowerMeter = (powerMeterId?: string) => {
const client = useApolloClient();
const [state, setState] = useState<{
loading: boolean;
powerMeter?: PowerMeter;
error?: string;
}>({
loading: true,
});
useEffect(() => {
if (!powerMeterId) {
return setState({ loading: false });
}
client
.query<GetPowerMeterQueryResponse, GetPowerMeterQueryVariables>({
query: getPowerMeterQuery,
variables: {
powerMeterId,
},
})
.then(({ data, errors }) => {
if (errors) {
setState({ loading: false, error: errors[0].message });
}
console.log(JSON.stringify(data));
const frag = client.readFragment({
fragment: gql`
fragment P on PowerMeter {
id
name
registry
project {
id
name
}
}
`,
id: 'PowerMeter:' + powerMeterId,
});
console.log(JSON.stringify(frag));
setState({
loading: false,
powerMeter: data.powerMeter,
});
})
.catch(err => setState({ loading: false, error: err.message }));
}, [powerMeterId]);
return state;
};
Edit 2: Fetching Policy Details
When I use fetchPolice equals cache-first or network-only, the error persists. When I use no-cache, I don't get the error.
I think this might have been the solution:
https://github.com/apollographql/apollo-client/issues/7050
Probably way too late, but it could help people coming to this issue in the future.
When using apollo client's InMemoryCache it seems you need to provide a list of possible types so the fragment matching can be done correctly when using the InMemoryCache.
You can do that manually when having few union types and a pretty stable API which doesn't change very often.
Or you automatically generate these types into a json file, which you can use directly in the InMemoryCache's possibleTypes config directly.
Visit this link to the official docs to find out how to do it.
Cheers.