How to configure apollo cache to uniquely identify a child elements based on their parent primary key - apollo

What is the proper way to configure apollo's cache normalization for a child array fields that do not have an ID of their own but are unique in the structure of their parent?
Let's say we have the following schema:
type Query {
clients: [Client!]!
}
type Client {
clientId: ID
name: String!
events: [Events!]!
}
type Events {
month: String!
year: Int!
day: Int!
clients: [Client!]!
}
At first I thought I can use multiple keyFields to achieve a unique identifier like this:
const createCache = () => new InMemoryCache({
typePolicies: {
Event: {
keyFields: ['year', 'month', 'name'],
},
},
});
There would never be more than 1 event per day so it's safe to say that the event is unique for a client based on date
But the created cache entries lack a clientId (in the cache key) so 2 events that are on the same date but for different clients cannot be distinguished
Is there a proper way to configure typePolicies for this relationship?
For example the key field can be set to use a subfield:
const cache = new InMemoryCache({
typePolicies: {
Book: {
keyFields: ["title", "author", ["name"]],
},
},
});
The Book type above uses a subfield as part of its primary key. The ["name"] item indicates that the name field of the previous field in the array (author) is part of the primary key. The Book's author field must be an object that includes a name field for this to be valid.
In my case I'd like to use a parent field as part of the primary key

If you can't add a unique event id, then the fallback is to disable normalization:
Objects that are not normalized are instead embedded within their parent object in the cache. You can't access these objects directly, but you can access them via their parent.
To do this you set keyFields to false:
const createCache = () => new InMemoryCache({
typePolicies: {
Event: {
keyFields: false
},
},
});
Essentially each Event object will be stored in the cache under its parent Client object.

Related

Trying to update DynamoDB item, what does this error mean? (I'm using Python/Boto3) [duplicate]

I'm trying to update an Item in my Dynamodb Table +Users+. I have tried many different ways but I always received the same error message:
The provided key element does not match the schema
The creation of an Item works, as well as a query but not the update. When I check on DynamoDB the user is well created:
{
"email": "test#email.com",
"password": "123",
"registration": 1460136902241,
"verified": false
}
Here is the table information:
Table name: Users
Primary partition key: email (String)
Primary sort key: registration (Number)
Here is the code (called from lambda):
exports.handler = function(event, context)
{
var AWS = require("aws-sdk");
var docClient = new AWS.DynamoDB.DocumentClient();
var params = {
TableName: "Users",
Item:{
email: "test#email.com",
password: "123",
verified: false,
registration: (new Date()).getTime(),
}
};
// Create the user.
docClient.put(params, function(err, data)
{
if (err)
{
context.fail("Put failed...");
return;
}
var params = {
TableName: "Users",
Key: { email : "test#email.com" },
AttributeUpdates: {
verified: {
Action: "PUT",
Value: true
}
}
};
// Update the user.
docClient.update(params, function(err, data)
{
if (err)
{
console.log(JSON.stringify(err));
context.fail(JSON.stringify(err));
return;
}
context.succeed("User successfully updated.");
});
});
};
Do you have any idea of what could be wrong in my code?
You are only providing half of your primary key. Your primary key is a combination of the partition key and range key. You need to include the range key in your Key attribute in the update parameters.
For others who have faced the same challenge and the issue is not fixed by above answers, it is always better to double check the data type of the value being updated, in my case the primary key was expecting a Number and I was trying to update with a string. Silly me
My issue was with the Node SDK for deletes, where the documentation says to provide in format:
... {Key: {'id': {S: '123'}}} ...
Which does not appear to work with the aws-sdk ^2.1077.0. This seems to work:
... {Key: {'id': '123'}} ...
My checklist when facing this issue:
Check that the name and type of your key correspond to what you have in the database.
Use corresponding attributes to make it explicit. E.g. use #DynamoDBHashKey(attributeName = "userId") in Java to indicate the partition key named userId.
Ensure that only one field or getter marked as partition key in your class.
Please, add more if you know in the comments.
I was doing BatchGetItem, then streamed it to BatchWriteItem (Delete). DeleteItem didn't like it got all attributes from the object instead of only partition and sort key.
Gathering all answers:
mismatch in an attribute name
mismatch in attribute type
half key provided
unnecessary additional keys

How to handle related stores with Svelte

I have a store with a list of entities, and another Store with and object that include one of those entities.
I want changes in the first store to be reactively reflected on the second.
I'll provide a quick example with a list of items and a list of invoices
export type Invoice = {
id: string
customer: string
items: InvoiceItem[]
}
export type InvoiceItem = {
id: string
name: string
price: number
}
Whenever the name or price of an invoice item is updated I'd like all the related Invoices to also be updated.
I created this very simple example (repl available here) but in order for the $invoices store to be updated I have to issue a $invoices = $invoices whenever the $items store changes.
Another more elegant way to do it is to subscribe to the items store and from there update the invoices store, like this:
items.subscribe(_ => invoices.update(data => data))
<script>
import { writable } from 'svelte/store'
let item1 = { id: 'item-01', name: 'Item number 01', price: 100 }
let item2 = { id: 'item-02', name: 'Item number 02', price: 200 }
let item3 = { id: 'item-03', name: 'Item number 03', price: 300 }
let items = writable([item1, item2, item3])
let invoices = writable([
{ id: 'invoice-0', customer: 'customer1', items: [item1, item3] }
])
items.subscribe(_ => invoices.update(data => data)) // refresh invoices store whenever an item is changed
const updateItem1 = () => {
$items[0].price = $items[0].price + 10
// $invoices = $invoices // alternatively, manually tell invoices store that something changed every time I change and item!!!
}
</script>
<button on:click={updateItem1}>update item 1 price</button>
<hr />
<textarea rows="18">{JSON.stringify($invoices, null, 2)}</textarea>
<textarea rows="18">{JSON.stringify($items, null, 2)}</textarea>
Is this the best way to handle this kind of scenario?
Update: thanks to the great answers and comments I came out with this more complete example: see this repl
I added some functionality that I hope will serve as basis for similar common scenarios
This is how my store api ended up:
// items.js
items.subscribe // read only store
items.reset()
items.upsert(item) // updates the specified item, creates a new one if it doesn't exist
// invoices.js
invoices.subscribe // read only store
invoices.add(invocieId, customer, date) // adds a new invoice
invoices.addLine(invoiceId, itemId, quantity)
invoices.getInvoice(invoice) // get a derived store for that particular invoice
invoice.subscribe // read only store
invoice.addLine(itemId, quantity)
A few highlights
invoices now has a lines array, each with an item and a quantity
invoices is a derived store that calculate total for each line and for the whole invoice
implementes an upsert method in items
in order to update invoices whenever an item is modified I run items.subscribe(() => set(_invoices))
also created a derived store to get a specific invoice
The solution depends on whether or not you need items independently (one item can be part of multiple invoices) or if it can be part of the invoices. If they can be one big blob, I would create invoices as a store and provide methods to update specific invoices. The items store then would be derived from the invoices.
// invoices.ts
const _invoices = writable([]);
// public API of your invoices store
export const invoices = {
subscribe: _invoices.subscribe,
addItemToInvoice: (invoideId, item) => {...},
..
};
// derived items:
const items = derived(invoices, $invoices => flattenAllInvoiceItems($invoice));
However, if they need to be separate - or if it is easier to handle item updates that way -, then I would only store the IDs of the items in the invoice store and create a derived store which uses invoices+items to create the full invoices.
// items.ts
const _items = writable([]);
// public API of your items store
export const items = {
subscribe: _items.subscribe,
update: (item) => {...},
...
};
// invoices.ts
import { items } from './items';
const _invoices = writable([]);
// public API of your invoices store
export const invoices = {
// Assuming you never want the underlying _invoices state avialable publicly
subscribe: derived([_invoices, items], ([$invoices, $items]) => mergeItemsIntoInvoices($invoices, $items)),
addItemToInvoice: (invoideId, item) => {...},
..
};
In both cases you can use invoices and items in your Svelte components like you want, interact with a nice public API and the derived stores will ensure everything is synched.
You can use a derived store like this:
let pipe = derived([invoices, items], ([$invoices, $items]) => {
return $invoices;
})
So $pipe will return an updated invoice if the invoice was changed.
$pipe will be triggered bij both stores ($items and $invoice) but only produces a result if the invoice was changed. So $pipe will not produce a result when an item changes which is not part of the invoice.
Update. I expected no result of $pipe when $invoices does not change as is the case for a writeable store. But a derived store callback will always run if $invoices or $items changes.
So we have to check if $invoices changes and use set only if we have a change.
let cache = "";
let pipe = derived([invoices, items], ([$invoices, $items], set) => {
if (JSON.stringify($invoices) !== cache) {
cache = JSON.stringify($invoices);
set($invoices);
}
}, {})

Amplify AppSync: custom sorting and filtering with pagination

I'm trying to write a schema so that I can query models filtered by multiple keys, sorted by a custom key and paginated.
an example of my model:
type Article {
id: ID!
category: String!
area: String!
publishOn: AWSDate!
}
And an example of the query I would like to do is: retrieve all the Articles which are part of both a given category AND area, returned in descending order by publishOn in chunks of 10 items each (to implement pagination server-side, and have a lightweight UI).
The response should include also the nextToken attribute that can be used to load the "next" page of the filtered articles list.
I have multiple problems with what I can do with the automatically generated schema and can't find a way to implement manually a solution that works for all what I want to do. I try and make a list of what goes wrong:
Filtering
Let's say I want to query 10 articles that belong to the category "Holiday":
listArticles(filter: {category: {eq: "Holiday} }, limit: 10)
I won't get the first 10 articles that match that category, but instead, it seems that AppSync selects the first 10 items in the table, and then it filters these 10 items by the filter criteria.
In other words, it seems that the sequence in which filtering and sorting are applied is the opposite of what expected. Expected: firstly filter the table by the filter critaria, then return the first 10 items of the filtered result sets.
Sorting
I couldn't find a way to add sorting with AppSync, so I added searchable:
type Article (
#searchable
) {
id: ID!
category: String!
area: String!
publishOn: AWSDate!
}
Now if I sort by date, that key will be used as nextToken and brake the pagination. This is a known issue: https://github.com/aws-amplify/amplify-cli/issues/4434
Do you have any good tip on how to find a workaround to these bugs? I dag into the documentation and in couple of issue, but didn't come up with a solution that works well...
Thanks in advance,
Matteo
Filtering
You will need a Global Secondary Index in DynamoDB to achieve such a behaviour. You can create them with the #key annotation. I your case I would create a composite key consisting of the category for the partition key and area and publishOn as the sort key(s).
type Article
#model
#key(fields: ["id"])
#key(name: "byCategory", fields: ["category", "publishOn"])
#key(name: "byCategoryArea", fields: ["category", "area", "publishOn"])
{
id: ID!
category: String!
area: String!
publishOn: AWSDate!
}
Sorting
Sorting is done by the sortDirection property which is either DESC or ASC and can only be done on the sort key.
The #searchable directive enables elasticsearch on the table, which is a fulltext search engine and probably a bit pricy for small applications and wouldn't be required here unless you would want to query based on e.g. the article description text.
listArticles(filter: {category: {eq: "Holiday"} }, limit: 10, sortDirection: DESC)
Amplify AppSync: filtering with pagination
let allClubsList = async (sport) => {
try {
let clubsList;
let clubsInfoList = [];
let nextTokenInfo = null;
do{
let clubs = await client.query({
query: gql(clubBySportStatus),
variables: {
sport: sport,
eq: { status: "ACTIVE" },
},
limit: 100,
nextToken: nextTokenInfo,
fetchPolicy: "network-only",
});
clubsList = clubs.data.clubBySportStatus.items;
clubsList.forEach((item) => clubsInfoList.push(item));
nextTokenInfo = clubs.data.clubBySportStatus.nextToken;
} while (Boolean(nextTokenInfo));
if (clubsInfoList && clubsInfoList.length) {
return {
success: true,
data: clubsInfoList,
};
}
} catch (eX) {
console.error(`Error in allClubsList: ${JSON.stringify(eX)}`);
return {
success: false,
message: eX.message,
};
}
};

TypeORM: How to set ForeignKey explicitly without having property for loading relations?

I don't want to create a property for loading relation into it (as shown in all the examples). The only thing I need is to have an explicit foreign key property so that the migration will be able to create appropriate constraints for it in the database. The closest decorator to the one I need is #RelationId but it still requires the presence of a property of the relational class.
For clarity let's take the example from the documentation:
#Entity()
export class Post {
#ManyToOne(type => Category)
category: Category;
#RelationId((post: Post) => post.category) // it still requires the presence of the `category` proeprty
categoryId: number;
}
I don't need the category property here. I want to have the categoryId property and mark it as foreign key to Category.Id. It should look like this:
#Entity()
export class Post {
#ForeignKey((category: Category) => category.Id) // it's a foreign key to Category.Id
categoryId: number;
}
Is it possible?
"I need is to have an explicit foreign key property"...
No, you could not. TypeOrm will automatically create foreign key property when you use #ManyToOne decorator. Just combine #ManyToOne and #JoinColumn decorators together like this:
#ManyToOne(type => Category)
#JoinColumn({ name: 'custom_field_name_if_you_want' })
category: Category;
Maybe you can create and write your own migration and use it like this :
const queryRunner = connection.createQueryRunner();
await queryRunner.createTable(new Table({
name: "question",
columns: [
{
name: "id",
type: "int",
isPrimary: true
},
{
name: "name",
type: "varchar",
}
]
}), true);
await queryRunner.createTable(new Table({
name: "answer",
columns: [
{
name: "id",
type: "int",
isPrimary: true
},
{
name: "name",
type: "varchar",
},
{
name: "questionId",
isUnique: connection.driver instanceof CockroachDriver, // CockroachDB requires UNIQUE constraints on referenced columns
type: "int",
}
]
}), true);
// clear sqls in memory to avoid removing tables when down queries executed.
queryRunner.clearSqlMemory();
const foreignKey = new TableForeignKey({
columnNames: ["questionId"],
referencedColumnNames: ["id"],
referencedTableName: "question",
onDelete: "CASCADE"
});
await queryRunner.createForeignKey("answer", foreignKey);
This code snippet is extracted from the functional test of type orm and you can use it to create your own constraint on the database I think.
It's actually possible to do so:
#Entity()
export class Post {
// this will add categoryId
#ManyToOne(type => Category)
category: Category;
// and you can use this for accessing post.categoryId
// only column you mark with #Column decorator will be mapped to a database column
// Ref: https://typeorm.io/#/entities
categoryId: number;
}
The added categoryId won't be mapped to column and will then be use for setting explicitly the id or for accessing its value as in:
post.categoryId = 1;
// or
const id = post.categoryId
Check with these two places Auth module(JwtModule.register()) and JWT strategy(super({...})). Make sure you have secret /secretOrKey is set to the same key. In my case "secret: process.env.JWT_SECRET_KEY" & "secretOrKey: process.env.JWT_SECRET_KEY"
I have encountered the same problem recently.
I still use the Entity but only with the primary key value of the referenced entity.
i.e. I do not query the database for the referenced entity.
Suppose your category entity looks like this:
#Entity()
export class Category{
#PrimaryGeneratedColumn()
id: number;
// ... other stuff
}
Now using your codes as example.
Dircely assigning relation using a foreign key value would be like.
// You wish to assign category #12 to a certain post
post.category = { id: 12 } as Category

DynamoDB - Key element does not match the schema

I'm trying to update an Item in my Dynamodb Table +Users+. I have tried many different ways but I always received the same error message:
The provided key element does not match the schema
The creation of an Item works, as well as a query but not the update. When I check on DynamoDB the user is well created:
{
"email": "test#email.com",
"password": "123",
"registration": 1460136902241,
"verified": false
}
Here is the table information:
Table name: Users
Primary partition key: email (String)
Primary sort key: registration (Number)
Here is the code (called from lambda):
exports.handler = function(event, context)
{
var AWS = require("aws-sdk");
var docClient = new AWS.DynamoDB.DocumentClient();
var params = {
TableName: "Users",
Item:{
email: "test#email.com",
password: "123",
verified: false,
registration: (new Date()).getTime(),
}
};
// Create the user.
docClient.put(params, function(err, data)
{
if (err)
{
context.fail("Put failed...");
return;
}
var params = {
TableName: "Users",
Key: { email : "test#email.com" },
AttributeUpdates: {
verified: {
Action: "PUT",
Value: true
}
}
};
// Update the user.
docClient.update(params, function(err, data)
{
if (err)
{
console.log(JSON.stringify(err));
context.fail(JSON.stringify(err));
return;
}
context.succeed("User successfully updated.");
});
});
};
Do you have any idea of what could be wrong in my code?
You are only providing half of your primary key. Your primary key is a combination of the partition key and range key. You need to include the range key in your Key attribute in the update parameters.
For others who have faced the same challenge and the issue is not fixed by above answers, it is always better to double check the data type of the value being updated, in my case the primary key was expecting a Number and I was trying to update with a string. Silly me
My issue was with the Node SDK for deletes, where the documentation says to provide in format:
... {Key: {'id': {S: '123'}}} ...
Which does not appear to work with the aws-sdk ^2.1077.0. This seems to work:
... {Key: {'id': '123'}} ...
My checklist when facing this issue:
Check that the name and type of your key correspond to what you have in the database.
Use corresponding attributes to make it explicit. E.g. use #DynamoDBHashKey(attributeName = "userId") in Java to indicate the partition key named userId.
Ensure that only one field or getter marked as partition key in your class.
Please, add more if you know in the comments.
I was doing BatchGetItem, then streamed it to BatchWriteItem (Delete). DeleteItem didn't like it got all attributes from the object instead of only partition and sort key.
Gathering all answers:
mismatch in an attribute name
mismatch in attribute type
half key provided
unnecessary additional keys