Record update timestamp not being updated with DynamoDB Enhanced Client - amazon-web-services

My Kotlin Spring Boot application is using the DynamoDB enhanced client to persist a record - the object has a createdAt and updatedAt field which I'm attempting to get DynamoDB to set automatically as appropriate whenever the record is created/modified, by annotating both timestamp fields with #get:DynamoDbAutoGeneratedTimestampAttribute.
While the fields do get set on create, when the update of the record does trigger, the updatedAt field is never modified to reflect the latest timestamp, as it should have with the update behavior set to WRITE_ALWAYS. What am I missing here?
#DynamoDbBean
data class Record(
#get:DynamoDbPartitionKey
var uuid: String = UUID.randomUUID().toString(),
#get:DynamoDbUpdateBehavior(UpdateBehavior.WRITE_IF_NOT_EXISTS)
#get:DynamoDbAutoGeneratedTimestampAttribute
var createdAt: Instant? = null,
#get:DynamoDbUpdateBehavior(UpdateBehavior.WRITE_ALWAYS)
#get:DynamoDbAutoGeneratedTimestampAttribute
var updatedAt: Instant? = createdAt
)
#Bean
fun dynamoDbEnhancedClient(): DynamoDbEnhancedClient =
DynamoDbEnhancedClient.builder()
.dynamoDbClient(dynamoDbClient())
.extensions(AutoGeneratedTimestampRecordExtension.create())
.build()

Related

Filtering List Query By Another Table's Field (a.k.a Cross-Table or Nested Filtering) in AWS Amplify GraphQL DynamoDB

Which Category is your question related to?
DynamoDB, AppSync(GraphQL)
Amplify CLI Version
4.50.2
Provide additional details e.g. code snippets
BACKGROUND:
I'm new in AWS serverless app systems and as a frontend dev, I'm quite enjoying it thanks to auto-generated APIs, tables, connections, resolvers etc. I'm using Angular/Ionic in frontend and S3, DynamoDB, AppSync, Cognito, Amplify-cli for the backend.
WHAT I HAVE:
Here is a part of my schema. I can easily use auto-generated APIs to List/Get Feedbacks with additional filters (i.e. score: { ge: 3 }). And thanks to the #connection I can see the User's details in the listed Feedback items.
type User #model #auth(rules: [{ allow: owner }]) {
id: ID!
email: String!
name: String!
region: String!
sector: String!
companyType: String!
}
type Feedback #model #auth(rules: [{ allow: owner }]) {
id: ID!
user: User #connection
score: Int!
content: String
}
WHAT I WANT:
I want to list Feedbacks based on several fields on User type, such as user's region (i.e. user.region: { contains: 'United States' }). Now I searched for a solution quite a lot like, #2311 , and I learned that amplify codegen only creates top-level filtering. In order to use cross-table filtering, I believe I need to modify resolvers, lambda functions, queries and inputs. Which, for a beginner, it looks quite complex.
WHAT I TRIED/CONSIDERED:
I tried listing all Users and Feedbacks separately and filtering them in front-end. But then the client downloads all these unnecessary data. Also because of the pagination limit, user experience takes a hit as they see an empty list and repeatedly need to click Load More button.
Thanks to some suggestions, I also thought about duplicating the User details in Feedback table to be able to search/filter them. Then the problem is that if User updates his/her info, duplicated values will be out-of-date. Also there will be too many duplicated data, as I need this feature for other tables also.
I also heard about using ElasticSearch for this problem but someone mentioned for a simple filtering he got 30$ monthly cost, so I got cold feet.
I tried the resolver solution to add a custom filtering in it. But I found that quite complex for a beginner. Also I will need this cross-table filtering in many other tables as well, so I think would be hard to manage. If that is the best-practice, I'd appreciate it if someone can guide me through it.
QUESTIONS:
What would be the easiest/beginner-friendly solution for me to achieve this cross-table filtering? I am open to alternative solutions.
Is this cross-table filtering a bad approach for a no-SQL setup? Since I need some relationship between two tables. (I thought #connection would be enough). Should I switch to an SQL setup before it is too late?
Is it possible for Amplify to auto-generate a solution for this in the future? I feel like many people are experiencing the same issue.
Thank you in advance.
Amplify, and really DynamoDB in general, requires you to think about your access patterns ahead of time. There is a lot of really good information out there to help guide you through what this thought process can look like. Particularly, I like Nader Dabit's https://dev.to/dabit3/data-modeling-in-depth-with-graphql-aws-amplify-17-data-access-patterns-4meh
At first glance, I think I would add a new #key called byCountry to the User model, which will create a new Global Secondary Index on that property for you in DDB and will give you some new query methods as well. Check out https://docs.amplify.aws/cli/graphql-transformer/key#designing-data-models-using-key for more examples.
Once you have User.getByCountry in place, you should then be able to also bring back each user's Feedbacks.
query USAUsersWithFeedbacks {
listUsersByCountry(country: "USA") {
items {
feedbacks {
items {
content
}
nextToken
}
}
nextToken
}
}
Finally, you can use JavaScript to fetch all while the nextToken is not null. You will be able to re-use this function for each country you are interested in and you should be able to extend this example for other properties by adding additional #keys.
My former answer can still be useful for others in specific scenarios, but I found a better way to achieve nested filtering when I realized you can filter nested items in custom queries.
Schema:
type User #model {
id: ID!
email: String!
name: String!
region: String!
sector: String!
companyType: String!
feedbacks: [Feedback] #connection # <-- User has many feedbacks
}
Custom query:
query ListUserWithFeedback(
$filter: ModelUserFilterInput # <-- Filter Users by Region or any other User field
$limit: Int
$nextToken: String
$filterFeedback: ModelFeedbackFilterInput # <-- Filter inner Feedbacks by Feedback fields
$nextTokenFeedback: String
) {
listUsers(filter: $filter, limit: $limit, nextToken: $nextToken) {
items {
id
email
name
region
sector
companyType
feedbacks(filter: $filterFeedback, nextToken: $nextTokenFeedback) {
items {
content
createdAt
id
score
}
nextToken
}
createdAt
updatedAt
}
nextToken
}
}
$filter can be something like:
{ region: { contains: 'Turkey' } }
$filterFeedback can be like:
{
and: [{ content: { contains: 'hello' }, score: { ge: 4 } }]
}
This way both Users and Feedbacks can be filtered at the same time.
Ok thanks to #alex's answers I implemented the following. The idea is instead of listing Feedbacks and trying to filter them by User fields, we list Users and collect their Feedbacks from the response:
Updated schema.graphql as follows:
type User
#model
#auth(rules: [{ allow: owner }])
#key(name: "byRegion", fields: ["region"], queryField: "userByRegion") # <-- added byRegion key {
id: ID!
email: String!
name: String!
region: String!
sector: String!
companyType: String!
feedbacks: [Feedback] #connection # <-- added feedbacks connection
}
Added userFeedbacksId parameter while calling CreateFeedback. So they will appear while listing Users.
Added custom query UserByRegionWithFeedback under src/graphql/custom-queries.graphl and used amplify codegen to build it:
query UserByRegionWithFeedback(
$region: String
$sortDirection: ModelSortDirection
$filter: ModelUserFilterInput
$limit: Int
$nextToken: String # <-- nextToken for getting more Users
$nextTokenFeedback: String # <-- nextToken for getting more Feedbacks
) {
userByRegion(
region: $region
sortDirection: $sortDirection
filter: $filter
limit: $limit
nextToken: $nextToken
) {
items {
id
email
name
region
sector
companyType
feedbacks(nextToken: $nextTokenFeedback) {
items {
content
createdAt
id
score
}
nextToken
}
createdAt
updatedAt
owner
}
nextToken
}
}
Now I call this API like the following:
nextToken = {
user: null,
feedback: null
};
feedbacks: any;
async listFeedbacks() {
try {
const res = await this.api.UserByRegionWithFeedback(
'Turkey', // <-- region: filter Users by their region, I will add UI input later
null, // <-- sortDirection
null, // <-- filter
null, // <-- limit
this.nextToken.feedback == null ? this.nextToken.user : null, // <-- User nextToken: Only send if Feedback NextToken is null
this.nextToken.feedback // <-- Feedback nextToken
);
// Get User NextToken
this.nextToken.user = res.nextToken;
// Initialize Feedback NextToken as null
this.nextToken.feedback = null;
// Loop Users in the response
res.items.map((user) => {
// Get Feedback NextToken from User if it is not null (Or else last User in the list could overrite it)
if (user.feedbacks.nextToken) {
this.nextToken.feedback = user.feedbacks.nextToken;
}
// Push the feedback items into the list to diplay in UI
this.feedbacks.push(...user.feedbacks.items);
});
} catch (error) {
this.handleError.show(error);
}
}
Lastly I added a Load More button in the UI which calls listFeedbacks() function. So if there is any Feedback NextToken, I send it to the API. (Note that multiple user feedbacks can have a nextToken).
If all feedbacks are ok and if there is a User NextToken, I send that to the API and repeat the process for new Users.
I believe this could be much simpler with an SQL setup, but this will work for now. I hope it helps others in my situation. And if there is any ideas to make this better I'm all ears.

Store error: the application attempted to write an object with no provided id but the store already contains an id of <connectionName> for this object

Note - I'm quite new to GraphQL and I've seen other stackoverflow questions of this error being reported but they were using Apollo. Here, I am using AWS Amplify and AppSync's own GraphQL client. So I couldn't use those solutions.
tldr - I'm trying to fetch a list of items from the db, but I keep getting a cryptic Network error and a store error that I don't understand. Details:-
This is my client definition for the AWS Appsync Client:-
export const client = new AWSAppSyncClient({
url: awsconfig.aws_appsync_graphqlEndpoint,
region: awsconfig.aws_appsync_region,
auth: {
type: awsconfig.aws_appsync_authenticationType,
jwtToken: async () =>
(await Auth.currentSession()).getAccessToken().getJwtToken(),
},
});
This is my query method:-
listInstitutions = () => {
client
.query({
query: queries.ListInstitutions,
})
.then((res: any) => {
this.institutions = res.data.listInstitutions.items;
console.log('this.institutions', this.institutions);
})
.catch((err) => {
console.error(err);
this.institutions = [];
});
};
This is my query definition:-
query ListInstitutions(
$filter: ModelInstitutionFilterInput
$limit: Int
$nextToken: String
) {
listInstitutions(filter: $filter, limit: $limit, nextToken: $nextToken) {
items {
id
name
location
city
website
phone
logo
bio
admins {
id
name
title
bio
createdAt
updatedAt
owner
}
classes {
nextToken
}
learners {
nextToken
}
createdAt
updatedAt
}
nextToken
}
}
The error in the console looks like this:-
Error: Network error: Error writing result to store for query:
query ListInstitutions($filter: ModelInstitutionFilterInput, $limit: Int, $nextToken: String) {
listInstitutions(filter: $filter, limit: $limit, nextToken: $nextToken) {
items {
id
name
location
city
website
phone
logo
bio
admins {
id
name
title
bio
createdAt
updatedAt
owner
__typename
}
classes {
nextToken
__typename
}
learners {
nextToken
__typename
}
createdAt
updatedAt
__typename
}
nextToken
__typename
}
}
Store error: the application attempted to write an object with no provided id but the store already contains an id of ModelInstitutionConnection:undefined for this object. The selectionSet that was trying to be written is:
listInstitutions(filter: $filter, limit: $limit, nextToken: $nextToken) {
items {
id
name
location
city
website
phone
logo
bio
admins {
id
name
title
bio
createdAt
updatedAt
owner
__typename
}
classes {
nextToken
__typename
}
learners {
nextToken
__typename
}
createdAt
updatedAt
__typename
}
nextToken
__typename
}
at new ApolloError (ApolloError.js:37)
at QueryManager.js:326
at QueryManager.js:698
at Array.forEach (<anonymous>)
at QueryManager.js:697
at Map.forEach (<anonymous>)
at QueryManager.push.lq9a.QueryManager.broadcastQueries (QueryManager.js:692)
at QueryManager.js:275
at ZoneDelegate.invoke (zone-evergreen.js:372)
at Object.onInvoke (core.js:28510)
I have to note that this error vanishes when I add the cacheOptions configuration to the AWS Appsync client definition, like so:-
export const client = new AWSAppSyncClient({
url: awsconfig.aws_appsync_graphqlEndpoint,
region: awsconfig.aws_appsync_region,
auth: {
type: awsconfig.aws_appsync_authenticationType,
jwtToken: async () =>
(await Auth.currentSession()).getAccessToken().getJwtToken(),
},
cacheOptions: {
dataIdFromObject: (obj: any) => `${obj.__typename}:${obj.myKey}`,
},
});
But even though the error goes away, it doesn't actually fetch the items from the dynamoDB. It just always returns an empty array.
I don't know why I'm getting this kind of an error even though all of my graphql code is autogenerated using the aws amplify-cli and I'm following the approach as seen in the documentation
I just want the query to fetch the items from the database. What should I do?
I figured it out. The issue was in the GraphQL Schema definition where I had set the #auth paramter to only allow a certain admin to access the list, that's why I was getting back an empty array. I suppose this particular error was caused by me not leaving in the cacheoptions in the client definition. The cacheoptions as specified at the end of the question will fix this issue.

How to configure apollo cache to uniquely identify a child elements based on their parent primary key

What is the proper way to configure apollo's cache normalization for a child array fields that do not have an ID of their own but are unique in the structure of their parent?
Let's say we have the following schema:
type Query {
clients: [Client!]!
}
type Client {
clientId: ID
name: String!
events: [Events!]!
}
type Events {
month: String!
year: Int!
day: Int!
clients: [Client!]!
}
At first I thought I can use multiple keyFields to achieve a unique identifier like this:
const createCache = () => new InMemoryCache({
typePolicies: {
Event: {
keyFields: ['year', 'month', 'name'],
},
},
});
There would never be more than 1 event per day so it's safe to say that the event is unique for a client based on date
But the created cache entries lack a clientId (in the cache key) so 2 events that are on the same date but for different clients cannot be distinguished
Is there a proper way to configure typePolicies for this relationship?
For example the key field can be set to use a subfield:
const cache = new InMemoryCache({
typePolicies: {
Book: {
keyFields: ["title", "author", ["name"]],
},
},
});
The Book type above uses a subfield as part of its primary key. The ["name"] item indicates that the name field of the previous field in the array (author) is part of the primary key. The Book's author field must be an object that includes a name field for this to be valid.
In my case I'd like to use a parent field as part of the primary key
If you can't add a unique event id, then the fallback is to disable normalization:
Objects that are not normalized are instead embedded within their parent object in the cache. You can't access these objects directly, but you can access them via their parent.
To do this you set keyFields to false:
const createCache = () => new InMemoryCache({
typePolicies: {
Event: {
keyFields: false
},
},
});
Essentially each Event object will be stored in the cache under its parent Client object.

AWS AppSync only returns 10 items on query on connection

I'm new to AppSync and trying to see how this works and what's the proper way to set this up.
I created schema.graphql looks like below.
type User #model {
id: String!
following: [String]
follower: [String]
journals: [Journal] #connection(name: "UserJournals", sortField: "createdAt")
notifications: [Notification] #connection(name: "UserNotifications", sortField: "createdAt")
}
type Journal #model {
id: ID!
author: User! #connection(name: "UserJournals")
privacy: String!
content: AWSJSON!
loved: [String]
createdAt: String
updatedAt: String
}
and this created queries.js automatically by AppSync.
export const getUser = `query GetUser($id: ID!) {
getUser(id: $id) {
id
following
follower
journals {
items {
id
privacy
content
loved
createdAt
updatedAt
}
nextToken
}
notifications {
items {
id
content
category
link
createdAt
}
nextToken
}
}
}
`;
I noticed that querying getUser only returns 10 journals items and not sure how to set that to more than 10 or proper way to query and add more journals into that 10 items that were queried by getUser.
Since you do not pass the limit argument explicitly in your query, the Request Mapping Template of the journals resolver defaults it to 10 items. If you would like to change this default value, go to your schema page on the AppSync console, navigate to the journals field, found under the Resolvers section of the schema page. This will then show the resolver definition for this field, and you can then update the default value of 10 to anything you like. Alternatively, you can pass this as your query argument.
FYI - This default value is defined in the amplify-cli repo on GitHub and can be found here.

AWS Lambda update DynamoDB item

I have a DynamoDB table called Wishlist, and an existing DynamoDB Item which I'm calling "monitor".
I am trying to write a Lambda function that updates the "monitor" item as follows:
takes the user's login ID, appends #gmail.com to it, and writes it to a new email attribute
writes a timestamp to the item
Here is my code:
console.log('Loading function');
var doc = require('dynamodb-doc');
var db = new doc.DynamoDB();
exports.handler = function(event, context)
{
var username = event.username;
var email = event.username+"#gmail.com";
console.log(username + "," + email);
var tableName = "WishList";
var item = {
"username":username,
"email": email,
};
var params = {
TableName:tableName,
Item: item
};
console.log(params);
db.putItem(params, function(err,data){
if (err) console.log(err);
else console.log(data);
});
};
How do I read the existing "monitor" item so that I can update it with putItem?
If I understand your question, you need to:
get the existing item by its key using getItem
modify the returned item
put the modified item using putItem
Alternatively, you can simply use updateItem which will edit an existing item's attributes, or add a new item to the table if it does not already exist.
You can see sample code here.