Appsync query on relation child object - amazon-web-services

I am in the situation of mutation to create relations on AWS AppSync
I have events in one table, comments in another. In my graphql events type I have a field for list of comments with its own resolver
I would like to know how I can request only events that have comments, in the same way.
My situation makes I need also to filter which events I want to show so I can not start by getting list of distinct events in the comment table.
I would not like if it can be avoided to create a pipeline with specific resolver, to avoid code duplication.
Code can be retrieved to the linked stackoverflow question.
Thank you for help

Related

Single table db architecture with AWS Amplify

By default AWS Amplify transformers creating tables per each graphql type.
But according DynamoDB documentation it's best practice to
Keep tables few as possible
Keep often queried together entries within a same table
I have an impression Amplify way of doing things stays in contradiction with the statement above.
I am new to both NoSQL and Amplify
Can someone suggest ways to address those issues?
I think we're in a bit of a transition or gray area here. I'm very new to Amplify and have been investigating moving to a single-table design as there are sources (below) that indicate that it's always be there but you'd have to write everything in VTL templates. But in 2020 they released direct lambda resolver support: https://youtu.be/EOQqi6Yun7g?t=960 (clip)
However, it seems like you lose access to the #auth directive (and probably others because you're no longer going to use #model) along with a lot of the nice out-of-the-box functionality that's available with Amplify's multi-table approach.
At this point, being that I'm developing a new app, I'm going to stick with the default multi-table design to hasten the process of getting the app functional.
Trying to implement the single-table design seems to go against what the Amplify team recommends and requires more manual work. You'd have to manually create custom lambda functions (AppSync) and code queries to DynamoDb for each data access element and manage authorization through some other means which I'm not aware of at this time. Maybe someone can chime in here...
Single table vs multi table info
Using Amplify with single table:
https://youtu.be/EOQqi6Yun7g
Single vs Multi Clip:
https://youtu.be/1WF_wped808?t=1251 (clip)
https://www.alexdebrie.com/posts/dynamodb-single-table/ (towards bottom)
https://youtu.be/EOQqi6Yun7g?t=1288 (clip)
Example single table design by Alex Debrie:
https://gist.github.com/dabit3/96dc51e688b18a7d40fc534331758c56
More Discussion:
https://stackoverflow.com/a/56438716/1956540
Basic Setup steps
I setup a single table by following the below instructions. Again, you don't use #models for this. Also, I think you have to include a type query {} in your schema for it to compile, but I could be wrong here.
So the basic steps are:
Create a single table (amplify add storage)
amplify push
Create your schema in the schema.graphql file.
Create supporting lambda function (amplify add function)
Note: if you look at the example here, I believe you can create an entry point to routes to all other methods: https://gist.github.com/dabit3/96dc51e688b18a7d40fc534331758c56#lambda
Add the DynamoDb query code in the function.
amplify push
Complete steps for Setting up a single Table:
https://catalog.us-east-1.prod.workshops.aws/workshops/53b10bf8-2271-4ab4-bfd2-39e878a90dc8/en-US/lab2/1-vtl (both "Connecting to an existing DynamoDB table" and "Direct Lambda Resolver" steps)
Not trying to be negative about Amplify, it is awesome, I love what they are doing with this product. I just think it's very new to everyone and I'm hoping this post is no longer valid next year and we continue to see great progress from the team.

AWS Appsync multiple dynamodb requesst in one dynamodb resolver

I would like to know if it is possible to have multiple dynamodb request using only one dynamo resolver in AppSync?
Or the only/best way to have more complicated processing is to use a lambda function ?
Practically, no. You even cannot query on multiple indices in a single resource definition for an query, indeed.
However, if you are to use that structure for joining multiple DynamoDB tables, you can attach resolvers not to the query entry; but to the field you want to relate on other fields.
I had an issue like relating users to another table for containing the posts and I've passed it by attaching a resolver aiming the Posts field of the User type.
This issue refers to a similar problem and is quite helpful for that kind of cases: https://github.com/awslabs/aws-mobile-appsync-sdk-js/issues/17
If it is not the case of yours, you can elaborate the question. I may look like guessing your purpose for relating tables, all in all.
Have you looked at batch resolvers with AWS AppSync?https://docs.aws.amazon.com/appsync/latest/devguide/tutorial-dynamodb-batch.html
This will allow you to write to one or more tables in a single request, and also allow you to do multiple write/read/delete operations in a single request.
You can do it with pipeline resolvers
https://docs.aws.amazon.com/appsync/latest/devguide/tutorial-pipeline-resolvers.html

WSO2 DAS SiddhiQL : Dynamic event table / persist event stream

I would like to know if WSO2 Data Analytics Server allows to define a dynamic events tables or dynamic streams.
For example, imagine one event represent a car, and in this event, an attribute is the 'brand' of the car (Ford, Mercedes, Audi ...).
And I would like to add a column each time there is a new different brand. So my table would look like this :
And thus, if I receive an event with the brand 'Toyota', it would add a column to my table which would look like this:
Considering that I don't know in advance the number of different brands I will receive, I need this to be dynamic.
Dynamically changing the schema of an event table is not possible.
This is because the schema of the event table is defined when the Siddhi execution plan is deployed. Once it is deployed, the schema cannot be changed.
On the other hand,
it looks like what you need here is not an event table.
Perhaps, what you need to do is to update the schema of a table (an RDBMS table) when a certain event happens (for example, when a car event arrives with a new brand). Do you use this updated table in your Siddhi execution plan? If you do not use it, then you do not need an event table.
Please correct me if I have misunderstood your requirement.
If your requirement is to update the schema of a table when a certain event happens, then you might need to write a custom event publisher to do that. If so, please refer the document on the same: Building Custom Event Publishers.

Associate a model with a list of other models

I have two models in my django app: Schedule and Topic. I want to associate a Schedule with a sequence of Topics. This seems like a simple problem, but I'm blanking on it. What is the best way to do this?
I was going to use a ForeignKey in the Topic to refer to a Schedule, so it would correctly be a many-to-one relationship of Topics to a Schedule. But I also need to keep them correctly ordered, and I'm not sure how to do this. If the db wasn't involved, I would use a linked list, but I don't think that's the ideal solution here. Do I simply give each topic a field to track it's index within the list? I would be a little worried about loosing consistency, but I can probably update all of them at once.

Multiple SqlDependencies firing the same OnChange Event

I have a product catalog with a few hundred categories in it and I am dynamically creating an SqlDependency for each category in the catalog. The SqlCommands that these dependencies will be based on, will differ only on the categoryID. The problem that I have is that I want all these dependencies to perform different actions depending on the SqlDependency that fired them. How can I do that? Do I have to create a different OnChange event for each SqlDependency? Is there a way all these dependencies to fire the same OnChange event and this event to know which dependency fired it or receive a parameter which will be passed during the dependency creation?
This problem arised trying to create a Sql Dependency mechanism that will work with AppFabric Cache.
Thank you in advance.
See if you can look into the cache tables that asp.net is creating for you and the triggers that are being created on the original tables. Once you see what is going on, you can create the tables and triggers yourself and can implement the caching through your own asp.net code. It really is not that hard. Then, not when a table is updated(when you use SQLDependency), but relevant rows in that table are updated, you can refresh the relevant cache or write your own code to perform the whatever unique actions you want. Better off doing it yourself when you learn how to!