Updating objects in an Ember many to many relationship - ember.js

I have an app which allows me to create a message which can be sent to N number of social networks, modeled as follows:
Social.Account = DS.Model.extend({
username: DS.attr("string"),
messages: DS.hasMany("Social.Message")
});
Social.Message = DS.Model.extend({
text: DS.attr("string"),
account: DS.belongsTo("Social.Account")
});
My code for creating a record looks like this:
saveMessage: function(){
var account = Social.Account.find(this.get("id")),
msg = Social.store.createRecord(
Social.Message,
{
text: postingWindow.get('text'),
account: account,
}
);
account.get("messages").addObject(msg);
Social.store.commit();
}
With the form data looking like this:
text:testing one three five seven
message_key:
tags:
user_id:
created:Wed, 19 Jun 2013 21:39:14 GMT
scheduled_at:2013-06-20T03:00:00.000Z
is_editing:false
status:C
account:56
That part works excellent. Now we're working on scheduling future messages, which offers the user the chance to edit them after they've been saved, but before they've been sent. On the surface my implementation seems to work properly, but I noticed that Ember was not sending the account information along with the post, which meant that when my server code ran, the message was being orphaned in the database. Here's the update code:
sendNow: function(message){
var account = Social.Account.find( this.get('controllers.account.id') );
message.setProperties({
scheduled_at: '',
accounts: [account]
// I've also tried accounts: this.get('controllers.account.id')
});
Social.store.commit();
}
And the corresponding form data:
text:testing one three five seven
message_key:
tags:
user_id:17
created:Wed, 19 Jun 2013 21:39:14 GMT
scheduled_at:
is_editing:false
No matter what I do, Ember isn't sending the account data to the server. Is there a more appropriate way to update a record and cause it to be saved properly?

You could try:
message.get('accounts').pushObject(account')

Are you sure the var account is being set? Seems like you are inconsistent with the get statements:
var account = Social.Account.find(this.get("id")) , <- double quotes
var account = Social.Account.find(this.get('controllers.account.id')); <- single quotes and different object structure?
Also, did you try:
message.set('accounts', account);

Related

Best approach for Posts and PostReactions in AWS Amplify and DynamoDB

I am working on chat functionality using AWS Amplify and I have a simple Post model in my graphql schema:
type Post
...
{
id: ID!
channelId: ID #index(
name: "byChannel", sortKeyFields: ["createdAt"],
queryField: "listPostsByChannel"
)
customerId: ID #index(
name: "byCustomer", sortKeyFields: ["postType", "createdAt"]
)
text: String!
postTempId: String
postType: String
reactions: [PostReaction] #hasMany(fields: ["id"])
createdAt: AWSDateTime
updatedAt: AWSDateTime
}
What I want to achieve is to have similar to other popular chat apps - reactions with emojis attached to each post, so I've created another table and the PostReaction model.
type PostReaction
...
{
postId: ID! #primaryKey(sortKeyFields: ["customerId", "emojiUnicode"])
customerId: String!
customerMeta: CustomerMeta
emojiUnicode: String!
createdAt: AWSDateTime
updatedAt: AWSDateTime
}
Of course, each customer could add multiple emojis to a single post, the custom primary key is for handling duplicates later.
There is one disadvantage here.
Emojis will be listed in an array in the reactions field in the post, even if it's the same emoji added by many people.
Instead of a simple array of reactions that frontend would need to merge for each post, the best would be to get a result from the AppSync query for each Post like:
...
reactions: [{
emojiUnicode: "U+1F44D",
customerIds: ["ID1234", "ID5678"],
...
}, {...}]
I thought that I can use a JSON object in the reactions field, but the DynamoDB has the max size limit for a single item which is 400KB. That's not a problem for now, but next when I will add more attributes to the Post model, and when there will be many reactions from many people at the same time, this might be an issue.
Is there an option how to achieve this in the simplest way?
Best thing to not over-complicate your schema would be to enforce a maximum number of emojis just as Slack does for example:
You can add up to 23 emoji reactions to any message, but the maximum per message is 50 unique emoji.
Other than that, you could keep an item for each emoji reacted
pk
sk
data
thread123
metadata
metadata about thread
thread123
post#001
First message in thread
thread123
post#002
Second message in thread
thread123
post#003
Third message in thread
thread123
post#003#emoji#U+1F44D
[user1, user2, user45]
thread123
post#003#emoji#U+1F33R
[user56, user8, user7, user10]
Now when you want all the data to populate a given thread on your UI, you just issue a query with the pk as a parameter:
SELECT * FROM table WHERE PK = 'thread123'

Filtering List Query By Another Table's Field (a.k.a Cross-Table or Nested Filtering) in AWS Amplify GraphQL DynamoDB

Which Category is your question related to?
DynamoDB, AppSync(GraphQL)
Amplify CLI Version
4.50.2
Provide additional details e.g. code snippets
BACKGROUND:
I'm new in AWS serverless app systems and as a frontend dev, I'm quite enjoying it thanks to auto-generated APIs, tables, connections, resolvers etc. I'm using Angular/Ionic in frontend and S3, DynamoDB, AppSync, Cognito, Amplify-cli for the backend.
WHAT I HAVE:
Here is a part of my schema. I can easily use auto-generated APIs to List/Get Feedbacks with additional filters (i.e. score: { ge: 3 }). And thanks to the #connection I can see the User's details in the listed Feedback items.
type User #model #auth(rules: [{ allow: owner }]) {
id: ID!
email: String!
name: String!
region: String!
sector: String!
companyType: String!
}
type Feedback #model #auth(rules: [{ allow: owner }]) {
id: ID!
user: User #connection
score: Int!
content: String
}
WHAT I WANT:
I want to list Feedbacks based on several fields on User type, such as user's region (i.e. user.region: { contains: 'United States' }). Now I searched for a solution quite a lot like, #2311 , and I learned that amplify codegen only creates top-level filtering. In order to use cross-table filtering, I believe I need to modify resolvers, lambda functions, queries and inputs. Which, for a beginner, it looks quite complex.
WHAT I TRIED/CONSIDERED:
I tried listing all Users and Feedbacks separately and filtering them in front-end. But then the client downloads all these unnecessary data. Also because of the pagination limit, user experience takes a hit as they see an empty list and repeatedly need to click Load More button.
Thanks to some suggestions, I also thought about duplicating the User details in Feedback table to be able to search/filter them. Then the problem is that if User updates his/her info, duplicated values will be out-of-date. Also there will be too many duplicated data, as I need this feature for other tables also.
I also heard about using ElasticSearch for this problem but someone mentioned for a simple filtering he got 30$ monthly cost, so I got cold feet.
I tried the resolver solution to add a custom filtering in it. But I found that quite complex for a beginner. Also I will need this cross-table filtering in many other tables as well, so I think would be hard to manage. If that is the best-practice, I'd appreciate it if someone can guide me through it.
QUESTIONS:
What would be the easiest/beginner-friendly solution for me to achieve this cross-table filtering? I am open to alternative solutions.
Is this cross-table filtering a bad approach for a no-SQL setup? Since I need some relationship between two tables. (I thought #connection would be enough). Should I switch to an SQL setup before it is too late?
Is it possible for Amplify to auto-generate a solution for this in the future? I feel like many people are experiencing the same issue.
Thank you in advance.
Amplify, and really DynamoDB in general, requires you to think about your access patterns ahead of time. There is a lot of really good information out there to help guide you through what this thought process can look like. Particularly, I like Nader Dabit's https://dev.to/dabit3/data-modeling-in-depth-with-graphql-aws-amplify-17-data-access-patterns-4meh
At first glance, I think I would add a new #key called byCountry to the User model, which will create a new Global Secondary Index on that property for you in DDB and will give you some new query methods as well. Check out https://docs.amplify.aws/cli/graphql-transformer/key#designing-data-models-using-key for more examples.
Once you have User.getByCountry in place, you should then be able to also bring back each user's Feedbacks.
query USAUsersWithFeedbacks {
listUsersByCountry(country: "USA") {
items {
feedbacks {
items {
content
}
nextToken
}
}
nextToken
}
}
Finally, you can use JavaScript to fetch all while the nextToken is not null. You will be able to re-use this function for each country you are interested in and you should be able to extend this example for other properties by adding additional #keys.
My former answer can still be useful for others in specific scenarios, but I found a better way to achieve nested filtering when I realized you can filter nested items in custom queries.
Schema:
type User #model {
id: ID!
email: String!
name: String!
region: String!
sector: String!
companyType: String!
feedbacks: [Feedback] #connection # <-- User has many feedbacks
}
Custom query:
query ListUserWithFeedback(
$filter: ModelUserFilterInput # <-- Filter Users by Region or any other User field
$limit: Int
$nextToken: String
$filterFeedback: ModelFeedbackFilterInput # <-- Filter inner Feedbacks by Feedback fields
$nextTokenFeedback: String
) {
listUsers(filter: $filter, limit: $limit, nextToken: $nextToken) {
items {
id
email
name
region
sector
companyType
feedbacks(filter: $filterFeedback, nextToken: $nextTokenFeedback) {
items {
content
createdAt
id
score
}
nextToken
}
createdAt
updatedAt
}
nextToken
}
}
$filter can be something like:
{ region: { contains: 'Turkey' } }
$filterFeedback can be like:
{
and: [{ content: { contains: 'hello' }, score: { ge: 4 } }]
}
This way both Users and Feedbacks can be filtered at the same time.
Ok thanks to #alex's answers I implemented the following. The idea is instead of listing Feedbacks and trying to filter them by User fields, we list Users and collect their Feedbacks from the response:
Updated schema.graphql as follows:
type User
#model
#auth(rules: [{ allow: owner }])
#key(name: "byRegion", fields: ["region"], queryField: "userByRegion") # <-- added byRegion key {
id: ID!
email: String!
name: String!
region: String!
sector: String!
companyType: String!
feedbacks: [Feedback] #connection # <-- added feedbacks connection
}
Added userFeedbacksId parameter while calling CreateFeedback. So they will appear while listing Users.
Added custom query UserByRegionWithFeedback under src/graphql/custom-queries.graphl and used amplify codegen to build it:
query UserByRegionWithFeedback(
$region: String
$sortDirection: ModelSortDirection
$filter: ModelUserFilterInput
$limit: Int
$nextToken: String # <-- nextToken for getting more Users
$nextTokenFeedback: String # <-- nextToken for getting more Feedbacks
) {
userByRegion(
region: $region
sortDirection: $sortDirection
filter: $filter
limit: $limit
nextToken: $nextToken
) {
items {
id
email
name
region
sector
companyType
feedbacks(nextToken: $nextTokenFeedback) {
items {
content
createdAt
id
score
}
nextToken
}
createdAt
updatedAt
owner
}
nextToken
}
}
Now I call this API like the following:
nextToken = {
user: null,
feedback: null
};
feedbacks: any;
async listFeedbacks() {
try {
const res = await this.api.UserByRegionWithFeedback(
'Turkey', // <-- region: filter Users by their region, I will add UI input later
null, // <-- sortDirection
null, // <-- filter
null, // <-- limit
this.nextToken.feedback == null ? this.nextToken.user : null, // <-- User nextToken: Only send if Feedback NextToken is null
this.nextToken.feedback // <-- Feedback nextToken
);
// Get User NextToken
this.nextToken.user = res.nextToken;
// Initialize Feedback NextToken as null
this.nextToken.feedback = null;
// Loop Users in the response
res.items.map((user) => {
// Get Feedback NextToken from User if it is not null (Or else last User in the list could overrite it)
if (user.feedbacks.nextToken) {
this.nextToken.feedback = user.feedbacks.nextToken;
}
// Push the feedback items into the list to diplay in UI
this.feedbacks.push(...user.feedbacks.items);
});
} catch (error) {
this.handleError.show(error);
}
}
Lastly I added a Load More button in the UI which calls listFeedbacks() function. So if there is any Feedback NextToken, I send it to the API. (Note that multiple user feedbacks can have a nextToken).
If all feedbacks are ok and if there is a User NextToken, I send that to the API and repeat the process for new Users.
I believe this could be much simpler with an SQL setup, but this will work for now. I hope it helps others in my situation. And if there is any ideas to make this better I'm all ears.

Ember - Issue with HTTP POST request

I have written a (very) simple RESTFul Web service to retrieve data from MongoDB using Node, Express and Mongoose.
On the server side, I have this code:
router.route('/products').post(function(req,res){
var product = new Product(req.body);
product.save(function(err){
if(err)
res.send(err);
res.send({message:'Product Added'});
});
When I submit a request from my Ember client, the req.body contains something like the following:
{ attributes:
{ category: 1,
name: 'y',
price: 1,
active: false,
notes: null } }
The attribute names are exactly the same as my mongoose schema. I get no error but the document created in MongoDB is empty (just get the _id and __v fields).
What am I doing wrong. Should I convert the req.body further into ???
A couple things that will help debug:
1) From a quick glance (I haven't used mongoose before) it looks like call back function passed to save takes two arguments.
2) I don't know if your code got cut off, but the sample above was missing a matching });
3) I made the function short circuit itself on error, so you will not see 'Product added' unless that is truly the case.
Try these fixes.
router.route('/products').post(function(req,res){
var product = new Product(req.body);
product.save(function(err, product){
if(err){
return res.send(err);
}
return res.send({message:'Product Added'});
});
});
The issue was related to my lack of familiarity with Ember and Node+Express. The data received in the server is slightly different from what I had first indicated: (first line was missing)
{ product:
{ attributes:
{ category: ... } } }
On the server side I can access my data using req.body.product.attributes (instead of req.body):
router.route('/products').post(function(req,res){
var product = new Product(req.body.product.attributes);
product.save(function(err){
if(err)
res.send(err);
res.send({message:'Product Added'});
});

Ember Websocket...can't find any records?

So, I'm stuck not being able to bring in arrays from my livestream websocket, which is coming through as JSON.
Not seeing any records in ember inspector, but plenty is printing out with console.log(data). Getting error:
-94 Uncaught Error: Assertion Failed: You must include an `id` in a hash passed to `push`
(but there is an ID included in each livestream update).
Here's the code: http://jsbin.com/qapik/1/edit?html,js,output
JSON looks like...
{
"group":{
"usage":{
"case1":0,
"case2":0,
"case3":0
},
"sunshine":"00/00/0000",
"id":1010,
"device_info":11.5,
}
}
With the console showing updates...
Tue Apr 01 2014 09:22:09 GMT-0400 (EDT): group update: {"group": ...
At the end of the day, I want to show {{#each}} {{device_info}}... and more.
Where am I going wrong?
Thanks!
Edit - Solution:
App.ApplicationRoute = Ember.Route.extend({
activate: function() {
var socket = window.io.connect('http://localhost:8887');
var self = this;
socket.on('group_live_stream', function(data){
var dataObj = JSON.parse(data); // data happens to be a JSON string
self.store.push('group',dataObj.group);
});
}
});
Objects pushed onto the store should be formatted like in the example here: http://emberjs.com/api/data/classes/DS.Store.html#method_push
{
"usage":{
"case1":0,
"case2":0,
"case3":0
},
"sunshine":"00/00/0000",
"id":1010,
"device_info":11.5
}
In other words, when you're pushing it onto the store, the object should not be wrapped in group. This is notably different than how Ember Data expects JSON responses using it's REST adapter (when ED gets a group record, it does indeed expect an object like {group:{...}}).

Ember data sideloaded models ignored

I'm new to Ember, and am having a problem I'm not seeing duplicated anywhere else so I'm sure I'm doing something silly. I have these models:
App.User = DS.Model.extend({
username: DS.attr("string"),
userId: DS.attr("number"),
modules: DS.hasMany("App.Module")
});
App.Module = DS.Model.extend({
moduleId: DS.attr("number"),
name: DS.attr("string")
});
Note that my Module model is simply a container that a User can have a few of, and many users might share the same modules - they're actually represented by an Enum on the server. As such, the backend does not have a mapping of Module > Users, so I've left the DS.ownedBy or DS.hasMany out of App.Module. I have, however, tried my test with a "users: DS.hasMany('App.User')" in there as well, with the same result. If it turns out to be necessary, I could maintain such a mapping, but I have no other reason for doing so right now, so it would be a bit unfortunate to be forced to add such a table to my database.
I'm using Ember.Auth for authentication, and when my app loads up and I log in, the server requests (as expected):
users/nathan?authToken=<token>
The result is also as I think it should be, according to the ember data docs:
{
"user": {
"username": "nathan",
"user_id": 1,
"module_ids": [1,2]
},
"modules": [
{"module_id": 1, "name": "Account Settings"},
{"module_id": 2, "name": "Manage Websites"}
]
}
I'm then testing in the Chrome console with:
App.Auth.get("user").get("modules");
or
App.User.find("nathan").get("modules");
and in both cases, Ember makes a request to my server to get the data for Modules 1 and 2. If I repeat the same request, it doesn't go back to the server again, so it is storing the result properly that time, it's simply the sideload that it's ignoring.
I'm using ember-1.0.0-rc4 with ember-data-0.13.
In your sideload response, module_id should be id.
(or you can configure ember-data to use module_id, but formatting the server response should be the easier way?)
--- edit for explanation ---
I'm not sure the original REST call is "working perfectly". Without the id. ember-data does see the two modules that your originally sideloaded, but it sees no id, so it does not know that they are modules 1 and 2 respectively. By default (changeable), ember-data expects the id to be called id; your module_id (and user_id) are just treated as regular properties.
On your next API call (to /modules?ids[]=1&ids[]=2 presumably), ember-data silently assumes that, since your requested two modules, and two modules came back, they should be the two that you requested. Try sending back this
{
"modules": [
{"module_id": 3, "name": "Foo module"},
{"module_id": 4, "name": "Bar module"}
]
}
for the request /modules?ids[]=1&ids[]=2, you will still get your "successful" observation.
Try App.Module.find(1).get('name') with the module_id-response - you will get undefined; now try it with the id-response, and you will get Account settings as expected.
Have you configured your RestAdapter to sideload modules?
Like this:
DS.RESTAdapter.configure('App.Module', {
sideloadsAs: 'modules'
});