I'm retrieving messages from my page, but notice that the messages have two different id types.
on most of the messages they have an id as such:
"id": "m_id.363733227087070",
however some of them are formated as
"id": "m_mid.1374165333681:48689d90e8f7fa5e77",
What is the difference between m_mid and m_id? I cannot find any documentation on why some messages have an ID that is m_mid and some are m_id. Additionally, they m_mid has some sort of hashed number after it, any idea what this hex represents??
Did you find a definite answer for this?
From my observation, the first message in a conversation has id with the format m_id.<numeric id>. All subsequent messages have id with the format m_mid.<numeric id>:<hex hash>
Related
I am all new to NoSQL and specifically DynamoDB single table design. Have been going through a lot of videos and articles on the internet regarding the single-table design and finally I have put together a small design for a chat application which I am planning to build in the future.
The access patterns I have so far thought about are -
Get user details by User Id.
Get list of conversations the user is part of.
Get list of messages the user has created
Get all members of a conversation
Get all messages of a conversation
Also want to access messages of a conversation by a date range, so far I haven't figured out that one.
As per the below design, if I were to pull all messages of a conversation, is that going to pull the actual message in the message attribute which is in the message partition?
Here is the snip of the model I have created with some sample data on. Please let me know if I am in the right direction.
As per the below design, if I were to pull all messages of a conversation, is that going to pull the actual message in the message attribute which is in the message partition?
No, it will only return the IDs of a message as the actual content is in a separate partition.
I'd propose a different model - it consists of a table with a Global Secondary Indexe (GSI1). The layout is like this:
Base Table:
Partition Key: PK
Sort Key: SK
Global Secondary Index GSI1:
Partition Key: GSI1PK
Sort Key: GSI1SK
Base Table
GSI 1
Access Patterns
1.) Get user details by User Id.
GetItem on Base Table with Partition Key = PK = U#<id> and Sort Key SK = USER
2.) Get list of conversations the user is part of.
Query on Base Table with Partition Key = PK = U#<id> and Sort Key SK = starts_with(CONV#)
3.) Get list of messages the user has created
Query on GSI1 with Partition Key GSI1PK = U#<id>
4.) Get all members of a conversation
Query on Base Table with Partition Key = PK = CONV#<id> and Sort Key SK starts_with(U#)
5.) Get all messages of a conversation
Query on Base Table with Partition Key PK = CONV#<id> and Sort Key SK starts_with(MSG#)
6.) Also want to access messages of a conversation by a date range, so far I haven't figured out that one.
DynamoDB does Byte-Order Sorting in a partition - if you format all dates according to ISO 8601 in the UTC timezone, you can make the range query, e.g.:
Query on Base Table with Partition Key PK = CONV#<id> and Sort Key SK between(MSG#2021-09-20, MSG#2021-09-30)
I have a order I want to store in DynamoDB, with the following fields:
Order date: 2019-03-27 02:09pm
First Name: John
Last Name: Doe
Email: john#example.com
Phone: 555-11434
Address: 13 Lorong K Changi, Sunny Shores
City: Singapore
Zip: 654321
Country: Singapore
Status: new, confirmed, delivered
(There is no unique order identifier decreed)
At first I combined First&Last name "John Doe" as the partition key and put order date as the sort key. That worked quite well until:
I figured I can't query the partition key (name of customer). I want to be able to look up customer orders, by customer!
Secondly URLs addressing the order would look like: https://example.com/2019-03-27/John%20Doe... i.e. the space does cause some confusion. Is there a more efficient way to encode the name?
I am most keen on email address, but from researching that, it seems like email is a bad field to use.
The access patterns are pretty simple. Need a way to:
Look up an order
Search by customer (could be name, could be email)
Query by order status
I tried making a composite key with order status and order date, but that has not gone well: Replace an old item with a new item in DynamoDB
Most people in this scenario generate a UUID for the user, and make that the partition key.
If you use an email address as the partition key, it means your user cannot ever change their email address, at least not without some creative coding on your part.
It might be valid to use an email address in your case, for example if a user can never change email address. In that case you should just be able to URL encode the email address on your client. However if you want to avoid that altogether, you could accept the parameter in a Base64 encoded format, and decode it before use with DynamoDB.
If you decide to generate UUIDs and make these your partition keys, you would probably then create GSIs with partition keys of email address and order state. You can use these GSIs to access your data quickly with your specified access patterns.
I'm having issues getting an FQL query to work with an IN clause for multiple pages. The pages are publicly accessible - so I'm not sure if there a permission issue or not
Ex:
SELECT post_id, source_id, message FROM stream WHERE source_id in (40796308305, 56381779049) order by updated_time;
This is using the graph api explorer with all the permissions enabled. I will either get no data, or just simply a few posts from the last graphid specified in the IN clause.
Thoughts? This doesn't seem to be well documented in the FQL documentation.
You can do this as a multiquery. You'll get all your data in one shot. You will need to sort the results together in your script to get these into chronological order.
{
"coke":"SELECT post_id, source_id, message FROM stream WHERE source_id = 40796308305 order by updated_time",
"pepsi":"SELECT post_id, source_id, message FROM stream WHERE source_id = 56381779049 order by updated_time"
}
I have been using the graph to obtain feed / post information for pages, but have started to use FQL instead as I needed to sort by updated_time rather than the standard created_time sort returned by the graph.
I am using the stream table in FQL and I can get all the information from this I require except the equivalent of the 'type' field (i.e. Status, Link, Photo, Video etc).
When I add type into the fields list for the FQL, I get a int value back (or null) which seems to roughly translate to 46 => page status, 56 => user status, 80 => link ... etc but this field is not documented and this value does not seem to be fully consistent. I've seen a user status be equal to 56 or 237, but not sure what the context difference is to make them change.
The FQL I'm using is:
"SELECT post_id, type, message, description, comments, likes, created_time, updated_time FROM stream WHERE source_id = 40796308305 ORDER BY updated_time DESC" which I'm viewing through the Graph API Explorer /fql?q=
I can get the type information by storing up the ids and making an additional graph call such as "?ids=12345,23456,34567&fields=type" but the goal is to get this in the same call.
Does anybody know how / if this can be achieved?
Many Thanks
This was acknowledged as a bug. See https://developers.facebook.com/bugs/223855374358566
I am interested in returning back a few metrics from insights but i am having difficulty getting results from FQL.
I need to get back the following.
page impressions/likes/shares
posts impressions/likes/share/comments
For post impressions
fql?q=SELECT metric, value FROM insights WHERE object_id=#### AND metric='post_impressions_unique' AND end_time=end_time_date('2011-10-30') AND period=period('lifetime')
For page impressions
fql?q=SELECT metric, value FROM insights WHERE object_id=#### AND metric='page_impressions' AND end_time=end_time_date('2011-10-30') AND period=period('lifetime')
i get back empty data sets:
{
"data": [
]
}
What am i doing wrong?
I had the same problem at first. According to the documentation, 'page_impressions' and 'page_impressions_unique' can only take the periods of 'day', 'week', and 'days_28'. If that doesn't fix the problem, try changing to a different end date (which was my problem).