From POS system point of view, how do I list Postmates deliveries which were CREATED from Postmates app or Postmates website? Via Postmates documentation, it seems like I can only list deliveries which I created myself, but I can't list deliveries which customers created directly from Postmates website/app.
I'm trying to write application for POS system which polls for deliveries from Postmates and print directly into kitchen printer.
Related
We have created a contact centre with two contact flows and 1 customer queue flow. Under Metrics there are multiple report types which focus on the Queue or Agents. But what I need is to get a report based on the Customer's choice for the Get customer Input blocks. ie., path taken by the customer at each intersection. Is there a way to achieve this ?
Example:
customer selected option A at level 1 and Option 3 at Level 2, contact flow name etc.. I believe these info will reside in CTRs(Contact attributes) but how to get a cumulative report on all records. Because as far I checked we can only get one contact at a time using Contact search.
You can stream the CTR data to a redshift database using the streaming feature in Amazon Connect. The general steps are to create a kinesis stream and then turn on streaming in Amazon connect and attach the kinesis stream.
There is a quickstart to help set this up here: https://aws.amazon.com/quickstart/connect/data-streaming/
Is it possible to have multiple sessions running parallel to a single AWS Lex bot? I have a chatbot application with two intents: Order Pizza and Book Ticket in a single bot. A user gives the first query to book a ticket and simultaneously different user queries about ordering pizza from a different machine. How to track both the requests as separate sessions in Lex.
Thanks in advance.
in this aws docs
you can see "userId" attributes in Input Event Format
userId allocated by session(what you mean).
I heard it can be slack id or user account property (maybe)
so we can track by this userId
parallel processing is will supported by aws.
you just code lambda or something to hook function with userId.
I'm currently working on the Microsoft Calendar Graph API and I ran into some problems when trying to synchronize events.
Basically what I have is 2 users on my web service : they both have an Outlook adress. User A create an event on my web service, and add User B as an attendee. The event is then send to Outlook using the REST API. It's created on User A Outlook calendar, and User B receive an invitation on Outlook and the event is created as well in his Outlook calendar.
Now, on my web service, when the event is created, I retrieve the Microsoft ID of the event and store it in my database. This way, when I want to synchronize events between Outlook calendar and my calendar, all I do is retrieve all Outlook events, check their ID and see if they are stored in my database : if they are, then the event already exists on my web service, if they are not then I create the event in my web service.
The problem I have is that when I try to get the Outlook events from User B, the ID of the event is different than the one I stored earlier, and therefore it creates a duplicate that I don't need.
Basically, the event on User A outlook calendar and User B outlook calendar have different ID. I don't understand what's the point of this, because they are the same, the one on User B outlook calendar is just an invitation to the one from User A outlook calendar.
My question is : is there any way I can get around that ? Is there any kind of ID that is shared through users in Outlook/Office365 ? Thanks a lot !
This is precisely what the iCalUId property is for. The id is by design different, as it's a sort of "primary key" for the user's mailbox database. The iCalUId is supposed to be the same across calendars.
From https://developer.microsoft.com/en-us/graph/docs/api-reference/v1.0/resources/event:
iCalUId - String - A unique identifier that is shared by all instances of an event across different calendars.
There is AFAIK no property that is unique and also constant across different mailboxes. You can add custom properties to events, and make sure that the property value is unique and constant. That would require you to be able to write to User A's events.
I am working on this project which uses EventHub -> Stream Analytics Job -> Storage Table / Blob structure and I want to write a couple of unit tests for it.
I can test the EventHub Sender status to see if queries have the expected behavior, but how can I check if the data is being sent to the Table Storage, since the whole process doesn't happen instantly and there is a pretty long delay from the moment I hit the EventHub and the moment the data is being saved in Storage.
First create a new Azure Table storage account and then create a new Azure table within that account. In your Stream Analytics job add a new output for Table storage. When you setup the Output details, you will need to specify the storage account, account key, table name, and which columns names in the event will represent the Azure Table partition and row keys. As an example, I set mine up like this:
After the output is setup, you can create a simple Stream Analytics query that maps input events from Event Hub to the Azure Table output. I also have an Event Hub input named 'eventhub' with Send/Listen permissions. My query looks like this:
SELECT
*
INTO
tableoutput
FROM
eventhub
At this point hit the 'Start' button in Azure portal to run the Stream Analytics job. To generate the events, you can follow the instructions here but change the event message to this:
string guid = Guid.NewGuid().ToString();
var message = "pk,rk,value\n" + guid + ",1,hello";
Console.WriteLine("{0} > Sending message: {1}", DateTime.Now, message);
eventHubClient.Send(new EventData(Encoding.UTF8.GetBytes(message)));
To eyeball the Azure Table results, download a tool like TableXplorer and enter the storage account details. Double click your Azure Table and you should see something like the following. Keep in mind you may need to periodically hit F5 on your TableXplorer query for 10-60 seconds until the data gets pushed through. When it shows up it will look like the following:
For programmatic unit testing you will need to push the partition key / row key values generated in your Event Hub code into a data structure and have a worker poll the Azure Table using point queries. A good overview on Azure Table usage is here.
I'm creating a application where the user will type in the name of a video game and a query will be sent to a servlet. I want this query to search the amazon product database and if the game is found i want to grab the information such as the name, publisher, platform genre etc and add this information to my database. Just like price grabber does. There is an example below.
http://video-games.pricegrabber.co.uk/nintendo-ds-games/m/25813985/details/st=product_tab/
Can this be done and if so what will i need to know and learn to do this.
I believe you should be able to use the Amazon Product Advertising API to query for this kind of product information. However the current Terms of Use appear to restrict you from storing the data into your database for more than 24 hours.