Using 'newUUID()' aws iot function in AWS SiteWise service that returns a random 16-byte UUID to be stored as a partition key - amazon-web-services

I am trying to use the 'newUUID()' aws iot function in the AWS SiteWise service (as part of an alarm action) that returns a random 16-byte UUID to be stored as a partition key for a DynamoDb tables partition key column.
With reference to the attached screenshot, in the 'PartitionKeyValue' trying to
use the value returned by newUUID() function that will be passed to the DynamoDb as part of the action trigger.
Although this gives an error as follows:
"Invalid Request exception: Failed to parse expression due to: Invalid expression. Unrecognized function: newUUID".
I do understand the error, but not sure how can I solve this and use a random UUID generator. Kindly note that I do not want to use a timestamp, because there could be eventualities where multiple events get triggered at the same time and hence the same timestamp.
Any ideas that how can I use this function, or any other information that helps me achieve the above-mentioned.

The docs you refer to say that function is all lowercase newuuid().
Perhaps that will work, but I believe that function is only available in IoT Core SQL Statements. I think with event notifications, you only have these expressions to work with, which is not much. Essentially, you need to get what you need from the alarm event itself.
You may need the alarm event to invoke Lambda, rather than directly write to DynamoDB. Your Lambda function can create a UUID and write the alarm record to DynamoDB using the SDKs.

Related

AWS Step Function manual approval process

I am working on the requirement where the data entered in the form needs to be validated manually and once validated , a approval mail be be sent out and then data will be stored in the database.I plan to use AWS step function for this with token.
https://aws.amazon.com/blogs/compute/implementing-serverless-manual-approval-steps-in-aws-step-functions-and-amazon-api-gateway/
I plan to use a similar design like in the link above.However is there a way not to use API Gateway for sending back the task token to step function to resume processing.Did anybody worked on the similar requirement and how the functionality was achieved. Thank you.
Step function can be invoked by the AWS Lambda function as well.
Once the form is validated and stored in database, you can trigger the Lambda function based on the database events(ex- if DynamoDB used then based on the DynamDB streams), and the lambda can start the step function.

Amazon Connect date calculations

I am writing a call flow in Amazon Connect. I am using Lex to get a date from the caller into a slot and then setting a call attribute in Connect equal to the value of the slot. I need to calculate how many years have passed between the date the caller provides and today.
Can this be done within Connect and if yes, how? Or do I need to write a Lambda function?
You would need to do this in a lambda function, as there is no access to date time functions or ad-hoc programatic mechanisms within the Amazon Connect contact flow blocks (actions). The contact flow blocks only provides a set of comparison operators to compare contact attribute or metrics within the blocks.
You could potentially invoke this lambda function from within Lex, so that the slot data is returned as the time difference that you need, or call it from the contact flow after you get the Lex slot data with the captured date. Either way, it would need to be done in lambda.

Get all items in DynamoDB with API Gateway's Mapping Template

Is there a simple way to retrieve all items from a DynamoDB table using a mapping template in an API Gateway endpoint? I usually use a lambda to process the data before returning it but this is such a simple task that a Lambda seems like an overkill.
I have a table that contains data with the following format:
roleAttributeName roleHierarchyLevel roleIsActive roleName
"admin" 99 true "Admin"
"director" 90 true "Director"
"areaManager" 80 false "Area Manager"
I'm happy with getting the data, doesn't matter the representation as I can later transform it further down in my code.
I've been looking around but all tutorials explain how to get specific bits of data through queries and params like roles/{roleAttributeName} but I just want to hit roles/ and get all items.
All you need to do is
create a resource (without curly braces since we dont need a particular item)
create a get method
use Scan instead of Query in Action while configuring the integration request.
Configurations as follows :
enter image description here
now try test...you should get the response.
to try it out on postman deploy the api first and then use the provided link into postman followed by your resource name.
API Gateway allows you to Proxy DynamoDB as a service. Here you have an interesting tutorial on how to do it (you can ignore the part related to index to make it work).
To retrieve all the items from a table, you can use Scan as the action in API Gateway. Keep in mind that DynamoDB limits the query sizes to 1MB either for Scan and Query actions.
You can also limit your own query before it is automatically done by using the Limit parameter.
AWS DynamoDB Scan Reference

Can I create temporary users through Amazon Cognito?

Does Amazon Cognito support temporary users? For my use case, I want to be able to give access to external users, but limited to a time period (e.g. 7 days)
Currently, my solution is something like:
Create User in User Group
Schedule cron job to run in x days
Job will disable/remove User from User Group
This all seems to be quite manual and I was hoping Cognito provides something similar automatically.
Unfortunately there is no functionality used to automate this workflow so you would need to devise your own solution.
I would suggest the below approach to handling this:
Create a Lambda function that is able to post process a user sign up. This Lambda function would create a CloudWatch Event with a schedule for 7 days in the future. Using the SDK you would create the event and assign a target of another Lambda function. When you specify the target in the put_targets function use the Input parameter to pass in your own JSON, this should contain a metadata item related to the user.
You would then create a post confirmation Lambda trigger which would trigger the Lambda you created in the above step. This would allow you to schedule an event every time a user signs up.
Finally create the target Lambda for the CloudWatch event, this will access the input passed in from the trigger and can use the AWS SDK to perform any cognito functions you might want to use such as deleting the user.
The benefit to using these services rather a cron, is that you can perform the most optimal processing only when it is required. If you have many users in this temporary group you would need to loop through every user and compare if its ready to be removed for a one time script (and perhaps sometimes never remove users).
My solution for this is the following: Instead of creating a post confirmation lambda trigger you can also create a pre authentication lambda trigger. This trigger will check for the user attribute "valid_until" which contains a unix timestamp. The pre authentication lambda trigger will only let the user in if the value of the "valid_until" attribute is in the future. Main benefit of this solution is that you don't need any cron-jobs.

Triggering AWS Lambda when a DynamoDB table grows to a certain size

I'm interested in seeing whether I can invoke an AWS Lambda when one of my DynamoDB tables grows to a certain size. Nothing in the DynamoDB Events/Triggers docs nor the Lambda Developer Guide suggests this is possible, but I find that hard to believe. Anyone ever deal with anything like this before?
You will have to do it manually.
I see two out-of-the box ways to achieve this though:
1) You can create a CloudWatch Event that runs every X min (replace X with whatever you think is necessary for your business case) to trigger your Lambda Function. Your function then needs to invoke the describeTable API and run a check against that value. Once it has run, you can disable the event since your table has reached the size you wanted to be notified about. This is the easiest and most cost effective since most of time your tables size will be lower than your predefined limit.
2) You could also use DynamoDB streams and invoke the describeTable API, but then your function would be triggered upon every new event in your table. This is cost ineffective and, in my opinion, overkilling.