I'm attempting to trigger an IoT event via Alexa. I've created an Alexa skill which triggers a Lambda written in Go, which publishes an event to SNS. However, I can't see of a way to get an IoT device to listen to an SNS event.
Similarly, there doesn't seem to be a Go library for IoT, so I can't get the Lambda to interact directly with the IoT MQTT protocol as a client either.
I just wondered if I'd need to re-write my Lambda as a JS Lambda for the purposes of interacting with the IoT core, or if there was a better way to trigger an IoT event from a Lambda?
You should be able to publish to an IoT topic from within your Lambda function via the IoTDataPlane.Publish method in the AWS SDK for Go.
Related
I have a AWS Lambda function (triggered by an API call via API Gateway) that publishes messages to a collection of related IoT devices using an IoT topic based on the parameters received (the topic being a serial number that uniquely identifies a specific IoT device). I do this using boto3's IoT data-plane client method publish(). Is there a way for me to have this same Lambda function subscribe to the same IoT topic so that it is able to receive the response from the IoT device?
I am aware that the prescribed way for an IoT message to trigger a Lambda is to create an IoT Rule Action that calls a Lambda, however this doesn't really work for me (as far as I can tell) because what I want is for the original Lambda (the one triggered by the external API call to API Gateway) to listen for the IoT device's response and send back a HTTP response to the external caller based on that; this doesn't seem achievable if it's a separate Lambda that gets triggered when the IoT device responds on its IoT topic. Any ideas on how to do this?
Why does it have to be the original lambda? I'm guessing there's some state you want to save; in which case you should either persist it or pass it in the messages.
Though not pretty, this enables me to do what I posted in my question: I added an IoT rule which flicks the IoT device's response on to a SQS queue; the Lambda (that was invoked by API Gateway to handle the request at the start) polls that SQS queue for a message which it then consumes, using it to construct and send a response back via API Gateway to the external client that sent the HTTP POST request. To be sure, this is not a great way to use AWS Lambdas, but it seems the only way available to use Lambda + API Gateway to hook up HTTP request/response with IoT publish/subscribe.
Can someone explain to me the advantage or disadvantage of using SNS -> Lambda vs. SNS -> SQS -> Lambda.
I'm looking to setup an architecture for pub/sub micro-service messaging, but having a queue in front of every Lambda seems excessive.
Unless something has changed, the question of whether to it makes more sense to deploy SNS → Lambda, or SNS → SQS → Lambda, is based on a premise with a significant flaw.
As is indicated in Supported Event Sources in the Lambda documentation, Lambda events can be sourced from S3, DynamoDB, Kinesis, SNS, SES, Cognito, CloudFormation, CloudWatch & Events (including Scheduled Events), AWS Config, Amazon Echo, and API Gateway.
And, of course, you can invoke them directly.
But SQS is not a supported Lambda event source.
Amazon SQS is a message queue service used by distributed applications to exchange messages through a polling model, and can be used to decouple sending and receiving components—without requiring each component to be concurrently available. By using Amazon SNS and Amazon SQS together, messages can be delivered to applications that require immediate notification of an event, and also persisted in an Amazon SQS queue for other applications to process at a later time.
Untill you don't want to decouple sending and receiving components and just want to achieve your use case in the question it will work in both case SNS- Lambda and SNS - SQS - Lambda.
What I intend to do:
I am receiving a file from a device connected to AWS IoT and I would like to send that file to AWS EC2 server using a AWS Lambda function. Lambda function is like the middle man which passes the file.
My questions is:
The rule in AWS IoT says "Insert this message into a code function and execute it (Lambda)". When I select this, will this send the file to Lambda sent by the device to IoT or am I supposed to write a function in lambda to subscribe to the device.
My question in brief is when I create a rule in IoT and choose an action as Lambda, what happens then? Does it forward the file to lambda? If yes then how do I receive it from the lambda function? It would be great if I could get some example function for it.
When you create an IoT rule choosing lambda. The lambda you chose will receive the message sent to IoT in the event.
You may want to take a look at this tutorial, where IoT triggers a lambda to update a DynamoDB table (change Dynamo for your EC2).
Is that possible if the user create a ticket in freshdesk that needs to be trigger the AWS lambda function.
That shouldn't be that hard. I would like to recommend using the following architecture
FreshDesk Ticket Trigger
FreshDesk Ticket Trigger Handler Published Message to SNS Topic
AWS Lambda Configured to SNS Topic as Event Source
AWS Lambda Code Accepts the SNS topic message (as Input) and performs the necessary processing
The advantages of using SNS rather directly calling Lambda are
Reducing the exposure of AWS API to only SNS topic and completely sealing rest of the API (IAM Privileges)
Possibility of Fan-Out Architecture [Multiple Lambda Functions can listen to the same SNS topic - near zero configuration]
For anyone landing on this topic.
It's possible with Freshdesk Marketplace app. With onTicketCreate product event, any actions can be written to execute with a Serverless function. It's completely run in Freshworks platform cloud.
If required, it can call your AWS Lambda.
Is it possible to auto send/push the messages in Amazon SQS to DynamoDB? I wish to send my messages to SQS and for period of time I want to send this to DynamoDB. Another service should fetch the DynamoDB table and send it as email using SES.
Kindly help me out to achieve this. I will be using it for the User notification purpose from a Social networking site.
Thanks.
There is no AWS mechanism to automatically publish SQS messages to DynamoDB; but you can use an AWS Lambda event source mapping to automatically pull SQS messages and invoke a Lambda function, and it's pretty straightforward to write a Lambda function that writes those messages to DynamoDB. (Here's an example using Node.js: https://docs.aws.amazon.com/sdk-for-javascript/v2/developer-guide/dynamodb-example-table-read-write.html.)
Yes I agree Hyangelo, you can do this with Simple Workflow Service (SWF).
SWF will give you a control feature over your application enabling you to distribute and execute different services or tasks when you want.
Here is the link to the documentation: http://aws.amazon.com/swf/
Sounds like a workflow system from how you describe what you want, have you considered Simple Workflow Service?
SQS can't be processed w/o pulling messages.
You can either use SWF to solve your use-case OR use SNS.
SNS<=>SQS binding is free by AWS.
Send your messages to SNS, bind your SNS with SQS & lambda-function.
On triggering lambda function - you can create dynamodb-record and send it to another SNS2.
Bind SNS2 <=> SES which will trigger the email.
checkout: https://aws.amazon.com/premiumsupport/knowledge-center/lambda-sns-ses-dynamodb/