I have a AWS Lambda which publishes data to a AWS IOT topic A and waits for the result, which will be published to a different topic B.
I was wondering how to get this data from topic B when the thing publishes it to the already running lambda.
I was not able to find any equivalent to get_thing_shadow for a particular topic https://boto3.amazonaws.com/v1/documentation/api/latest/reference/services/iot-data.html#id4
Eg:
Lambda1 -> IOT Topic A -> Thing
Lambda1 waiting
Thing -> IOT Topic B
Lambda1 reads from Topic B and updates say DB and dies.
I was wondering how this can be done.
For some reasons we are unable to use IOT Shadow anymore.
Current architecture:
Lambda1 -> IOT Shadow Desired -> Thing
Lambda1 -> waits for 5 sec
Lambda1 -> reads IOT Shadow Reported -> success or failure
If failure Lambda1 -> resets IOT Desired to old state -> exists
It is not possible to configure IoT to send the new message to the "already running" Lambda. It will always trigger a new invocation of the Lambda function. Isn't the previous state already in the IoT Shadow Update Failed message? Can't you just use that data in the new invocation to do whatever DB updates or whatever else you need?
AWS sdk for lambda(e.g. boto3 for python) does not support subscribing topic.
It only support publishing topic.
If you want to subscribe topic, You must use device sdk
(ref. https://docs.aws.amazon.com/iot/latest/developerguide/iot-sdks.html )
And then, You can publish and subscribe by device sdk in lambda.
If you don't want to use device sdk, you have to use redis or dynamoDB like below.
device publish response message -> AWS IoT Rule trigger some action(e.g. write to DB ) -> lambda polling DB.
Related
Here's the scenario, after AWS Backup completes backup cluster aurora RDS, it will send 1 event message to SNS topic. Finally, I'll use lambda to get cluster information.
AWS Backup job completed -> Event message -> SNS topic -> trigger lambda
The thing is that I cannot find any information related to cluster ID ( tried with EventBridge ). I've tried to use Event Subscription ( RDS), but, it won't send any information while I perform AWS Backup.
Please let me know if you have any other ideas, thanks in advance
What I want to have is to send some metrics from greengrass device to greengrass lambda function in the local network.
I'm using the subscribe mode of basicDiscovery.py to send some device metrics to the amazon greengrass core device and invoke the lambda function deployed on the core. The lambda function simply prints the event.
My confusion is when I use a subscription from IoT Cloud -> lambda function it work's fine but It's not working when I configure a subscription from device -> lambda function.
Also when I disconnect the device the device from the internet (in case 1), it buffers the messages and sends them when it gets connected to the internet again.
I know that I'm passing xxxxxxx.iot.us-west-2.amazonaws.com as the endpoint and it's accessible over the internet, but not sure how to have an endpoint for the core.
I have tried sending data from device to lambda and lambda to cloud.As per my understanding you have to take care of the following points:
1.The device that is connected to greengrass should be publishing data for a topic
2.For the same topic subscription has to created between device and lambda
3.To test this behavior,create another subscription between lambda and cloud
Reference:https://medium.com/tensoriot/aws-greengrass-on-raspberry-pi-creating-core-and-node-devices-707a38452293
I had the same problem.I tried to trigger a lambda function in Greengrass. If I created a subscription whose source was cloud and target was lambda. Then when I published a message to a topic with my device credentials, the lambda function was triggered.
But if the connection between greengrass and AWS was lost, the lambda function didn't trigger. I needed the lambda function to work even when greengrass didn't have connection with AWS.
So I think this issue can be fixed if I change the source of subscription to device.
However, if I create a subscription with source set as device itself, then publishing a message does not trigger the lambda function. Although I can get message from the topic.
I have a lambda function with 2 aliases.
Dev - Points to the $LATEST version
Test - Points to a specific version.
I have setup my SQS queue to trigger both aliases when it receives a message in the queue. However, it only seems to trigger one of them (The one that got registered the latest) and not both. Has anyone else come across this issue?
arn:aws:lambda:us-east-1:XXXXXXXXXXXXXX:function:Amoel:Dev
arn:aws:lambda:us-east-1:XXXXXXXXXXXXXX:function:Amoel:Test
A message on the queue would be consumed only once, by whichever Lambda function grabs it.
If you wish to send the same message to multiple AWS Lambda functions, combine it with Amazon SNS:
SQS queue -> SNS topic -> 2 x Lambda function subscriptions
Be sure to activate Raw Message Delivery on the Amazon SNS topic to preserve the original format of the message from Amazon SQS.
Superb news about the Amazon MQ service but now the question arises on how I can trigger a Lambda function (Node.js) on a message on a specific Queue?
I was thinking if I somehow can get a SNS topic posted on a message PUT or some other trigger that can fire a Lambda to consume the message from the Queue...
Any suggestions?
There isn't a native way to do this. Amazon's managed ActiveMQ service is simply a managed deployment of ActiveMQ running in EC2. It has no integration with other services.
You'd need to write a queue consumer and have it running on a server and listening to the queue on ActiveMQ and publishing those messages to SNS or invoking the Lambda function directly via the Lambda API, etc.
(At least for now.)
AWS recently announced
https://aws.amazon.com/about-aws/whats-new/2020/11/aws-lambda-supports-amazon-mq-apache-activemq-event-source/
We can now add Trigger as MQ in lambda. Then Configure
Broker url
Batch size
Queue Name
Here is one approach that AWS describes - https://aws.amazon.com/blogs/compute/invoking-aws-lambda-from-amazon-mq/
Basically, have a cloud watch trigger for the lambda, start polling for MQ messages and process those messages
I want to trigger Lambda function whenever new message added to SQS.
Note that I don't want to add new message (events) to SQS.
What I'm trying to do:
My app will send message to SQS
Whenever new message added to queue CloudWatch event gets generated
CloudWatch Event triggers lambda
Problem:
In AWS console while configuring CloudWatch Events I haven't found any option to add source of event i.e. URL or Name of my SQS queue.
I'm not sure if this use case is valid but please help me out.
EDIT: AWS now supports SQS as an event source to trigger Lambda functions. See this blog post for more details.
ORIGINAL ANSWER:
SQS is not supported as a direct event source for AWS Lambda functions. If there are properties of a queueing system that you need for your use case, then you could have a "cron-job" type Lambda function that runs on a schedule, receives messages from the queue, and calls your worker Lambda function in response to each message received. The problem with this approach is that you must continually poll SQS even during periods when you don't expect messages, which incurs unnecessary cost.
The easiest approach is to use SNS instead. Create a topic, publish events to that topic instead of adding a message to an SQS queue, and have your Lambda function subscribe to that SNS topic. It will then be invoked each time a message is published to that SNS topic. There's a tutorial on this approach here:
http://docs.aws.amazon.com/lambda/latest/dg/with-sns-example.html
I would recommend to change your approach.
Your application should publish a message to an existing SNS topic. Your SQS and Lambda should than subscribe to this SNS topic.
Application -> publish -> SNS_TOPIC
-> SQS is notified
-> Lambda is notified