I'm exploring AWS IoT MQTT broker capabilities for my future use and I'm trying to figure out if AWS MQTT broker supports "shared subscriptions" functionality.
Shared subscriptions, relatively new functionality, introduced by IBM MessageSight. Represents "queue" or "p2p" behavior when multiple subscribers with same clientID can be connected/subscribed to the same topic and only one subscriber instance (load is balancing automatically) receives particular message. Against of pub/sub topic model, when all subscribers receive copies, this is very convenient to use in software architecture as we don't need to use one more "player" message queue to distribute load between parallel application instances.
In case if IoT, messages published by devices can be consumed by several application instances, working in parallel, to distribute load between them.
AWS IoT does not support Shared subscriptions but there is feature request. So we could expect it (soon).
https://forums.aws.amazon.com/thread.jspa?messageID=757689
Spent time on documentation and yes, unfortunately, AWS IoT MQTT does not support shared subscriptions at this point of time.
Related
In AWS IOT we can make device subscribe to a topic. When a message is received on a topic, the device can be programmed to execute some code.
AWS IOT Jobs seems similar in that the device listens on the job and executes certain code when job is received.
How are AWS IOT Jobs different to Topic subscription?
The primary purpose of jobs is to notify devices of a software or
firmware update.
AWS IOT Job Doc
AWS IOT Events activities (like subscribing to a topic) would be the generic implementation for doing stuff when a device gets a message. IOT jobs are more of a managed workflow for doing a specific activity- like notifying devices of a firmware update and using CodeSigning.
Just want to add an important point to what #Bobshark wrote.
Yes, Amazon engineers implemented a set of endpoints to manage a whole job lifecycle on a single device and the process of gradually rolling out jobs over a fleet of devices.
However, IoT jobs are not tied down to using MQTT as the transport protocol. As the AWS docs [1] mention:
Devices can communicate with the AWS IoT Jobs service through these methods:
MQTT
HTTP Signature Version 4
HTTP TLS
My personal advice: Use jobs if you would have to implement your own update procedure (such as progress reporting, gradual rollouts, etc.) otherwise.
[1] https://docs.aws.amazon.com/iot/latest/developerguide/jobs-devices.html
I am trying to send IoT commands using a push subscription. I have 2 reasons for this. Firstly, my devices are often on unstable connections so going through the pubsub let me have retries and I don't have to wait the QoS 1 timeout (I still need it because I log it for later use) at the time I send the message. The second reason is the push subscription can act as a load balancer. To my understanding, if multiple consumers listen to the same push subscription, each will receive a subset of the messages, effectively balancing my workload. Now my question is, this balancing is a behavior I observed on pull subscriptions, I want to know if:
Do push subscription act the same ?
Is it a reliable way to balance a workload ?
Am I garanteed that these commands will be executed at most once if there is, lets say, 15 instances listening to that subscription ?
Here's a diagram of what I'm trying to acheive:
Idea here is that I only interact with IoT Core when instances receive a subset of the devices to handle (when the push subscription triggers). Also to note that I don't need this perfect 1 instance for 1 device balancing. I just need the workload to be splitted in a semi equal manner.
EDIT: The question wasn't clear so I rewrote it.
I think you are a bit confused about the concepts behind Pub/Sub. In general, you publish messages to a topic for one or multiple subscribers. I prefer to compare Pub/Sub with a magazine that is being published by a big publishing company. People who like the magazine can get a copy of that magazine by means of a subscription. Then when a new edition of that magazine arrives, a copy is being sent to the magazine subscribers, having exactly the same content among all subscribers.
For Pub/Sub you can create multiple push subscriptions for a topic, up to the maximum of 10,000 subscriptions per topic (also per project). You can read more about those quotas in the documentation. Those push subscriptions can contain different endpoints, in your case, representing your IoT devices. Referring back to the publishing company example, those push endpoints can be seen as the addresses of the subscribers.
Here is an example IoT Core architecture, which focuses on the processing of data from your devices to a store. The other way around could also work. Sending a message (including device/registry ID) from your front-end to a Cloud Function wrapped in API gateway. This Cloud Function then publishes the message to a topic, which sends the message to a cloud Function that posts the message using the MQTT protocol. I worked out both flows for you that are loosely coupled so that if anything goes wrong with your device or processing, the data is not lost.
Device to storage:
Device
IoT Core
Pub/Sub
Cloud Function / Dataflow
Storage (BigQuery etc.)
Front-end to device:
Front-end (click a button)
API Gateway / Cloud Endpoints
Cloud Function (send command to pub/sub)
Pub/Sub
Cloud Function (send command to device with MQTT)
Device (execute the command)
I am currently working on an IoT device using AWS IoT core. I am new to working with IoT device. What is the standard/best way for determining whether the device is online and connected to the internet?
Thanks you!
Since you have been using AWS IoT Core, I would recommend that you stay in fully managed services provided by AWS IoT suite. No need to reinvent the wheel such as provisioning a separate database for a basic requirement of pretty much every IoT-enabled solution.
What I understand is that you want to monitor your IoT device fleets for state changes or failures in operation, and to trigger actions when such events occur. To address this challenge, I'd suggest using AWS IoT Events. It accepts inputs from many different IoT telemetry data sources including smart sensors, edge devices, management applications, and other AWS IoT services. You can easily push any telemetry data input to AWS IoT Events by using a standard API interface.
In specific to device heartbeat, please take a look at this sample detector model. A detector model simply represents your equipment or process. On the console, you can find some other pre-made detector model templates which you can customize based on your use-case.
One way to know if a device is online is to check for a heartbeat.
A device heartbeat is a small mqtt message to a topic that the device sends every 5 minutes.
In IoT Core, you would configure a rule that would update a Dynamodb table with a timestamp each time a message is sent to the heartbeat topic.
By checking this timestamp in Dynamodb, you can confirm if your device is currently online.
You can follow this Developer Guide to get connect disconnect events. it works on MQTT topics so we can use rules to trigger Lambda or other services.
I've got rather rare requirement to deliver SNS topic message to all micro service instances.
Basically it's kind of notification that related data had changed
and all micro service instances should reload their internals from data source.
We are using TerraForm to create our infrastructure, with Kong api gateway.
Micro Service instances could be created 'on the fly' as system load is increased,
so subscriptions to topic could not be created in TerraForm stage.
Micro Service is standard SpringBoot app.
My first approach is:
micro service is exposing http endpoint that can be subscribed to SNS topic
micro service on start will subscribe itself (above endpoint) to required SNS topic, unsubscribe on service shutdown.
My problem is to determine individual micro service instances urls, that can be used in subscription process.
Alternative approach would be to use SQS, create SQS queue per micro srv instance (subscribe it to sns).
Maybe I'm doing it wrong on conceptual level ?
Maybe different architecture approach is required ?
It might be easier for the microservices to check an object in Amazon S3 to "pull" the configuration updates (or at least call HeadObject to check if the configuration has changed) rather than trying to "push" the configuration update to all servers.
Or, use AWS Systems Manager Parameter Store and have the servers cache the credentials for a period (eg 5 minutes) so they aren't always checking the configuration.
Kinda old right now but here is my solution:
create SNS, subscribe with SQS, publish the SQS to redis pub/sub, subscribe to pub/sub
now all your instances will get the event.
I'd like to use AWS IoT to manage a grid of devices. Data by device must be sent to a queue service (RabbitMQ) hosted on an EC2 instance that is the starting point for a real time control application. I read how to make a rule to write data to other Service: Here
However there isn't an example for EC2. Using the AWS IoT service, how can I connect to a service on EC2?
Edit:
I have a real time application developed with storm that consume data from RabbitMQ and puts the result of computation in another RabbitMQ queue. RabbitMQ and storm are on EC2. I have devices producing data and connected to IoT. Data produced by devices must be redirected to the queue on EC2 that is the starting point of my application.
I'm sorry if I was not clear.
The AWS IoT supports pushing the data directly to other AWS services. As you have probably figured out by now publishing to third party APIs isn't directly supported.
From the choices AWS offers Lambda, SQS, SNS and Kinesis would probably work best for you.
With Lambda you could directly forward the incoming message using the one of Rabbit MQs APIs.
With SQS you would put it into an AWS queue first and than poll this queue transfering it to RabbitMQ.
Kinesis would allow more sophisticated processing, but is probably too complex.
I suggest you program a Lamba with the programming language of your choice using one of the numerous RabbitMQ APIs.