I'm just starting out with AWS IoT and Arduino and have had a look at the SDK for the Arduino Yun on Github. I have data collected by the Arduino and sensors and a GPRS shield connected to the Arduino to send this data, ideally to AWS IoT.
The information on the Github page seems to be related to WiFi? Looking at the installation instructions to set up OpenWRT. Naive question but if I want to use GPRS as my means for Internet connectivity and therefore AWS IoT connectivity, how can I proceed? I'm guessing I won't need to use MQTT as I only want to publish data, and not subscribe to any topics, therefore I can use HTTPS?
The easiest is if you use AWS IoT Device SDK (http://docs.aws.amazon.com/iot/latest/developerguide/iot-device-sdk.html), which handles MQTT for you. This means you don't need to worry about the protocol to use, just call the bits you need. As long as you have internet connectivity, it doesn't matter if you go via WiFi of GPRS.
Related
As announced by Google, the IoT Core will be retired on August 16, 2023.
I already have some ESP32 using Mongoose-OS and Google IoT Core to connect, send and receive telemetry.
What will happen after August 16? Will all the connected devices be disconnected and I need to use another technical approach (rework needed)?
When Google Cloud IoT Core is retired, your existing devices will likely be unable to use the service. But note that Google has not provided specific information on what will happen to devices that were connected to IoT Core, so you may want to check with them or plan for alternative solutions before August 16.
Is there a way by which we can do integration between On-Premise IBM MQ with AWS SQS/API Gateway.I checked lots of links but found that we can migrate whole IBM MQ to AWS MQ but can't call from AWS to on premise MQ. Please suggest if anyone has tried this kind of integration.
I’m assuming you have an AWS based application that integrates with SQS and an on-premise application that integrates with IBM MQ, and ultimately you want to communicate effectively between the two applications.
At a functional level IBM MQ provides a client interface and a bridge between this and the AWS SQS interface is relatively straight forward to create. One important aspect to consider is the non-functional aspects. The IBM MQ client can either communicate directly back to the on-premise MQ instance, or via an AWS MQ instance. Although it may appear to be more straight forward to communicate directly to the on-premise MQ instance there are a few considerations that may mean an MQ instance in AWS is a more sensible approach.
Applications often use IBM MQ for its assured delivery capabilities,
by building a bridge to AWS SQS which is a non-assured delivery
provider there is a risk that messages can be lost or duplicated
(depending on the implementation of the bridging logic). To minimize
the chance of this occurring you want to ensure that you have a
reliable network between MQ, the bridge and SQS instance. This
removes any fragile network links, as MQ can transfer the message
reliably from on-premise to a MQ instance deployed in AWS, overcoming
any network issues transparently.
The MQ Client is relatively chatty compared to two MQ instances exchanging messages. Due to the network latency between the on-premise and AWS data center, the chatty nature of the MQ Client can impact the overall performance of the solution.
Therefore, it is often sensible to install a lightweight instance of MQ within your AWS availability zone and allow MQ to transfer the messages from on-premise to AWS efficiently and reliably. To help get you up and running quickly, you can grab the IBM MQ developer container for free on DockerHub here.
I created a SQS Adapter in the OnPremise server and called my SQS directly from there.
I have a smartplug (Tp link HS110) installed in my local network. Now I want that the power usage that this smartplug uses is "captured" by my Raspberry (also in my local network), and the raspberry should send this data to a influxDB or DynamoDB hosted on an aws server.
How can I achieve this in an efficient way, having as less programs/ stuff running on the raspberry as possible? Which OS could fit here?
Maybe someone has done something similar and can help me out?
I would prefer a solution with node.js, since I want to work with the data out of the DB later on in an aws Lambda function (which will probably be written in node.js) and process it further.
Thanks!
Two options:
Option 1: Send the data via MQTT to AWS Internet of Things (IoT) that can then store the data in DynamoDB:
Option 2: Put an AWS SDK on the device and communicate with DynamoDB directly.
See:
Using the AWS IoT device SDKs on a Raspberry Pi - AWS IoT
Setting up your Raspberry Pi and moisture sensor - AWS IoT
Use a Raspberry Pi to communicate with Amazon AWS IoT
I'm new to AWS IoT and want to know how to get my things connectivity status.
I've read about managing indexes and believe that this is what i'm looking for.
However in my architecture, i've IoT Greengrass core which is an Edge device that is physically linked to AWS Cloud, and Greengrass devices that are connected to this edge device with bluetooth and that i'm creating in AWS IoT as IoT Things too. (IoT Thing for both Greengrass core and Greengrass devices)
I believe that once the Edge device is connected to AWS, its connectivity status in the AWS_Things index will be updated to "true". But how about the IoT Devices that are not directly connected to AWS but through the edge device ? are their connectivity states is going to be updated too ? how does it work ?
Or should i make use of shadow attributes for connectivity states when it comes to these IoT Things that don't connect directly to AWS IoT Cloud Platform ?
So I would do this in one of two ways:
One is with attributes, probably the best and "proper" way if only for the connected/disconnected status.
The other way is via having the Greengrass device query the BT attached device in the background locally, perhaps by bt-device -l (from the bluez-tools apt package). Upon detected it is no longer attached, you could publish an alert on a separate topic. The benefit of this method is that you could potentially query battery status or other properties periodically and publish under this specific topic i.e.:
IoTDevice1/BTDevice1/running True
IoTDevice1/BTDevice1/battery 80%
etc.
About getting the state of GG* Devices, I suggest the following :
The physical GG Devices communicate with the GG Core via local Lambda functions once the Core get the state from a given Device, a Lambda function (which runs from the Core) set the status of the IoT Thing in the AWS Cloud Platform.
*GG : Greengrass
According to https://docs.wso2.com/display/IoTS310/Analyzing+Data I should be able to do some Machine Learning tasks in IoT Server but the menu, usually available in WSO2 DAS, is missing, as is the Machine Learner features in "Configure->Features->Installed features" or "Configure->Features->Available features".
What Can I do?
Should I use an external DAS, as described here https://docs.wso2.com/display/IoTS310/Configuring+WSO2+IoT+Server+with+WSO2+Data+Analytics+Server?
Its depending on the load of the events and number of IOT devices you are dealing. If the load is not significant, you can install WSO2 DAS features in the IOT node and operate.
Going forward it will be difficult when scale up, in case you if the IOT event generating throughput is high whereas you need to have multiple nodes and clustering. Therefore simply you can setup another DAS node as in the documentation and publish events from the IOT server and leave the analytic part there. When scaling up you can have different clusters for IOT and for analytics depending on the load.