(5) Azure IoT Hub

Caio Gasparine
8 min readNov 3, 2023

--

Full solution implementation

This is part of a series of articles called Azure Challenges. You can refer to the Intro Page to understand more about how the challenges work.

IN THIS CHALLENGE:

  • 1-Create IoT Hub
  • 2-Create IoT Edge Device + Add Consumer Group
  • 3-Create a Stream Analytics Job
  • 4-Create your Storage Account
  • 5-Setup the Output file
  • 6-Test I/O
  • 7-Monitor / Query Setup
  • 8-Start your process
  • 9-Check/validate the results

The Solution Scope

This challenge requires data collection from a simulated IoT device using Azure IoT Hub and forwarding the data from the IoT Hub to Power BI for proper analysis. The data collected from the IoT device comprises two things — temperature and humidity in the air.

This data will be sent every 10 seconds, so analyzing these metrics in a tabular form would be inefficient and tedious. Therefore, making a visual report to study and analyze the temperature and humidity would be easy and beneficial.

The snippet only has 30 rows whereas the dataset that we would be using would contain more than 250 rows. It would be not easy to analyze these metrics simply by looking at the numbers.

Main Components

There are a total of 4 important components that we are going to make use of:

  • Device: The simulated IoT device would be responsible for sending the data. The data that it would be sending is information about the temperature and humidity.
  • Azure IoT Hub: The hub provides a cloud-hosted solution back end to connect virtually any device. For this experiment, a virtual device would be created in the IoT hub, and that device would be connected to the Simulated IoT device. This would be done to route the messages from the online simulator to the IoT Hub.
  • Azure Stream Analytics: Stream Analytics is an analytical service that is provided over streaming data. It is important for us because we can draw insights by combining historical data and streaming data. We can use a query to get the desired output or specific desired rows from the table.
  • Storage Account: In our challenges, we are using a store account as our output, but this could be other services like SQL database OR Power BI, for example.

Solution Architecture

1-Create IoT Hub

Go to the Azure Portal and select Create a resource

Search for IoT Hub

When you find the IoT Hub select Create

Fill out the info about your IoT hub resource and select Review + create.

After you confirm all the information you can hit Create.

Now, wait for the deployment of your resource…

2-Create IoT Edge Device

Select your Azure IoT Hub and then choose IoT Edge in the menu.

After you select IoT Edge select + Add IoT Edge Device

Create your device and Save.

Select your device created from the device list.

Take a look at the main features of your device.

This edge device is a virtual device, and the primary connection string of this edge device would be responsible for connecting the Online Raspberry Pi Simulator to the IoT hub.

In the Online Raspberry Pi Simulator, a script can be seen. In that script connection string copied from the edge device should be updated.

A screenshot of the Raspberry Pi Azure IoT Online Simulator after connecting the IoT Edge device to the Simulator.

And now you are ready to start the Simulator. Hit Run.

You will notice that the data will start flowing from the Simulator to your Azure device.

Go back to your device in the Azure Portal and select Built-in endpoints.

And then create a new consumer group…

3-Stream Analytics Job

Go to the Azure Portal and select Create a resource

Search for Stream Analytics Job

Select Create.

Fill out the info about your Stream Analytics

Wait for the deployment of your resource and then select Go to resource.

When in your resource choose Inputs from the menu.

select where the data is coming from, in this example, it is coming from the IoT Hub. Select it from +Add stream input.

select the info about your IoT Hub and Save. Do not forget to choose the correct Consumer Group. ;-)

Now you have your Raspberry pi input created!

4-Create your Storage Account / Output

Go to your Azure Portal, select + create, and search for Storage Account.

Define the info about your storage account and Review + create and then Create.

Now you have a storage account ready to be used!

5-Setup the Output file

Go to your Azure Portal, select your Stream Analytics job, and choose the option Outputs in the menu.

Select +Add and from the output list select Blob storage/ADLS Gen2. Please note that you have a bunch of other options to select as your data output, including SQL databases and Power BI.

Add your Storage account info.

Important: note the message below — because my resources are in different regions my subscription will be billed at additional costs to move data between different regions. :-|

Go to your Stream Analytics job, select Outputs, and the added Output.

select the option to Test your Output.

6-Test Input and Output

Select your Stream Analytics job and the option Inputs, select the input you want to test, and hit the icon to test the connection.

You will be able to see a message confirming the test execution (successfully OR not).

The same step applies to the Output.

Select your Stream Analytics job and the option Outputs, select the output you want to test, and hit the icon to test the connection.

7-Monitor / Query Setup

To monitor the process select your IoT Hub resource

You will see the number of messages received by your IoT Hub solution.

Selecting your Stream Analytics job you will be able to query the data as soon the data is being streamed to your cloud resources.

8-Start your process

9-Check and validate the result

To check and validate the process select your Storage account.

Using the Storage browser you can navigate through your Storage account and visualize the output container.

… and visualize the content of the file in JSON format.

Overall Architecture

This is the end of the Challenge#5.

You can go to the Intro Page and start the next Challenge.

--

--

Caio Gasparine
Caio Gasparine

Written by Caio Gasparine

Project Manager | Data & AI | Professor

No responses yet