Blog

  • How IoT is Transforming Smart Shopping
  • Turning Retail Pain into Smart Gain
  • Another Big Win for Axonize & Deutsche Telekom
  • Insights from 1,300 IoT projects in 2018 & What to Expect in 2019
  • Smart city orchestration in action – connecting all city smart apps
  • IoT Sensors & Bundles & Platforms, Oh My!
  • Break Your Sensors Out of Their Silos
  • Achieving in-transit visibility in complex supply chains
  • Case Study: How Megla is Implementing IoT to Unleash Data
  • Growing Gains: Microsoft on scaling to hundreds of microservices
  • Axonize launches partnership with Singtel and enters the Asian and Australian markets
  • Case Study: How Groupe Tera is Using IoT to Measure Air Quality Sensor Data
  • Case Study: Deutsche Telekom Selects the Axonize IoT Orchestration Platform
  • Case Study: How Optus is Using IoT to Disrupt the Retail Industry in Australia
  • Diving into Edge Computing
  • AXONIZE SELECTED AS ONE OF THE TOP IOT STARTUPS OF 2018
  • Case Study: Fast Food Chain Saves 27% on energy consumption
  • Case Study: Hotel Improves Efficiency & Customer Experience with IoT
  • Case Study: Presidential House Installs Comprehensive Monitoring of Mission Critical Server Room
  • POPULAR IOT PROTOCOLS 2018: AN OVERVIEW & COMPARISON [Updated]
  • Deutsche Telekom IoT Leadership Visits Bezeq & Axonize
  • Accelerating time-to-market by 90% with Microsoft Azure
  • Axonize Wins Deutsche Telekom Investment for Innovative IoT Platform
  • Using IoT Orchestration to Break Down the Silos
  • What is IoT orchestration?
  • How facility managers are "smartifying" their buildings for increased profitability
  • Case Study: How Bezeq is ‘Smartifying’ Kindergartens & Schools
  • The 4 keys to starting small and scaling successfully in IoT
  • IoT revenue is in the application development for service providers
  • Most Popular IoT Use Case? Smart Energy Management
  • Everything You Need to Know: Deloitte's The Building of the Future Meetup
  • Axonize named one of the top 10 most disruptive companies
  • What is an IoT Platform & When to Use One
  • Popular IoT protocols: An overview & comparison
  • Case Study: Leading Israeli service provider Bezeq chooses Axonize to deliver digital business services
  • The most frequently asked IoT questions
  • How System Integrators are growing their IoT business these days
  • The survey results are in: Integrators’ top roadblocks to IoT business growth
  • In It To Win IT: How to get to a live IoT project in 4 days
  • In it to win it: why system integrators should be taking over IoT
  • Joining Collections in MongoDB using the C# driver and LINQ
  • Simple or sophisticated? What kind of IoT platform do you need?
  • The Benefits & Downfalls of Using Azure Stream Analytics for IOT Applications
  • The Case for A Smart Campus, From Someone Who Would Benefit
  • The Top 3 Considerations in Evaluating and Selecting an IoT Platform

What is Azure Stream Analytics?

Azure Stream Analytics is a high volume, simple to use stream processing service. It is totally stand-alone and does not require any additional software or program environment to run. It can be entirely configured in the Azure Management Portal. One of the touted use cases for this service is IOT message processing. Obviously, real life is much more complicated, and here we will examine the pros and cons of using Azure Stream Analytics in potential IOT applications.

Azure Stream Analytics lets you connect to the event hub, transform data as it comes in, and save it to some sort of DB. The transformations are done in a SQL-like language (good for filtering, group by, etc.) called Stream Analytics Query Language. In addition to standard SQL queries you can use, there is SAQL which also adds support for time window aggregations, allowing you to group the stream into 5 minute chunks, for example. This functionality is of course most helpful in IOT applications where you can get readings every few seconds but only care about some of that data, filtered or aggregated.

 

Let’s begin with the positive:

  1. Easy to use: It takes mere minutes to set up the Azure Stream Analytics, and about a minute to start/stop it. Writing the internal SAQL query can take some time, but it’s very short and intuitive compared to writing actual code. You are only responsible for the business logic, which is what you want.
  2. Easy to scale: Microsoft uses a vague unit to measure the power of a stream analytics job. This is a known as a Streaming Unit – which is a “Blended measure of CPU, memory, throughput”. However, they translate that to about 1 MB of data per second. Adding more Streaming Units to a job is as simple as changing the number from 1 to 5 to 10 to even higher numbers. The only caveat is that it cannot auto scale.
  3. Good integration within Azure: Azure Stream Analytics can connect to Azure Event Hub/IOT Hub as input for the stream analytics, and can output to a variety of azure services (Azure SQL, Azure DocumentDB, Azure EventHub, Azure ServiceBus, etc.). If these services are in the same Azure account, they can be chosen easily from a drop down menu.
  4. Cheap: The cost of running the Azure Stream Analytics is 3 cents per hour per streaming unit. Microsoft claims that 1 streaming unit can handle a MB of data per second, which is roughly 22 dollars a month. Having a VM/Cluster that runs a different stream processing solution would be much more costly.

Now for the not so positive:

Azure Stream Analytics is strictly a stream solution, however, when you compare what it can do versus solutions like Spark Streaming or Apache Storm, you can see that it is much more limited.

  1. Unable to join dynamic data: Azure Stream Analytics gives you an option to join the data against a file in blob storage, which they call Reference Data. Theoretically, this is their solution for joining extra data. For example, you can get just the readings in the event hub and then join them against the devices to add device name or any other device data. However, this file will be loaded once for the duration of the job lifetime. In an IOT solution, you will be adding and removing devices dynamically. This file will not be updated in the Azure Stream Analytics job. This is a big problem when you need to add data from external providers.
  2. Unable to save state: Azure Stream Analytics gives you the option to aggregate your data into windows based on time. However, sometimes you want to keep a relative state regardless of how much time has passed. For example, spike detection or just to know the maximum value all time. Azure Stream Analytics does not have a place to store this kind of state.
  3. Limits of SQL: SAQL is based on SQL, which is well known and simple to use. However, it is also a limited language. It is good for querying but does not have the openness of a full programming language.
  4. Coupled to Azure and Microsoft: Azure Stream Analytics is a pure Microsoft service, you will only use it if you are already using other Azure products. You cannot take this code out of Azure and reuse it somewhere else, because it is a proprietary language and solution. If you created a cluster on VMs, you could then move them to a different cloud provider with some work.
  5. Will crash on invalid data: One of the biggest quirks with Azure Stream Analytics is that if the data is misformatted, or if there is a type mismatch, it can cause the entire job to crash. Since in IOT applications you can be getting data from IOT sources that you might not be able to control, and since the IOT sources are connected directly to the event hub, you will need to either sanitize the data with a separate Azure Stream Analytics job or hope you can do it in the same job and that it doesn’t crash unexpectedly. This lack of recovery is a huge problem for relying on Azure Stream Analytics.

Bottom Line

In summary, Azure Stream Analytics is an easy to use but very limited tool. It does the job it sets out to do well, but most real world IOT applications need much stronger capabilities than what Azure Stream Analytics currently provides. Coupled with the issues involved in managing it (crashes/logs/source control), our findings are that it is only useful in a very limited sense for very specific use cases. For example, a great use would be just to save all the data from the event hub to blob storage. For analytics and more advanced business logic, it will probably not suit your needs.

Want to learn more about the Axonize IoT orchestration platform? Schedule your deep-dive demo here.