Azure Stream Analytics is a way to run light weight queries in the Azure cloud on event streams that arrive into Azure using  SQL-like queries. Azure Stream Analytics is able to run transformations on events and turn the input events into another data shape to be sent to other data sinks in Azure. The raw events going into Azure can be in the form of telemetry, IoT events, data logs or other events originating from API calls for example.
Azure Stream Analytics is able to run queries on input event data that arrives in Azure IoT Hub, Azure Event Hubs, and Azure Blob Storage at the time of this writing. After running the query, the transformed data can then be sent to even more output data sinks such as Data Lake, Blob Storage, Functions, PowerBI (for DataViz wizards out there) and more.
The ultimate benefits of Azure Stream Analytics from my experience was that it can save on the overhead of having to write the boilerplate SQL database connection code in cases where a Client application may need a SQL database for storing events in and have to connect to it.
Here is an example setup: Â
The setup  above involves:
A browser - the calling client to make an HTTP call to an Azure Function
An Azure Function (using .NET Core 3.1, also see package references xml below) with an HTTP Trigger and Event Hub output binding
A store of raw data to use as events - in this case some World Bank Data retrieved by an API call
An Azure Event Hub - to initially store the incoming events, and this Event Hub is used as the output binding on the Azure Function
An Azure Stream Analytics Job - to listen for events on the Event Hub, query them with a pre-defined query and send the query results to Blob Storage
Azure Blob Storage - to store output results
The Azure Function Code:
In my quick example, I use an Azure Function that responds to an HTTP trigger and makes a call to data source (in this case some World Bank data) via an API call. The results from the World Bank are then organised into easy to manage WBData objects. The WBData objects become the events that get sent to Azure Event Hub.
After hitting the Azure Function URL, the waiting Azure Stream Analytics job waits for events as they arrive in the linked Azure Event Hub Instance (not the Namespace!!) and automatically runs the defined query and outputs the results into the defined output data sink (Blob Storage in my example):
This is a simple example but the idea is that it is possible to also add multiple Inputs and to output to multiple Outputs and query from and into them. This means you can have multiple event hubs that get queried to multiple blob stores or multiple Functions or a single Cosmos DB instance or an Azure SQL Database. Powerful!!! Â
NOW REMEMBER TO DELETE ANY Â RESOURCES IN YOUR SUSBCRIPTION YOU NO LONGER NEED
Ref Cover Image by eberhard grossgasteiger from Pexels