Azure taps new Stream Analytics features
Microsoft will launch a raft of new features for its Azure Stream Analytics platform in preview next week, though you’ll have to wait at least a few weeks for general availability.
Microsoft pitches Azure Stream Analytics as a platform for quickly developing and deploying complex serverless analytics on data streams.
Top of the list of new features is online scaling, which means users no longer have to stop and restart a Stream Analytics job to change the number of Streaming Units allocated to a job. Streaming units represent the amount of compute resources allocated to a job.
As Microsoft puts it, “This builds on the customer promise of long-running mission-critical pipelines that Stream Analytics offers today.”
Also new is the ability to implement custom deserializers in C#, “which can then be used to de-serialize events received by Azure Stream Analytics.” At the same time, developers can now create Stream Analytics modules that write or reuse custom C# functions and invoke them right in the query through User Defined Functions. Both of these changes will give developers more to play with when dealing with IoT or other edge applications.
Other new features include the ability to debug query steps in Visual Studio, and to carry out local testing on live data in Visual Studio code, and the option of of managed identity authentication via PowerBI.
The public preview of all these features will open on November 4, with general availability coming some weeks later.
Looking a little further ahead, Microsoft has pencilled in a public preview of the ability to analyze ingress data from Event Hubs or IoT Hub on Azure Stack, and egress the results to a blob storage or SQL database in January 2020.
And if you like playing secret squirrel, you might like to know the company is soliciting signups for a Private Preview for real time scoring courtesy of Azure Machine Learning. The service will be based on custom pre-trained models managed by Azure Machine Learning, and hosted in either Azure Kubernetes Service or Azure Container Instances. Models can be built with a range of Python libraries, including Scikit-learn, PyTorch, TensorFlow, and trained on platforms, including Azure Databricks, Azure Machine Learning Compute, and HD Insight.