Tag Archives: microsoft
Storing OPC UA Information Models in Azure Data Explorer
Most Azure users deploy Azure Data Explorer (ADX) for storing and analyzing OPC UA PubSub telemetry data sent from industrial sites via a cloud broker. For the last several years, customers have also added OPC UA PubSub metadata to ADX as documented here (https://www.linkedin.com/pulse/using-azure-data-explorer-opc-ua-erich-barnstedt).
However, many customers are unaware that they can store entire OPC UA Information Models in ADX, imported from the UA Cloud Library (https://uacloudlibrary.opcfoundation.org).
This has several advantages:
OPC UA PubSub metadata only describes the semantics of the associated OPC UA PubSub telemetry data, but not the entire OPC UA Information Model where the data originally came from. However, customers want to have all semantic information in one location, ideally in the cloud for global access.
OPC UA PubSub metadata only includes a subset of the rich OPC UA semantics. For example, OPC UA complex type definitions or references to other data within the Information Model are not included but needed for deeper analysis of the telemetry data.
Customers want to be able to see what other telemetry data is available from their sites for potential publishing to the cloud and need the entire OPC UA Information Model to make a selection.
To get started with importing OPC UA Information Models into ADX, you first need an instance of ADX in your Azure subscription as well as a login to the UA Cloud Library, hosted by the OPC Foundation. You can get registered for accessing the UA Cloud Library for free from here: https://uacloudlibrary.opcfoundation.org/Identity/Account/Register
Once you have registered, you can browse the OPC UA Information Models you are interested in via the built-in browser accessible from here: https://uacloudlibrary.opcfoundation.org/Explorer
To get the unique ID of the OPC UA Information Models you are interested in, you can simply execute the “namespaces” REST API from here: https://uacloudlibrary.opcfoundation.org/infomodel/namespaces. For example, the “Robotics” Information Model has the unique ID 4172981173.
Configure an Azure Data Explorer callout policy for the UA Cloud Library by running the following query on your ADX cluster (make sure you are an ADX cluster administrator, configurable under Permissions in the ADX tab in the Azure Portal):
Then, from the Azure Portal UI of your ADX instance, simply run the following query to import the OPC UA Information Model into ADX:
let uri=’https://uacloudlibrary.opcfoundation.org/infomodel/download/<insert information model identifier from cloud library here>’;
let headers=dynamic({‘accept’:’text/plain’});
let options=dynamic({‘Authorization’:’Basic <insert your cloud library credentials hash here>’});
evaluate http_request(uri, headers, options)
| project title = tostring(ResponseBody.[‘title’]), contributor = tostring(ResponseBody.contributor.name), nodeset = parse_xml(tostring(ResponseBody.nodeset.nodesetXml))
| mv-expand UAVariable=nodeset.UANodeSet.UAVariable
| project-away nodeset
| extend NodeId = UAVariable.[‘@NodeId’], DisplayName = tostring(UAVariable.DisplayName.[‘#text’]), BrowseName = tostring(UAVariable.[‘@BrowseName’]), DataType = tostring(UAVariable.[‘@DataType’])
| project-away UAVariable
| take 10000
You need to provide two things in the query above:
The Information Model’s unique ID from the UA Cloud Library and enter it into the <insert information model identifier from cloud library here> field of the ADX query.
Your UA Cloud Library credentials (generated during registration) basic authorization header hash and insert it into the <insert your cloud library credentials hash here> field of the ADX query. Use tools like https://www.debugbear.com/basic-auth-header-generator to generate this.
And voila! You have just imported an entire OPC UA Information Model into a temporary table in Azure Data Explorer which you can then use in your queries!
Microsoft Tech Community – Latest Blogs –Read More
Storing OPC UA Information Models in Azure Data Explorer
Most Azure users deploy Azure Data Explorer (ADX) for storing and analyzing OPC UA PubSub telemetry data sent from industrial sites via a cloud broker. For the last several years, customers have also added OPC UA PubSub metadata to ADX as documented here (https://www.linkedin.com/pulse/using-azure-data-explorer-opc-ua-erich-barnstedt).
However, many customers are unaware that they can store entire OPC UA Information Models in ADX, imported from the UA Cloud Library (https://uacloudlibrary.opcfoundation.org).
This has several advantages:
OPC UA PubSub metadata only describes the semantics of the associated OPC UA PubSub telemetry data, but not the entire OPC UA Information Model where the data originally came from. However, customers want to have all semantic information in one location, ideally in the cloud for global access.
OPC UA PubSub metadata only includes a subset of the rich OPC UA semantics. For example, OPC UA complex type definitions or references to other data within the Information Model are not included but needed for deeper analysis of the telemetry data.
Customers want to be able to see what other telemetry data is available from their sites for potential publishing to the cloud and need the entire OPC UA Information Model to make a selection.
To get started with importing OPC UA Information Models into ADX, you first need an instance of ADX in your Azure subscription as well as a login to the UA Cloud Library, hosted by the OPC Foundation. You can get registered for accessing the UA Cloud Library for free from here: https://uacloudlibrary.opcfoundation.org/Identity/Account/Register
Once you have registered, you can browse the OPC UA Information Models you are interested in via the built-in browser accessible from here: https://uacloudlibrary.opcfoundation.org/Explorer
To get the unique ID of the OPC UA Information Models you are interested in, you can simply execute the “namespaces” REST API from here: https://uacloudlibrary.opcfoundation.org/infomodel/namespaces. For example, the “Robotics” Information Model has the unique ID 4172981173.
Configure an Azure Data Explorer callout policy for the UA Cloud Library by running the following query on your ADX cluster (make sure you are an ADX cluster administrator, configurable under Permissions in the ADX tab in the Azure Portal):
Then, from the Azure Portal UI of your ADX instance, simply run the following query to import the OPC UA Information Model into ADX:
let uri=’https://uacloudlibrary.opcfoundation.org/infomodel/download/<insert information model identifier from cloud library here>’;
let headers=dynamic({‘accept’:’text/plain’});
let options=dynamic({‘Authorization’:’Basic <insert your cloud library credentials hash here>’});
evaluate http_request(uri, headers, options)
| project title = tostring(ResponseBody.[‘title’]), contributor = tostring(ResponseBody.contributor.name), nodeset = parse_xml(tostring(ResponseBody.nodeset.nodesetXml))
| mv-expand UAVariable=nodeset.UANodeSet.UAVariable
| project-away nodeset
| extend NodeId = UAVariable.[‘@NodeId’], DisplayName = tostring(UAVariable.DisplayName.[‘#text’]), BrowseName = tostring(UAVariable.[‘@BrowseName’]), DataType = tostring(UAVariable.[‘@DataType’])
| project-away UAVariable
| take 10000
You need to provide two things in the query above:
The Information Model’s unique ID from the UA Cloud Library and enter it into the <insert information model identifier from cloud library here> field of the ADX query.
Your UA Cloud Library credentials (generated during registration) basic authorization header hash and insert it into the <insert your cloud library credentials hash here> field of the ADX query. Use tools like https://www.debugbear.com/basic-auth-header-generator to generate this.
And voila! You have just imported an entire OPC UA Information Model into a temporary table in Azure Data Explorer which you can then use in your queries!
Microsoft Tech Community – Latest Blogs –Read More
Storing OPC UA Information Models in Azure Data Explorer
Most Azure users deploy Azure Data Explorer (ADX) for storing and analyzing OPC UA PubSub telemetry data sent from industrial sites via a cloud broker. For the last several years, customers have also added OPC UA PubSub metadata to ADX as documented here (https://www.linkedin.com/pulse/using-azure-data-explorer-opc-ua-erich-barnstedt).
However, many customers are unaware that they can store entire OPC UA Information Models in ADX, imported from the UA Cloud Library (https://uacloudlibrary.opcfoundation.org).
This has several advantages:
OPC UA PubSub metadata only describes the semantics of the associated OPC UA PubSub telemetry data, but not the entire OPC UA Information Model where the data originally came from. However, customers want to have all semantic information in one location, ideally in the cloud for global access.
OPC UA PubSub metadata only includes a subset of the rich OPC UA semantics. For example, OPC UA complex type definitions or references to other data within the Information Model are not included but needed for deeper analysis of the telemetry data.
Customers want to be able to see what other telemetry data is available from their sites for potential publishing to the cloud and need the entire OPC UA Information Model to make a selection.
To get started with importing OPC UA Information Models into ADX, you first need an instance of ADX in your Azure subscription as well as a login to the UA Cloud Library, hosted by the OPC Foundation. You can get registered for accessing the UA Cloud Library for free from here: https://uacloudlibrary.opcfoundation.org/Identity/Account/Register
Once you have registered, you can browse the OPC UA Information Models you are interested in via the built-in browser accessible from here: https://uacloudlibrary.opcfoundation.org/Explorer
To get the unique ID of the OPC UA Information Models you are interested in, you can simply execute the “namespaces” REST API from here: https://uacloudlibrary.opcfoundation.org/infomodel/namespaces. For example, the “Robotics” Information Model has the unique ID 4172981173.
Configure an Azure Data Explorer callout policy for the UA Cloud Library by running the following query on your ADX cluster (make sure you are an ADX cluster administrator, configurable under Permissions in the ADX tab in the Azure Portal):
Then, from the Azure Portal UI of your ADX instance, simply run the following query to import the OPC UA Information Model into ADX:
let uri=’https://uacloudlibrary.opcfoundation.org/infomodel/download/<insert information model identifier from cloud library here>’;
let headers=dynamic({‘accept’:’text/plain’});
let options=dynamic({‘Authorization’:’Basic <insert your cloud library credentials hash here>’});
evaluate http_request(uri, headers, options)
| project title = tostring(ResponseBody.[‘title’]), contributor = tostring(ResponseBody.contributor.name), nodeset = parse_xml(tostring(ResponseBody.nodeset.nodesetXml))
| mv-expand UAVariable=nodeset.UANodeSet.UAVariable
| project-away nodeset
| extend NodeId = UAVariable.[‘@NodeId’], DisplayName = tostring(UAVariable.DisplayName.[‘#text’]), BrowseName = tostring(UAVariable.[‘@BrowseName’]), DataType = tostring(UAVariable.[‘@DataType’])
| project-away UAVariable
| take 10000
You need to provide two things in the query above:
The Information Model’s unique ID from the UA Cloud Library and enter it into the <insert information model identifier from cloud library here> field of the ADX query.
Your UA Cloud Library credentials (generated during registration) basic authorization header hash and insert it into the <insert your cloud library credentials hash here> field of the ADX query. Use tools like https://www.debugbear.com/basic-auth-header-generator to generate this.
And voila! You have just imported an entire OPC UA Information Model into a temporary table in Azure Data Explorer which you can then use in your queries!
Microsoft Tech Community – Latest Blogs –Read More
Storing OPC UA Information Models in Azure Data Explorer
Most Azure users deploy Azure Data Explorer (ADX) for storing and analyzing OPC UA PubSub telemetry data sent from industrial sites via a cloud broker. For the last several years, customers have also added OPC UA PubSub metadata to ADX as documented here (https://www.linkedin.com/pulse/using-azure-data-explorer-opc-ua-erich-barnstedt).
However, many customers are unaware that they can store entire OPC UA Information Models in ADX, imported from the UA Cloud Library (https://uacloudlibrary.opcfoundation.org).
This has several advantages:
OPC UA PubSub metadata only describes the semantics of the associated OPC UA PubSub telemetry data, but not the entire OPC UA Information Model where the data originally came from. However, customers want to have all semantic information in one location, ideally in the cloud for global access.
OPC UA PubSub metadata only includes a subset of the rich OPC UA semantics. For example, OPC UA complex type definitions or references to other data within the Information Model are not included but needed for deeper analysis of the telemetry data.
Customers want to be able to see what other telemetry data is available from their sites for potential publishing to the cloud and need the entire OPC UA Information Model to make a selection.
To get started with importing OPC UA Information Models into ADX, you first need an instance of ADX in your Azure subscription as well as a login to the UA Cloud Library, hosted by the OPC Foundation. You can get registered for accessing the UA Cloud Library for free from here: https://uacloudlibrary.opcfoundation.org/Identity/Account/Register
Once you have registered, you can browse the OPC UA Information Models you are interested in via the built-in browser accessible from here: https://uacloudlibrary.opcfoundation.org/Explorer
To get the unique ID of the OPC UA Information Models you are interested in, you can simply execute the “namespaces” REST API from here: https://uacloudlibrary.opcfoundation.org/infomodel/namespaces. For example, the “Robotics” Information Model has the unique ID 4172981173.
Configure an Azure Data Explorer callout policy for the UA Cloud Library by running the following query on your ADX cluster (make sure you are an ADX cluster administrator, configurable under Permissions in the ADX tab in the Azure Portal):
Then, from the Azure Portal UI of your ADX instance, simply run the following query to import the OPC UA Information Model into ADX:
let uri=’https://uacloudlibrary.opcfoundation.org/infomodel/download/<insert information model identifier from cloud library here>’;
let headers=dynamic({‘accept’:’text/plain’});
let options=dynamic({‘Authorization’:’Basic <insert your cloud library credentials hash here>’});
evaluate http_request(uri, headers, options)
| project title = tostring(ResponseBody.[‘title’]), contributor = tostring(ResponseBody.contributor.name), nodeset = parse_xml(tostring(ResponseBody.nodeset.nodesetXml))
| mv-expand UAVariable=nodeset.UANodeSet.UAVariable
| project-away nodeset
| extend NodeId = UAVariable.[‘@NodeId’], DisplayName = tostring(UAVariable.DisplayName.[‘#text’]), BrowseName = tostring(UAVariable.[‘@BrowseName’]), DataType = tostring(UAVariable.[‘@DataType’])
| project-away UAVariable
| take 10000
You need to provide two things in the query above:
The Information Model’s unique ID from the UA Cloud Library and enter it into the <insert information model identifier from cloud library here> field of the ADX query.
Your UA Cloud Library credentials (generated during registration) basic authorization header hash and insert it into the <insert your cloud library credentials hash here> field of the ADX query. Use tools like https://www.debugbear.com/basic-auth-header-generator to generate this.
And voila! You have just imported an entire OPC UA Information Model into a temporary table in Azure Data Explorer which you can then use in your queries!
Microsoft Tech Community – Latest Blogs –Read More
Storing OPC UA Information Models in Azure Data Explorer
Most Azure users deploy Azure Data Explorer (ADX) for storing and analyzing OPC UA PubSub telemetry data sent from industrial sites via a cloud broker. For the last several years, customers have also added OPC UA PubSub metadata to ADX as documented here (https://www.linkedin.com/pulse/using-azure-data-explorer-opc-ua-erich-barnstedt).
However, many customers are unaware that they can store entire OPC UA Information Models in ADX, imported from the UA Cloud Library (https://uacloudlibrary.opcfoundation.org).
This has several advantages:
OPC UA PubSub metadata only describes the semantics of the associated OPC UA PubSub telemetry data, but not the entire OPC UA Information Model where the data originally came from. However, customers want to have all semantic information in one location, ideally in the cloud for global access.
OPC UA PubSub metadata only includes a subset of the rich OPC UA semantics. For example, OPC UA complex type definitions or references to other data within the Information Model are not included but needed for deeper analysis of the telemetry data.
Customers want to be able to see what other telemetry data is available from their sites for potential publishing to the cloud and need the entire OPC UA Information Model to make a selection.
To get started with importing OPC UA Information Models into ADX, you first need an instance of ADX in your Azure subscription as well as a login to the UA Cloud Library, hosted by the OPC Foundation. You can get registered for accessing the UA Cloud Library for free from here: https://uacloudlibrary.opcfoundation.org/Identity/Account/Register
Once you have registered, you can browse the OPC UA Information Models you are interested in via the built-in browser accessible from here: https://uacloudlibrary.opcfoundation.org/Explorer
To get the unique ID of the OPC UA Information Models you are interested in, you can simply execute the “namespaces” REST API from here: https://uacloudlibrary.opcfoundation.org/infomodel/namespaces. For example, the “Robotics” Information Model has the unique ID 4172981173.
Configure an Azure Data Explorer callout policy for the UA Cloud Library by running the following query on your ADX cluster (make sure you are an ADX cluster administrator, configurable under Permissions in the ADX tab in the Azure Portal):
Then, from the Azure Portal UI of your ADX instance, simply run the following query to import the OPC UA Information Model into ADX:
let uri=’https://uacloudlibrary.opcfoundation.org/infomodel/download/<insert information model identifier from cloud library here>’;
let headers=dynamic({‘accept’:’text/plain’});
let options=dynamic({‘Authorization’:’Basic <insert your cloud library credentials hash here>’});
evaluate http_request(uri, headers, options)
| project title = tostring(ResponseBody.[‘title’]), contributor = tostring(ResponseBody.contributor.name), nodeset = parse_xml(tostring(ResponseBody.nodeset.nodesetXml))
| mv-expand UAVariable=nodeset.UANodeSet.UAVariable
| project-away nodeset
| extend NodeId = UAVariable.[‘@NodeId’], DisplayName = tostring(UAVariable.DisplayName.[‘#text’]), BrowseName = tostring(UAVariable.[‘@BrowseName’]), DataType = tostring(UAVariable.[‘@DataType’])
| project-away UAVariable
| take 10000
You need to provide two things in the query above:
The Information Model’s unique ID from the UA Cloud Library and enter it into the <insert information model identifier from cloud library here> field of the ADX query.
Your UA Cloud Library credentials (generated during registration) basic authorization header hash and insert it into the <insert your cloud library credentials hash here> field of the ADX query. Use tools like https://www.debugbear.com/basic-auth-header-generator to generate this.
And voila! You have just imported an entire OPC UA Information Model into a temporary table in Azure Data Explorer which you can then use in your queries!
Microsoft Tech Community – Latest Blogs –Read More
AI in Operations (Part 2 of 2)
We have been through the use cases of AI in Development in part 1 and in this blog we will cover the final 4 stages of the DevOps lifecycle and look into how AI can be used in operations at scale.
Release: The release stage in DevOps refers to the phase in the Software Development Lifecycle (SDLC) where a new version or iteration of a product is cut and made available to the end users. Here are two examples of where AI can help in this stage:
Automated release note generation:
Natural Language Processing (NLP) and Generative AI for release notes: When writing release notes, it can be quite an arduous task and it is imperative to get it correct. Using NLP and Generative AI, you can analyse code changes and automatically generate comprehensive release notes in natural language for end users. This ensures that the release documentation is comprehensive, up-to-date, and easily understandable for your user base.
Deployment risk assessment:
Machine Learning for risk prediction: The releasing of a new iteration of a product is always exciting but it comes with inherent risk. By implementing machine learning models to assess the risk associated with a release, using historical data it will help the team by providing insights, potential risks and employ mitigation contingencies that can be put in place ahead of time.
Deploy: The deploy stage in DevOps refers to the process of deploying the tested software or infrastructure changes from a development / test / pre-production environment to a production environment. Here are two examples of how AI can assist in this stage:
Dynamic rollback strategies:
AI-Driven rollbacks: It is expected that at one point or another, you will need to rollback your recently deployed environment. Mistakes happen and that is ok. The hard part of this is that it is not always automatically taken care of and the “what” and “why” is also not always clear. Here you can utilise AI models to analyse real-time performance metrics during deployment. If anomalies or performance issues are detected post-deployment, it can autonomously decide whether to initiate a rollback, ensuring there is a quick response to potential issues.
Deployment Optimisation:
Using AI for optimal traffic routing: There are several different deployment methods that are widely used. These include Canary, All-at-once, shadow deployments and more. Blue-Green is one of the most commonly used in production systems but that does not always mean it yields the results as expected. By utilising AI tools, you can dynamically optimise the traffic distribution between blue and green environments in a blue-green deployment, potentially better than a regular load balancer. This ensures that the new version receives sufficient traffic for testing and validation without impacting user experience.
Operate: In this stage the focus is on maintaining and managing the production environment. Here is where you will triage and address any incidents that occur and yes, AI can help!
Cognitive incident analysis:
Cognitive AI for incident triage: When an incident occurs, as a DevOps Engineer (or otherwise stated) you need to be able to categorise and explain it in basic language to report it and help other team members understand. This can be a hard task and time consuming, especially when there is pressure involved. Here would be a good time to implement cognitive AI tooling that can understand and categorise incidents based on natural language descriptions, such as application logs. By doing this, it will assist in a faster and somewhat accurate incident triage, allowing the team to prioritise and address critical issues promptly.
Monitor: In this stage of DevOps you are continuously checking the health of the service, performance and even the behaviour. This can be time consuming and costly. You can do this in a few ways from cherry picking logs and analysing them to reading user feedback and calculating costs. Here is how AI can help you:
Predictive cost analysis:
Cost prediction and optimisation: Building on cloud infrastructure comes with a sense of anxiety that you may be charged unknowingly for the usage of a service or tool you may not be aware of. With the integration of AI into monitoring tools you could use it to predict future resource allocation and associated costs without needing to work through a manual cost calculator. This enables proactive cost management and optimisations with very little lift from you, the end user.
Sentiment analysis of user feedback:
AI-Based sentiment analysis: User feedback is imperative to improve the product or service you are providing, and this is the stage in the DevOps lifecycle where you will review it and begin to plan any actionable items into the next sprint. By applying sentiment analysis on user feedback and logs you can get an overall picture of how the product is being perceived and behaving at any given time. This in turn leads to having a quicker turn around on the feedback loop, it can help to prioritise feature improvements, bug fixes, or infrastructure changes.
By incorporating AI into your DevOps and Software Development Lifecycle, you will be able to speed up and improve your delivery of services in several ways, as shown above in this blog and in part 1. When using AI tools there must always be human interaction and oversight to ensure what is being changed, provided, or reported by the models is correct.
To fully immerse yourself in the different AI tools available to help at these different stages of operations, I would suggest visiting the Microsoft AI website.
Microsoft Tech Community – Latest Blogs –Read More
Deploying Flask Apps to Azure Web App via Docker Hub
This tutorial will explore a step-by-step approach to deploying a Python-based Flask web application to Azure Web App using Docker Hub. Our project, a book recommendation system “BookBuddy”, represents a collaborative effort between myself, Haliunna Munkhuu, and Lilly Grella. We will guide you through each phase of the project: from the initial development of the recommendation logic in Python to creating the Flask framework, packaging our application into a Docker image, and finally deploying it to Azure.
Development with Flask on a Local IDE
Before we start, ensure that Python and pip are installed on our system. Then, install Flask using pip:
pip install Flask
Create a file named `flask_app.py` for our application “BookBuddy”. This code initializes the Flask app and sets up a simple route:
from flask import Flask
app = Flask(__name__)
@app.route(‘/’)
def index():
return “Welcome to BookBuddy!”
To ensure our Flask app is functioning correctly, we write a simple unit test using Python’s built-in `unittest` framework:
import unittest
class TestBookRecommendation(unittest.TestCase):
def setUp(self):
# Set up any variables you need for your tests
self.test_genre = “Cookbooks”
Containerization with Docker
Next, we need to create a Dockerfile to specify the environment of our Flask application:
# Dockerfile content
With Docker Desktop installed, build the Docker image with the following command:
docker build -t bookbuddy:latest .
After signing up and logging into Docker Hub, create a repository named `bookbuddy`, and ensure we have our tag name in lowercase:
docker login
docker tag bookbuddy:latest <your-docker-hub-username>/bookbuddy:latest
docker push <your-docker-hub-username>/bookbuddy:latest
Infrastructure as Code with Terraform
Before we deploy our Flask application to Azure, we need to set up the necessary infrastructure. We use Terraform, an open-source infrastructure as a code software tool, to write, plan, and create infrastructure efficiently.
Install Terraform on our local machine, we can refer to the Terraform official website.
Create a new directory for our Terraform configuration files. Within this directory, run:
terraform init
This command initializes a new Terraform project and sets up the necessary plugins.
Define our Azure resources in a file named `main.tf`. This file should include our Azure provider, resource groups, and App Service resources.
# main.tf content with Azure provider and resources
Execute the following command to see the execution plan, which shows what Terraform will do when we apply our configuration
terraform plan
If the plan looks correct, deploy our infrastructure with:
terraform apply
Confirm the action when prompted, and Terraform will begin creating the resources. Once Terraform has finished applying the changes, check the Azure portal to see our new resources.
Azure Web App Deployment
First, we need to install the Azure CLI. This can be done from the official website or via package managers:
# Azure CLI installation commands
Log into our Azure portal. From the Azure services dashboard, click on “App Services” to start creating a new Web App.
In the App Services section, click on “Create” and select “Web App”.
Create Web App
Fill out the “Instance Details” section with our web app’s name, publish settings, and operating system. For example, choose “Docker Container” and the region closest to ourselves for optimal performance.
Review our settings and click “Create” to provision the Web App with our configurations.
Once the Web App is provisioned, we will be directed to the deployment overview page. This page will indicate that our deployment is in progress.
After a short while, we will receive a notification that your deployment is complete. Click “Go to resource” to manage your deployed Web App.
In the Web App management section, we can view our Web App’s details, such as the default domain, status, and location. Here, we can also manage domain settings, scaling, and deployment slots
Navigate to the default domain provided by Azure to view our running Flask application. We should see the landing page of our “BookBuddy” application.
Conclusion
We’ve now successfully deployed a Flask app to Azure Web App using a Docker container. Our book recommendation system is live and accessible. Toggle “On” for continuous deployment in Azure to allow automatic redeployment whenever the Docker image is updated on Docker Hub. For any updates, push the new Docker image to Docker Hub and Azure will handle the rest.
Microsoft Tech Community – Latest Blogs –Read More
Cumulative Update #11 for SQL Server 2022 RTM
The 11th cumulative update release for SQL Server 2022 RTM is now available for download at the Microsoft Downloads site. Please note that registration is no longer required to download Cumulative updates.
To learn more about the release or servicing model, please visit:
CU11 KB Article: https://learn.microsoft.com/troubleshoot/sql/releases/sqlserver-2022/cumulativeupdate11
Starting with SQL Server 2017, we adopted a new modern servicing model. Please refer to our blog for more details on Modern Servicing Model for SQL Server
Microsoft® SQL Server® 2022 RTM Latest Cumulative Update: https://www.microsoft.com/download/details.aspx?id=105013
Update Center for Microsoft SQL Server: https://learn.microsoft.com/en-us/troubleshoot/sql/releases/download-and-install-latest-updates
Microsoft Tech Community – Latest Blogs –Read More
Cumulative Update #11 for SQL Server 2022 RTM
The 11th cumulative update release for SQL Server 2022 RTM is now available for download at the Microsoft Downloads site. Please note that registration is no longer required to download Cumulative updates.
To learn more about the release or servicing model, please visit:
CU11 KB Article: https://learn.microsoft.com/troubleshoot/sql/releases/sqlserver-2022/cumulativeupdate11
Starting with SQL Server 2017, we adopted a new modern servicing model. Please refer to our blog for more details on Modern Servicing Model for SQL Server
Microsoft® SQL Server® 2022 RTM Latest Cumulative Update: https://www.microsoft.com/download/details.aspx?id=105013
Update Center for Microsoft SQL Server: https://learn.microsoft.com/en-us/troubleshoot/sql/releases/download-and-install-latest-updates
Microsoft Tech Community – Latest Blogs –Read More
Cumulative Update #11 for SQL Server 2022 RTM
The 11th cumulative update release for SQL Server 2022 RTM is now available for download at the Microsoft Downloads site. Please note that registration is no longer required to download Cumulative updates.
To learn more about the release or servicing model, please visit:
CU11 KB Article: https://learn.microsoft.com/troubleshoot/sql/releases/sqlserver-2022/cumulativeupdate11
Starting with SQL Server 2017, we adopted a new modern servicing model. Please refer to our blog for more details on Modern Servicing Model for SQL Server
Microsoft® SQL Server® 2022 RTM Latest Cumulative Update: https://www.microsoft.com/download/details.aspx?id=105013
Update Center for Microsoft SQL Server: https://learn.microsoft.com/en-us/troubleshoot/sql/releases/download-and-install-latest-updates
Microsoft Tech Community – Latest Blogs –Read More
Cumulative Update #11 for SQL Server 2022 RTM
The 11th cumulative update release for SQL Server 2022 RTM is now available for download at the Microsoft Downloads site. Please note that registration is no longer required to download Cumulative updates.
To learn more about the release or servicing model, please visit:
CU11 KB Article: https://learn.microsoft.com/troubleshoot/sql/releases/sqlserver-2022/cumulativeupdate11
Starting with SQL Server 2017, we adopted a new modern servicing model. Please refer to our blog for more details on Modern Servicing Model for SQL Server
Microsoft® SQL Server® 2022 RTM Latest Cumulative Update: https://www.microsoft.com/download/details.aspx?id=105013
Update Center for Microsoft SQL Server: https://learn.microsoft.com/en-us/troubleshoot/sql/releases/download-and-install-latest-updates
Microsoft Tech Community – Latest Blogs –Read More
Cumulative Update #11 for SQL Server 2022 RTM
The 11th cumulative update release for SQL Server 2022 RTM is now available for download at the Microsoft Downloads site. Please note that registration is no longer required to download Cumulative updates.
To learn more about the release or servicing model, please visit:
CU11 KB Article: https://learn.microsoft.com/troubleshoot/sql/releases/sqlserver-2022/cumulativeupdate11
Starting with SQL Server 2017, we adopted a new modern servicing model. Please refer to our blog for more details on Modern Servicing Model for SQL Server
Microsoft® SQL Server® 2022 RTM Latest Cumulative Update: https://www.microsoft.com/download/details.aspx?id=105013
Update Center for Microsoft SQL Server: https://learn.microsoft.com/en-us/troubleshoot/sql/releases/download-and-install-latest-updates
Microsoft Tech Community – Latest Blogs –Read More
Cumulative Update #11 for SQL Server 2022 RTM
The 11th cumulative update release for SQL Server 2022 RTM is now available for download at the Microsoft Downloads site. Please note that registration is no longer required to download Cumulative updates.
To learn more about the release or servicing model, please visit:
CU11 KB Article: https://learn.microsoft.com/troubleshoot/sql/releases/sqlserver-2022/cumulativeupdate11
Starting with SQL Server 2017, we adopted a new modern servicing model. Please refer to our blog for more details on Modern Servicing Model for SQL Server
Microsoft® SQL Server® 2022 RTM Latest Cumulative Update: https://www.microsoft.com/download/details.aspx?id=105013
Update Center for Microsoft SQL Server: https://learn.microsoft.com/en-us/troubleshoot/sql/releases/download-and-install-latest-updates
Microsoft Tech Community – Latest Blogs –Read More
Removal of several Microsoft Graph Beta API’s for Intune device configuration reports
In February 2024, the following Microsoft Graph Beta API’s that leverage the old Intune reporting framework for device configuration policy reports will stop working:
Device configuration report:
https://microsoft.graph.com/Beta/deviceManagement/managedDevices(‘device_id’)/deviceConfigurationStates
Device status:
https://microsoft.graph.com/Beta/deviceConfiguration/StatelessDeviceConfigurationFEService/deviceManagement/deviceConfigurations(‘policy_id’)/deviceStatuses
If you’re impacted by this change, look for MC688107 in the Message center. If you’re using automation or scripts to retrieve reporting data from the Graph Beta API’s listed above, we recommend moving to newer Intune reporting framework by making POST requests to the corresponding endpoint for each report:
Device configuration report: getConfigurationPoliciesReportForDevice
Device and user check-in status report: getConfigurationPolicyDevicesReport
Device assignment status report: getCachedReport
For more information on the updated reporting experience read, Announcing updated policy reporting experience in Microsoft Intune.
Example: Device configuration report
POST: https://graph.microsoft.com/beta/deviceManagement/reports/getConfigurationPoliciesReportForDevice
Payload:
{
“select”: [
“IntuneDeviceId”,
“PolicyBaseTypeName”,
“PolicyId”,
“PolicyStatus”,
“UPN”,
“UserId”,
“PspdpuLastModifiedTimeUtc”,
“PolicyName”,
“UnifiedPolicyType”
],
“filter”: “((PolicyBaseTypeName eq ‘Microsoft.Management.Services.Api.DeviceConfiguration’) or (PolicyBaseTypeName eq ‘DeviceManagementConfigurationPolicy’) or (PolicyBaseTypeName eq ‘DeviceConfigurationAdmxPolicy’) or (PolicyBaseTypeName eq ‘Microsoft.Management.Services.Api.DeviceManagementIntent’)) and (IntuneDeviceId eq ‘adce2b4a-0000-0000-0000-0000000000’)”,
“skip”: 0,
“top”: 50,
“orderBy”: [
“PolicyName”
]
}
Response:
{
“TotalRowCount”: 2,
“Schema”: [{
“Column”: “IntuneDeviceId”,
“PropertyType”: “String”
},
{
“Column”: “PolicyBaseTypeName”,
“PropertyType”: “String”
},
{
“Column”: “PolicyId”,
“PropertyType”: “String”
},
{
“Column”: “PolicyName”,
“PropertyType”: “String”
},
{
“Column”: “PolicyStatus”,
“PropertyType”: “Int32”
},
{
“Column”: “PspdpuLastModifiedTimeUtc”,
“PropertyType”: “DateTime”
},
{
“Column”: “UnifiedPolicyType”,
“PropertyType”: “String”
},
{
“Column”: “UnifiedPolicyType_loc”,
“PropertyType”: “String”
},
{
“Column”: “UPN”,
“PropertyType”: “String”
},
{
“Column”: “UserId”,
“PropertyType”: “String”
}],
“Values”: [[“adce2b4a-0000-0000-0000-0000000000”, “DeviceManagementConfigurationPolicy”, “fdb08003-0000-0000-0000-00000000000”, “ASR Rules 02”, 2, “2023-08-13T01:51:46”, “SettingsCatalog”, “Settings Catalog”, “admin@xxx.net”, “132aa545-0000-0000-0000-00000000000”], [“adce2b4a-0000-0000-0000-00000000000”, “DeviceManagementConfigurationPolicy”, “09e3c028-0000-0000-0000-00000000000”, “Intent Policy with AF”, 6, “2023-08-10T01:53:20”, “MicrosoftDefenderAntivirus”, “Microsoft Defender Antivirus”, “admin@xxxx.net”, “132aa545- 0000-0000-0000-00000000000”]],
“SessionId”: “”
}
If you have any questions, leave a comment below or reach out to us on X @IntuneSuppTeam!
Microsoft Tech Community – Latest Blogs –Read More
Removal of several Microsoft Graph Beta API’s for Intune device configuration reports
In February 2024, the following Microsoft Graph Beta API’s that leverage the old Intune reporting framework for device configuration policy reports will stop working:
Device configuration report:
https://microsoft.graph.com/Beta/deviceManagement/managedDevices(‘device_id’)/deviceConfigurationStates
Device status:
https://microsoft.graph.com/Beta/deviceConfiguration/StatelessDeviceConfigurationFEService/deviceManagement/deviceConfigurations(‘policy_id’)/deviceStatuses
If you’re impacted by this change, look for MC688107 in the Message center. If you’re using automation or scripts to retrieve reporting data from the Graph Beta API’s listed above, we recommend moving to newer Intune reporting framework by making POST requests to the corresponding endpoint for each report:
Device configuration report: getConfigurationPoliciesReportForDevice
Device and user check-in status report: getConfigurationPolicyDevicesReport
Device assignment status report: getCachedReport
For more information on the updated reporting experience read, Announcing updated policy reporting experience in Microsoft Intune.
Example: Device configuration report
POST: https://graph.microsoft.com/beta/deviceManagement/reports/getConfigurationPoliciesReportForDevice
Payload:
{
“select”: [
“IntuneDeviceId”,
“PolicyBaseTypeName”,
“PolicyId”,
“PolicyStatus”,
“UPN”,
“UserId”,
“PspdpuLastModifiedTimeUtc”,
“PolicyName”,
“UnifiedPolicyType”
],
“filter”: “((PolicyBaseTypeName eq ‘Microsoft.Management.Services.Api.DeviceConfiguration’) or (PolicyBaseTypeName eq ‘DeviceManagementConfigurationPolicy’) or (PolicyBaseTypeName eq ‘DeviceConfigurationAdmxPolicy’) or (PolicyBaseTypeName eq ‘Microsoft.Management.Services.Api.DeviceManagementIntent’)) and (IntuneDeviceId eq ‘adce2b4a-0000-0000-0000-0000000000’)”,
“skip”: 0,
“top”: 50,
“orderBy”: [
“PolicyName”
]
}
Response:
{
“TotalRowCount”: 2,
“Schema”: [{
“Column”: “IntuneDeviceId”,
“PropertyType”: “String”
},
{
“Column”: “PolicyBaseTypeName”,
“PropertyType”: “String”
},
{
“Column”: “PolicyId”,
“PropertyType”: “String”
},
{
“Column”: “PolicyName”,
“PropertyType”: “String”
},
{
“Column”: “PolicyStatus”,
“PropertyType”: “Int32”
},
{
“Column”: “PspdpuLastModifiedTimeUtc”,
“PropertyType”: “DateTime”
},
{
“Column”: “UnifiedPolicyType”,
“PropertyType”: “String”
},
{
“Column”: “UnifiedPolicyType_loc”,
“PropertyType”: “String”
},
{
“Column”: “UPN”,
“PropertyType”: “String”
},
{
“Column”: “UserId”,
“PropertyType”: “String”
}],
“Values”: [[“adce2b4a-0000-0000-0000-0000000000”, “DeviceManagementConfigurationPolicy”, “fdb08003-0000-0000-0000-00000000000”, “ASR Rules 02”, 2, “2023-08-13T01:51:46”, “SettingsCatalog”, “Settings Catalog”, “admin@xxx.net”, “132aa545-0000-0000-0000-00000000000”], [“adce2b4a-0000-0000-0000-00000000000”, “DeviceManagementConfigurationPolicy”, “09e3c028-0000-0000-0000-00000000000”, “Intent Policy with AF”, 6, “2023-08-10T01:53:20”, “MicrosoftDefenderAntivirus”, “Microsoft Defender Antivirus”, “admin@xxxx.net”, “132aa545- 0000-0000-0000-00000000000”]],
“SessionId”: “”
}
If you have any questions, leave a comment below or reach out to us on X @IntuneSuppTeam!
Microsoft Tech Community – Latest Blogs –Read More
Exciting news: Teams Essential plus Teams Phone promotion extended!
PROMO EXTENDED!
We are excited to share that the Teams Essential plus Teams Phone bundle promotion (available in USA, UK, Puerto Rico, and Canada) has been extended through July 1st, 2024. Teams Phone is one of the biggest bets with the highest potential for partners in the new fiscal year and can help partners increase profitability.
Teams Phone enables customers to always be connected through a cloud-based phone solution. They will benefit from intelligent phone features such as auto transcriptions of mailbox messages, smart call control such as scheduling call queues, screen pop, and more.
We are providing CSP partners with the following tools to support Teams Phone enablement and sales guidance:
SMB Masters Program on-demand trainings | Microsoft Teams Phone Learning Path
New Teams Phone SMB 1:many customer workshop | book for partners as a part of the SMB workshop motion
With the promo we launched some Phone discounts, Teams Essentials plus Phone System Promo FAQ
Call to Action
Download and share the SMB 1:Many Customer Workshops to help partners grow their practices TE+PS business: Modern Work for Partners – SMB Briefings (microsoft.com)
Evangelize new resources on https://aka.ms/TeamsEssentialsPartner
Resources
Teams Phone Partner Portal
Teams Phone SMB partner opportunity deck & Teams Phone SMB pitch deck
Teams Essentials plus Phone System Promo FAQ
CSP Masters Program readiness: https://aka.ms/M365MastersProgram
Microsoft Tech Community – Latest Blogs –Read More
What's new in security for Azure SQL and SQL Server | Data Exposed
Check out this episode to learn the newest information on security for Azure SQL and SQL Server!
View/share our latest episodes on Microsoft Learn and YouTube!
Microsoft Tech Community – Latest Blogs –Read More
What's new in security for Azure SQL and SQL Server | Data Exposed
Check out this episode to learn the newest information on security for Azure SQL and SQL Server!
View/share our latest episodes on Microsoft Learn and YouTube!
Microsoft Tech Community – Latest Blogs –Read More
What's new in security for Azure SQL and SQL Server | Data Exposed
Check out this episode to learn the newest information on security for Azure SQL and SQL Server!
View/share our latest episodes on Microsoft Learn and YouTube!
Microsoft Tech Community – Latest Blogs –Read More
What's new in security for Azure SQL and SQL Server | Data Exposed
Check out this episode to learn the newest information on security for Azure SQL and SQL Server!
View/share our latest episodes on Microsoft Learn and YouTube!
Microsoft Tech Community – Latest Blogs –Read More