Category: Microsoft
Category Archives: Microsoft
How to use Azure OpenAI GPT-4o with Function calling
Introduction
In this article we will demonstrate how we leverage GPT-4o capabilities, using images with function calling to unlock multimodal use cases.
We will simulate a package routing service that routes packages based on the shipping label using OCR with GPT-4o.
The model will identify the appropriate function to call based on the image analysis and the predefined actions for routing to the appropriate continent.
Background
The new GPT-4o (“o” for “omni”) can reason across audio, vision, and text in real time.
It can respond to audio inputs in as little as 232 milliseconds, with an average of 320 milliseconds, which is similar to human response time in a conversation.
It matches GPT-4 Turbo performance on text in English and code, with significant improvement on text in non-English languages, while also being much faster and 50% cheaper in the API.
GPT-4o is especially better at vision and audio understanding compared to existing models.
GPT-4o now enables function calling.
The application
We will run a Jupyter notebook that connects to GPT-4o to sort packages based on the printed labels with the shipping address.
Here are some sample labels we will be using GPT-4o for OCR to get the country this is being shipped to and GPT-4o functions to route the packages.
The environment
The code can be found here – Azure OpenAI code examples
Make sure you create your python virtual environment and fill the environment variables as stated in the README.md file.
The code
Connecting to Azure OpenAI GPT-4o deployment.
from dotenv import load_dotenv
from IPython.display import display, HTML, Image
import os
from openai import AzureOpenAI
import json
load_dotenv()
GPT4o_API_KEY = os.getenv(“GPT4o_API_KEY”)
GPT4o_DEPLOYMENT_ENDPOINT = os.getenv(“GPT4o_DEPLOYMENT_ENDPOINT”)
GPT4o_DEPLOYMENT_NAME = os.getenv(“GPT4o_DEPLOYMENT_NAME”)
client = AzureOpenAI(
azure_endpoint = GPT4o_DEPLOYMENT_ENDPOINT,
api_key=GPT4o_API_KEY,
api_version=”2024-02-01″
)
Defining the functions to be called after GPT-4o answers.
# Defining the functions – in this case a toy example of a shipping function
def ship_to_Oceania(location):
return f”Shipping to Oceania based on location {location}”
def ship_to_Europe(location):
return f”Shipping to Europe based on location {location}”
def ship_to_US(location):
return f”Shipping to Americas based on location {location}”
Defining the available functions to be called to send to GPT-4o.
It is very IMPORTANT to send the function’s and parameters descriptions so GPT-4o will know which method to call.
tools = [
{
“type”: “function”,
“function”: {
“name”: “ship_to_Oceania”,
“description”: “Shipping the parcel to any country in Oceania”,
“parameters”: {
“type”: “object”,
“properties”: {
“location”: {
“type”: “string”,
“description”: “The country to ship the parcel to.”,
}
},
“required”: [“location”],
},
},
},
{
“type”: “function”,
“function”: {
“name”: “ship_to_Europe”,
“description”: “Shipping the parcel to any country in Europe”,
“parameters”: {
“type”: “object”,
“properties”: {
“location”: {
“type”: “string”,
“description”: “The country to ship the parcel to.”,
}
},
“required”: [“location”],
},
},
},
{
“type”: “function”,
“function”: {
“name”: “ship_to_US”,
“description”: “Shipping the parcel to any country in the United States”,
“parameters”: {
“type”: “object”,
“properties”: {
“location”: {
“type”: “string”,
“description”: “The country to ship the parcel to.”,
}
},
“required”: [“location”],
},
},
},
]
available_functions = {
“ship_to_Oceania”: ship_to_Oceania,
“ship_to_Europe”: ship_to_Europe,
“ship_to_US”: ship_to_US,
}
Function to base64 encode our images, this is the format accepted by GPT-4o.
# Encoding the images to send to GPT-4-O
import base64
def encode_image(image_path):
with open(image_path, “rb”) as image_file:
return base64.b64encode(image_file.read()).decode(“utf-8”)
The method to call GPT-4o.
Notice below that we send the parameter “tools” with the JSON describing the functions to be called.
def call_OpenAI(messages, tools, available_functions):
# Step 1: send the prompt and available functions to GPT
response = client.chat.completions.create(
model=GPT4o_DEPLOYMENT_NAME,
messages=messages,
tools=tools,
tool_choice=”auto”,
)
response_message = response.choices[0].message
# Step 2: check if GPT wanted to call a function
if response_message.tool_calls:
print(“Recommended Function call:”)
print(response_message.tool_calls[0])
print()
# Step 3: call the function
# Note: the JSON response may not always be valid; be sure to handle errors
function_name = response_message.tool_calls[0].function.name
# verify function exists
if function_name not in available_functions:
return “Function ” + function_name + ” does not exist”
function_to_call = available_functions[function_name]
# verify function has correct number of arguments
function_args = json.loads(response_message.tool_calls[0].function.arguments)
if check_args(function_to_call, function_args) is False:
return “Invalid number of arguments for function: ” + function_name
# call the function
function_response = function_to_call(**function_args)
print(“Output of function call:”)
print(function_response)
print()
Please note that WE and not GPT-4o call the methods in our code based on the answer by GTP4-o.
# call the function
function_response = function_to_call(**function_args)
Iterate through all the images in the folder.
Notice the system prompt where we ask GPT-4o what we need it to do, sort labels for packages routing calling functions.
# iterate through all the images in the data folder
import os
data_folder = “./data”
for image in os.listdir(data_folder):
if image.endswith(“.png”):
IMAGE_PATH = os.path.join(data_folder, image)
base64_image = encode_image(IMAGE_PATH)
display(Image(IMAGE_PATH))
messages = [
{“role”: “system”, “content”: “You are a customer service assistant for a delivery service, equipped to analyze images of package labels. Based on the country to ship the package to, you must always ship to the corresponding continent. You must always use tools!”},
{“role”: “user”, “content”: [
{“type”: “image_url”, “image_url”: {
“url”: f”data:image/png;base64,{base64_image}”}
}
]}
]
call_OpenAI(messages, tools, available_functions)
Let’s run our notebook!!!
Running our code for the label above produces the following output:
Recommended Function call:
ChatCompletionMessageToolCall(id=’call_lH2G1bh2j1IfBRzZcw84wg0x’, function=Function(arguments='{“location”:”United States”}’, name=’ship_to_US’), type=’function’)
Output of function call:
Shipping to Americas based on location United States
That’s all folks!
Thanks
Denise
Microsoft Tech Community – Latest Blogs –Read More
Como Implementar e Gerenciar Seus Custos do Azure com FinOps Hub no Formato FOCUS 🚀
Tutorial Implementação do FinOps Hub para Análise de Custos
Se você está buscando uma plataforma confiável para analisar seus custos, obter insights e tomar ações baseadas em dados, está no lugar certo. Neste artigo, vou mostrar o passo a passo de como implementar a solução do FinOps Hub no seu ambiente e aproveitar os diversos relatórios de custos disponíveis no repositório deste projeto.
O Que é o FinOps?
Antes de começarmos, é importante entender o que é o FinOps e como o kit de Ferramentas do FinOps Hub vai ajudá-lo na análise de custos.
FinOps é uma estrutura operacional e prática cultural que maximiza o valor comercial da nuvem. Ela permite a tomada de decisões baseadas em dados e cria responsabilidade financeira por meio da colaboração entre equipes de engenharia, finanças, tecnologia e operações comerciais. Esta disciplina envolve o uso de ferramentas de gerenciamento de custos na nuvem, como o Microsoft Cost Management, e práticas recomendadas, como:
Analisar e acompanhar os gastos com a nuvem.
Identificar oportunidades de redução de custos.
Alocar custos para equipes, projetos ou produtos específicos.
A Microsoft recentemente anunciou uma parceria estratégica com a FinOps.org, uma organização líder no desenvolvimento de práticas de gerenciamento financeiro em nuvem. Essa colaboração visa aprimorar as capacidades de governança financeira e otimização de custos para empresas que utilizam a plataforma Azure.
Combinando a expertise da Microsoft em tecnologia de nuvem com as práticas recomendadas e frameworks desenvolvidos pela FinOps.org, as organizações poderão obter maior transparência, controle e eficiência em seus investimentos em nuvem. Essa aliança promete fortalecer as ferramentas e recursos disponíveis para empresas, facilitando uma gestão financeira mais inteligente e estratégica no ambiente digital.
Benefícios do FinOps Hub
O FinOps Hub vai ampliar o gerenciamento de custos ao exportar detalhes para uma conta de armazenamento consolidada, superando algumas limitações quando coletamos estes dados por API. Na sua forma básica, ele habilita opções adicionais de relatórios no Power BI. Em um nível avançado, ele serve como base para criar sua própria solução de gerenciamento e otimização de custos.
Princípios de Design do FinOps Hub
O FinOps Hub se concentra em três princípios básicos de design:
Padronizado: Este princípio se esforça para ser o exemplo máximo do Framework de FinOps, demonstrando seus princípios, práticas e valores de forma exemplar.
Construído para Escala: Projetado para suportar as maiores contas e organizações.
Aberto e Extensível: Abraça o ecossistema e prioriza a habilitação da plataforma.
Comparação com o Microsoft Cost Management
Uma pergunta comum é por que usar o FinOps Hub se já existe o Cost Management. Muitas organizações que utilizam o Microsoft Cost Management encontram obstáculos quando necessitam de recursos que não estão disponíveis nativamente. Nessas situações, as opções são limitadas a utilizar ferramentas de terceiros ou desenvolver uma solução do zero. O FinOps Hub oferece uma base para facilitar a criação de soluções personalizadas de gerenciamento de custos.
Vantagens do FinOps Hub
Acesso Simplificado: Você não precisa conceder acesso no portal do Azure para utilizar o Cost Management.
Templates de Relatórios: Ele oferece templates de relatórios do Power BI que podem ser publicados online.
Funcionamento do FinOps Hub
A exportação é baseada no FinOps Open Cost and Usage Specification (FOCUS), uma iniciativa para definir um formato comum para dados de faturamento. O FOCUS inclui dados atuais e amortizados, reduzindo até 30% de dados no storage account e processamento, e está alinhado com o Framework do FinOps.
Processo de Exportação e Ingestão de Dados
O Gerenciamento de Custos exporta detalhes de custo bruto para o contêiner msexports.
O pipeline de msexports_ExecuteETL inicia o processo ETL (extract-transform-load) quando os arquivos são adicionados ao armazenamento.
O pipeline msexports_ETL_ingestion salva os dados exportados em formato parquet no contêiner de ingestão.
O Power BI lê dados de custo do contêiner de ingestão.
Custos do FinOps Hub
O custo médio é de 25 dólares por 1 milhão de linhas, mas o custo exato da solução pode variar, principalmente devido ao armazenamento de dados e à frequência de ingestão dos dados. Os pipelines operam uma vez por dia por exportação, ajudando a manter os custos sob controle e garantindo uma gestão eficiente dos dados.
Conclusão
Considerando os ganhos em termos de eficiência, automação e precisão que essa solução pode oferecer, o investimento pode ser justificado, trazendo retorno em médio a longo prazo. Agora, vamos para a prática!
O FinOps Hub é projetado pela comunidade. Você pode se juntar à discussão pelo link na descrição do vídeo e comentar o que gostaria de ver a seguir ou aprender a contribuir e fazer parte da equipe.
Quer assistir ao vídeo com o passo a passo para implementar o FinOps Hub baseado no Focus? Aperte o play no vídeo abaixo ou, se preferir, acesse
Deseja contribuir com o projeto e interagir com a comunidade? Visite o FinOps hubs – FinOps toolkit (microsoft.github.io)
Até mais!
Microsoft Tech Community – Latest Blogs –Read More
Single Line of Text Changed to Multiple Lines of Text Character Limitation not adjusted
I had a flow fail due to the column setting of Single Line of Text maximum 255 characters. I was able to adjust the column type to Multiple Lines of Text which should give me 69K’ish characters. Upon resubmitting the flow it continues to fail, not accepting any characters of 255. I also have another column set up exactly this same way and that column accepts characters over 255, I see no difference in the settings between the two columns.
Can someone shed any light on this for me please?
I had a flow fail due to the column setting of Single Line of Text maximum 255 characters. I was able to adjust the column type to Multiple Lines of Text which should give me 69K’ish characters. Upon resubmitting the flow it continues to fail, not accepting any characters of 255. I also have another column set up exactly this same way and that column accepts characters over 255, I see no difference in the settings between the two columns. Can someone shed any light on this for me please? Read More
Spam
Folks, there has been an increase in spam post recently, can you please adjust the protection settings and start blocking repeated offenders? Here are some examples from today:
7 Best OST to PST Converter Software for Outlook – Microsoft Community Hub
List of 7 Best MBOX to PST Converter for Outlook – Microsoft Community Hub
5 Best MBOX to PDF Converter – Microsoft Community Hub
Top 5 Best OST Repair tool for Outlook Application – Microsoft Community Hub
Folks, there has been an increase in spam post recently, can you please adjust the protection settings and start blocking repeated offenders? Here are some examples from today:
7 Best OST to PST Converter Software for Outlook – Microsoft Community Hub
List of 7 Best MBOX to PST Converter for Outlook – Microsoft Community Hub
5 Best MBOX to PDF Converter – Microsoft Community Hub
Top 5 Best OST Repair tool for Outlook Application – Microsoft Community Hub Read More
What is to do? 3 Basic buttons missing from Bluetooth settings
Not doing anything and the three basic buttons were gone! Disabled and Enabled Bluetooth multiple times but this does not work at all. As you can see in the image, i only have 2 rows, the one with the bluetooth is completely gone.
Not doing anything and the three basic buttons were gone! Disabled and Enabled Bluetooth multiple times but this does not work at all. As you can see in the image, i only have 2 rows, the one with the bluetooth is completely gone. Read More
SQL Server on Azure VMs: I/O analysis (preview)
Analyzing I/O problems just got easier for SQL Server on Azure VMs
It is not easy to understand what’s going on when you run into an I/O related performance problem on an Azure Virtual Machine. It is a common, but complex problem. What you need is to figure out what’s happening at both the host level and your SQL Server instance where often, correlating host metrics with SQL Server workloads can be a challenge.
We developed a new experience that helps you do exactly that.
When you visit the Storage blade of your SQL virtual machine resource page on Azure portal, you will see two new tabs:
I/O analysis
I/O related best practices
The I/O analysis tab will tell you if you are having a performance issue stemming from IOPS and/or throughput throttling, caused by exceeding virtual machine or data disks limits. It will further show you the exact metric(s) and time where this issue shows itself down to the disk or VM level. Once you identify the problem on this tab, you might want to go to our documentation by following the “Learn more” link on the page as shown in the image above. The documentation details each scenario, what might have caused the problem, and provides guidance on how to resolve it.
How does it determine if there is a problem or not? It uses Azure metrics to understand what is going on in the system. I/O analysis checks metric health data for the last 24 hours. You will see the Azure metric charts on the page.
If you click on these charts, they will take you directly to the Azure metrics page as it is simply leveraging what’s available in Azure already.
It first looks to see if there is disk latency above a certain threshold. Throttling might occur but it is not considered problematic unless it results in a latency condition. Once latency is detected, it then analyzes Azure Metrics and shows you which one(s) demonstrate the problem.
The Azure metrics are:
VM Cached IOPS Consumed Percentage
VM Cached Bandwidth Consumed Percentage
VM Uncached IOPS Consumed Percentage
VM Uncached Bandwidth Consumed Percentage
Data Disk IOPS Consumed Percentage
Data Disk Bandwidth Consumed Percentage
You can find detailed information about the metrics and the algorithms used in the documentation.
Detecting Latency
In the example below, you see that it detected an issue (disk latency was over the threshold for a certain amount of time). The problem occurred on May 20th at 12:56pm UTC. If there are multiple spikes in the chart, I/O Analysis helps you pinpoint the issue. Two metrics show why the latency occurred.
In this case, it is a throughput problem both at VM level and disk level. The disk related charts have a chart line for each disk you have in your virtual machine, labeled with the LUN number. In the below graphic, you can see there is an I/O latency issue due to throughput for ‘VM Uncached Bandwidth Consumed Percentage’ and ‘Data Disk Bandwidth Consumed Percentage’.
You can then explore the details further by expanding the VM level metrics and / or disk level metrics sections below the Disk Latency chart.
For this scenario, expanding the VM level metrics reveals the following data on cached and uncached IOPS and throughput health, where you again can go to the metrics data, for further analysis, by clicking the chart.
You may also want to explore the disk level, by expanding the disk level metrics section as shown below.
I/O related best practices tab checks to make sure your system is following the configuration best practices relating to I/O on a SQL Server on Azure VM. Poorly configured systems tend to lead to performance problems, which often get exposed under workload pressure. Running an assessment will give you recommendations with various severity ratings based on risk and impact.
We recommend you implement them starting with the highest severity.
PowerShell script
If you prefer scripting to using Azure portal, you can also use the I/O Analysis PowerShell script to analyze the I/O performance of your SQL Server VM.
We would love to hear your feedback. Please feel free to comment here.
Microsoft Tech Community – Latest Blogs –Read More
Microsoft Federal Azure Developer Connect – Upcoming Events
Join us monthly for the Microsoft Federal Azure Developer Connect, a virtual webinar series focused on accelerating cloud adoption and empowering every developer to innovate and build software on the Azure Platform.
What’s it about?
This series is dedicated to the developer community, whether you are a developer building or supporting apps, or a Program Manager overseeing a portfolio of apps. We will be providing Federal relevant briefings to:
Share industry best practices
Enable developer productivity and accelerate software delivery
Deliver new features to end users more securely and efficiently
Boost application scalability and reliability
Upcoming Events
To register, click on each hyperlink below to register for each individual event. These events are open to all Federal employees and contractors.
Once you complete the registration form, please allow the team up to 2 business days to process your registration request. Once approved, you’ll receive an email with the calendar invite.
Microsoft Tech Community – Latest Blogs –Read More
Potential Use Cases for Generative AI
Azure’s generative AI is a powerful and versatile technology that can help users to create and deploy intelligent applications that can generate content, insights, and solutions from your own data. It can be applied to almost all industries and domains, such as education, healthcare, media, entertainment, gaming, marketing, public sector and more. Azure’s generative AI can help users to automate repetitive tasks, enhance creativity, and solve complex problems. GenAI can be used as a co-pilot or a custom co-pilot (bespoke build), depending on the level of control and customization that the user needs.
Co-pilot: This is the default mode of GenAI, where a user can enter a prompt or a partial text and GenAI will complete it with relevant and coherent content. It uses a general-purpose model that can handle a wide range of topics and domains. The co-pilot mode is useful for tasks such as writing emails, blog posts, social media posts and product descriptions.
Custom co-pilot: This is an advanced mode of GenAI, where the user can create their own models by fine-tuning general-purpose models on their own domain specific data. The user can also specify the style, tone, format, length, and other parameters of the generated content. The custom co-pilot mode allows the user to train a specialized model that can capture the nuances and specificities of their use case. The custom co-pilot mode is useful for tasks such as creating personalized and targeted content for specific audiences or scenarios.
What are some potential use cases of Azure’s generative AI?
We have been working with many customers and industries sectors and came across numerous use cases. Here are some examples of how Azure’s generative AI can help users in different scenarios.
Manufacturing: As the industry manufactures finished products or parts rather than services multiple use cases can be seen:
Many organisations have historical and technical documentation. An organisation may want a way to surface useful information from the documents and query it in natural language. This not only reduces management administrative efforts, reduces labour costs and overhead costs (for physical storage to store files). An internal Copilot can be created for employees to ask questions in natural language and get back an answer. This involves deploying a solution to extract relevant contextual information from a Knowledge Base. Using this tool a custom co-pilot can be made to answer your organisations specific questions.
A Copilot could provide valuable assistance to manufacturers by suggesting designs and recommending optimal materials. These recommendations consider cost, sustainability, and durability considerations. For example, Rockwell Automation, a leading US provider of industrial automation technology, leverages Microsoft Copilot within its FactoryTalk Design Studio. Copilot assists engineers by generating code through natural language prompts, automating routine tasks, and enhancing design efficiency here.
Copilot can enhance innovation and operational efficiency in any organisation. For example, Siemens is integrating its Teamcenter software (used for product lifecycle management) with Microsoft Teams and Copilot. This solution allows for:
Production operatives to use their devices to report design concerns in natural language, the summary of the reports sent in by the production operatives. GPT 4v can assist in terms of analysing the images and visual data. This helps to detect defects or inconsistences in the production line .
Generative AI can also be used to build cloud-native systems to improve efficiency by gaining real-time insights on production lines or industrial equipment. This moves from batch processing to real time allows for an improved customer experience.
Retail: It’s essential for retailers to standout by bringing appropriate products to customers on hand at speed. This can be accelerated with the help of generative AI. Some use cases seen in this industry include:
Personalised product recommendations – to maximise sales, tailored advertising and marketing is used to recommend products based upon a customer’s purchase history, preferences, and behaviour to aid the alignment of promotions to the customer. Azure’s generative AI can help marketers and advertisers to create and test content on various user groups. Custom co-pilots can be used to enable users to chat with the system database to find appropriate products that might be best suited for their needs. This chatbot allows a user to search the retailer’s database in natural language to obtain a result.
Forecasting & inventory management. Generative AI can help retailers predict future demand and optimise inventory levels, reducing cost and waste. This can be done by analysing historical data, market trends, and external factors to predict future demand more precisely. This accuracy helps retailers optimize inventory levels, preventing stockouts (which disappoint customers) and overstocks (which lead to waste).
A customer example of how generative AI is used in retail, is Estee Lauder here.
Public Sector: Generative AI has the potential to revolutionize how challenges are addressed in the public sector. To increase the efficiency (more than 30%) of Government Departments aligning to the Cabinet Directives can utilise generative AI.
Using Azure Open AI, chat Bots can provide better customer service to provide citizens with information e.g. Gov.UK. Chatbots would be able to understand and interpret natural language queries from citizens. NLU (natural language understanding) models would process user input, extract intent, identify relevant entities and relay the answer back to the user. Chatbots can provide and share knowledge internally with more people and in some instances might be able to explain information with trends in data that might not be able to be detected at first glance.
Automations, Applications and Processes infused with the Azure OpenAI service can unlock high levels of efficiencies in key areas such as call centres, citizen services and borders. Azure Open AI can simplify call centres e.g. HMRC Tax Helpline, automate manual processes e.g. DVLA Driving Licence Applications. Phone calls between agent and customers may be recorded and stored and later Azure AI speech can transcribe the audio files asynchronously while identifying different speakers, languages and sentiment Here
Financial Sector: (Finance industry) encompasses institutions or services involved in the management of money. Generative AI has potential to transform the industry.
In portfolio management, Copilots can assist portfolio managers by analysing market trends, suggesting investment strategies, and providing real-time insights. For instance, a custom Copilot could monitor stock prices, analyse financial news, and recommend adjustments to investment portfolios.
Augmenting Human Capabilities Through Automation, this allows for more focus on strategic activities. Extraction of insights from documents and summarisation can be done utilising Azure generative AI capabilities. It has the capacity to analyse and synthesize vast amounts of financial documents, such as reports, contracts, and regulatory filings. With the ability to extract information and identify patterns to aid in the prevention of fraud. It increases the efficiency of the organisation. In claims, it would help claim handlers and claim adjusters better manage customer interactions and help reduce fraud.
The potential use cases of Azure’s generative AI are vast and continually evolving, demonstrating its versatility and power in addressing industry-specific challenges and enhancing operational efficiency.
In the next article we will discuss about how to begin building custom Co-pilots.https://techcommunity.microsoft.com/t5/ai-ai-platform-blog/the-evolution-of-genai-application-deployment-strategy-building/ba-p/4150525
@Paolo Colecchia @arung @Stephan Rhodes @Renata Bafaloukou @Morgan Gladwell
Microsoft Tech Community – Latest Blogs –Read More
Mitigate- Error 2767 “Could not locate statistics” when query against the secondary replica fails
Scenario:
You may face Error 2767 “Could not locate statistics” when query against the secondary replica fails
Full discussion about this error and possible workaround is available here
A suggested mitigation is to convert auto created statistics to user created statistics.
as it should be done for each, and every auto created stats it would only make sense that we would like to make it automated.
in case we keep the auto create statistics active for the database we might face the issue again once new auto created statistics will be created in the database and will be attempted to use on the secondary.
Solution:
To help with automating the process I am sharing here a script that will do what was suggested in the original article for every auto created statistics.
you will also be able to run the script (as stored procedure in your database) to monitor for newly auto created stats and to automate their conversion to user created stats.
I hope you will find it useful.
Please make sure to set the [@Whatif] parameter to 0 to trigger the action, otherwise it will only be reporting the suggested steps.
here is the stored procedure code:
CREATE OR ALTER PROCEDURE MigrateAutoCreatedStatsToUserCreated (@WhatIf bit =1) as
BAGIN
/*
Date: 2024-06-03 | V1.0
Authored by: Yochanan Rachamim
Purpose:
This procedure created to help with mitigation of error 2767 as described here:
https://learn.microsoft.com/en-us/troubleshoot/sql/database-engine/availability-groups/error-2767-query-secondary-replica-fails
this will drop auto created statistics and recreate them as user created statistics.
this script can be used as automated process that run on regular basis to prevent reoccurances of exception 2767
DISCLAIMER : THIS CODE AND INFORMATION IS PROVIDED “AS IS” WITHOUT WARRANTY OF ANY KIND, EITHER EXPRESSED OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE.
*/
SET NOCOUNT ON;
DECLARE @iPrefix varchar(5) = ‘User_’;
SELECT
–ss.object_id
–,table_name = st.name
–, stats_name = ss.name
–,ss.stats_id
–,sc.stats_column_id
–,sc.column_id
–,scol.name column_name
DROP_COMMAND = ‘DROP STATISTICS [‘+ sschem.name +’].[‘ + st.name + ‘].[‘+ ss.name +’]’,
CREATE_COMMAND = ‘CREATE STATISTICS [‘+ @iPrefix + ss.name +’] ON [‘+ sschem.name +’].[‘ + st.name + ‘](‘+ scol.name +’)’
INTO #StatsTemp
FROM sys.stats ss JOIN sys.stats_columns sc ON ss.object_id=sc.object_id and ss.stats_id = sc.stats_id
JOIN sys.tables st ON st.object_id = ss.object_id
JOIN sys.schemas sschem ON st.schema_id = sschem.schema_id
JOIN sys.columns scol ON st.object_id = scol.object_id and sc.column_id = scol.column_id
WHERE ss.auto_created=1;
DECLARE @t TABLE(DROP_COMMAND NVARCHAR(max), CREATE_COMMAND NVARCHAR(max));
DECLARE @cmd NVARCHAR(max);
IF @WhatIf=1 RAISERROR(‘WhatIf command activated, no actual command will be executed’,0,1) WITH NOWAIT;
IF @WhatIf=1 RAISERROR(‘WhatIf mode is defaut, use @whatif=0 for actual execution’,0,1) WITH NOWAIT;
WHILE EXISTS(SELECT * FROM #StatsTemp)
BEGIN
DELETE TOP (1) FROM #StatsTemp OUTPUT deleted.* INTO @t;
SELECT @cmd = DROP_COMMAND + ‘; ‘+ CREATE_COMMAND FROM @t;
IF @WhatIf=0 exec(@cmd);
RAISERROR(@cmd,0,1) WITH NOWAIT;
END
RAISERROR(‘Done’,0,1) WITH NOWAIT;
END
GO
Microsoft Tech Community – Latest Blogs –Read More
SAP on Azure Product Announcements Summary – SAP Sapphire 2024
We are very excited to be part of Sapphire 2024 this year as a diamond sponsor to “bring out the best in customer’s business” with SAP on Azure. We help customers get the most out of their current SAP investments, improve their operations significantly, and make them more agile and innovative. We also continue to build a robust portfolio of services to deploy, manage and extend and innovate customers’ SAP workload leveraging AI and Copilots to unlock insight and automation. Our comprehensive analytics platform on the Microsoft Cloud enables customers to extract more insights from their SAP and non-SAP data. Also, our unified security operations platform, powered by industry-leading AI, enables faster response to security threats and leverages the Microsoft Intelligent data platform to deploy machine learning models more efficiently and cost-effectively.
With these expanded capabilities more and more customers are choosing SAP on Azure as their preferred cloud provider, with the possibility of running RISE with SAP and natively with SAP solutions.
When South32, a global mining and metals company, spun off from BHP, it needed an environment to build scalable and easy to manage SAP systems. RISE with SAP on Azure offered a transformation solution that minimizes risk and no trade-offs for migrating its legacy systems.
“RISE with SAP on Azure and SAP’s Private Cloud Edition remove the infrastructure-level management activities and present us with a simple service catalogue–based approach for managing our application activities.”
– Stuart Munday, Group Manager, ERP, South32
For Kyndryl, it was important to have a platform that meets the diverse needs of its customers as it moved away from its legacy infrastructure after separating from IBM. To build a new reliable, secure and agile SAP platform, Kyndryl decided to deploy a greenfield solution on SAP on Azure leveraging the built-in functionality to simplify operations and improve efficiency.
“ The ease of the rollouts of Microsoft 365 and our SAP on Azure solution, along with how easily interoperable they are, gave us confidence in our ability to drive sweeping, positive change.”
– Guido Reisch, Vice President of Enterprise Applications, Kyndryl
Building upon the momentum, we are happy to share product updates to further help our customers improve productivity and business outcomes in the era of AI.
Automatic attack disruption for SAP with Microsoft Sentinel for SAP and Microsoft Defender for Endpoint to protect SAP hosts.
Replacing SAP Identity and Access Management (IDM) with Entra ID. Furthermore, supporting MFA for SAPGUI using SAP Secure Login Service and Entra ID to mitigate sign-in risks for every SAP interface.
Public Preview of SAP OData Connector in Microsoft Power Automate and General Availability of OData capability in Azure API Management enabling customers and partners to quickly and securely connect to SAP OData services for Copilot Studio scenarios like “chatting with your sales orders” and more.
Integration of the API Management capability in SAP Integration Suite with Azure API Management. This boosts discoverability of Azure APIs for the SAP BTP developer persona.
Preview for Mv3 High Memory and General Availability of Mv3 Medium Memory and Mv3 Very High Memory series to provide faster insights, more uptime, and improved price-performance.
Azure Center for SAP solutions now has quality checks for Db2 and Oracle databases as well as integrated VM health annotations in Quality Insights.
Enhanced the provider and alert management experience for Azure Monitor for SAP solutions by simplifying the creation and monitoring of the Providers and creating a new central alert management experience.
The SAP Deployment Automation Framework is extended to support HANA Scale-out and integration of Azure Monitor for SAP into the deployment workflow.
Private Preview of SAP ASE (Sybase) database backup support on Azure Backup.
Public preview of Discovery and Assessment of SAP Systems in Azure Migrate to perform assessments of on-premises SAP inventory and workloads.
General Availability of Well-Architected Framework for SAP as part of Azure Center for SAP Tools and Frameworks to provide best practices to optimize SAP deployments.
With that let’s get into the summary of product updates and services.
Extend and Innovate
Microsoft Sentinel Solution for SAP
Business applications pose a unique security challenge with highly sensitive information that can make them prime targets for attacks. Microsoft offers best in class security solutions support for SAP business applications with Microsoft Sentinel.
We are announcing automatic attack disruption for SAP. Correlating signals from Microsoft Defender XDR with Sentinel for SAP ensures the confidence to take drastic action like isolating devices, disabling Entra ID users, and the SAP backend user. A less disruptive approach with semi-automatic disruption using approvals via Microsoft Teams is available too. See this video to learn more.
We also recently announced the General Availability of Copilot for Security. It easily analyzes incidents involving SAP, generating threat hunting queries, summarizing impact for the leadership, and recommending remediation steps. See this video to learn more.
Furthermore, we are announcing additional detections for SAP AS JAVA. Update your Sentinel for SAP solution or simply browse the content hub to get them. The solution provides threat detection for SAP workloads running on Azure, in RISE, other clouds and on-premises.
Microsoft Defender for Endpoint for SAP applications
We are announcing new deployment guidance to protect SAP hosts with industry leading Endpoint Detection and Response (EDR) and next generation Antivirus. Deployment guidance validated with SAP workloads is available for Linux and Windows based SAP systems.
MFA for and SAP IDM Replacement
Conditional access and multi-factor authentication are the first line of defense against cyber-attacks on SAP systems. Protection of SAP Web applications like Fiori launchpad with Entra ID is already a common practice. Before it was not possible to use MFA for the power user experience with SAPGUI. Learn more about how to extend MFA and conditional access to SAPGUI using SAP Cloud Identity Service and Microsoft Entra ID (formerly Azure AD) in this blog.
SAP and Microsoft signed contracts to intensify collaboration in the identity space. SAP IDM will be replaced by Entra ID and SAP Identity Access Governance will be tighter integrated with Entra ID Governance.
Power Platform
Customers on or moving to SAP S/4HANA are recommended to follow a “keep the core clean” approach. The idea is to develop extensions outside of the ECC or S/4HANA core system. Using Microsoft’s low-code Power Platform customers follow exactly this paradigm which enables them to easily build extensions of their SAP system without modifying the core. In an easy and quick way business users and low-code developers can enhance the SAP system, integrate in Microsoft 365 and many other solutions and build new Applications like Creating Purchase Orders using Power Apps, automate processes like invoice processing using Power Automate or even build Chatbot Copilots that allow you to query the latest status of a Sales Order using Copilot Studio.
So far it has been possible to connect to SAP RFC and BAPIs using the SAP ERP Connector. While we are constantly enhancing this connector, for example, Simplified Single Sign-On Support via Kerberos, customers have also asked for a connector to SAP OData Services.
The new SAP OData Connector now in Public Preview enables customers and partners to quickly connect to SAP OData services. It also integrates with API Management solutions (like Azure API Management, or SAP API Management) to control access to your SAP system.
Try out and get started with the SAP OData Connector.
API Management
We are sharing integration guidance for SAP API Management (part of SAP Integration Suite) and Azure API Center to govern all API gateways within an enterprise. Many customers use at least two integration platforms today. API Sprawl is a major driver for increased API security risks and “Improper Inventory Management” is part of the OWASP API security top 10 list. Learn more from this blog post.
Furthermore, we announced the integration of the API Management capability in SAP Integration Suite with Azure API Management. See this blog from SAP to learn more.
New SAP Certified Compute and Storage
Thousands of organizations today trust the Azure M-Series virtual machines to run some of their largest mission-critical SAP workloads, including SAP HANA.
Azure M-series Mv3 Family
The next generation memory optimized virtual machines, give customers faster insights, more uptime, a lower total cost of ownership and improved price-performance for running SAP HANA with Azure IaaS deployments and SAP RISE on Azure. Our new Mv3 VMs, supporting up to 32 TB of memory, are powered by the 4th generation Intel® Xeon® Scalable processors and Azure Boost, one of Azure’s latest infrastructure innovations. The Mv3 family provides unparalleled computational performance to support large in-memory databases and workloads, with a high memory-to-CPU ratio that is ideal for relational database servers, large caches, and in-memory analytics. Mv3 has three virtual machines series, Medium Memory (MM), High Memory (HM), and Very High Memory (VHM). Today we are pleased to share that Preview for Mv3 HM series has been ongoing, and Mv3 MM and Mv3 VHM series are Generally Available.
Mv3 Key Features and Benefits:
Mv3 delivers quicker insights and faster performance over Mv2, enabling up to 30% faster SAP HANA data load times for SAP OLAP workloads and up to 15% higher performance per core for SAP OLTP workloads over the previous generation Mv2.
Powered by Azure Boost, Mv3 provides up to two times more throughput to Azure premium SSD storage and up to 25% improvement in network throughput over Mv2, with more deterministic performance.
Mv3 has been built from the ground up with a significant investment in stability and reliability. It offers improved resilience against common failures in memory, storage, and networking, resulting in minimized interruptions to your mission-critical workloads.
Using 16-socket hardware to host our 32TB VMs on gives customers the assurance of having enough CPU resources for even the most demanding S/4HANA systems.
Mv3 is available in many regions with zonal resiliency. All Mv3 virtual machine sizes can use Premium SSD v2, as well as Premium SSD v1 and Ultra SSD.
SAP on Azure Software Products and Services
Azure Center for SAP solutions
Azure Center for SAP solutions (ACSS) is an end-to-end solution to deploy and manage your SAP landscapes on Azure. ACSS is now available in 7 additional regions for a total 27 Azure regions. Now even more customers can take advantage of the rich management capabilities either themselves or through partners.
At SoftwareONE, integrating our customers’ SAP systems with ACSS has become standard practice. ACSS offers our teams and customers a centralized location to view status and vital information about their SAP estate. Through ACSS, we have increased our visibility of key SAP information, can perform mass actions on SAP systems such as start/stop, and check configuration of the systems against evolving Azure deployment best practices. We collaborate closely with Microsoft to enhance and further leverage ACSS as a powerful tool for managing SAP workloads on Azure
– Chris Kernaghan, Chief Product Owner (SAP Services), SoftwareONE
We are also happy to announce some of the new capabilities that are now available for you to use in ACSS. You can deploy new S/4HANA systems using the latest Mv3 virtual machine SKU for HANA database. Mv3 offers improved performance, better reliability with reduced TCO. To ensure your SAP systems on IBM Db2 and Oracle databases are running per best practices, after you register your SAP systems with ACSS you can take advantage of the expanded quality checks based on Azure Advisor Recommendations which now cover checks specific to IBM Db2 and Oracle database.
Apart from the Advisor recommendations, Quality Insights in ACSS provides an end-to-end availability and performance metrics view for all Virtual Machines of the SAP system along with Azure Resource Health VM annotations. When you are troubleshooting issues with availability or performance, these metrics at SID level help you quickly identify if the issue is due to an Azure infrastructure health event or not.
In addition, you can now improve security by limiting access to the managed storage accounts deployed by ACSS to only trusted virtual networks.
Please check out the product documentation to learn more or head over to the Azure Portal to try out the new features.
Azure Monitor for SAP solutions
Azure Monitor for SAP solutions is a cloud-based monitoring solution that helps you monitor and optimize the performance and availability of your SAP applications running on Azure. We are excited to share a few of the latest announcements for Azure Monitor for SAP solutions.
We have simplified providers creation by automating some of the steps for SAP NetWeaver, Operating System, and High-Availability providers, making it easier and faster to onboard your SAP systems to Azure Monitor for SAP solutions.
You can now more efficiently monitor the telemetry data flows properly from your SAP systems to Azure Monitor for SAP solutions through the recently added provider level health details.
We have improved the alerts experience in Azure Monitor for SAP solutions by providing a centralized view of all the alerts generated by your SAP systems, databases, hosts, and infrastructure. You can also perform bulk operations on the alerts, such as acknowledging, enabling/disabling, multiple alerts at once.
Azure monitor for SAP solutions has been expanded to 7 more regions, making it available in 30 regions now.
SAP Deployment Automation Framework
The SAP Deployment Automation Framework (SDAF) helps our customers rapidly deploy SAP S/4HANA and SAP NetWeaver at scale. The latest release of the SDAF introduces a suite of enhancements that significantly improve the deployment of SAP systems on Azure. This release introduces HANA Scale-out with worker/standby nodes utilizing Azure NetApp Files storage, fortifying the framework’s scalability and resilience. The integration of Azure Monitor for SAP solutions enabled deployment of our first-party monitoring solution.
We also added a post deployment playbook to allow customers to apply custom configuration to further tailoring the deployment to their specific needs. Additionally, support for SystemD-based SAP Startup framework for SLES, and encrypted DB2 databases is now available.
“By using the SAP Deployment Automation Framework, we have significantly sped up the deployment of our customers’ SAP applications on Azure. By rigorously adhering to recommended architecture and industry best practices, we’ve been able to streamline the modernization of SAP infrastructure on Azure, translating into significant cost reductions and heightened operational efficiency for our clients.
Jose A Hernandez, CTO, myCloudDoor
To learn more, please check out our product documentation.
Azure Backup for SAP
Since introducing Azure Backup for SAP HANA, we have extended the range of capabilities to increase backup performance and extend backup to other database workloads. We recently launched Private Preview of SAP ASE database backup support on Azure. SAP ASE Backup enables customers to schedule backups, retain those backups for longer periods and perform on demand ad-hoc backups. Customers can also restore ASE DBs to specific points in time using log backups or revert to a designated recovery point. ASE backups are streamed directly to a Recovery Services Vault, ensuring complete isolation of backup data from the production environment and enhanced protection.
We also continue to invest to make HANA backup faster and more cost-effective. We recently announced support for multi-streaming/multi-channel backup, which increased the backup throughputs from 420 MBps to ~1.5 GBps and reduced the time to back up a 1TB HANA database to less than 20 minutes. We also enabled support for HANA native compression on Azure Backup along with instant HANA snapshot backup, allowing users to achieve a significant reduction of 40-45% in storage usage, thereby lowering backup cost.
Discovery and Assessment of SAP Systems with Azure Migrate
We are announcing the public preview of Discovery and Assessment of SAP Systems in Azure Migrate. Using this capability, customers can now perform import-based assessments for their on-premises SAP inventory and workloads. With a simple excel based discovery of your SAP system inventory, you will be able to see the SAP landscape in Azure migrate. You can then run multiple assessments to understand the deployments on Azure to meet your business needs with performance optimized and cost optimized recommendations for the various environments in your SAP estate on Azure.
Azure Center for SAP solutions Tools and Framework
Central to our dedication to simplifying SAP migrations and operations on Azure lies the Azure Center for SAP Solutions Tools and Frameworks. In continuation with the efforts, we are delighted to announce the general availability of the Well-Architected Framework for SAP, bolstered by Microsoft Assessments. The Well-Architected Framework for SAP on Azure provides a comprehensive set of best practices to optimize SAP deployments in the cloud. It emphasizes five key pillars: cost management, operational excellence, performance efficiency, reliability, and security. By adhering to these principles, organizations can ensure robust, scalable, and efficient SAP environments on Azure.
Azure Inventory Checks for SAP represents a health check tool designed to offer our customers and partners a comprehensive overview of their SAP deployment quality at a subscription level. We are thrilled to introduce a feature update for Inventory Checks in SAP, offering invaluable insights into the utilization of Azure NetApp files. Customers gain the ability to track volume activity against predefined threshold values, empowering them to adjust settings accordingly.
SAP + Microsoft Co-Innovations
Microsoft and SAP have been partners and customers of each other for over 30 years, collaborating on innovative business solutions and helping thousands of joint customers accelerate their business transformation. The Microsoft Cloud is the market leader for running SAP workloads in the cloud, including RISE with SAP, and today at SAP Sapphire 2024, we are very excited to bring more amazing innovation to the market for our joint customers. Be sure to check out our blog to stay up-to-date on these important announcements.
Microsoft Tech Community – Latest Blogs –Read More
Session Management via Defender for Cloud Apps
Hi,
We are testing the conditional access functionality in Defender for Cloud Apps, one of our use cases is to be able to sign out users from all active SSO sessions, it appears that this is only possible for Microsoft apps. Is there a way for us to be able to get users kicked out of all active sessions, ie. Workday, Slack, ServiceNow or other SSO enabled applications? During our testing it seems that this is not possible from any Microsoft resources such as Entra, Intune, or Defender.
Hi, We are testing the conditional access functionality in Defender for Cloud Apps, one of our use cases is to be able to sign out users from all active SSO sessions, it appears that this is only possible for Microsoft apps. Is there a way for us to be able to get users kicked out of all active sessions, ie. Workday, Slack, ServiceNow or other SSO enabled applications? During our testing it seems that this is not possible from any Microsoft resources such as Entra, Intune, or Defender. Read More
Planning and Administering Microsoft Azure for SAP Blueprinting Opportunity
Microsoft is updating a certification for Planning and Administering Microsoft Azure for SAP Workloads, and we need your input through our exam blueprinting survey.
The blueprint determines how many questions each skill in the exam will be assigned. Please complete the online survey by June 14th, 2024. Please also feel free to forward the survey to any colleagues you consider subject matter experts for this certification. If you have any questions, feel free to contact Rohan Mahadevan rmahadevan@microsoft.com or John Sowles at josowles@microsoft.com.
Planning and Administering Microsoft Azure for SAP Workloads blueprint survey link:
https://microsoftlearning.co1.qualtrics.com/jfe/form/SV_cONTURE8I54piOG
Microsoft is updating a certification for Planning and Administering Microsoft Azure for SAP Workloads, and we need your input through our exam blueprinting survey.
The blueprint determines how many questions each skill in the exam will be assigned. Please complete the online survey by June 14th, 2024. Please also feel free to forward the survey to any colleagues you consider subject matter experts for this certification. If you have any questions, feel free to contact Rohan Mahadevan rmahadevan@microsoft.com or John Sowles at josowles@microsoft.com.
Planning and Administering Microsoft Azure for SAP Workloads blueprint survey link:
https://microsoftlearning.co1.qualtrics.com/jfe/form/SV_cONTURE8I54piOG Read More
Accessing Windows 365 PCs from non-AAD joined PCs
Greetings,
I was just wondering if anyone knows if there are any problems with accessing a Windows 365 desktop from a machine not connected to the Azure domain where the W365 PC is running. This used to be (and may still be) a limitation with Azure Virtual PC’s: Any machine that you were trying to remote into an Azure Virtual machine from had to be connected to the AAD tenant. I need to know if W365 Cloud PCs have this same limitation, or if I can sign into it from any PC (AAD joined or not).
Greetings, I was just wondering if anyone knows if there are any problems with accessing a Windows 365 desktop from a machine not connected to the Azure domain where the W365 PC is running. This used to be (and may still be) a limitation with Azure Virtual PC’s: Any machine that you were trying to remote into an Azure Virtual machine from had to be connected to the AAD tenant. I need to know if W365 Cloud PCs have this same limitation, or if I can sign into it from any PC (AAD joined or not). Read More
Error message AadGroupCrudService::GraphError
Hi. I’m trying to remove owners and members from a team. I deactivated the user and then tried to remove them from the team. When I remove them – I received an error that “something went wrong. We are looking into it.” When I try to switch them from owner to member (thinking this would be a good step, I receive “AadGroupCrudService::GraphError”.
Is there any way to reset this team other than deleting?
Thanks!
Hi. I’m trying to remove owners and members from a team. I deactivated the user and then tried to remove them from the team. When I remove them – I received an error that “something went wrong. We are looking into it.” When I try to switch them from owner to member (thinking this would be a good step, I receive “AadGroupCrudService::GraphError”. Is there any way to reset this team other than deleting?Thanks! Read More
NTA in May: Foster care, mental health, and coding for youth
During May, I had the privilege to engage in transformative discussions and initiatives that highlight societal issues and showcase the power of community and technology in driving equity. From delving into gaps within the foster care system to partnering with organizations dedicated to mental health awareness and celebrating the fusion of sports and coding for youth empowerment, each event has been a testament to the collective efforts of our nonprofit community.
National Foster Care Month: A Call for Transformation
For National Foster Care Month, I hosted an enlightening conversation with Shantay Armstrong of Kidsave EMBRACE Project and Mariah M. Jameson, a former foster youth. We explored the pressing gaps in the foster care system and discussed how Microsoft technology could propose transformative solutions, especially for underestimated communities. The EMBRACE Project stands out as a beacon of hope to enhance the foster system’s effectiveness. I encourage you to watch the replay of our Microsoft Nonprofits LinkedIn Live to how technology can be a driving force for equity.
Watch the replay: Innovation in Foster Care: How Technology is Driving Equity | LinkedIn
Silencing the Shame: Mental Health Awareness
Mental well-being is a cornerstone of a thriving community. I’m proud to share that one of our program’s nonprofit organizations, Silence the Shame, has been featured in Forbes for their unwavering commitment to mental health awareness. In collaboration with Microsoft, they’ve launched an app that provides a safe space for the Black community to discuss mental health challenges and access education and support. Discover the resources available and the significant strides being made in mental health by reading the Forbes feature: For Mental Health Awareness Month, Let’s Vow to Silence the Shame
Harlem Codetrotters: Coding Meets Game Design in Norfolk
The Harlem Codetrotters event in Norfolk, VA, was a celebration of game design and coding, specifically tailored for Black and Brown youth. This initiative, in partnership with River City Dreams, ingeniously integrated the Harlem Globetrotters’ legacy with a technology-infused curriculum. It was a joy to witness the community’s enthusiasm and participation. I invite you to watch the event recap video and delve deeper into this innovative approach to education. Learn more at aka.ms/Codetrotters to be part of these impactful narratives.
The Nonprofit Tech Acceleration (NTA) program provides technology grants and technical consulting for nonprofits that support underestimated communities. Learn more by visiting aka.ms/NTA. Join the conversation in the comments and let us know how you are empowering your community!
Continue the conversation by joining us in the Nonprofit Community! Want to share best practices or join community events? Become a member by “Joining” the Nonprofit Community. To stay up to date on the latest nonprofit news, make sure to Follow or Subscribe to the Nonprofit Community Blog space!
Microsoft Tech Community – Latest Blogs –Read More
Logic Apps Aviators Newsletter – June 2024
In this issue:
Ace Aviator of the Month
Customer Corner
News from our product group
News from our community
Ace Aviator of the Month
June’s Ace Aviator: Sebastian Meyer
What is your role and title? What are your responsibilities associated with your position?
I’m a Senior Lead Architect and I am responsible to solve integration needs from our clients. I have over 15 years of experience in the integration space. Mainly BizTalk Server and Azure Integration Services. I also have knowledge about SAP Integration Suite, so I work closely with stakeholders from different departments with different systems to understand and translate their needs into appropriate integration solutions. I am helping design, implementing and optimize integration solutions and also mentoring teams to adopt new capabilities.
Can you provide some insights into your day-to-day activities and what a typical day in your role looks like?
Every day is diverse, usually I work in more than one project, but a typical day is fully packed with meetings with clients and co-workers. There is a lot of communication to do. Other parts of my day include prototyping of new approaches and/or technologies, creating technical design documents and presentations for clients.
What motivates and inspires you to be an active member of the Aviators/Microsoft community?
Being an active member in the community is very important for me. Technology, especially Integration-Technology, is my passion and I learned a lot from the community. I want to share my knowledge to others, so they can also learn a lot from my experience. I think this can be named as circle of continuous learning where everyone from the community plays a crucial role. No matter if you are a beginner or an expert.
Looking back, what advice do you wish you would have been told earlier on that you would give to individuals looking to become involved in STEM/technology?
I want to give the following advice. Not every new technology or framework should be adopted without thinking about it and double-check it if it is really necessary for your needs. But it is important to be interested in and open-minded for new technologies. Because technology is evolving very fast.
What has helped you grow professionally?
I think the most important part to grow professionally is not only focus on technology. To be professional there is more than technology. Soft skills like communication, kindness and taking responsibility for your own work and your team members are very important as well. Don’t stop learning and grow yourself.
Imagine you had a magic wand that could create a feature in Logic Apps. What would this feature be and why?
If I had a magic wand to create a feature in Logic Apps, it would be full support for running Logic Apps on-prem to support on-prem only integration workloads, because this is the most requested feature by our clients.
Customer Corner:
hubergroup achieves operational excellence in 150 locations with a unified Microsoft solution
Discover this customer success story about hubergroup optimizing its manufacturing operations with Microsoft’s environment, including Azure and Logic Apps. By leveraging Logic Apps alongside Microsoft Power Apps and Power Automate, hubergroup streamlined operations, reducing costs by 10-15% and automating workflows for increased efficiency. Read more about how the Microsoft solution empowers hubergroup and join hubergroup’s journey of digital transformation with Azure and Logic Apps.
News from our product group:
Event Announcement: Logic Apps Community Day 2024
On September 26, 2024 (Pacific Time) the Logic Apps Product Group will host a full day of learning where you will be the star! We are currently hosting a call for speakers until June 23rd. Make sure to sign up now!
Business continuity and disaster recovery for Azure Logic Apps
Discover top strategies for disaster recovery in this must-watch video – boost your cloud resilience now!
Azure API Center: Your Comprehensive API Inventory and Governance Solution
Read more about Azure API Center and the importance in having a centralized management in this article.
Application Insights for Azure Logic Apps (Standard) – Correlation Tracking Update
Discover a new feature flag that will modify the default behavior of passing correlation between Azure services and Logic Apps when tracking data using Application Insights v2.
Time is almost up! Upgrade your Azure Logic Apps Integration Service Environment workloads
Are you an Azure Logic Apps Integration Service Environment user? Then make sure to complete your upgrade from Integration Service Environments (ISE) to Logic Apps Standard as the August 31st deadline quickly approaches. Read more in this article.
Upcoming Data Mapper improvements
Check out this article to learn about all the latest updates planned for Data Mapper and provide feedback.
Introducing GenAI Gateway Capabilities in Azure API Management
We are thrilled to announce GenAI Gateway capabilities in Azure API Management – a set of features designed specifically for GenAI use cases. Read more in this article!
Discover your next integration inspiration at this year’s Build!
This article covers the major announcements for Azure Integration Services from this year’s Build event – definitely a must-read!
Business Process Tracking Preview Update
Today, June 3rd, is the last day to complete the recommended call to action for the preparation needed with changes coming to Business Process Tracking. Make sure to read this article for tips on how to avoid an error.
This article explores various techniques for cataloging APIs, sourced from a myriad of platforms, and demonstrates how to create a universal consolidated view using Azure API Center!
News from our community:
Aviators Germany Community Kickoff Event
Hosted by QUIBIC
Join your fellow Aviators Kent Weare and Sandro Pereira when they feature as special guests at the Aviators Germany Community Kickoff Event! This special opportunity will occur on June 13th from 12-5pm (CEST). Make sure to register for free and secure your place now!
Connecting With Azure Integration Accounts
Video by Stephen W. Thomas
Don’t know what Azure Integration Accounts does or how to use them? Watch Stephen’s video to learn the basics!
Integration Insider: Azure Integration Services, The Integration Platform You Didn’t Know You Had
Video by Derek Marley and Tim Bieber
Check out the first video of many to come from a new series by Derek and Tim. In their introductory episode, watch them highlight the powerful capabilities that Azure Integration Services has to offer.
Friday Fact: Logic App Consumption and Standard have different Action name restrictions
Post by Sandro Pereira
Setting proper names for your actions can be important, so make sure you learn about the different name restrictions for Consumption and Standard in this Friday Fact from Sandro!
Debug Azure API Management Policies | Send-Request APIM Policy | Managed Identity Authentication
Video by Sri Gunnala
For our fellow Aviators looking for ways to debug or simplify complex Liquid template transformations and working with Azure API Management Policies, then make sure to watch Sri’s helpful video walkthrough.
Post by Luís Rigueira
Looking to merge different file types with different extensions into a final PDF Document? Then Luis and Logic Apps can help!
Microsoft Tech Community – Latest Blogs –Read More
Intelligent shared space solutions with Microsoft Teams
In today’s diverse work environment, we understand that there is no one-size-fits-all solution when it comes to shared spaces and devices. Each organization has its unique spectrum of needs, and at Microsoft, we believe in providing solutions that cater to this variety. Our offerings are designed to adapt to different settings, ensuring that whether it’s a shared workspace or a large conference room, the technology enhances collaborative experiences.
Shared spaces are hubs for collaboration, creativity, and connectivity wherever you are. From traditional conference rooms and boardrooms to small focus or huddle rooms, and even work and meeting spaces to which you bring your own device, AI is already improving how we work. Thought leadership on space planning and utilization, innovation from our OEM partners, and Microsoft solutions have helped customers reimagine how to make the most out of their shared spaces to meet present and future needs.
By implementing best practices and strategic approaches to enhance your shared spaces with Teams and Teams Room Pro Management, you unlock a host of tangible benefits. Boost collaboration, get actionable insights, and streamline space management for better efficiency.
Empower productivity and great meetings in every space
Microsoft Teams Rooms, an integral component of the Teams platform, is leading the charge in this transformation, providing a hub for hybrid collaboration that seamlessly integrates with the digital workplace. It remains the ultimate solution for maximizing collaboration and productivity in your shared spaces. Seventy percent of Fortune 500 companies are already utilizing Teams Rooms, and Microsoft Teams has been recognized by Gartner as a leader in the Magic Quadrant for Unified Communications as a Service for the fifth consecutive year. Customers are choosing Microsoft Teams and Microsoft Teams Rooms to successfully position themselves for the future of work.
As we talk to customers and better understand their needs, we continue to find new ways to innovate and deliver on those needs, catering to diverse budgets and space types. Whether with new entry-level Teams Rooms systems, which offer lower-cost solutions for focus and huddle rooms, or bring your own device (BYOD) meeting rooms solutions, providing better content sharing and collaboration, Microsoft Teams is still finding new ways to serve our customers.
Find a solution that best fits your needs
Customers often grapple with inconsistency, unpredictable meeting experiences, and a lack of visibility in shared spaces, posing significant challenges. Both end users and IT administrators seek solutions offering familiarity, ease-of-use, improved meeting experiences, and simplified inventory management.
Traditional and Signature Teams Rooms
Designed for the most inclusive hybrid meetings, Teams Rooms is the pinnacle of intelligence, inclusivity, and flexibility. These solutions boast audio and video (A/V) devices certified for Teams by an array of OEMs, delivering premium meeting experiences. Teams Rooms include compute with the benefits of one-touch join in the room, Front Row display, intelligent A/V processing power, and low-friction manageability for IT. With advanced A/V capabilities, and furniture and configurations that promote face-to-face engagement, your organization can engage in world-class meetings. We work closely with device partners to ensure seamless interactions no matter the OS or device type. Learn more about devices certified for Teams Rooms.
Entry-level Teams Rooms
Our customers tell us, “We love Teams Rooms, but some of our spaces need something more budget-friendly.” Entry-level Teams Rooms are the solution. Priced competitively at under $1,000 USD, they are perfect for upgrading bring your own device (BYOD) spaces and enhancing hybrid meeting experiences. These entry-level solutions come packed with Teams Rooms features like one-touch join and Front Row display, and offer great audio and video quality. Affordable and easy to deploy, they support user productivity while simplifying BYOD room upgrades. Installation is a breeze with a single USB cable, allowing organizations to seamlessly optimize shared spaces for enhanced collaboration.
Bring your own device (BYOD) meeting rooms
BYOD rooms are often used for impromptu meetings with 2-4 participants, where the organizer brings their own computer to run the meeting. About half of these rooms lack essential display or A/V equipment, making them inadequate for hybrid meetings in the modern workplace. The other half typically have a mix of peripherals, such as a large screen display or speaker puck, but the user experience can be inconsistent. Additionally, IT admins lack data on peripheral usage, making it challenging to optimize and support these spaces effectively.
We are determined to make the experience in BYOD meeting rooms so much better.
Now, you can utilize laptops to connect quickly and easily to Teams meetings. Add peripherals to a room, like intelligent speakers, to boost audio and provide speaker recognition,1 which enables the meeting intelligence of Copilot. With Teams solutions, you can experience shared display mode, getting the privacy and meeting features you need to lead meetings confidently. IT teams also benefit from the automatic device discovery and inventory in the Teams Rooms Pro Management portal, where they can track peripheral usage, including number of devices, call quantity, duration, and performance with the Teams Shared Devices license add-on for the room.2
Bookable desks
Ever needed to grab a desk quickly? Bookable desks are shared work desks, also known as hot desks, touch down spaces, or hoteling environments. With Teams on their desktop, users can make a bookable desk their personal workspace by reserving it in advance or simply connecting to the monitor or other peripherals at the desk when they arrive. Voice isolation and noise cancellation in Teams enables crystal clear meetings even in busy, high-traffic spaces. Later this year, users will be able to automatically update their location, making it easier to connect with co-workers when in the office and find a meeting room with the Booking Assist feature, which makes smart recommendations based on locations of attendees. Later, with Microsoft Places (currently available for public preview), users can take advantage of enhanced room finding and intelligent booking experiences.
As with BYOD rooms, IT can keep a pulse on peripherals in these spaces with auto-device discovery and visibility in the Teams Rooms Pro Management inventory, a capability that is available now. IT admins will also benefit from the space and device usage data and insights for intelligent management and planning that are coming later this year.
Intelligence for IT across workspaces
IT admins often lack holistic inventory and management approaches across shared workspaces, sacrificing visibility into room and device usage.
IT teams have a huge opportunity to add value for their organizations using holistic inventory views, space and device utilization data and insights, and AI for smarter planning. With Teams Rooms Pro Management, IT can prioritize investments intelligently to meet business needs and budgets. Microsoft Places can take intelligence to the next level for IT and real estate and facilities teams across their shared workspaces.
Gain insights and manage your spaces with Teams Room Pro Management
The best space and equipment planning starts with the holistic inventory, intelligence, and insights delivered in Teams Room Pro Management. Teams Rooms Pro Management provides comprehensive multi-OS, multi-brand device deployment, configuration, and proactive management capabilities for Teams Rooms. And, now with auto-discovery from the Teams desktop or bulk import for BYOD meeting rooms and bookable desks, devices in these spaces are visible and included in inventory in the Teams Rooms Pro Management portal,3 providing unprecedented insight for planning and asset management.
With this data, you’ll have more intelligence into which BYOD rooms and desks are great as they are, and which spaces are ready for an upgrade.
You’ll be able to track inventory data. Track the inventory data of BYOD rooms and desks and their peripherals in the Teams Rooms Pro Management portal,3 enabling resource account configuration for rooms to make them visible and bookable.
Room and device usage data provides insight for intelligent planning. Add a Teams Shared Devices license to your BYOD room or desk pool (coming later this year) to get analytics like space utilization, number of peripherals, call quantity, and performance via reports available in the Teams Rooms Pro Management portal. These insights will help you target spaces that are ready for an upgrade.
Make data-informed upgrades to equipment and licenses. With insights and data, you can make informed decisions for equipment and space planning. It may be time for a next-gen peripheral, like an intelligent speaker or integrated video bar. Or, you may find some rooms need more – a one-stop room solution that delivers the compute and ultimate meeting, intelligence, and IT features for the space. The competitively priced, entry-level Teams Rooms device kits and a Teams Rooms Pro license could be the answer.
Easily associate devices with rooms or desk pools and make them bookable resources for users.
Take advantage of usage and utilization data to make data-driven decisions and improve your space planning.2
You can learn more about Teams Room Pro Management and how to get started at Microsoft Teams Rooms Pro | Microsoft Teams.
Start planning solutions for all your shared meeting and collaboration spaces
For BYOD rooms or desks with displays or other peripherals to connect laptops to, take advantage of the auto-discovery and inventory in the Teams Rooms Pro Management portal that comes with the core Teams user license. To make manual collection of devices in a space easier, customers and partners can utilize a PowerShell script. From there, you can begin gathering insights and intelligently planning which rooms are ready for more advanced solutions, like the analytics and reports unlocked with a Teams Shared Devices license, or the enhanced end user and IT experiences you get from an entry-level Teams Room.
Each space has different considerations and criteria when planning the right solution to best fit the need. If you have meeting and collaboration spaces without any display or in-room devices for users to connect to, think about how you can elevate that experience for users and start getting the intelligence IT needs for smarter space and equipment planning. Consider:
How the space will be used, in terms of the type of meetings (e.g., team collaboration or executives and board members)
The number of people that typically gather for the meeting type
The mix of in-person and remote attendees
The richness of the audio, video, and collaboration experience needed
Budgets
Usage data from the Teams Rooms Pro Management service and Microsoft Places
An authorized, expert partner can help you design and deploy your Teams solutions. You can find a partner in your area at Modern Work for Partners – Microsoft Teams Rooms Partner Locator. You can find devices certified for Teams at aka.ms/teamsdevices.
Microsoft and devices certified for Teams can help you update your shared spaces for hybrid and flexible work, enabling your users to have productive and engaging meetings and collaborate more effectively. By choosing the right Teams solution for your space and using Teams Room Pro Management to gain insights and manage devices, you can optimize your shared spaces and deliver a consistent and seamless Teams experience.
Resources:
Get started with Teams Rooms: Microsoft Teams Rooms | Microsoft Teams
Set up bring your own device rooms in Teams Rooms Pro Management: Bring Your Own Device Rooms in Pro Management Portal – Microsoft Teams | Microsoft Learn
Set up bookable desk spaces: https://learn.microsoft.com/microsoftteams/rooms/bookable-desks
1 Intelligent speaker support, via intelligent speakers or via cloud for existing devices, in a BYOD room requires the host to be licensed for Teams Premium or Microsoft Copilot.
2 BYOD meeting rooms and bookable desk pools require a Teams Shared Devices license to enable analytics and reports in the Teams Rooms Pro Management portal. Learn more at Microsoft Teams Shared Devices licensing – Microsoft Teams | Microsoft Learn.
2 Access to the Teams Rooms Pro Management portal requires at least one Teams Rooms Pro or Teams Shared Devices license on the customer tenant.
Microsoft Tech Community – Latest Blogs –Read More
Windows 365 Frontline for FedRAMP is now generally available
If you have a US Government Community Cloud (GCC) environment and Microsoft Azure, you can now purchase Windows 365 Frontline for FedRAMP and deploy your Cloud PCs in Microsoft Azure commercial regions.
Windows 365 Frontline is the first Windows solution designed to meet the distinct needs of shift and part-time employees. Windows 365 Frontline for FedRAMP has been assessed by a Federal Risk and Authorization Management Program (FedRAMP) authorized auditor to meet FedRAMP High requirements at datacenters within the Continental US (CONUS) and has an agency authorization at FedRAMP Moderate.
Windows 365 Frontline is included in the Microsoft Office 365 Multi-Tenant & Supporting Services FedRAMP accreditation package. Whether your organization has a specific FedRAMP requirement or is using FedRAMP compliance as part of the overall evaluation criteria, Windows 365 Frontline for FedRAMP provides flexibility and security while simplifying the administration and enhancing the value and experience for end users.
With Windows 365 Frontline for FedRAMP, Cloud PCs are provisioned in an Azure Commercial datacenter and meet FedRAMP requirements when they are properly configured and used within CONUS. If your organization requires additional compliance or regulatory commitments, please see Windows 365 Government. Currently, it is not possible to have a tenant that includes Cloud PCs in both the Azure Commercial and Azure Government regions. Therefore, it is important to evaluate the full scope of compliance and regulatory requirements when deciding which product is appropriate for your organization.
Note: This announcement is for GCC customers only. This change applies to GCC customers who would like to purchase Windows 365 Frontline for FedRAMP in the Azure Commercial cloud. It is not applicable to GCC or GCC High customers that require services to be deployed and operated in US Government regions where additional compliance requirements can be met.
Read on to learn more about Windows 365 Frontline for FedRAMP, its available features, and how they could be a great solution for your organization.
Affordable, flexible Cloud PC access
With Windows 365 Frontline for FedRAMP, instead of purchasing a license for every shift worker, you need to only purchase enough licenses for the number of active employees at any given time. For example, if you have nine employees but only three of them work at the same time, you only need three licenses to meet the needs of all nine employees. IT admins can immediately deploy up to three Cloud PCs per purchased license within the Windows 365 provisioning experience using Microsoft Intune.
As employees sign in, the Cloud PC powers on, and a license is used for the duration of their work. When they sign off, the shared license is returned to the pool of shared licenses, and their Cloud PC is powered off. Any of the users within a defined group can access their Cloud PC without requiring a set schedule. This model empowers organizations to extend access to Cloud PCs to employees who may not have had such opportunities in the past and makes it a great solution for employees on a shift schedule—including customer representatives in call centers, help desk workers, and reception staff across many different verticals.
The benefits in Windows 365 Frontline for FedRAMP begin with affordability and flexibility, but in the coming months that value will expand with even more capabilities tailored to meet the needs of frontline, shift, and part-time workers.
Provisioning Windows 365 Frontline for FedRAMP Cloud PCs
Windows 365 Frontline for FedRAMP licenses will appear in the Microsoft 365 admin center under the Products tab only, and do not need to be assigned to specific users. Licenses purchased will show the number of Cloud PCs you can deploy in the Windows 365 provisioning experience when choosing Frontline as a license type. This makes it easy to remove and add users to your workforce as it changes. Additionally, IT admins have the flexibility to provide each user with multiple frontline Cloud PCs to support scenarios such as consultants who work for many different organizations. For more information on how to provision Cloud PCs for frontline workers, review our provisioning documentation.
Setting up Windows 365 Frontline for FedRAMP Cloud PCs
You can follow these simple steps to set up a Cloud PC with Windows 365 Frontline for FedRAMP:
Create an Entra ID group that includes all the users you wish to provision with a Cloud PC of a given configuration such as 2 vCPU/8 GB/128 GB. In this example, Cloud PCs need to be provisioned for a group of customer service reps. Let’s call this group “Customer service reps Manila.”
In the Windows 365 blade of the Microsoft Intune admin center (shown in screenshot below), select Create to create a new provisioning policy and set up the Cloud PCs. For this example, let’s call the Provisioning policy name “Customer service reps.”
Select Frontline as the license type, select the join type, and choose the region and network as you would for Windows 365 Enterprise scenarios.
Choose the image you want to use for the Cloud PC under the Image tab and the appropriate Language & Region in the Configuration tab.
Under the Assignments tab, you can see Windows 365 Frontline for FedRAMP Cloud PCs that have been purchased by your organization and target them to a specific group of individuals. Select the “Customer service reps Manila” group you created earlier and assign the 8 vCPU/32 GB/512 GB Cloud PC configuration to this group.
Preview the provisioning policy choices you have made in the Review + create tab and select Create to complete the provisioning process. The provisioning policy will begin calculating and will assign a Windows 365 Frontline for FedRAMP Cloud PC to each user in the group.
Cloud PCs provisioned to these users will automatically be placed in a powered-off state until an employee signs in to their Cloud PC. Admins can view which users received a Frontline Cloud PC and which users did not by viewing the provisioning policy after creation.
Frontline Cloud PC concurrency report
To deliver cost optimization, admins can use the Frontline Cloud PC concurrency report to understand trends of license usage over time and to adjust the number of licenses to ensure access during peak usage. For more information on how to review frontline Cloud PC concurrent usage, please read the Cloud PC Utilization Report documentation.
Concurrency Alerts
The concurrency report provides alerts to IT admins if they are reaching the concurrency limit or have already reached the limit. IT admins can configure alerts to ensure they aware of when users may be impacted and purchase additional licenses to meet demand.
Easily manage Cloud PCs using existing technology investments
You can use Microsoft Intune to configure, deploy, and manage Windows 365 Frontline for FedRAMP alongside your other Cloud PCs and physical endpoints without additional infrastructure components or special procedures. The key difference in management capabilities relative to Windows 365 Enterprise is that Cloud PCs for frontline workers are powered off when not in use. Some remote actions can only be completed after the Cloud PC is powered on. Restarting a Cloud PC is one such example.
Power on and Power off
There are situations in which you may wish to power on a Frontline Cloud PC to perform a time sensitive action, or you may wish to power off a Cloud PC to free up a session to be used by another user. These new remote actions, along with the ability to view the power state of a Cloud PC, provides a way for you to respond quickly and efficiently. IT admins can power on and off frontline Cloud PCs in bulk using Bulk Remote Actions in Microsoft Intune or Microsoft Intune Graph API. This feature enables organizations to automate the powering on and off of a group of Cloud PCs for frontline workers based on their shifts and scheduling system.
Note: Powering on a Cloud PC for frontline workers will utilize an active session even if the user does not sign in. More information on how to use these remote actions can be found in this technical documentation.
Idle timeout default
Windows 365 Frontline for FedRAMP relies on end users to sign out at the end of their Cloud PC session to make the license available for another user. However, frontline workers are often busy working away from their device and may forget to sign off. Windows 365 detects user inactivity and automatically disconnects after two hours by default. IT admins can use Microsoft Intune to modify the value and preferences for their organization to better meet the needs of specific workers. For more information on how to configure session timeout, review our technical documentation.
Optimized for frontline, shift, and part-time employees
Windows 365 Frontline for FedRAMP is designed for the way frontline, shift, and part-time workers work. Employees can use the Windows 365 app and web portal to connect to their Cloud PC. As employees connect, they are reminded to save their data and disconnect when they finish. Based on end-user behavior, an automated system will Power on and Power off frontline Cloud PCs. This functionality will continue to evolve, further optimizing Cloud PC power on based on shift patterns.
Note: Frontline employees can only access their Cloud PC using Windows 365 app or windows365.microsoft.com. Frontline Cloud PCs are not accessible from Remote Desktop app.
Uniquely designed features of Windows 365 Frontline for FedRAMP further enhance productivity for workers across various industries and use cases, including healthcare, manufacturing, retail, technical training, and more. Optimizations that deliver a better experience include:
Windows Update optimizations for Frontline Cloud PCs: Many shift workers perform mission-critical jobs. This feature works together with Windows Update for Business to apply OS reboots outside of work hours when the user disconnects, increasing their productivity and minimizing disruption. To take advantage of this feature, be sure to use the recommended settings.
Update freshness: Windows 365 will detect Frontline Cloud PCs that have not been powered on for seven days and will perform updates according to the organization’s policies, keeping occasional users or employees productive when they sign in next by updating Cloud PC outside of work hours.
FAQs
Q: Is Windows 365 Frontline for FedRAMP limited to frontline workers?
A: No, Windows 365 Frontline for FedRAMP is not limited to frontline workers. While it is designed for shift and part-time employees, many of whom are frontline workers, it is also for scenarios like contingent staff who may only require access to a Cloud PC for a limited part of the day.
Q: I purchased Windows 365 Frontline for FedRAMP licenses. Why don’t I see them when reviewing my licenses in Microsoft 365 admin center?
A: Windows 365 Frontline for FedRAMP licenses do not follow the same behavior as user-based licensing. As such, they do not show up in the Microsoft 365 admin center Licenses tab and cannot be assigned to users. You can find your Windows 365 licenses in Microsoft 365 admin center > Billing >Your products.
Q: I can’t access Windows 365 Frontline for FedRAMP from Remote Desktop app. Is this expected?
A: Windows 365 Frontline for FedRAMP is only supported through the Windows app and windows365.microsoft.com. End users can access the Remote Desktop app by choosing to “Open in Remote Desktop app” on the Windows 365 web portal.
Q: When do I use Windows 365 Frontline for FedRAMP, and when do I use Windows 365 Enterprise for FedRAMP?
A: Windows 365 Enterprise for FedRAMP is for employees that need dedicated, anytime access to their Cloud PC. Windows 365 Frontline for FedRAMP is for workers that need access to a Cloud PC for a limited amount of time such as during their shift or part-time work. Each worker will receive a unique frontline Cloud PC, but licenses are shared.
Q: Are trials available?
A: Trial subscriptions are available for qualified customers. Please contact your Microsoft account representative for more information.
Continue the conversation. Find best practices. Bookmark the Windows Tech Community, then follow us @MSWindowsITPro on X and on LinkedIn. Looking for support? Visit Windows on Microsoft Q&A.
Microsoft Tech Community – Latest Blogs –Read More
Em breve palestras na integra do AI for ChangeMakers
Tem sido muito legal receber as mensagens de vocês pedindo os vídeos do evento AI for ChangeMakers do dia 27/05. Compartilho que iremos sim divulgar os vídeos aqui em breve! Serão 1 ou 2 palestras por semana. Fiquem ligados.
Tem sido muito legal receber as mensagens de vocês pedindo os vídeos do evento AI for ChangeMakers do dia 27/05. Compartilho que iremos sim divulgar os vídeos aqui em breve! Serão 1 ou 2 palestras por semana. Fiquem ligados.
Read More
Azure Migration without VM Shutdown
We’re in the process of shutting down the on-premises setup of our company environment running on VMware. We have a DC, SQL server, accounting server and a couple terminal servers used for remote access. I have created the necessary Azure migration assessment and ready to start the replication of the 1st batch of servers. Because we have a legacy custom SQL app, we must maintain standard Windows AD, so I’ll be moving our DC virtual machine. Azure AD will not work thus the need to migrate our on-prem DC.
My question is replicating and copying our on-prem domain controller from VMware up to Azure but keeping it running following the replication migration up to Azure. I know the normal process of replication is to shutdown the running on-prem VMs once the final delta replication has completed. However, I need to keep my on-prem AD environment running for about another month as I build the Azure environment in parallel. I understand that the two environments will not be in-sync if new domain users are created but this likely will not need to occur with the small size of the company. Basically, does the Azure migration process allow us to keep the migrated VMs running following the final replication sync? I’ll do this replication during non-production hours so network activity will be zero.
Thanks for any suggestions or input.
Ken
We’re in the process of shutting down the on-premises setup of our company environment running on VMware. We have a DC, SQL server, accounting server and a couple terminal servers used for remote access. I have created the necessary Azure migration assessment and ready to start the replication of the 1st batch of servers. Because we have a legacy custom SQL app, we must maintain standard Windows AD, so I’ll be moving our DC virtual machine. Azure AD will not work thus the need to migrate our on-prem DC. My question is replicating and copying our on-prem domain controller from VMware up to Azure but keeping it running following the replication migration up to Azure. I know the normal process of replication is to shutdown the running on-prem VMs once the final delta replication has completed. However, I need to keep my on-prem AD environment running for about another month as I build the Azure environment in parallel. I understand that the two environments will not be in-sync if new domain users are created but this likely will not need to occur with the small size of the company. Basically, does the Azure migration process allow us to keep the migrated VMs running following the final replication sync? I’ll do this replication during non-production hours so network activity will be zero. Thanks for any suggestions or input.Ken Read More