Tag Archives: microsoft
how to get an Outline group to open downward
The default behavior for the +/- control of an outline group of rows is to be at the bottom of the group. E.g., if you group rows 1 through 5, the +/- is on row 6. When you want to collapse a group, you have to go to the bottom of the group (row 6) to click the minus. I want the +/- on the first row (the top) of the group. So if I’ve grouped rows 1-5, I would click on a control on row 1, not on row 6, and it would open downward.
Is this possible? I think I’ve seen it this way in the past, but I don’t know how to do it.
The default behavior for the +/- control of an outline group of rows is to be at the bottom of the group. E.g., if you group rows 1 through 5, the +/- is on row 6. When you want to collapse a group, you have to go to the bottom of the group (row 6) to click the minus. I want the +/- on the first row (the top) of the group. So if I’ve grouped rows 1-5, I would click on a control on row 1, not on row 6, and it would open downward. Is this possible? I think I’ve seen it this way in the past, but I don’t know how to do it. Read More
Pivot Table Summarize Values by Average Greyed Out
I am constructing a Recruitment Dashboard. One of the values I’m trying to include is the average for time to fill. Which is the time from the open date of a job to the close date of a job. When transferring this data over to the pivot table, it doesn’t provide the option to summarize values by average. It’s greyed out. The only option is count, which doesn’t do me any good.
I’ve done about all I can think to do. And nothings helped. I’m hoping someone has the answer out there. I’m also attaching the link to my workbook. Dashboard
I am constructing a Recruitment Dashboard. One of the values I’m trying to include is the average for time to fill. Which is the time from the open date of a job to the close date of a job. When transferring this data over to the pivot table, it doesn’t provide the option to summarize values by average. It’s greyed out. The only option is count, which doesn’t do me any good. I’ve done about all I can think to do. And nothings helped. I’m hoping someone has the answer out there. I’m also attaching the link to my workbook. Dashboard Read More
Formatting Feet and Inches in one cell
Hi all,
I am working with a spreadsheet of archived baseball team rosters (each year’s roster is contained as one sheet in the larger workbook). These were previously on paper, but are now being digitized.
The problem we are running into is with player heights (i.e. feet and inches). As you can see in the example screenshot below, heights have been typed in feet-inches format; for example the first player has a height of 5 feet, 11 inches.
However, Excel is formatting these as dates rather than heights (i.e. the first player is showing May 11 rather than simply 5-11). Although the values read correctly in this spreadsheet, it presents an issue when we ingest this data into our website, which is expecting heights in #-# format with feet and inches. Instead, for the first player the website is receiving 5/11/2023 and so on. Because of this the website disregards the height column for most players. The exception is anyone whose height doesn’t conflict with a date, for example 6 feet even (6-0).
If I change the column category to either General or Text, the values change to a random string of numbers.
Given that we have about 40 seasons worth of player rosters in this workbook, is there an efficient process that we could take this data and put it into a format suitable for importing (such as a simple 5-11, 6-3, etc)?
Thanks in advance for any insight, and happy to share additional context/details if needed.
Hi all, I am working with a spreadsheet of archived baseball team rosters (each year’s roster is contained as one sheet in the larger workbook). These were previously on paper, but are now being digitized. The problem we are running into is with player heights (i.e. feet and inches). As you can see in the example screenshot below, heights have been typed in feet-inches format; for example the first player has a height of 5 feet, 11 inches. However, Excel is formatting these as dates rather than heights (i.e. the first player is showing May 11 rather than simply 5-11). Although the values read correctly in this spreadsheet, it presents an issue when we ingest this data into our website, which is expecting heights in #-# format with feet and inches. Instead, for the first player the website is receiving 5/11/2023 and so on. Because of this the website disregards the height column for most players. The exception is anyone whose height doesn’t conflict with a date, for example 6 feet even (6-0). If I change the column category to either General or Text, the values change to a random string of numbers. Given that we have about 40 seasons worth of player rosters in this workbook, is there an efficient process that we could take this data and put it into a format suitable for importing (such as a simple 5-11, 6-3, etc)? Thanks in advance for any insight, and happy to share additional context/details if needed. Read More
Excel PivotTable Field Selection
Every time I select a field for a pivot table, the PivotTable Fields dialogue box collapses back to its original state. I’d like to select multiple fields without having to drill to the specific field folder each time. Is there a setting that controls this? The data source is an attached database.
Every time I select a field for a pivot table, the PivotTable Fields dialogue box collapses back to its original state. I’d like to select multiple fields without having to drill to the specific field folder each time. Is there a setting that controls this? The data source is an attached database. Read More
Power Automate for Paginated Reports
I am using a Power Automate to export a paginated report from Power BI and save the file to a SharePoint folder.
The flow runs perfectly, but when the file is created in the SharePoint folder, all of the headers are changed. The paginated file name is pre-pended to the header name. Because this file needs to have specific headers for import into another system, this is breaking the process.
What do I need to do to get the file with the headers as they appear in Power BI?
Actions:
I am using a Power Automate to export a paginated report from Power BI and save the file to a SharePoint folder. The flow runs perfectly, but when the file is created in the SharePoint folder, all of the headers are changed. The paginated file name is pre-pended to the header name. Because this file needs to have specific headers for import into another system, this is breaking the process. What do I need to do to get the file with the headers as they appear in Power BI? Actions: Read More
Using intune to create application desktop shortcuts
I’m trying to use intune to push an application shortcut to the public desktop. I created an lnk file and saved it to a shared network drive. I then wrote a powershell script to copy that file to a folder on the c drive of the pc and then form that folder to the public desktop. Here is the script
——————————————————————————————————————
#Create directory to hold icon file and copy file there
New-Item -Path “c:” -Name “scut” -ItemType “directory” -Force
Copy-Item “S:ShortcutsPS1 FilesFortinet IconFortiClientVPN.lnk” -Destination “c:scutFortiClientVPN.lnk”
$TargetFile = “c:scutFortiClientVPN.lnk”
$ShortcutFile = “$env:PublicDesktopFortiClientVPN.lnk”
$WScriptShell = New-Object -ComObject WScript.Shell
$Shortcut = $WScriptShell.CreateShortcut($ShortcutFile)
$Shortcut.TargetPath = $TargetFile
$Shortcut.Save()
————————————————————————————————————
I wrapped the script with intunewinapp utility and created a win32 app with the file. I use this as my install command. (forti2.ps1 is my original powershell file)
————————————————————————————————————–
powershell.exe -ExecutionPolicy Bypass -file forti2.ps1
—————————————————————————————————————
My detection rule is just a custom rule that looks for the file on the desktop. When the app runs I get
“The application was not detected after installation completed successfully (0x87D1041C)” error. Any ideas what I’m missing?
I’m trying to use intune to push an application shortcut to the public desktop. I created an lnk file and saved it to a shared network drive. I then wrote a powershell script to copy that file to a folder on the c drive of the pc and then form that folder to the public desktop. Here is the script ——————————————————————————————————————#Create directory to hold icon file and copy file thereNew-Item -Path “c:” -Name “scut” -ItemType “directory” -ForceCopy-Item “S:ShortcutsPS1 FilesFortinet IconFortiClientVPN.lnk” -Destination “c:scutFortiClientVPN.lnk”$TargetFile = “c:scutFortiClientVPN.lnk”$ShortcutFile = “$env:PublicDesktopFortiClientVPN.lnk”$WScriptShell = New-Object -ComObject WScript.Shell$Shortcut = $WScriptShell.CreateShortcut($ShortcutFile)$Shortcut.TargetPath = $TargetFile$Shortcut.Save() ————————————————————————————————————I wrapped the script with intunewinapp utility and created a win32 app with the file. I use this as my install command. (forti2.ps1 is my original powershell file)————————————————————————————————————–powershell.exe -ExecutionPolicy Bypass -file forti2.ps1 ————————————————————————————————————— My detection rule is just a custom rule that looks for the file on the desktop. When the app runs I get “The application was not detected after installation completed successfully (0x87D1041C)” error. Any ideas what I’m missing? Read More
NEW Level Up CSP | M365 & Copilot Sales and Technical Bootcamps!
Level Up your skills by joining us in the upcoming Microsoft 365 and Copilot bootcamps!
We are excited to announce our new FY25 Level Up CSP Bootcamps, that are built to help CSP partners grow sales and technical capabilities and accelerate new customer acquisition, upsell and cross sell. Join us for the upcoming Microsoft 365 and Copilot bootcamps to get ready to acquire, upsell and expand with M365 Premium SKUs, Copilot and Copilot Studio. Learn more by registering today!
Level Up CSP Sales Bootcamp
1-day sales Bootcamp: Get ready to go to market and sell Microsoft 365 and Copilot
Americas/EMEA region: August 28 | 8:00 AM – 12:00 PM, Pacific Time
APAC region: September 4 | 5:00 PM – 9:00 PM, Pacific Time
Who should attend: Sellers and sales managers
Level Up CSP Pre- and Post-Sales Technical Bootcamp
2-days technical bootcamp: Secure customers with Premium SKUs, deploy Copilot and extend with Copilot Studio
Americas/EMEA region: September 11 & 12 | 7:00 AM – 11:00 AM, Pacific Time
APAC region: September 18 & 19 | 5:00 PM – 9:00 PM, Pacific Time
Who should attend: Pre and post sales, IT admins and technical staff
Register at http://aka.ms/LevelUpCSPBootcamp
NEW! We will be offering voiceover and subtitles for ten languages to support our global CSP partners.
Level Up your skills by joining us in the upcoming Microsoft 365 and Copilot bootcamps!
We are excited to announce our new FY25 Level Up CSP Bootcamps, that are built to help CSP partners grow sales and technical capabilities and accelerate new customer acquisition, upsell and cross sell. Join us for the upcoming Microsoft 365 and Copilot bootcamps to get ready to acquire, upsell and expand with M365 Premium SKUs, Copilot and Copilot Studio. Learn more by registering today!
Level Up CSP Sales Bootcamp 1-day sales Bootcamp: Get ready to go to market and sell Microsoft 365 and Copilot
Americas/EMEA region: August 28 | 8:00 AM – 12:00 PM, Pacific Time
APAC region: September 4 | 5:00 PM – 9:00 PM, Pacific Time
Who should attend: Sellers and sales managers
Level Up CSP Pre- and Post-Sales Technical Bootcamp 2-days technical bootcamp: Secure customers with Premium SKUs, deploy Copilot and extend with Copilot Studio
Americas/EMEA region: September 11 & 12 | 7:00 AM – 11:00 AM, Pacific Time APAC region: September 18 & 19 | 5:00 PM – 9:00 PM, Pacific Time
Who should attend: Pre and post sales, IT admins and technical staff Register at http://aka.ms/LevelUpCSPBootcampNEW! We will be offering voiceover and subtitles for ten languages to support our global CSP partners. Read More
Thunderbird and Microsoft Exchange server 2019 phantom folder
I am using microsoft exchange 2019 mail server.
I use Thunderbird mailbox client
And during folder transfer, deletion or renaming, phantom folders are created.
I use the imap protocol.
If I install this mailbox on another computer, the phantom folders are loaded.
There are no such problems with Outlook. Can anyone tell me what to do about this?
I am using microsoft exchange 2019 mail server.I use Thunderbird mailbox clientAnd during folder transfer, deletion or renaming, phantom folders are created.I use the imap protocol.If I install this mailbox on another computer, the phantom folders are loaded.There are no such problems with Outlook. Can anyone tell me what to do about this? Read More
JS Link field on Views and CEWP option suddenly disappeared from SharePoint site
Hello
Please i need your help on this issue.
Yesterday we noticed that the JSLink option is not visible anymore when we edit view web parts
We also saw that we are not able to add CEWP anymore on pages…
Hello Please i need your help on this issue. Yesterday we noticed that the JSLink option is not visible anymore when we edit view web partsWe also saw that we are not able to add CEWP anymore on pages… Read More
Scrollable Tables
I can take a table of data, create a new table with a scroll bar and bring it each line using offset from the original table.
The issue is if I turn on the table total column at the bottom and use count, count only returns the rows that can be seen in the scrollable window which is 20. But there are 100 rows of data in it, I need to figure out how to get count to return 100.
Any ideas?
I can take a table of data, create a new table with a scroll bar and bring it each line using offset from the original table. The issue is if I turn on the table total column at the bottom and use count, count only returns the rows that can be seen in the scrollable window which is 20. But there are 100 rows of data in it, I need to figure out how to get count to return 100. Any ideas? Read More
Excel: Keep cell formatting in a drop down list data validation??
Is it possible to keep cell formatting (bold & colored font) in a drop down box from a data validation list?
Is it possible to keep cell formatting (bold & colored font) in a drop down box from a data validation list? Read More
Collapsible Header Format
Hey everyone,
This is my first post here because I’ve encountered a very strange issue that myself and Microsoft support have been unable to figure out.
To make a long story short, I’ve written a book and I’ve used Headings with the collapsible arrow on each chapter to make it easy to go through and edit things in order, and all was working fine until a few days ago when for some reason only the page underneath the header collapses, and the rest of the chapter stays visible.
I use Office 2021 Professional Plus and I have tried to reinstall it all but that hasn’t fixed the issue. I’ve also tried to create a new document with the new chapter headings and tried to copy and paste the body of text to each corresponding chapter, but now the body of text then loses all formatting. It removes all of the paragraphs breaks, and margin alignments that have been set on the new document.
I can’t figure this issue out, so I’m really really hoping that one of you lovely people can, because my only other option would be to retpye the entire book, and at ~350k words I’d very much like to avoid that.
Thank you!
Ismail
Hey everyone, This is my first post here because I’ve encountered a very strange issue that myself and Microsoft support have been unable to figure out. To make a long story short, I’ve written a book and I’ve used Headings with the collapsible arrow on each chapter to make it easy to go through and edit things in order, and all was working fine until a few days ago when for some reason only the page underneath the header collapses, and the rest of the chapter stays visible. I use Office 2021 Professional Plus and I have tried to reinstall it all but that hasn’t fixed the issue. I’ve also tried to create a new document with the new chapter headings and tried to copy and paste the body of text to each corresponding chapter, but now the body of text then loses all formatting. It removes all of the paragraphs breaks, and margin alignments that have been set on the new document. I can’t figure this issue out, so I’m really really hoping that one of you lovely people can, because my only other option would be to retpye the entire book, and at ~350k words I’d very much like to avoid that. Thank you!Ismail Read More
Integrating Azure Content Safety with API Management for Azure OpenAI Endpoints
In today’s digital landscape, ensuring the safety and integrity of AI-generated content is paramount. Azure Content Safety, combined with Azure API Management, provides a robust solution for managing and securing Azure OpenAI endpoints. This blog will guide you through the integration process, focusing on text analysis and prompt shields.
What is Azure Content Safety?
Azure AI Content Safety provides analysis for user and AI generated content. Currently available APIs include
Prompt Shields: scans user text and document text for input attacks on LLMs
Groundedness Detection: Verify if the response generated by the LLMs are grounded in the source provided
Protected material text detection: checks for existence of copyrighted material in the AI generated response
Analyze Text/Image: identifies and categorizes text severity against sexual content, hate, violence and self-harm
Why Integrate Azure Content Safety?
Azure Content Safety offers advanced algorithms to detect and mitigate harmful content in both user prompts and AI-generated outputs. By integrating this with Azure API Management, you can:
Enhance Security: Protect your applications from harmful content.
Ensure Compliance: Adhere to regulatory standards and guidelines.
Improve User Experience: Provide a safer and more reliable service to your users.
Onboard Azure Content Safety API to Azure API Management
Like any other APIs, we can onboard Azure Content Safety API to Azure APIM by importing the latest OpenAPI specification. API management helps with enabling Managed Identity based authentication to the Content Safety API as well as communicate privately using Private Endpoints.
Onboard Azure OpenAI to Azure API Management
Onboarding AOAI to API Management comes with many benefits which are extensively discussed. I have a blog and github repo that talks about this in detail.
Integrate Content Safety with Azure OpenAI APIs in API Management
AI Gateway Labs is an amazing repository exploring various patterns through a series of labs. We have included 2 Content Safety scenarios as labs to demonstrate this integration.
The pattern behind this integration is to leverage the send-request policy in APIM to invoke the respective Content Safety API, and decide to forward the request to OpenAI, if it is safe.
The snippet below concatenates all the prompts in the incoming request to OpenAI and validates if there is any attack detected.
<send-request mode=”new” response-variable-name=”safetyResponse”>
<set-url>@(“https://” + context.Request.Headers.GetValueOrDefault(“Host”) + “/contentsafety/text:shieldPrompt?api-version=2024-02-15-preview”)</set-url>
<set-method>POST</set-method>
<set-header name=”Ocp-Apim-Subscription-Key” exists-action=”override”>
<value>@(context.Variables.GetValueOrDefault<string>(“SubscriptionKey”))</value>
</set-header>
<set-header name=”Content-Type” exists-action=”override”>
<value>application/json</value>
</set-header>
<set-body>@{
string[] documents = new string[] {};
string[] messages = context.Request.Body.As<JObject>(preserveContent: true)[“messages”].Select(m => m.Value<string>(“content”)).ToArray();
JObject obj = new JObject();
JProperty userProperty = new JProperty(“userPrompt”, string.Concat(messages));
JProperty documentsProperty = new JProperty(“documents”, new JArray(documents));
obj.Add(userProperty);
obj.Add(documentsProperty);
return obj.ToString();
}</set-body>
</send-request>
<choose>
<when condition=”@(((IResponse)context.Variables[“safetyResponse”]).StatusCode == 200)”>
<choose>
<when condition=”@((bool)((IResponse)context.Variables[“safetyResponse”]).Body.As<JObject>()[“userPromptAnalysis”][“attackDetected”] == true)”>
<!– Return 400 if an attach is detected –>
<return-response>
<set-status code=”400″ reason=”Bad Request” />
<set-body>@{
var errorResponse = new
{
error = new
{
message = “The prompt was identified as an attack by the Azure AI Content Safety service.”
}
};
return JsonConvert.SerializeObject(errorResponse);
}</set-body>
</return-response>
</when>
</choose>
</when>
<otherwise>
<return-response>
<set-status code=”500″ reason=”Internal Server Error” />
</return-response>
</otherwise>
</choose>
The snippet below concatenates all the prompts in the incoming request to OpenAI and validates if it is within the allowed limits for hate, sexual, self harm and violence.
<send-request mode=”new” response-variable-name=”safetyResponse”>
<set-url>@(“https://” + context.Request.Headers.GetValueOrDefault(“Host”) + “/contentsafety/text:analyze?api-version=2023-10-01”)</set-url>
<set-method>POST</set-method>
<set-header name=”Ocp-Apim-Subscription-Key” exists-action=”override”>
<value>@(context.Variables.GetValueOrDefault<string>(“SubscriptionKey”))</value>
</set-header>
<set-header name=”Content-Type” exists-action=”override”>
<value>application/json</value>
</set-header>
<set-body>@{
string[] categories = new string[] {“Hate”,”Sexual”,”SelfHarm”,”Violence”};
JObject obj = new JObject();
JProperty textProperty = new JProperty(“text”, string.Concat(context.Request.Body.As<JObject>(preserveContent: true)[“messages”].Select(m => m.Value<string>(“content”)).ToArray()));
JProperty categoriesProperty = new JProperty(“categories”, new JArray(categories));
JProperty outputTypeProperty = new JProperty(“outputType”, “EightSeverityLevels”);
obj.Add(textProperty);
obj.Add(categoriesProperty);
obj.Add(outputTypeProperty);
return obj.ToString();
}</set-body>
</send-request>
<choose>
<when condition=”@(((IResponse)context.Variables[“safetyResponse”]).StatusCode == 200)”>
<set-variable name=”thresholdExceededCategory” value=”@{
var thresholdExceededCategory = “”;
// Define the allowed threshold for each category
Dictionary<string, int> categoryThresholds = new Dictionary<string, int>()
{
{ “Hate”, 0 },
{ “Sexual”, 0 },
{ “SelfHarm”, 0 },
{ “Violence”, 0 }
};
foreach (var category in categoryThresholds)
{
var categoryAnalysis = ((JArray)((IResponse)context.Variables[“safetyResponse”]).Body.As<JObject>(preserveContent: true)[“categoriesAnalysis”]).FirstOrDefault(c => (string)c[“category”] == category.Key);
if (categoryAnalysis != null && (int)categoryAnalysis[“severity”] > category.Value)
{
// Threshold exceeded for the category
thresholdExceededCategory = category.Key;
break;
}
}
return thresholdExceededCategory;
}” />
<choose>
<when condition=”@(context.Variables[“thresholdExceededCategory”] != “”)”>
<return-response>
<set-status code=”400″ reason=”Bad Request” />
<set-body>@{
var errorResponse = new
{
error = new
{
message = “The content was filtered by the Azure AI Content Safety service for the category: ” + (string)context.Variables[“thresholdExceededCategory”]
}
};
return JsonConvert.SerializeObject(errorResponse);
}</set-body>
</return-response>
</when>
</choose>
</when>
<otherwise>
<return-response>
<set-status code=”500″ reason=”Internal Server Error” />
</return-response>
</otherwise>
</choose>
Conclusion
Integrating Azure Content Safety with API Management for Azure OpenAI endpoints is a powerful way to enhance the security and reliability of your AI applications. By following these steps, you can ensure that your AI-generated content is safe, compliant, and user-friendly.
For more detailed information, refer to the Azure Content Safety documentation and the Azure API Management documentation.
Microsoft Tech Community – Latest Blogs –Read More
Build intelligent MySQL applications using semantic search and generative AI
Searching for content on a website uses keyword-based search, a method with its limitations. For example, searching for a ‘rain jacket for women’ on an e-commerce website returns all jackets for both men and women, as the search focuses on just the keyword ‘jacket’. The search simply isn’t contextual enough to provide the desired results.
Semantic search is a technique that allows users to search for information using natural language queries rather than specific keywords. With semantic search, the meaning and intent of users’ queries are inferred, and personalized relevant results are returned. Generative AI, on the other hand, is a type of artificial intelligence that can generate new content from existing data, such as text, images, or audio. You can use generative AI to produce summaries, captions, recommendations, or responses based on a user’s input and preferences.
This blog post discusses how to build intelligent MySQL applications with semantic search and generative AI response using Azure Open AI and Azure Database for MySQL with Azure AI search. As an example, we’ll use a Magento ecommerce app designed to sell jackets, and then build a “Product Recommender CoPilot” chat application that:
Recognizes the intent of a user’s natural language queries.
Generates custom responses to recommend suitable products using the product details and reviews data stored in Azure Database for MySQL.
Architecture
The simplest way to include the rich capabilities of semantic search and generative AI in your applications is to build a solution using the Retrieval Augmented Generation (RAG) architecture with Azure AI Search and Azure Open AI services.
What is a RAG architecture?
Retrieval Augmented Generation, or RAG, is an architecture that augments the natural language understanding and generation capabilities of LLMs like ChatGPT by adding an information retrieval system like Azure AI Search which works with your data stored in data sources like Azure Database for MySQL. In a typical RAG pattern:
A user submits a query or prompt in natural language.
The query is routed to Azure AI Search to find the relevant information.
Azure AI Search sends the top ranked semantic search results to a Large Language Model (LLM).
The LLM then processes the natural language query and uses reasoning capabilities to generate a response to the initial prompt.
Sample product recommender Copilot architecture
A sample RAG architecture for the AI solution we’ll show you how to build in this blog post appears in the following graphic:
Azure AI search pulls the content (in our case, the product details and reviews data) from a backend Azure Database for MySQL database by using an indexer that runs periodically.
The product details and reviews data are further chunked and vectorized using Azure OpenAI’s text embedding model.
Azure AI Search then persists this vectorized data in a vector search index.
When a user uses the “Product Recommender CoPilot” chat application, the query is sent to an Azure OpenAI Chat Completion Service.
Azure AI Search is now used as a data source to find the most relevant response using vector-search or hybrid search (vector + semantic search).
The Azure OpenAI Chat Completion service then uses these search results to generate a custom response back to the user query.
In this post, we’ll walk you through how to set up the backend data sources, indexers, and models required to build this solution. It is a detailed guide to the sample Python code hosted in our GitHub repository in a Jupyter Notebook: azure-mysql/Azure_MySQL_AI_Search_Sample at master Azure/azure-mysql.
For a short demo of the entire process using a Magento ecommerce application, watch the following video!
Prerequisites
Before getting started, you need to ensure that the following prerequisites are in place.
An Azure account with an active subscription. If you don’t have one, create one for free here.
An Azure AI Search resource. If you don’t have one, you can create one via the Azure portal or the Azure CLI, as explained in the article here.
An Azure Open AI Services resource. If you don’t have one, you can create one via the Azure portal or the Azure CLI, as explained in the article here.
A MySQL database (in Azure Database for MySQL or any database provider) populated with product and reviews data obtained from your ecommerce application like Magento.
To create an Azure Database for MySQL server, follow the instructions in the article here.
If you need some sample product and reviews data to try out this example, refer “Upload data to MySQL DB” in the Jupyter Notebook azure-mysql/Azure_MySQL_AI_Search_Sample at master Azure/azure-mysql
Process
Supporting semantic search in an e-commerce application that leverages Azure AI search and Azure Open AI Services in the backend requires completing the following:
I. Set up data source connection in Azure AI Search.
II. Set up automatic chunking, vectorization and indexing.
III. Use vector search from a sample application.
IV. Generate a GPT response to the user.
V. Test the solution in the Azure OpenAI Studio playground.
Azure AI search pulls the contents and reviews data from a backend MySQL flexible server by using an indexer that runs periodically. The reviewed data is further chunked and vectorized using Azure OpenAI’s text embedding model. In Azure AI search, the vectorized data then persists in a vector search index.
I. Set up data source connection in Azure AI Search
The data source definition specifies the data to index, credentials, and policies for identifying changes in the data. The data source is defined as an independent resource so that it can be used by multiple indexers. In this example, we’ll use a custom table with product details and review data which is stored in a database in Azure Database for MySQL.
In the Azure AI service, before creating a search Index, we’ll need to create a connection to your data source. We’ll import Azure AI classes like ‘SearchClient’, ‘SearchIndexerClient’, ‘SearchIndexerDataSourceConnection’ and their functions like “create_or_update_data_source_connection()” to setup the data source connection in Azure AI Search. We’ll also import several other models – the comprehensive list is shared in the following code sample.
Code: “1. Set up data source connection in Azure AI Search” in the Jupyter Notebook azure-mysql/Azure_MySQL_AI_Search_Sample at master Azure/azure-mysql
II. Set up automatic chunking, vectorization and indexing
We are now ready to create an Azure Database for MySQL indexer that periodically pulls the product details and reviews data from the database in Azure Database for MySQL, chunks and vectorizes the data and persists it in a vector search index.
To do this, we’ll first create an index which takes the product details and reviews data for each product, splits the combined text into chunks and embeds each chunk as a vector. For any incoming user query, we’ll search for the most relevant chunk using vector search and semantic search.
Create an Azure AI Search index
In Azure AI Search, a search index available to the search engine for indexing, full text search, vector search, hybrid search, and filtered queries. An index is defined by a schema and is saved to the search service.
We’ll first create an index object from the ‘SearchIndexClient’ class.
Then, we define the field mappings to correlate the fields in a MySQL database to the fields in the AI search Index. As the combined text is generally long, it needs to be chunked into smaller words. To do this, we’ll add an additional search field called “chunk”.
Next, we’ll decide the searches that the index will support. In this example, we’ll use ‘vector search’ along with ‘semantic re-ranking’. Here is how the two work together in our solution:
Vector search is first performed on all the entries in the search index.
Semantic search is sort of a neural network where the search can be performed only of limited number. The top 50 results obtained using vector search is sent to the neural network, which re-ranks these documents and provides the top matched result in the reduced context. Semantic search does a better optimization to find the best results. It also produces short-form captions and answers that are useful as LLM inputs for generation.
We’ll then define the vector search and semantic configurations:
Vector search configuration – You can choose different algorithms, such as HNSW or KNN, to perform the vector search. In this post, we’ll choose the most used algorithm – HNSW. We’ll configure the HNSW algorithm to use the ‘COSINE’ metric. Considering each vector as a point in the multi-dimensional space, the algorithm will not find the cosine distance between the points. Lower the distance, more similar the vectors.
Semantic configuration – Here, we’ll define the index field on which semantic re-ranking is performed.
Finally, we’ll create the search index using the above two configurations on the relevant MySQL DB table fields.
Code: “II. Set up automatic chunking, vectorization and indexing” -> “Create index” in the Jupyter Notebook azure-mysql/Azure_MySQL_AI_Search_Sample at master Azure/azure-mysql
Chunking
We’ll create a skillset using two pre-built “skills”:
The “Split Skill”, which takes the concatenated text and divides it into chunks.
The “Azure OpenAI Embedding Skill”, which takes the outputs of the Split Skill and vectorizes them individually.
We’ll then apply an Index Projector to make it so that our final index has one item for every chunk of text, rather than one item for every original row in the database.
Code: “II. Set up automatic chunking, vectorization and indexing” -> “Create skillset” in the Jupyter Notebook azure-mysql/Azure_MySQL_AI_Search_Sample at master Azure/azure-mysql
Create and run the MySQL Indexer
After you define the data source and create the index, you’re ready to create the indexer. The configuration of the Indexer requires the inputs, parameters, and properties that control run time behaviors.
To create an indexer, we’ll provide the name of the data source, index and skillset that we created in the previous steps. We’ll then run the indexer at periodic intervals.
Code: “II. Set up automatic chunking, vectorization and indexing” -> “Create indexer” in the Jupyter Notebook azure-mysql/Azure_MySQL_AI_Search_Sample at master Azure/azure-mysql
After the indexer is created, you can use the Azure portal to view the indexer under the Indexers blade in Azure AI Search. You can also run the indexer from the Azure portal or using Azure CLI. The runs can be configured to be ad hoc or scheduled:
Ad hoc – An indexer can be configured to run only once or on-demand. You can use APIs (like we used in the code sample), Azure portal or CLI to achieve this.
Scheduled – You can schedule and indexer to run at a certain time. Go to “Settings” section in the Azure portal view of the indexer, and choose hourly, one time, daily, or any other custom settings.
III. Use vector search from a sample application
With the Azure AI Search indexer ready and running, you can now use the vector search and semantic search capabilities from your application. To call the search function, provide the user query text, required query parameters like the number of nearest neighbors to return as top hits, and the columns or fields in the index to be considered. Also select query type “Semantic” to include both vector search and semantic search and provide the name of semantic search configuration object that you created in section II.
Code: “III. Use vector search from a sample application” in the Jupyter Notebook azure-mysql/Azure_MySQL_AI_Search_Sample at master Azure/azure-mysql
For a user query like “suggest me some rain jackets for women“, this single API call performs vector search and semantic re-ranking, and returns the top ranked results as shown in the following screenshot. You’ll also see the vector search score, semantic re-ranker score, and all the details of the recommended product.
IV. Generate a GPT response to the user
To generate custom responses to the users, we need to simply make a call to the Azure OpenAI chat completion service. To trigger the completion, we first input some text as a “prompt”. The LLM in the service then generates the completion and attempts to match our context or pattern. An example prompt for this scenario could be: “You are an AI assistant that recommends products to people based on the product reviews data matching their query. Your answer should summarize the review text, include the product ID, include the parent id as review id, and mention the overall sentiment of the review.”
To generate the GPT response, we’ll use the Azure OpenAI’s chat.completions.create() API call and supply it with the user query, the Open AI model to be used, the Azure AI Search index as data source, and the prompt.
Code: “IV. Generate GPT Response to the user” in the Jupyter Notebook azure-mysql/Azure_MySQL_AI_Search_Sample at master Azure/azure-mysql
V. Test the solution in the Azure OpenAI Studio playground
Finally, it’s important to test the solution in Azure Open AI Studio, a user-friendly platform that allows developers to explore cutting-edge APIs and models and to build, test, deploy, and manage AI solutions for use in websites, applications, and other production environments.
For an end-to-end test of the AI solution in the Azure OpenAI Studio playground, perform the following steps:
Go to Azure OpenAI studio, select your subscription and the Azure AI Search resource, and navigate to the “Chat playground”.
In the section, in the Add your data tab, select Add a data source.
In the pop-up window, select Azure AI Search as the data source.
Fill in subscription details, and then choose the Azure AI Search service and Azure AI Search index created in the previous sections.
Enable the Add vector search to this search resource option.
Select the Azure OpenAI – text-embedding-ada-002 model and select Next.
Select Hybrid + semantic search type.
Review all the details and add the data source.
In the Prompt tab under the Setup section, type your prompt message in the System message text box.
You can use the sample prompt from the previous step or modify the prompt to add more details, such as product images, in the response.
To test the solution, type a sample query in the chat box and watch your copilot generate a smart recommendation in response!
Now, you’re all set to deploy this Product Recommender CoPilot AI solution to production!
Conclusion
If you’re running applications, such as content management systems (CMS), e-commerce applications, or gaming sites, with data hosted in Azure Database for MySQL, you can enhance your user experience by building generative AI search and chat applications using LLMs available in Azure OpenAI and vector storage and indexing provided by Azure AI Search. Unleash the power of your data hosted on MySQL with the simple and seamless AI integrations on Azure!
If you have any queries or suggestions for more AI-related content, please let us know by contacting us at AskAzureDBforMySQL@service.microsoft.com. We’re also open to collaborating with you on technical collateral! Check out our Contributors initiative at aka.ms/mysql-contributors to learn more.
Microsoft Tech Community – Latest Blogs –Read More
Announcing Windows Server Preview Build 26257
Announcing Windows Server Preview Build 26257
Hello Windows Server Insiders!
Today we are pleased to release a new build of the next Windows Server Long-Term Servicing Channel (LTSC) Preview that contains both the Desktop Experience and Server Core installation options for Datacenter and Standard editions, Annual Channel for Container Host and Azure Edition (for VM evaluation only). Branding has been updated for the upcoming release, Windows Server 2025, in this preview – when reporting issues please refer to Windows Server 2025 preview. If you signed up for Server Flighting, you should receive this new build automatically.
What’s New
Windows Admin Center (WAC)
Beginning with build 26252, Windows Server 2025 preview customers can download and install Windows Admin Center right from the Windows Server Desktop using the in-OS app that takes care of downloading and guides you through the installation process.
Note: You must be running a desktop version of Windows Server 2025 Datacenter or Standard preview to access this feature.
Delegated Managed Service Accounts (dMSA)
A new account type known as delegated Managed Service Account (dMSA) is now available that allows migration from a traditional service account to a machine account with managed and fully randomized keys, while disabling original service account passwords.
Authentication for dMSA is linked to the device identity, which means that only specified machine identities mapped in AD can access the account. Using dMSA helps to prevent harvesting credentials using a compromised account (kerberoasting), which is a common issue with traditional service accounts.
To learn more about dMSA, visit https://learn.microsoft.com/en-us/windows-server/security/delegated-managed-service-accounts/delegated-managed-service-accounts-overview.
Windows Server Flighting is here!!
If you signed up for Server Flighting, you should receive this new build automatically later today.
For more information, see Welcome to Windows Insider flighting on Windows Server – Microsoft Community Hub
The new Feedback Hub app is now available for Server Desktop users! The app should automatically update with the latest version, but if it does not, simply Check for updates in the app’s settings tab.
Known Issues
Upgrade does not complete: Some users may experience an issue when upgrading where the download process does not progress beyond 0%. If you encounter this issue, please upgrade to this newer build using the ISO media download option. Download Windows Server Insider Preview (microsoft.com)
Access denied error when using Diskpart –> Clean Image on Winpe.vhdx VMs created using WinPE: Create bootable media | Microsoft Learn. We are working to resolve this issue and expect to have it fixed in the next preview release.
Download Windows Server Insider Preview (microsoft.com)
Flighting: The label for this flight may incorrectly reference Windows 11. However, when selected, the package installed is the Windows Server update. Please ignore the label and proceed with installing your flight. This issue will be addressed in a future release.
Setup: Some users may experience overlapping rectangle voids following mouse clicks during “OOBE” setup. This is a graphics rendering issue and will not prevent setup from completing. This issue will be addressed in a future release.
WinPE – Powershell Scripts: Applying the WinPE-Powershell optional component does not properly install Powershell in WinPE. As a result, Powershell cmdlets will fail. Customers who are dependent on Powershell in WinPE should not use this build.
If you are validating upgrades from Windows Server 2019 or 2022, we do not recommend that you use this build as intermittent upgrade failures have been identified for this build.
This build has an issue where archiving eventlogs with “wevetutil al” command causes the Windows Event Log service to crash, and the archive operation to fail. The service must be restarted by executing “Start-Service EventLog” from an administrative command line prompt.
If you have Secure Launch/DRTM code path enabled, we do not recommend that you install this build.
Available Downloads
Downloads to certain countries may not be available. See Microsoft suspends new sales in Russia – Microsoft On the Issues.
Windows Server Long-Term Servicing Channel Preview in ISO format in 18 languages, and in VHDX format in English only.
Windows Server Datacenter Azure Edition Preview in ISO and VHDX format, English only.
Microsoft Server Languages and Optional Features Preview
Keys: Keys are valid for preview builds only
Server Standard: MFY9F-XBN2F-TYFMP-CCV49-RMYVH
Datacenter: 2KNJJ-33Y9H-2GXGX-KMQWH-G6H67
Azure Edition does not accept a key
Symbols: Available on the public symbol server – see Using the Microsoft Symbol Server.
Expiration: This Windows Server Preview will expire September 15, 2024.
How to Download
Registered Insiders may navigate directly to the Windows Server Insider Preview download page. If you have not yet registered as an Insider, see GETTING STARTED WITH SERVER on the Windows Insiders for Business portal.
We value your feedback!
The most important part of the release cycle is to hear what’s working and what needs to be improved, so your feedback is extremely valued. Beginning with Insider build 26063, please use the new Feedback Hub app for Windows Server if you are running a Desktop version of Server. If you are using a Core edition, or if you are unable to use the Feedback Hub app, you can use your registered Windows 10 or Windows 11 Insider device and use the Feedback Hub application. In the app, choose the Windows Server category and then the appropriate subcategory for your feedback. In the title of the Feedback, please indicate the build number you are providing feedback on as shown below to ensure that your issue is attributed to the right version:
[Server #####] Title of my feedback
See Give Feedback on Windows Server via Feedback Hub for specifics. The Windows Server Insiders space on the Microsoft Tech Communities supports preview builds of the next version of Windows Server. Use the forum to collaborate, share and learn from experts. For versions that have been released to general availability in market, try the Windows Server for IT Pro forum or contact Support for Business.
Diagnostic and Usage Information
Microsoft collects this information over the internet to help keep Windows secure and up to date, troubleshoot problems, and make product improvements. Microsoft server operating systems can be configured to turn diagnostic data off, send Required diagnostic data, or send Optional diagnostic data. During previews, Microsoft asks that you change the default setting to Optional to provide the best automatic feedback and help us improve the final product.
Administrators can change the level of information collection through Settings. For details, see http://aka.ms/winserverdata. Also see the Microsoft Privacy Statement.
Terms of Use
This is pre-release software – it is provided for use “as-is” and is not supported in production environments. Users are responsible for installing any updates that may be made available from Windows Update. All pre-release software made available to you via the Windows Server Insider program is governed by the Insider Terms of Use.
Announcing Windows Server Preview Build 26257
Hello Windows Server Insiders!
Today we are pleased to release a new build of the next Windows Server Long-Term Servicing Channel (LTSC) Preview that contains both the Desktop Experience and Server Core installation options for Datacenter and Standard editions, Annual Channel for Container Host and Azure Edition (for VM evaluation only). Branding has been updated for the upcoming release, Windows Server 2025, in this preview – when reporting issues please refer to Windows Server 2025 preview. If you signed up for Server Flighting, you should receive this new build automatically.
What’s New
Windows Admin Center (WAC)Beginning with build 26252, Windows Server 2025 preview customers can download and install Windows Admin Center right from the Windows Server Desktop using the in-OS app that takes care of downloading and guides you through the installation process.
Note: You must be running a desktop version of Windows Server 2025 Datacenter or Standard preview to access this feature.
Delegated Managed Service Accounts (dMSA)
A new account type known as delegated Managed Service Account (dMSA) is now available that allows migration from a traditional service account to a machine account with managed and fully randomized keys, while disabling original service account passwords.
Authentication for dMSA is linked to the device identity, which means that only specified machine identities mapped in AD can access the account. Using dMSA helps to prevent harvesting credentials using a compromised account (kerberoasting), which is a common issue with traditional service accounts.
To learn more about dMSA, visit https://learn.microsoft.com/en-us/windows-server/security/delegated-managed-service-accounts/delegated-managed-service-accounts-overview.
Windows Server Flighting is here!!
If you signed up for Server Flighting, you should receive this new build automatically later today.
For more information, see Welcome to Windows Insider flighting on Windows Server – Microsoft Community Hub
The new Feedback Hub app is now available for Server Desktop users! The app should automatically update with the latest version, but if it does not, simply Check for updates in the app’s settings tab.
Known Issues
Upgrade does not complete: Some users may experience an issue when upgrading where the download process does not progress beyond 0%. If you encounter this issue, please upgrade to this newer build using the ISO media download option. Download Windows Server Insider Preview (microsoft.com)
Access denied error when using Diskpart –> Clean Image on Winpe.vhdx VMs created using WinPE: Create bootable media | Microsoft Learn. We are working to resolve this issue and expect to have it fixed in the next preview release.
Download Windows Server Insider Preview (microsoft.com)
Flighting: The label for this flight may incorrectly reference Windows 11. However, when selected, the package installed is the Windows Server update. Please ignore the label and proceed with installing your flight. This issue will be addressed in a future release.
Setup: Some users may experience overlapping rectangle voids following mouse clicks during “OOBE” setup. This is a graphics rendering issue and will not prevent setup from completing. This issue will be addressed in a future release.
WinPE – Powershell Scripts: Applying the WinPE-Powershell optional component does not properly install Powershell in WinPE. As a result, Powershell cmdlets will fail. Customers who are dependent on Powershell in WinPE should not use this build.
If you are validating upgrades from Windows Server 2019 or 2022, we do not recommend that you use this build as intermittent upgrade failures have been identified for this build.
This build has an issue where archiving eventlogs with “wevetutil al” command causes the Windows Event Log service to crash, and the archive operation to fail. The service must be restarted by executing “Start-Service EventLog” from an administrative command line prompt.
If you have Secure Launch/DRTM code path enabled, we do not recommend that you install this build.
Available Downloads
Downloads to certain countries may not be available. See Microsoft suspends new sales in Russia – Microsoft On the Issues.
Windows Server Long-Term Servicing Channel Preview in ISO format in 18 languages, and in VHDX format in English only.
Windows Server Datacenter Azure Edition Preview in ISO and VHDX format, English only.
Microsoft Server Languages and Optional Features Preview
Keys: Keys are valid for preview builds only
Server Standard: MFY9F-XBN2F-TYFMP-CCV49-RMYVH
Datacenter: 2KNJJ-33Y9H-2GXGX-KMQWH-G6H67
Azure Edition does not accept a key
Symbols: Available on the public symbol server – see Using the Microsoft Symbol Server.
Expiration: This Windows Server Preview will expire September 15, 2024.
How to Download
Registered Insiders may navigate directly to the Windows Server Insider Preview download page. If you have not yet registered as an Insider, see GETTING STARTED WITH SERVER on the Windows Insiders for Business portal.
We value your feedback!
The most important part of the release cycle is to hear what’s working and what needs to be improved, so your feedback is extremely valued. Beginning with Insider build 26063, please use the new Feedback Hub app for Windows Server if you are running a Desktop version of Server. If you are using a Core edition, or if you are unable to use the Feedback Hub app, you can use your registered Windows 10 or Windows 11 Insider device and use the Feedback Hub application. In the app, choose the Windows Server category and then the appropriate subcategory for your feedback. In the title of the Feedback, please indicate the build number you are providing feedback on as shown below to ensure that your issue is attributed to the right version:
[Server #####] Title of my feedback
See Give Feedback on Windows Server via Feedback Hub for specifics. The Windows Server Insiders space on the Microsoft Tech Communities supports preview builds of the next version of Windows Server. Use the forum to collaborate, share and learn from experts. For versions that have been released to general availability in market, try the Windows Server for IT Pro forum or contact Support for Business.
Diagnostic and Usage Information
Microsoft collects this information over the internet to help keep Windows secure and up to date, troubleshoot problems, and make product improvements. Microsoft server operating systems can be configured to turn diagnostic data off, send Required diagnostic data, or send Optional diagnostic data. During previews, Microsoft asks that you change the default setting to Optional to provide the best automatic feedback and help us improve the final product.
Administrators can change the level of information collection through Settings. For details, see http://aka.ms/winserverdata. Also see the Microsoft Privacy Statement.
Terms of Use
This is pre-release software – it is provided for use “as-is” and is not supported in production environments. Users are responsible for installing any updates that may be made available from Windows Update. All pre-release software made available to you via the Windows Server Insider program is governed by the Insider Terms of Use. Read More
How to add Confluence as data source to MS Copilot Studio
How to add Confluence as data source to MS Copilot Studio. I have tried with Graph Connection provided by Microsoft but its not working.
How to add Confluence as data source to MS Copilot Studio. I have tried with Graph Connection provided by Microsoft but its not working. Read More
Returning the Row/area based on search results
I am developing a spreadsheet to keep track of inventory at my work based on specific locations. I have all of the data here:
I also have a search function built with conditional formatting to highlight the inventory number if it is in stock.
I am wanting to have the Row and area (A,B,C,D, or E) returned below the search box, is that possible? If so how should I go about it?
I am developing a spreadsheet to keep track of inventory at my work based on specific locations. I have all of the data here:I also have a search function built with conditional formatting to highlight the inventory number if it is in stock. I am wanting to have the Row and area (A,B,C,D, or E) returned below the search box, is that possible? If so how should I go about it? Read More
Adjusting file path in power query so other users can refresh queries
So I have a file for work that I’ve developed and I need to configure it so that others can refresh it. We use Box for file sharing. The problem is that when others open the file up – the source filepath is always tied to my unique user id. I need to adjust the file path somehow so that it accounts for the fact that someone else is refreshing it.
Current file path syntax: C:UsersMYUSERIDBoxBox_folder_1Box_folder2box_folder_3box_folder_4
I need the syntax to be: C:UsersANYUSERIDBoxBox_folder_1Box_folder2box_folder_3box_folder_4
I assume there is something I can do in the editor to make the file path something that is more relative, for example – adjusting the USER ID, or just giving the filepath a particular name so that it isn’t looking for my specific User ID. Any thoughts?
So I have a file for work that I’ve developed and I need to configure it so that others can refresh it. We use Box for file sharing. The problem is that when others open the file up – the source filepath is always tied to my unique user id. I need to adjust the file path somehow so that it accounts for the fact that someone else is refreshing it. Current file path syntax: C:UsersMYUSERIDBoxBox_folder_1Box_folder2box_folder_3box_folder_4I need the syntax to be: C:UsersANYUSERIDBoxBox_folder_1Box_folder2box_folder_3box_folder_4 I assume there is something I can do in the editor to make the file path something that is more relative, for example – adjusting the USER ID, or just giving the filepath a particular name so that it isn’t looking for my specific User ID. Any thoughts? Read More
Keeping formatting on frozen columns across sheets?
Hello,
I’ve got the first three columns on sheet one formatted conditionally, and frozen. They are always the leading three columns on all following sheets in the workbook, and continue the color pattern, for example when I type “Conservation” in an empty cell in column B, it turns orange.
My intent, which I cannot figure out, is to be able to update the data in sheet one in these frozen columns, and have it update across all sheets, while keeping the source formatting; cell fill color, bold, and links. I’ve tried copy->paste as link for the column on the next sheets, but that doesn’t keep the formatting. I’ve also tried a few formulas that replicate the data accurately, but don’t keep the color coding, or the corresponding hyperlinks, for example.
Seems like this function should be hiding in plain sight, but I’m somewhat of an Excel newbie here, so please keep that in mind, if you’re kind enough to respond.
Thank you!
Hello,I’ve got the first three columns on sheet one formatted conditionally, and frozen. They are always the leading three columns on all following sheets in the workbook, and continue the color pattern, for example when I type “Conservation” in an empty cell in column B, it turns orange.My intent, which I cannot figure out, is to be able to update the data in sheet one in these frozen columns, and have it update across all sheets, while keeping the source formatting; cell fill color, bold, and links. I’ve tried copy->paste as link for the column on the next sheets, but that doesn’t keep the formatting. I’ve also tried a few formulas that replicate the data accurately, but don’t keep the color coding, or the corresponding hyperlinks, for example.Seems like this function should be hiding in plain sight, but I’m somewhat of an Excel newbie here, so please keep that in mind, if you’re kind enough to respond.Thank you! Read More
Migrating from more than 2 days and seems like there is some issue as the total size is 16 GB
Hello
Please i need your help on this issue.
We did migration but we are having error.
We had a user who We have scheduled this migration
The user is using .onmicrosoft.com account. Its been migrating from more than 2 days and seems like there is some issue as the total size is 16 GB on Source end.
However the attached results and snip shown below says
257 GB how come.? Total mailbox size on source is 16 GB then how is it possible 257 GB. Seems like there is some discrepancy between these storage numbers.
I would like to know why it is taking so long to be migrated 16 GB storage.?
Hello Please i need your help on this issue. We did migration but we are having error. We had a user who We have scheduled this migration The user is using .onmicrosoft.com account. Its been migrating from more than 2 days and seems like there is some issue as the total size is 16 GB on Source end. However the attached results and snip shown below says 257 GB how come.? Total mailbox size on source is 16 GB then how is it possible 257 GB. Seems like there is some discrepancy between these storage numbers. I would like to know why it is taking so long to be migrated 16 GB storage.? Read More