Category: Microsoft
Category Archives: Microsoft
Teams Voice – business recovery plan
Hello,
I have a mandate to draw up a business recovery plan. We use Teams phones with domestic calling licenses and Microsoft is our service provider.
I can’t find any information about vendor/customer responsibility for the Voice portion of Teams. I assume that as a SAAS, Microsoft is responsible for offering the service, but what about configurations such as Phone numbers, auto attendants, call queues, holidays, resource accounts?
Regards
Eric
Hello,I have a mandate to draw up a business recovery plan. We use Teams phones with domestic calling licenses and Microsoft is our service provider. I can’t find any information about vendor/customer responsibility for the Voice portion of Teams. I assume that as a SAAS, Microsoft is responsible for offering the service, but what about configurations such as Phone numbers, auto attendants, call queues, holidays, resource accounts? Regards Eric Read More
WebDav installation issue with IIS
Hi! I have failed installation IIS for Web Server on virtual machine windows server 2016, that have connection via proxy and I connected to it via xfreerdp.
What it can be?
Hi! I have failed installation IIS for Web Server on virtual machine windows server 2016, that have connection via proxy and I connected to it via xfreerdp.What it can be? Read More
Received email: Immediate Action Required: Update your sign-in technology before September 16th, 202
I’m not sure of:
1. Is this a legit email?
2. I’m not a tech. Pls use plain English.
3. How do I know I’m using 3rd party stuff. What’s a third party? I tap on my email icon on iPhone and get my Hotmail emails. I also use a pc and get my hotmail email through that sometimes (link says Outlook.live.com, but my address is hotmail.com). Is it outlook email or hotmail or what?
4. Which one is going to be affected and how do I make it work? In plain English, not in tech terms please! 🙏🏼I’m extremely lost because all search I’ve made so far comes up with tech language, acronyms, download this, and that, but all in terms assuming a person has extended tech knowledge.
Please help.
Chief Noel…
I’m not sure of:1. Is this a legit email?2. I’m not a tech. Pls use plain English.3. How do I know I’m using 3rd party stuff. What’s a third party? I tap on my email icon on iPhone and get my Hotmail emails. I also use a pc and get my hotmail email through that sometimes (link says Outlook.live.com, but my address is hotmail.com). Is it outlook email or hotmail or what?4. Which one is going to be affected and how do I make it work? In plain English, not in tech terms please! 🙏🏼I’m extremely lost because all search I’ve made so far comes up with tech language, acronyms, download this, and that, but all in terms assuming a person has extended tech knowledge.Please help.Chief Noel… Read More
Month Year as an ID
Hi Experts,
I have been fiddling around with this query for quite some time. I am getting nowhere with it. What I have resorted to is trying to use MonthYearSort in a Where clause in the following OpeningBalance calculation (I typically would use an ID but I dont have one. I dont know if this is a good idea…probably not however it is unique but just not a typical ID since its based on a date). Not sure if some special kind of formatting would need to be used since it is based on a date. I am grabbing a sum from another query and summing by month based on the MonthYearSort.
OpeningBal: format(Nz(DSum(“SumOfAmount”,”qryDrawsDecliningCumDrawn”,”Type=” & [tblDraws].[Type] & ” And [MonthYearSort] < ” & Nz([qryDrawsDecliningCumDrawn].[MonthYearSort],0)),0),”Currency”)
MonthYrSort: Year([FundingDate])*12+DatePart(‘m’,[FundingDate])-1
thank you.
Hi Experts,I have been fiddling around with this query for quite some time. I am getting nowhere with it. What I have resorted to is trying to use MonthYearSort in a Where clause in the following OpeningBalance calculation (I typically would use an ID but I dont have one. I dont know if this is a good idea…probably not however it is unique but just not a typical ID since its based on a date). Not sure if some special kind of formatting would need to be used since it is based on a date. I am grabbing a sum from another query and summing by month based on the MonthYearSort. OpeningBal: format(Nz(DSum(“SumOfAmount”,”qryDrawsDecliningCumDrawn”,”Type=” & [tblDraws].[Type] & ” And [MonthYearSort] < ” & Nz([qryDrawsDecliningCumDrawn].[MonthYearSort],0)),0),”Currency”)MonthYrSort: Year([FundingDate])*12+DatePart(‘m’,[FundingDate])-1 thank you. Read More
Analysis Services Community
Hi everyone,
I’m not sure if this is the right place for this question, but I can’t find a specific community for SSAS projects.
Anyway, I hope you can help me with this doubt:
I have a Tabular project using Visual Studio and SQL Server Analysis Services (SSAS), with the aim of viewing the data in a Power BI report in Direct Query mode. My question is, why do I have to process the table in Visual Studio, deploy the solution, and then process it again in SQL to see the data refreshed in Power BI when I make changes to the project?
Is there a more efficient way to handle this? Why do I need to process the table twice—once in Visual Studio and once again in SQL after deploying?
For example: I changed the data type of a column in my table, and I don’t see the change until I go through the previous steps I detailed.
Hi everyone,I’m not sure if this is the right place for this question, but I can’t find a specific community for SSAS projects. Anyway, I hope you can help me with this doubt: I have a Tabular project using Visual Studio and SQL Server Analysis Services (SSAS), with the aim of viewing the data in a Power BI report in Direct Query mode. My question is, why do I have to process the table in Visual Studio, deploy the solution, and then process it again in SQL to see the data refreshed in Power BI when I make changes to the project?Is there a more efficient way to handle this? Why do I need to process the table twice—once in Visual Studio and once again in SQL after deploying? For example: I changed the data type of a column in my table, and I don’t see the change until I go through the previous steps I detailed. Read More
Forms Attachment to SharePoint with a twist
Hello, I have a form that I use to collect responses in relation to accident reports. I have a flow set up to add them to a sharepoint list. I used to have the attachment working when I was the owner of the form because they would save to my OneDrive folder. These pictures are usually for injuries so I dont want them on my drive. I have transferred the ownership of the form to the actual SharePoint site where the list is held. The attachments are now going into a folder on SharePoint under Documents > Apps >etc.. Instead of my OneDrive > Apps > etc.. I tried the Sharepoint Get File Content but I cannot get it to find the file I need.
Hello, I have a form that I use to collect responses in relation to accident reports. I have a flow set up to add them to a sharepoint list. I used to have the attachment working when I was the owner of the form because they would save to my OneDrive folder. These pictures are usually for injuries so I dont want them on my drive. I have transferred the ownership of the form to the actual SharePoint site where the list is held. The attachments are now going into a folder on SharePoint under Documents > Apps >etc.. Instead of my OneDrive > Apps > etc.. I tried the Sharepoint Get File Content but I cannot get it to find the file I need. Read More
Elastic pools for Azure SQL Database Hyperscale now Generally Available!
We are very pleased to announce General Availability (GA) for Azure SQL Database Hyperscale elastic pools (“Hyperscale elastic pools”).
Why use Hyperscale elastic pools?
Azure SQL Database is the preferred database technology for hundreds of thousands of customers. Built on top of the rock-solid SQL Server engine and leveraging leading cloud-native architecture and technologies, Azure SQL Database Hyperscale offers leading performance, scalability and elasticity with one of the lowest TCO in the industry .
While you may start with a standalone Hyperscale database, chances are that as your fleet of databases grows, you want to optimize price and performance across a set of Hyperscale databases. Elastic pools offer the convenience of pooling resources like CPU, memory, IO, while ensuring strong security isolation between those databases.
Here’s an example showing 8 standalone databases, each with an individually variable workload. Each database tends to spike in resource consumption at different points in time. Each database must therefore be allocated adequate resources (CPU, data IO, log IO, etc.) to accommodate the individual peak resource requirement. Accordingly, the total cost of these databases is directly proportional to the number of databases, while the average utilization of each database is low.
vCore configuration shown is for demonstration purposes. Prices as of Sep 12, 2024, and only represent compute costs for Azure East US. Storage costs are extra and are billed per database. Actual configuration depends on workload profiles and performance requirements.
With the shared resource model offered by Hyperscale elastic pools, the aggregate performance requirements of the workloads become much “smoother”, as seen in the white line chart. Correspondingly, the elastic pool only needs to be provisioned for the maximum combined resource requirement. This way, overall cost is lower, and average resource utilization is much higher than the standalone database scenario.
vCore configuration shown is for demonstration purposes. Prices as of Sep 12, 2024, and only represent compute costs for Azure East US. Savings are indicative. Storage costs are extra and billed per database. Actual configuration depends on workload profiles and performance requirements.
Customer feedback
For many customers, elastic pools are an essential tool to stay competitive from price and performance perspectives. Adam Wiedenhaefer, Principal Data Architect, Covetrus Global Technology Solutions, says:
“Elastic pools on Azure SQL Database Hyperscale has provided us a solid blend between performance, storage, and overall flexibility. This allows us to scale our systems in ways we couldn’t before at a price point that reduces overall costs, savings we can pass on to Covetrus customers.“
The cloud-native architecture for Hyperscale elastic pools enables independent scaling of compute and storage in a fast and predictable manner. This allows customers to perfectly optimize their compute resources while relying on auto-scaling storage, which provides hands-off scalability and great performance as their databases grow. Nick Olsen, CTO, ResMan says:
“We have been users of Azure SQL Database elastic pools for over a decade now and have loved the ability to share resources amongst many databases. Our applications are such that only a few databases reach peak utilization simultaneously but we need to allow any given database to consume quite a bit of resources when bursts occur. As our requirements evolved, we found that we needed to go beyond the resource limits of our existing pools while controlling for the amount of time it would take to scale very large pools. The introduction of elastic pools in Azure SQL Database Hyperscale introduced much higher limits on pool size and the ability to scale in constant time, regardless of the size of the workload. We are now able meet the evolving needs of our business while allowing us to achieve greater cost savings than we have had in the past.”
Throughout public preview, we have received overwhelmingly positive feedback from several customers about the superb reliability, great performance and scalability, and the value for money that Hyperscale elastic pools have provided. Many customers are already running production systems on Hyperscale elastic pools since public preview.
Availability
During a very successful public preview, we have seen tremendous adoption from many customers and have addressed top customer requests and improvements including:
Zone redundancy for Hyperscale elastic pools.
Premium-series (PRMS / MOPRMS) hardware for Hyperscale elastic pools.
Reserved capacity for these premium-series hardware options.
Maintenance window support for Hyperscale elastic pools.
All these capabilities for Hyperscale elastic pools are also Generally Available (GA) starting 12 September 2024. Hyperscale elastic pools are available in all supported Azure regions including US Government regions and Azure China.
Pricing
With GA we are also adjusting the pricing for Hyperscale elastic pools. Starting 12 September 2024, we will begin charging an additional $0.05 / vCore / hour for each Hyperscale elastic pool, compared to the preview price. The additional charge will not be eligible for reserved capacity discounts and will apply to the primary pool replica and any secondary pool replicas configured. The final pricing is visible in the Azure portal, Azure Pricing calculator and on the Azure SQL Database pricing page.
Resources
Learn more about the architecture of Hyperscale elastic pools.
[Docs] View examples of PowerShell and Azure CLI commands for managing Hyperscale elastic pools.
[Docs] Review supported capabilities and resource limits for Hyperscale elastic pools.
Our team is here to assist with any questions you may have. Please leave a comment on this blog and we’ll be happy to get back to you. Alternatively, you can also email us at sqlhsfeedback AT microsoft DOT com. We are eager to hear from you all!
Microsoft Tech Community – Latest Blogs –Read More
From Feedback to Action: How Viva Pulse Empowers Managers at Microsoft to Drive Employee Engagement
In fall 2023, the Viva Pulse product team collaborated with Microsoft’s HRBI (HR Business Insights) team to develop a strategy for continuous listening using Viva Pulse. We designed a comprehensive employee data collection approach that includes the existing semi-annual Viva Glint engagement survey, supplemented by new, manager-driven Viva Pulse surveys in the interim.
“We leverage Viva Pulse as a strategic tool within our broader Employee Listening System. It provides special value between large survey cycles to track progress and drive action taking, and the opportunities are endless as this tool flexibly enables our managers/project leads democratized access to robust surveying functionality.” – Dante Myers, Director of Employee Listening Systems
The HRBI team created an organization-specific pulse survey template tailored for managers at Microsoft. This custom template was designed to help managers regularly check in on their team’s progress on the focus areas identified from the previous Viva Glint engagement survey cycle and to make any necessary adjustments to their action plans. Six weeks after the release of the Viva Glint engagement survey results, in early 2024, the HRBI team sent an email to all managers, encouraging them to use Viva Pulse to act on their Viva Glint engagement survey feedback.
Following the communication, Microsoft conducted a research study with managers who used Viva Pulse to learn more about the effectiveness of the tool to support actioning. From the study, the manager participants highlighted the importance of acting on survey feedback to build trust and demonstrate that leadership listens. They emphasized the value of regular surveys to provide leaders with ongoing insights and maintain connection with their teams in between survey cycles. Additionally, they see surveys as a tool to reinforce organizational culture, unify the team, and communicate that employee opinions are valued and impactful.
When the next Viva Glint engagement survey was released, the Research team re-connected with some of the manager participants to follow up on their Viva Pulse experience and any impact it may have had on their Viva Glint engagement survey experience. The Managers noted a significant increase in response rate in their interim Pulses and Glint engagement survey.
One manager participant was unable to review his direct team’s fall 2023 Viva Glint engagement survey scores as responses were too low. This round – 88% responded. He noted it was because of the continuous listening and action taking. He said, “I’m just taking the time from those Pulses and bringing them up on team calls and immediately sharing the results. The higher result happened; I believe [it is] from people being more familiar & comfortable, developing that muscle with the process.”
The manager participants we spoke to attribute these response improvements to Pulse building trust within respondents, further emphasizing that their feedback is being heard, and action is being taken. Additionally, these manager participants took the time to process the Pulse data after each survey, share it with their team, and create an environment for discussion, which we believe is crucial to closing the feedback loop.
Another manager participant noted, “Pulse is a great way for people to know that leadership is doing [things] or trying to do [things]….It signals that we’re not just doing this because [our CEO] wants us to do all these surveys and check sentiment…No, we’re doing this independently.” Managers appreciated the opportunity to seek feedback directly, instead of waiting for the next engagement survey to receive sentiment from their teams. They found Viva Pulse’s stock templates and question library particularly useful as a starting point, and by customizing the survey to match their team’s needs and focus areas, the managers were able to gather insights during critical moments.
As this initial experiment concludes, the Viva Pulse team will continue to invest in Viva Pulse as a follow up to engagement surveys. With deeper integrations planned between Viva Pulse and Viva Glint, we aim to support manager enablement in driving employee engagement, culture, and inclusion. Our goal is to develop Viva Pulse in a way that empowers managers to take ownership of driving employee engagement.
Thank you for reading! If you have any questions or feedback, please don’t hesitate to comment. Furthermore, if you would like to get involved with Viva Pulse, please comment below to learn about our customer engagement opportunities.
Until Next Time,
Viva Pulse Team
Microsoft Tech Community – Latest Blogs –Read More
Think like a People Scientist: How Microsoft used Viva Insights to understand organizational change
On September 11 we took a deep dive into the transformative power of Viva Glint, People Science and Viva Insights in understanding and driving organizational change. I was joined by our Viva People Scientists Keith Mcgrane, Beth Demko and Jennifer Stoll, and Todd Crutchfield (Principal Data Science Manager at Microsoft). The session unpacked the nuances of employee sentiment, organizational data, and the innovative use of Organizational Network Analysis (ONA) within Viva Insights.
Key points discussed during this webinar:
People Science and Change: Jennifer and Beth kicked off by exploring the People Science perspective behind organizational change, focusing on employee sentiment and the theories that guide our understanding during organizational transitions.
Introducing Viva Insights and Organizational Network Analysis: Keith introduced the concept of adding organizational data into your employee listening strategy. He spoke about how Viva Insights leverages organizational data to complement employee sentiment, with a special focus on ONA’s role in understanding and facilitating change.
Microsoft’s Use Case: Todd added a practical example from Microsoft, demonstrating how ONA was used internally to assess the impact of organizational restructuring on employee collaboration.
This session was a reminder of the power of data in understanding and navigating the human aspects of organizational change. It also highlighted the combined value of employee sentiment and organizational data to leaders during times of change.
We invite you to watch the recording and access the slides and other useful resources from this event below.
Read more about the Microsoft ONA use case HERE.
Learn more about our perspective on Holistic Listening HERE.
Microsoft Tech Community – Latest Blogs –Read More
High CPU Consumption in IIS Worker Processes
Context:
High CPU consumption in IIS worker processes (w3wp.exe) can significantly impact the performance of your web applications. Here we will discuss available tools to identifying symptoms, initial troubleshooting steps and data collection methods.
Symptoms:
When IIS worker processes consume high CPU, you may notice:
Slow response times or timeouts in web applications.
High memory usage accompanying the high CPU.
Overall server performance degradation.
Initial isolation:
Troubleshooting high CPU involves two main tasks: Issue identification, Log Collection and Analysis. Here are some isolation to help you get started with IIS worker process:
Is high CPU accompanied by high memory usage or slowness? Observe the memory consumption along with CPU consumption.
Describe the steps leading to high CPU usage. Dose the CPU consumptions increase with load?
What is the CPU consumption by w3wp.exe alone and the total CPU consumption on the server during the issue? If overall CPU is high, then there may be an issue with server performance.
How does CPU consumption change over time? If this spikes up and down then should be as per the code execution. Monitor and if stays for 5-10 seconds, then we have a scenario to troubleshoot further.
How often does the problem occur and how high does the CPU get? Observe any pattern or specific time. May be any scheduled task is executing or load increases.
Data Collection
For .NET Core applications hosted on IIS, the data collection method depends on whether the app is in-process or out-of-process. In-process mode is recommended over out-of-proc.
Caution 1: For CPU analysis, either ETW trace or memory dumps can be useful. Avoid collecting both simultaneously to prevent additional CPU load or slowness.
ETW Trace vs. Memory Dumps:
ETW Trace: Useful in production scenarios with minimal performance impact. Captures specific events and performance counters.
Memory Dumps: Provides an in-depth view of objects, their values, and roots. Useful for detailed analysis.
Caution 2: Avoid collecting logs at very high CPU values (>=95%) as the server might hang or become unresponsive. Collect logs before CPU reaches that level.
The initial goal is to get logs that enable us to watch the operations on the same non-waiting thread(s) over a portion of the problematic time period where w3wp.exe CPU is highest. Therefore:
Multiple dumps are needed (3 is usually a good number).
Dumps should be taken of the same process ID.
Dumps should be close enough in time (usually 10 seconds apart).
Collect dumps within a CPU usage that is considered high and abnormal.
ETW Steps – Perfview
We will use the Perfview tool to collect traces.
Download Perfview from here.
Run PerfView.exe as admin.
During the occurrence of the issue:
Click the Collect Menu and select Collect option.
Check Zip, Merge, and Thread Time checkboxes.
Expand the Advanced Options tab and select the IIS checkbox.
Start collection and stop once the issue is produced.
Note: Do not capture this trace for too long as data gets overwritten once the circular MB limit is reached.
Memory Dumps Steps
We will use either Procdump.exe or DebugDiag Collection tool. Procdump is preferred when installation of applications is not possible.
Caution 3: Do not use Task Manager to collect dumps. Specialized tools like DebugDiag and Procdump provide more detailed information and handle process bitness correctly.
Option 1: Procdump
Download Procdump.exe from here.
In an elevated command prompt, go to the directory where you have extracted the downloaded procdump.zip and run the command:
procdump.exe -c w3wpCPUConsumptionTrigger -s NumOfSeconds -ma -n NumberOfDumpsToCollect PID
-c: The w3wp.exe CPU consumption threshold. Replace w3wpCPUConsumptionTrigger with threshold say 80%.
-s: Number of consecutive seconds the CPU consumption must be above the threshold. Replace NumOfSeconds with no of seconds CPU stays above the threshold, say 10 seconds.
-ma: Full memory dump.
-n: Number of dumps to collect. Replace NumberOfDumpsToCollect with number of dumps to collected. Usually this is set as 3 for conclusive analysis.
PID is the process ID, you can find it from Task manager or IIS Worker Processes.
Command example : procdump.exe -c 80 -s 10 -ma -n 3 1234
Option 2: DebugDiag
Install DebugDiag from here.
Open DebugDiag Collection from the start menu.
Change the path for dumps if needed via Tools > Options And Settings > Manual Userdump Save Folder.
Click the “Processes” tab.
Locate the w3wp process by its Process ID.
Right-click the process and select “Create Userdump Series”, then set the options.
Click “Save and close” to start generating dumps.
Finally: By following these steps, you can effectively isolate and collect necessary data for high CPU consumption issues in IIS worker processes. If you are comfortable analyzing this data then you should be able to draw insights from Perfview an Dumps. If you want us to do that, please contact Microsoft IIS Support with a case and we will do it for you.
Microsoft Tech Community – Latest Blogs –Read More
Quick Items to Review for a Smooth Microsoft 365 Copilot Deployment
As Microsoft 365 Copilot continues to transform the way people work, it is more important than ever to have a governance structure in place for your data. This guide will help you ensure a smooth deployment and maximize the benefits of Microsoft 365 Copilot as quickly as possible.
Archive or Purge Stale Data and Sites
Purging or archiving data from SharePoint that is no longer being used is the cornerstone of any data governance strategy. The links below highlight some of the native tools in Microsoft 365 to help manage stale content.
Use Expiration policies for Microsoft 365 Groups so users can easily participate in the lifecycle of their data.
Set expiration for Microsoft 365 groups – Microsoft Entra ID | Microsoft Learn
Create Lifecycle policies to identify inactive sites (SharePoint Premium required).
Manage site lifecycle policies – SharePoint in Microsoft 365 | Microsoft Learn
Configure Retention Policies to automatically expire and purge old content
Configure Microsoft 365 retention settings to automatically retain or delete content | Microsoft Learn
Configure Microsoft 365 Archive for sites that are no longer being used, but need to be preserved for compliance or other reasons.
Manage Microsoft 365 Archive – Microsoft 365 Archive | Microsoft Learn
Review Sharing Settings
Below are settings a SharePoint administrator can configure to manage how users can share content with others.
If enabled, configure expiration for Anyone Links to ensure they are removed automatically.
Review Default Link Type to help guide users to the most appropriate experience for your organization.
Use sensitivity labels and configure a “default” in SharePoint to automate protection settings.
Configure a default sensitivity label for a SharePoint document library | Microsoft Learn
Find Overshared Data
Configure Purview policies to look for and restrict access to potentially overshared files. Typical policies for Microsoft 365 Copilot will look for:
Passwords
API keys
Personal information
What DLP policy templates include
Data loss prevention policy tip reference for SharePoint
Review reports for content shared with “Anyone” or “People in the organization” (SharePoint Premium required)
Data access governance reports for SharePoint sites
Manage SharePoint Search
Consider Restricted SharePoint Search as a temporary solution as you finish auditing and cleaning up permissions. Restricted Search will prevent users from discovering content that they have not interacted with, see the link below for details.
Restricted SharePoint Search
If Applicable – remove sites from SharePoint Search index.
Semantic Index for Copilot
By following the steps outlined in this guide, you’ll be well-prepared to leverage the full potential of Microsoft 365 Copilot. These essential reviews will help you streamline your data governance processes and ensure a smooth deployment, without impacting collaboration.
Microsoft Tech Community – Latest Blogs –Read More
Auto respons message
Hello,
We are currently migrating to Exchange 365 as our new mailfilter. Our current mailfilter has the option to send an auto reply (for each incoming message) for choosen mailboxes.
As far as I know Exchange 365 only support the out-of-office feature to send automated messages to external persons.
Is there an option to setup automated messages as in our current mailfilter. In Exchange of maybe in 3th party tooling?
Kind regards,
Arjan
Hello, We are currently migrating to Exchange 365 as our new mailfilter. Our current mailfilter has the option to send an auto reply (for each incoming message) for choosen mailboxes.As far as I know Exchange 365 only support the out-of-office feature to send automated messages to external persons. Is there an option to setup automated messages as in our current mailfilter. In Exchange of maybe in 3th party tooling? Kind regards,Arjan Read More
Using Copilot for Excel to create a chart
Hi everyone, over the last few weeks we have had a series of posts to show you some of the things that are possible to do with Copilot in Excel. I have a table that has life expectancy figures by year for both men and women.
With so much data it is hard to visualize it so I ask Copilot:
Create a line chart of average life expectancy by year, with one line for men and another line for women
I click on the add to a new sheet button in the Copilot pane and this chart gets inserted into my workbook:
Over the coming weeks I will continue to share more examples of what you can do with Copilot in Excel.
Thanks for reading,
Microsoft Excel Team
*Disclaimer: If you try these types of prompts and they do not work as expected, it is most likely due to our gradual feature rollout process. Please try again in a few weeks.
Hi everyone, over the last few weeks we have had a series of posts to show you some of the things that are possible to do with Copilot in Excel. I have a table that has life expectancy figures by year for both men and women.
Life expectancy table with columns for Year, Sex and (Years)
With so much data it is hard to visualize it so I ask Copilot:
Create a line chart of average life expectancy by year, with one line for men and another line for women
Copilot in Excel pane with the above prompt and the returned chart showing average life expectancy by year, with one line for men and another line for women.
I click on the add to a new sheet button in the Copilot pane and this chart gets inserted into my workbook:
chart showing average life expectancy by year, with one line for men and another line for women.
Over the coming weeks I will continue to share more examples of what you can do with Copilot in Excel.
Thanks for reading,
Microsoft Excel Team
*Disclaimer: If you try these types of prompts and they do not work as expected, it is most likely due to our gradual feature rollout process. Please try again in a few weeks. Read More
Service Issue Error Message
Hi, I keep getting an error message that states “Service Issue” this app could not be protected due to issue with the Intune Service. Please try to sign in again in a few minutes. I’m only getting this error message with my work teams app on my iPhone all other Microsoft apps are working fine. Have anyone seen this before and do you know how to fix this it’s been saying this for two days now?
Hi, I keep getting an error message that states “Service Issue” this app could not be protected due to issue with the Intune Service. Please try to sign in again in a few minutes. I’m only getting this error message with my work teams app on my iPhone all other Microsoft apps are working fine. Have anyone seen this before and do you know how to fix this it’s been saying this for two days now? Read More
Synapse Serverless SQL Pool Processing Too Much Data From Cosmos
Hi,
We have an online platform for our clients that currently stores data they input in Cosmos, in JSON format, including some nested JSONs.
For these clients, we also provide a variety of Power BI reports based on this Cosmos data. To retrieve and sometimes transform the data from Cosmos, we use Synapse T-SQL queries to create views from the Cosmos data, that Power BI can then link to.
All Synapse queries use the following to flatten the Cosmos data into a suitable format for Power BI:
– OPENROWSET – to access the data in a specific Cosmos container
– CROSS APPLY/OUTER APPLY – to access data stored in nested JSONs
– WHERE clause to reduce the amount of data stored in the final view that is accessed by Power BI
However, as a result, we encountering an issue with the amount of data processed each time a report is refreshed.
It appears that despite the WHERE clause, the OPENROWSET retrieves all of the data from the container before filtering the rows. For example, if the Cosmos container stores 16,000 rows which if queried would require 3GB of data to be processed by the query, even if we add a WHERE clause that reduces the final amount of rows to 1,000, 3GB of data are still processed by the query each time it is run.
Consequently, we are encountering scenarios where a Power BI report containing less than 100 rows of data are still processing massive amounts of data in the Synapse queries, incurring high costs.
As we have multiple queries running multiple times per day, we are facing escalating costs, e.g.
3GB per query x 15 queries x 5 refreshes per day = 225 GB data processed per day, when it should be significantly lower. This example is also conservative as it is based on test data rather than live, and it is after we have reduced the number of daily refreshes for our clients to reduce the costs.
As data is added to the container each month, the amount processed and the cost is only increasing.
Furthermore, this has led to situations where we have exceeded our Synapse capacity, causing the Power BI reports to be unable to refresh at all, preventing clients to be able to review their data.
Does anyone have any suggestions as to how we can limit the data being returned in the OPENROWSET clause so it does not process the whole Cosmos container’s worth of data when a query is run?
Hi, We have an online platform for our clients that currently stores data they input in Cosmos, in JSON format, including some nested JSONs. For these clients, we also provide a variety of Power BI reports based on this Cosmos data. To retrieve and sometimes transform the data from Cosmos, we use Synapse T-SQL queries to create views from the Cosmos data, that Power BI can then link to. All Synapse queries use the following to flatten the Cosmos data into a suitable format for Power BI:- OPENROWSET – to access the data in a specific Cosmos container- CROSS APPLY/OUTER APPLY – to access data stored in nested JSONs- WHERE clause to reduce the amount of data stored in the final view that is accessed by Power BI However, as a result, we encountering an issue with the amount of data processed each time a report is refreshed. It appears that despite the WHERE clause, the OPENROWSET retrieves all of the data from the container before filtering the rows. For example, if the Cosmos container stores 16,000 rows which if queried would require 3GB of data to be processed by the query, even if we add a WHERE clause that reduces the final amount of rows to 1,000, 3GB of data are still processed by the query each time it is run. Consequently, we are encountering scenarios where a Power BI report containing less than 100 rows of data are still processing massive amounts of data in the Synapse queries, incurring high costs. As we have multiple queries running multiple times per day, we are facing escalating costs, e.g. 3GB per query x 15 queries x 5 refreshes per day = 225 GB data processed per day, when it should be significantly lower. This example is also conservative as it is based on test data rather than live, and it is after we have reduced the number of daily refreshes for our clients to reduce the costs. As data is added to the container each month, the amount processed and the cost is only increasing.Furthermore, this has led to situations where we have exceeded our Synapse capacity, causing the Power BI reports to be unable to refresh at all, preventing clients to be able to review their data. Does anyone have any suggestions as to how we can limit the data being returned in the OPENROWSET clause so it does not process the whole Cosmos container’s worth of data when a query is run? Read More
Clarification on Tracking Email Activity with Non-Microsoft Clients in Microsoft Graph API Reports
Dear Microsoft Support Team,
I am seeking clarification regarding how email activities are tracked in Microsoft 365 usage reports, specifically through the getM365AppUserDetail and getOffice365ActiveUserDetail Graph API endpoints.
We have observed cases where users show Exchange Online activity in the getOffice365ActiveUserDetail report, but no Outlook usage is reported across any platform (web, Windows, Mac, or desktop).
Our question is whether email actions taken using non-Microsoft clients (such as via SMTP or IMAP) are included in these reports. While we understand that activity within Microsoft 365 apps like Outlook and Exchange Online is tracked, we would like to confirm if email activity via third-party clients, PowerShell scripts, or external SMTP servers is reflected in the user activity reports, particularly when no Outlook usage is present.
Thank you for your assistance, and we look forward to your clarification.
Best regards,
Dear Microsoft Support Team, I am seeking clarification regarding how email activities are tracked in Microsoft 365 usage reports, specifically through the getM365AppUserDetail and getOffice365ActiveUserDetail Graph API endpoints.We have observed cases where users show Exchange Online activity in the getOffice365ActiveUserDetail report, but no Outlook usage is reported across any platform (web, Windows, Mac, or desktop). Our question is whether email actions taken using non-Microsoft clients (such as via SMTP or IMAP) are included in these reports. While we understand that activity within Microsoft 365 apps like Outlook and Exchange Online is tracked, we would like to confirm if email activity via third-party clients, PowerShell scripts, or external SMTP servers is reflected in the user activity reports, particularly when no Outlook usage is present.Thank you for your assistance, and we look forward to your clarification. Best regards, Read More
Data of the origin email is not shown in the header after applying transport rules
Hello, I have an email account (Exchange online) that has several forwardings applied to it through a transport rule and when the email arrives in the mailbox, the original sender’s data does not appear in the header. This causes the antiphishing filters not to be applied. For example, if that account receives an email from Gmail, after applying the forwarding rules, the Gmail data does not appear in the header. I have read that this can be solved by configuring the Enhanced Filtering for Connectors in Microsoft Defender. The problem i have is that this option is applied to the inbound connectors and my email account does not go through any inbound connector. Does any to know if there is a solution to be able to keep the data of the original email in the header?
Thanks.
Hello, I have an email account (Exchange online) that has several forwardings applied to it through a transport rule and when the email arrives in the mailbox, the original sender’s data does not appear in the header. This causes the antiphishing filters not to be applied. For example, if that account receives an email from Gmail, after applying the forwarding rules, the Gmail data does not appear in the header. I have read that this can be solved by configuring the Enhanced Filtering for Connectors in Microsoft Defender. The problem i have is that this option is applied to the inbound connectors and my email account does not go through any inbound connector. Does any to know if there is a solution to be able to keep the data of the original email in the header?Thanks. Read More
Issue creating and Azure deployment Pipeline
Context
(I am trying to take over an application that the initial developer has left undocumented and almost no knowledge transfer. So, I’m exploring it bit by bit and I am not necessarily aware of the reasons for past decisions.)
The backend is .Net WebApi solution based on Entity Framework developed with Visual Studio and hosted in Azure Web Applications Deployment slots.
The frontend is an Angular/Nx application developed in Visual Studio Code and hosted on Azure Storage Accounts.
Both parts has a Staging and a Production environment (in addition to local environment).
The codes are also on a GitHub repository.
But many aspects of intended deployment are unknown.
Issue
After some reading and watching some videos, I think that Azure Pipelines could be a solution here.I went to portal.azure.com and search for Pipelines. Pipelines list is empty and I cannot create on from there.I went to dev.azure.com but when I clicked on “Sign in”, I was redirect to portal.azure.com .I went back to dev.azure.com an tried the option “Start for free” (or something like this). Then it asked for a company name.I stopped there because this is getting too confusing and I don’t want to create a wrong account in the name of the company.
Question
Should the company be able to use Azure deployment Pipelines , or is Azure DevOps a completely different beast and we should create a new account for this?
How am I supposed to create the pipeline?
And if you have any extra information that would help to solve the issue, it is welcome.
(Also, after this, if I went back to dev.azure.com , I was redirected to a blank page. But this seems to be solved now.)
(I tried to ask it on Azure Q&A, but it got instantly dismissed (at posting time) “violating Code of Conduct”. I checked the “Code of Conduct”, but can’t pinpoint the reason, no idea.)
Context(I am trying to take over an application that the initial developer has left undocumented and almost no knowledge transfer. So, I’m exploring it bit by bit and I am not necessarily aware of the reasons for past decisions.)The backend is .Net WebApi solution based on Entity Framework developed with Visual Studio and hosted in Azure Web Applications Deployment slots.The frontend is an Angular/Nx application developed in Visual Studio Code and hosted on Azure Storage Accounts.Both parts has a Staging and a Production environment (in addition to local environment).The codes are also on a GitHub repository.But many aspects of intended deployment are unknown.IssueAfter some reading and watching some videos, I think that Azure Pipelines could be a solution here.I went to portal.azure.com and search for Pipelines. Pipelines list is empty and I cannot create on from there.I went to dev.azure.com but when I clicked on “Sign in”, I was redirect to portal.azure.com .I went back to dev.azure.com an tried the option “Start for free” (or something like this). Then it asked for a company name.I stopped there because this is getting too confusing and I don’t want to create a wrong account in the name of the company.QuestionShould the company be able to use Azure deployment Pipelines , or is Azure DevOps a completely different beast and we should create a new account for this?How am I supposed to create the pipeline?And if you have any extra information that would help to solve the issue, it is welcome. (Also, after this, if I went back to dev.azure.com , I was redirected to a blank page. But this seems to be solved now.) (I tried to ask it on Azure Q&A, but it got instantly dismissed (at posting time) “violating Code of Conduct”. I checked the “Code of Conduct”, but can’t pinpoint the reason, no idea.) Read More
10 great ways to use Figma in the classroom – a partnership between Figma and Microsoft Education
Today’s guest post is from Lauren McCann from Figma Education
————————————————————————————
It’s back-to-school season, and we’re thrilled to announce our partnership with Figma to bring their professional-grade design and collaboration tools to Microsoft 365 schools! Administrators and school leaders can apply here to get started.
In today’s fast-paced world, skills like collaboration, creativity, and problem solving are more essential than ever. That’s why we’re offering free access to Figma and FigJam enterprise tiers for all K-12 educators and students within Microsoft schools. Figma and FigJam are design and collaboration software used by professional designers, engineers, and makers of all kinds. They can support students in building together—in a fun, interactive space that simultaneously prepares them for future career opportunities.
5 Creative Ways to Engage Students with FigJam
FigJam, Figma’s digital whiteboard tool, is a versatile tool for collaboration that can bring a fresh dynamic to classrooms. Its intuitive design, flexibility, and joyous features make it perfect for a wide range of activities across subjects. Teachers and students have access to sticky notes, stamps, pen tools, diagramming tools, stamps, and interactive multimedia to make learning fun, engaging, and collaborative.
Here are five creative ways teachers can use FigJam to engage students and foster learning:
Community building ‘Get to Know Me’ Activities
At the start of the school year or when forming new groups, use FigJam for to help students get to know one another. By using FigJam’s sticky notes or drawing tools, students can easily share their hobbies, favorite books, or fun facts in a more interactive way. This sets a positive classroom culture and helps create a sense of community right from the beginning.
Try out these templates:
All About Me activity
Let Me Introduce Myself Activity
Classmate Scavenger Hunt
Shoe Rack Icebreaker Activity
ELA/History Activities
For English Language Arts or History, FigJam is a great tool for dynamic graphic organizers, brainstorming, connecting ideas and themes, planning, and organizing thoughts. Students can collaboratively map out plotlines of novels, create character analysis boards, or even build timelines of historical events. Teachers can assign group projects where students organize key ideas from a text or debate historical perspectives, all while using FigJam’s collaborative features to capture their ideas in real time.
Try out these templates:
Dynamic Vocabulary List
Book Recap Activity
Classroom Debate Protocol
Student-led Book Review Cards
Story Arc Pre-Writing Planner
Book Chat Activity
Science and Math Activities
In STEM subjects, FigJam can be used for problem-solving, modeling, and visualizing complex concepts. For math, students can break down word problems by annotating them, using digital manipulatives, drawing diagrams, or collaborating on equations. In science, FigJam can be used for creating concept maps on topics like ecosystems or the periodic table. Teachers can also incorporate visual tools like flowcharts to show processes such as photosynthesis or the steps of a scientific experiment.
Try out these templates:
Base Ten Blocks
Scientic Method Flowchart
Virtual Lab Notebook
Math Card Game
Telling Time Activity
Classroom Connection and Brain Break Activities
Beyond academics, FigJam is perfect for fostering class spirit and creating bonding experiences. Teachers can organize quick games, icebreakers, or fun brainstorming sessions where students can contribute silly answers, vote on class nicknames, or design a collaborative mural. Activities like “Two Truths and a Lie” or classwide Pictionary are simple to set up and encourage participation, all while making learning fun.
Try out these templates:
Create Your Own Croc Charms
Friendship Bracelet Builder
Around the World Review Game
Choice Board Template
Pass the Doodle
Play a Game of Battleship
Check for Understanding Activities
FigJam’s collaborative board allows teachers to perform informal assessments in real-time. Use it for “exit tickets” where students post one thing they learned, or use the space for quick quizzes and polls to gauge understanding. This approach encourages participation and provides immediate feedback on student comprehension, helping teachers identify areas that need further clarification.
Try out these templates:
Bumper Sticker Exit Ticket
Text Me Your Takeaway Exit Ticket
Tweet Me Your Takeaway Exit Ticket
Inverted Pyramind Exit Ticket
Exit Ticket Bundle
Assignment Self Reflection
By incorporating FigJam into everyday classroom activities, teachers can make learning more interactive, engaging, collaborative, and fun for students across all subjects. Whether it’s for academic purposes or simply to build classroom community, FigJam’s versatile features provide endless possibilities for creative engagement.
5 Creative Figma Design Activities for Students
Figma is a powerful design tool that’s not only great for professional work but also for classroom activities that build both creative and technical skills. Teams across industries use Figma to create user interfaces and user experiences for websites, apps, and digital products. Figma provides tools to create wireframes, prototypes, and high-fidelity designs.
Here are five fun ways to use Figma to engage students while helping them develop future-ready skills and design literacy.
**Student Trading Cards**
Kick off the first week of class with an activity that combines getting to know your classmates with learning Figma basics. Have students create digital trading cards that mimic a social media-style profile. Using Figma’s Auto Layout and components, students can design profiles that showcase their interests, hobbies, and fun facts. By working in the same file, students get a sneak peek into each other’s design processes and styles. You can even set up a gallery to review each other’s work at the end!
Figma skill level: Beginner
Key Figma features: Auto Layout, Multiplayer Editing
Make a Heart
This activity is perfect for building confidence with Figma’s tools. Ask students to create a heart using the pen tool, layout grid, and various properties, then challenge them to customize it. It’s a quick way to introduce key Figma features that will come in handy for future design projects.
For a fun twist, students can animate the heart using Figma’s Smart Animate feature to make it pulse. This activity not only helps students get comfortable with design tools but also introduces them to animation and motion design basics.
Figma skill level: Beginner
Key Figma features: Pen Tool, Properties Panel, Layout Grid
Figma Tangrams
Using the classic tangram puzzle, students can explore form, geometry, and creativity in Figma. After a brief lesson on the history of tangrams, have students arrange the seven traditional pieces into recognizable animals or other shapes.
Encourage them to experiment by creating abstract designs as well. For an extra challenge, students can design their own reusable tangram sets by researching other tangram puzzles, pushing their creativity and research skills.
Figma skill level: Beginner
Key Figma features: Components, how to rotate and align
Loading Animations
This activity taps into a familiar experience: waiting for a page to load. Ask students to reflect on their favorite (and least favorite) loading animations and then design their own. This exercise gets them thinking about how to visually represent waiting or processing within an interactive app. It’s a great introduction to more conceptual and technical work in interaction design.
Whether it’s a simple spinning icon or a more elaborate animation, students can get creative and dive into the user experience aspect of design.
Figma skill level: Intermediate
Key Figma features: Prototyping, Smart Animate
Mobile Magazine Prototype
For this lesson, students create a simple mobile layout for a magazine, combining layout and prototyping to design an interactive experience. After laying out the gallery of magazine content, they can prototype navigation to simulate how users would flip through the pages.
Figma skill level: Intermediate
Key Figma features: Layout Grids, Prototyping
These activities let students jump into the role of UX designer and realtor. Each project not only teaches students key design skills in Figma but also spark creativity and collaboration. Whether it’s creating custom trading cards or designing interactive prototypes, students will walk away with practical skills and a deeper understanding of the design process.
Lauren McCann
Head of Figma Education
Microsoft Tech Community – Latest Blogs –Read More
HelpDesk role assigned a group to manage a dynamic group
we are implementing a solution to manage an users group for each country assigned to its helpdesk group.
the idea is:
USA helpdesk group can manage USA users
FR helpdesk group can manage FR users
and so on
we created “USA helpdesk” group (with some member) and “USA users” group with dynamic group.
we need to match HelpDesk role, “USA helpdesk” group and “USA users” group
how can we do it?
we are implementing a solution to manage an users group for each country assigned to its helpdesk group.the idea is:USA helpdesk group can manage USA usersFR helpdesk group can manage FR usersand so onwe created “USA helpdesk” group (with some member) and “USA users” group with dynamic group.we need to match HelpDesk role, “USA helpdesk” group and “USA users” grouphow can we do it? Read More