Author: Tony Redmond
Version 1.5 of the Microsoft 365 User Password and Authentication Report
Microsoft Adds Last Used Property for Authentication Methods
The Microsoft 365 User Password and Authentication report is one of the scripts that I pay attention to and attempt to keep up to date as new developments emerge. The last version (V1.4) dealt with a change in how the default MFA method is reported; the version before added details of the authentication methods configured for user accounts.
Now it’s time to go back to refresh the script again because Microsoft has refreshed the beta version of the list authentication methods API to add a last used date time property. Entra ID updates the property when the authentication method (SMS code, passkey, Microsoft authenticator app, and so on) is used to authenticate an account. Looking at the dates in my tenant, I see last used dates going back to January 2023. There might be earlier dates than this noted for some authentication methods, but the point is that this information is now available.
The Value of the Last Used Property
The value of the last used property is that if you know what authentication methods are in active use, you might be able to remove unused authentication methods from user accounts to reduce the available attack surface for those accounts.
In any case, knowing when authentication methods are in active use for accounts is good information to have, especially if you want to encourage (“nag”) people to move away from weak secondary authentication methods like SMS and use something stronger, like the authenticator app or passkeys.
Updating the Script to V1.5
In any case, it was time to break out Visual Studio Code to update the Microsoft 365 User Password and Authentication script. The code uses the Get-MgUserAuthenticationMethod cmdlet to fetch authentication methods for an account. Each method has an identifier, and the more interesting information is found in the additionalProperties property (array). You’ll need at least the UserAuthMethod-MicrosoftAuthApp.Read.All Graph permission to access this information:
[array]$AuthMethods = Get-MgUserAuthenticationMethod -UserId $User.Id -ErrorAction Stop $AuthMethods Id -- 28c10230-6103-485e-b985-444c60001490 3ddfcfc8-9383-446f-83cc-3ab9be4be18f 338e704e-bb5c-4b0d-9c2e-458e630e4017
Microsoft updated the Microsoft Graph PowerShell SDK to V2.32 earlier this month. So far, the release has proven stable, and I haven’t run into new problems. It includes the Get-MgBetaUserAuthenticationMethod cmdlet, which returns the last used property:
[array]$AuthMethods = Get-MgBetaUserAuthenticationMethod -UserId $User.Id -ErrorAction Stop $AuthMethods Id CreatedDateTime LastUsedDateTime -- --------------- ---------------- 28c10230-6103-485e-b985-444c60001490 30/05/2020 07:48:05 3ddfcfc8-9383-446f-83cc-3ab9be4be18f 338e704e-bb5c-4b0d-9c2e-458e630e4017 04/08/2025 06:27:29
Not all authentication methods update the created date and last used date properties, but enough do to make the properties worthwhile.
The interesting thing here is that the cmdlet now surfaces the created date time as a property instead of an item in the additionalProperties array. This change is likely due to an update to the underlying Graph API metadata, and it could result in some scripts breaking if, as expected, the change makes it through to production. I certainly had to make some code changes to accommodate the change in how the created date is exposed. Figure 1 shows some example output where the last used date is reported for two authentication methods.

It would be nice if the data provided for every authentication method was consistent, but it’s not. It’s just another challenge to solve when working with Graph data.
New Version Available from GitHub
The updated (V1.5) version of the script can be downloaded from the Office 365 for IT Pros GitHub repository. I make no claim of greatness for the code. It’s there for people to learn about how to access and use the Graph to interact with authentication methods. No doubt this will interest some and not others. Feel free to upgrade and enhance the code to meet your requirements.
Learn about managing Entra ID and the rest of the Microsoft 365 ecosystem by subscribing to the Office 365 for IT Pros eBook. Use our experience to understand what’s important and how best to protect your tenant.
Microsoft 365 Companion Apps Fail to Impress
Why do Microsoft 365 Companion Apps Even Exist?
Announced in message center notification MC1160180 (updated 30 September 2025, Microsoft 365 roadmap item 486856), the Microsoft 365 companion apps are a collection of apps designed to live in the Windows toolbar and specialize in a single task. Currently, the suite spans the People, Files, and Calendar companions and starting in October 2025, Microsoft began to install the companion apps automatically on Windows 11 devices that already have the Microsoft 365 desktop client apps.
According to Microsoft, “these lightweight apps integrate seamlessly with Microsoft 365, allowing users to efficiently look up contacts and navigate organization charts, locate files, view calendars and streamline workflows without distractions.” This text seems like a desperate justification for recreating three wheels. Why these apps exist when there are perfectly good other Microsoft 365 apps to do the same job is beyond me. The companion apps complicate an already complex app landscape.
I like to stay current with Windows, so the companion apps showed up earlier this month. Since then, I have struggled to make sense of what they do. The first thing I don’t like about the companion apps is their detachment from the rest of Microsoft 365. Typically, I have Outlook (classic), Teams, the OneDrive sync client, and a bunch of browser apps running (SharePoint sites, admin centers, Planner, and so on). The Microsoft 365 apps share a perfectly good single-sign-on experience, but the companion apps do their own thing and insist on separate authentication. It’s a jarring start.
The Files Companion App
The Files app depends on OneDrive for Business and is able to list cloud files owned or shared by the signed-in user (sounds a lot like the OneDrive browser client). You can view and share files or open the location where a file is stored. The single party trick I found was relatively fast searching. In Figure 1, I searched for Exchange Online and the app responded with alacrity.

But the big question is whether the Files app does enough to warrant keeping it around. After all, Microsoft 365 users have SharePoint search or the OneDrive for Business app, or if they have Microsoft 365 Copilot, Microsoft 365 Copilot Search. The latter is the best way that I have found to search or information, especially when it’s linked via a Copilot connector to important external websites.
The People Companion App
The People app is a way of browsing your Outlook contacts and the Entra ID directory with details of a contact presented through the Microsoft 365 user profile card (Figure 2). Once again, I wonder why I should use a separate app instead of Outlook. Or OWA? Or the new Outlook?

The Calendar Companion App
The Calendar app doesn’t even rate a screen shot. It’s a calendar app without the ability to create a new event or meeting. Opening the Outlook calendar in a new window gives access to more information and more capabilities.
Suppressing the Companion Apps
It didn’t take long to decide that the companion apps were toolbar clutter that I could live without. Tenant administrators can stop Microsoft 365 installing the apps by updating the companion apps setting in the Modern Apps settings tab of Deployment configurations in the Microsoft 365 apps admin center. By default, the setting is checked. Unchecking it stops the installation on workstations (Figure 3).

If the companion apps have already reached PCs, some PowerShell can clean things up by blocking the startup state for the companion apps in the system registry to stop the apps showing up in the toolbar. This code checks the registry to find the startup state for each app (0 = enabled, 1 = disabled) and disables the state for the three apps.
# Disable the People, Files, and Calendar Microsoft 365 Companion Apps from starting automatically
$RegistryKey = "HKCU:SoftwareClassesLocal SettingsSoftwareMicrosoftWindowsCurrentVersionAppModelSystemAppDataMicrosoft.M365Companions_8wekyb3d8bbwe"
[array]$AppStartUpIds = @("$RegistryKeyCalendarStartupId","$RegistryKeyFilesStartupId","$RegistryKeyPeopleStartupId")
ForEach ($AppStartupId in $AppStartUpIds) {
Try {
If (Test-Path $AppStartupId) {
# Disable startup state for the app
Write-Host ("Disabling startup state for the {0} companion app" -f $AppStartupId.Split("StartupId")[0].Split("")[11]) -ForegroundColor Green
Set-ItemProperty -Path $AppStartupId -Name "State" -Value 1 -Type DWORD -ErrorAction Stop
} Else {
Write-Host ("Couldn't find path to disable startup for the {0} companion app" -f $AppStartupId.Split("StartupId")[0].Split("")[11]) -ForegroundColor Red
}
} Catch {
Write-Error ("Failed to set State for {0} : {1}" -f $AppStartupId, $_)
}
}
Write-Host "Completed suppressing the startup of the Calendar, Files, and People companion apps"
The apps are still present on the PC and can be started if the user wants to check them out.
Dump the Apps and Unclutter Your PC
I have no idea how long Microsoft will persist with the notion that these companion apps will improve the lives of Microsoft 365 users. The apps do nothing to keep me focused, streamlined, or any of the other fine words used as justification in MC1160180. But make your own mind up – and then dump the apps before Microsoft comes to their senses and cuts the apps in an effort to save engineering expenses.
So much change, all the time. It’s a challenge to stay abreast of all the updates Microsoft makes across the Microsoft 365 ecosystem. Subscribe to the Office 365 for IT Pros eBook to receive insights updated monthly into what happens within Microsoft 365, why it happens, and what new features and capabilities mean for your tenant.
Microsoft Won’t Dump Outlook for a New AI Client
New Management Wants to Reimagine Outlook but That Doesn’t Mean That the New Outlook Client is Dead

A Tom Warren report in TheVerge.com that Outlook is getting an AI overhaul under new leadership (reported here in an accessible form) certainly caused the imagination of some commentators to go into overdrive. Unfortunately, the conclusions reached are impractical and unlikely. Let me explain why.
The article reported that Gaurav Sareen, corporate VP of global experiences and platform at Microsoft wants the Outlook developers to reimagine how Outlook can serve users by using AI to process email in a much more proactive manner than happens today.
Essentially, Outlook should be like a hyper-efficient assistant that processes email to relieve mailbox owners of the need to review and decide how to handle messages. According to the internal memo seen by Warren, Sareen wants developers to take a new approach: “Instead of bolting AI onto legacy experiences, we have the chance to reimagine Outlook from the ground up.” More importantly, Sareen wants work to happen faster with teams “prototyping and testing in days, not months.”
Senior managers have a habit of laying out grand plans when they take over new responsibilities. That’s OK, because it’s important to have a vision for where a product or technology is heading, so no one can criticize Microsoft executives for setting out how they think development teams should react to the current state of the market and customer demand.
However, Outlook is in the middle of a transition to fulfil the “One Outlook” vision of clients that deliver the same functionality on Windows, Mac, browsers, and mobile clients. The transition from Outlook classic is ongoing, and while I have been critical of the rate of progress and the implementation of some features (like the very slow export to PST), there’s no doubt that Microsoft is making progress. The eventual goal is to be able to transition away from Outlook classic by the time support for the classic client finishes in 2029.
The need to deliver certainty to corporate customers means that it makes zero sense for commentators to conclude that Microsoft will now dump the new Outlook in favor of some AI-infused client that must be designed from the ground up to replace Outlook classic, OWA, Outlook mobile, and the new Outlook.
Microsoft’s Outlook Commitment to Microsoft 365 Customers
Microsoft has a commitment to Microsoft 365 customers to deliver a solid version of Outlook as part of the Microsoft 365 enterprise apps suite. Changing course now to incorporate new AI-powered functionality might sound exciting, but it ignores the simple fact that Microsoft 365 Copilot licenses are not possessed by many tenants, and without Copilot and access to AI-powered features, the vision outlined by Sareen cannot be achieved.
I don’t think Microsoft is willing to give away Copilot licenses just to enable AI features in Outlook. That move wouldn’t go down well with shareholders who look at the massive investments made to build out datacenter capabilities for AI without a clear line of sight about how these investments will deliver revenues.
Microsoft is coy about many Microsoft 365 Copilot licenses they’ve sold. No one knows how many Microsoft 365 Copilot licenses are in active use, but I’m willing to bet that the number of Copilot licensees is hundreds of millions removed from the number of Outlook users. The latest data from the Microsoft FY26 Q1 results indicate that Microsoft 365 has around 446 million paid seats. Let’s say that 400 million of these people use Outlook. That’s a lot of additional AI processing that might be required to deliver a new AI-infused Outlook client, which is why I think that any strategy based on dramatically increasing the amount of AI processing in Outlook will run into the cold brick wall of financial reality.
There’s also the need for Microsoft to deliver a client Exchange Server after Outlook classic retires. This client is unlikely to have as many AI-powered features because it’s much harder to deliver those features in on-premises environments than it is in the cloud.
Evolutionary AI Additions to Outlook
What I think will happen is that Microsoft will continue to press ahead with its One Outlook strategy to equip the new Outlook with equivalent functionality (and more) than is currently available in Outlook classic. It just makes sense for Microsoft to get Outlook to a common code base for multiple platforms.
At the same time, during the period up to 2029 when Microsoft’s committed support for Outlook classic ceases, Microsoft will implement important AI-powered features in Outlook classic to keep corporate customers happy.
At the same time, I believe that Sureen’s memo will force the Outlook development teams to respond with proposals to become more aggressive about bringing AI-powered features into Outlook based on the new Outlook framework. I don’t see any appetite for a third Outlook flavor over the next few years (Outlook classic, new Outlook, and Outlook AI++). That’s not how Microsoft works, especially in a space where they need to keep large corporate customers happy and don’t want to see support costs escalate due to client profusion.
Customer Support and Expectations Moderate Grand Plans
As noted above, new leaders invariably have new ideas about how to move products forward. Just because some of those ideas leak outside is no reason to conclude that Microsoft will suddenly switch course for a product used by hundreds of millions of people. The practical issues of customer expectations and long-term support are enough to moderate even the most radical of new leader ideas. I suspect that the same will happen here. Stay calm and take some happy pills.
Insight like this doesn’t come easily. You’ve got to know the technology and understand how to look behind the scenes. Benefit from the knowledge and experience of the Office 365 for IT Pros team by subscribing to the best eBook covering Office 365 and the wider Microsoft 365 ecosystem.
Office 365 for IT Pros November 2025 Update
Monthly Update #125 Now Available for Download

The Office 365 for IT Pros team is happy to announce the availability of the November 2025 update for the Office 365 for IT Pros (2026 edition) eBook. This is the 125th monthly update. We’ve also published an update for the Automating Microsoft 365 with PowerShell eBook (now at version 17.3).
Subscribers can download the updated PDF and EPUB files from Gumroad.com. The link in the receipt you received always fetches the latest files. See our FAQ for more information about downloading updated, and our change log for details of what’s changed in update #125.
Hype, BS, and Misunderstandings
Some of the commentary that appears on the internet is in a state of outrageous ignorance. Two recent examples involving Teams come to mind. The first was the revelation that attackers could extract and reuse access tokens from the local state file that Teams uses to track cookies and other metadata. The security researcher was very excited by this finding but quite forgot that an attacker needed physical control over a workstation to carry out the exploit.
The second was the hubris around the upcoming change to add automatic location updates so that when people come into the office, Teams will update their location to “Office” instead of the user doing so manually. This was interpreted as an example of employee surveillance, an assessment that rapidly fell apart once anyone with an ounce of sense and some knowledge about how Teams works looked at what actually happens. The sad quality of the material some publish in the pursuit of web page views…
Microsoft’s FY26 Q1 Results
Microsoft published their FY26 Q1 results on October 29, 2025. The Microsoft Cloud is now at an annualized revenue run rate of $196.4 billion (at a 68% margin) and the number of Microsoft 365 commercial seats seems to be around 446 million based on a 6% year-over-year growth (slowing gradually). No number was provided for the Teams user base, so we’re still stuck at the 320 million stated in October 2023. However, we did hear that Entra ID now has one billion monthly active users.
Apart from those numbers, there wasn’t much to get excited about from a Microsoft 365 perspective. All the vibe at the market analysts meeting was about how happy Microsoft is with Copilot’s progress. In reinforcing this impression, Microsoft misses no opportunity to push out data snippets that seem impressive but are pretty worthless.
Satya Nadella said: “Just nine months since release, tens of millions of users across Microsoft 365 customer base are already using Chat.” That seems good, but he didn’t specify how many of these people have licenses and how many use the free Microsoft Copilot Chat. The statement that “first party family of Copilots now has surpassed 150 million monthly active users” is similarly light on detail. For instance, what constitutes an active Copilot user?
Nadella went on to say that “Adoption is accelerating rapidly, growing 50% quarter over quarter, and we continue to see usage intensity increase.” That 50% growth appears impressive, but is the growth for free Copilot or the $360/year version? And without a base to measure against, it’s hard to know if Copilot grew from 40 to 60 seats or are millions of seats involved. Finally, the assertion that “more than 90% of the Fortune 500 now use Microsoft 365 Copilot.” is another example of Microsoft’s undoubted skill at obfuscating market numbers because no one knows how many seats are involved and if the Fortune 500 are seriously implementing Copilot or just kicking the tires.
Microsoft cited three customer examples of customers buying over 15,000 seats, one customer deploying 30,000 seats, and PWC with 200,000 seats. That’s not a lot to justify the $34.9 billion of capital expenditures in the quarter “driven by growing demand for our cloud and AI offerings.” I guess that spending so much to beef up datacenters for AI doesn’t matter so much when Microsoft is throwing off $45 billion cash flow in a quarter.
I loved the assertion about “16 billion Copilot interactions audited by Purview.” Purview certainly captures audit events for Copilot interactions, but that’s not auditing. Unless you need some faux statistics, of course. And of course, a Copilot interaction usually generates at least two audit events (prompt and response), so the big number isn’t quite as impressive.
Knowing How Technology Works
What all of this proves is that reading news published on the internet and taking everything on face value creates a certain impression. Reading the same news and knowing how Microsoft 365 works means that you’re not going to be caught out and impressed by bogus news or over-hyped data. The mission of Office 365 for IT Pros is to spread knowledge based on hard experience and expertise. Now on to monthly update #126.
Microsoft Issues Updated Guidance for Defender for Office 365 Licensing
Changes to MDO P2 to Remove Requirements to License All Shared Mailboxes
Last August, I wrote about the issue of unexpected costs for Microsoft 365 customers when Microsoft Defender for Office 365 Plan 2 (MDO P2) was enabled in a tenant because MDO P2 is included as a service plan in Office 365/Microsoft 365 E5 licenses. No administrator action is required to use MDO P2; the presence of an E5 license is enough to activate its protection.
According to the MDO service description (August 2025), when MDO P2 is used by a tenant, “licenses must be acquired for users or mailboxes falling under one or more of the following scenarios:
- All Exchange Online users on the tenant. This is because Plan 2 features and capabilities protect all users in the tenant.
- All shared mailboxes on the tenant.”
In other words, the presence of just one E5 license automatically invokes the need for MDO P2 licenses for every Exchange Online user and shared mailbox. Buying MDO P2 at $5/user/month to remain compliant quickly racks up a substantial bill.
Group mailboxes also benefit from MDO P2 protection, but the service description makes no mention of a license requirement for these mailboxes, despite the efforts made by Microsoft over the years to give group mailboxes equivalent functionality to shared mailboxes.
Removing Inconsistency and Incoherence
In short, inconsistencies and incoherence abounded in the MDO P2 licensing requirements. The MDO team agreed to take the issue away to see what could be done to improve matters, and now they’ve come back with a revised licensing scheme.
The big change is the removal of the requirement for MDO P2 licenses for all user and shared mailboxes when E5 licenses are present. The previous position was indefensible and it’s good that Microsoft agreed.
Instead of a “MDO P2 licenses required for all mailboxes” approach, Microsoft uses the “if you benefit from a feature, you pay for a feature” rule that already applied to MDO P1 licensing. The new licensing terms are shown in FIgure 1:

Microsoft Defender for Office 365 P2 can be licensed through any of the following:
“Microsoft Defender for Office 365 Plan 2 standalone, Microsoft 365 E5/A5/G5, Office 365 E5/A5/G5, Microsoft Defender Suite/EDU/GOV/FLW, and Microsoft Defender + Purview Suite FLW provide the rights for a user to benefit from Microsoft Defender for Office 365 Plan 2.”
In other words, tenant administrators must decide which mailboxes should benefit from MDO P2 and then license those mailboxes accordingly. Licensing is automatic for accounts with E5 licenses because the MDO P2 service plan is already present. Shared mailboxes that tenants want to receive MDO protection will need to be licensed.
Custom Policies Required to Scope MDO Coverage
Unless a tenant licenses every user and shared mailbox, the new licensing arrangement means that administrators must create custom scoped policies to enable the MDO P2 safe links, safe attachments, and anti-phishing features for target groups rather than using the scope of the default policy to “cover everyone.” The target group can include user and shared mailboxes.
In large tenants, several custom policies will probably be required to cover different target groups. Dynamic distribution groups aren’t supported for scoped policies, but dynamic Microsoft 365 Groups are. Using dynamic Microsoft 365 Groups creates the requirement for Entra P1 licenses for all users that are members of a dynamic group.
One issue is that the membership rules for dynamic Microsoft 365 Groups don’t offer an off-the-shelf way to find shared mailboxes. Shared mailboxes will need to be marked in some manner such as a value in a custom attribute to allow a membership rule to find and include their accounts in group membership. On the upside, a dynamic Microsoft 365 Group to find shared mailboxes for MDO protection can also assign the MDO P2 license to the mailboxes.
I can see why Microsoft has gone down the path of using custom scoped policies to target the mailboxes to receive MDO protection. It’s a feature that already exists and works, but I’m not sure how much use custom scoped MDO policies get in the real world because I have never used these kinds of policies. I’m also unsure about the amount of administrative effort that will be necessary to set up and maintain the policies, especially in large tenants.
Group Mailboxes Don’t Need MDO Licenses
No mention is made about the group mailboxes used by Microsoft 365 Groups. This might be because Microsoft 365 Groups come about through the creation of other Microsoft 365 objects, like Teams and group-connected SharePoint Online sites. By contrast, creating a shared mailbox is a standalone operation to support the work of a team or to preserve a leaver mailbox, so it could be argued that it would be unfair to insist on licensing the automatic operation. In any case, I suspect that some debate will continue on this point.
Guiding Principles
The new licensing arrangement for MDO P2 can be broken down into four guiding principles:
- MDO licenses are required for any mailbox (or rather, the user account that the mailbox belongs to) that comes within the scope of an MDO policy to enable features like safe link and safe attachments.
- The majority of MDO processing happens during mail flow delivery to mailboxes. If a mailbox comes within the scope of an MDO policy (including a policy covering all mailboxes), it gets the benefit of the MDO features. If the account isn’t within the scope of an MDO policy, it doesn’t.
- When considering the protection of shared mailboxes, only include shared mailboxes that actively receive external email that require protection. Exclude shared mailboxes like those used to retain leaver data (use inactive mailboxes instead), defunct mailboxes (consider their removal), and mailboxes used exclusively to process internal email.
- MDO licenses don’t need to be assigned to the accounts that own shared mailboxes. All Microsoft requires is that the tenant has sufficient MDO licenses to cover the user and shared mailboxes that come within the scope of MDO policies.
- Accounts that benefit from MDO P2 features must be licensed for those features.
The new MDO licensing arrangement is better, but it requires more thought and action from tenant administrators, especially to configure and maintain policies to make MDO P2 features available to user accounts.
Insight like this doesn’t come easily. You’ve got to know the technology and understand how to look behind the scenes. Benefit from the knowledge and experience of the Office 365 for IT Pros team by subscribing to the best eBook covering Office 365 and the wider Microsoft 365 ecosystem.
Using the SharePoint Site Attestation Policy
Forcing Owners to Confirm Details of Their Sites
The site attestation policy is part of the site lifecycle management component of SharePoint advanced management (SAM). It’s also one of the SAM features available to tenants with Microsoft 365 Copilot licenses. The basic idea is to force site owners to periodically attest that the settings of their site, including its membership, remain valid. If the site owners can’t or don’t confirm the site details, SharePoint Online can enforce an action such as archiving the site.
Microsoft 365 roadmap item 494159 lists the site attestation policy as generally available from August 2025. However, that’s not quite the case as the policy is still listed in the SharePoint admin center as a preview feature (Figure 1).

Imposing site attestation can clear out many sites that form the digital debris that clogs up Microsoft 365 tenants. Apart from releasing expensive SharePoint “hot” storage by moving the content of non-attested sites into “cold” archive storage, the biggest benefit is to remove the files held in these sites from Copilot processing. This reduces the risk that obsolete and incorrect information will find its way into Copilot responses and improves the overall quality of Copilot processing.
Configuring a Site Attestation Policy
Like the other site lifecycle policies, configuring a site attestation policy is pretty straightforward. The usual process is to configure a policy in simulation mode so that the policy runs to generate a report about the sites within the policy scope for administrators to review.
Scoping means defining what sites the policy should process, like all team-connected sites. In Figure 2 I’ve selected to combine several criteria to form a precise scope. You can select one or more container management sensitivity labels to use. Filtering by site creation source is interesting because it allows you to select sites created using methods like PnP, PowerShell, or the SharePoint admin center. Running the policy in simulation mode will create a report to tell you exactly what sites match the scope.

The policy configuration specifies how often the policy runs, who must attest sites, and what SharePoint Online should do if attestation doesn’t happen. In Figure 3, we see the configuration for an annual review where lack of attestation by site owners leads to sites being moved to Microsoft 365 Archive.

Given that most SharePoint Online sites are used with Teams and that many Microsoft 365 tenant administrators probably couldn’t differentiate between site owners and site administrators, I wonder if the configuration could be simplified to a single option that combines the two. Just a thought.
After running in simulation to identify any issues and making necessary tweaks, such as including or excluding certain sites, the attestation policy can be launched to do its business.
Site Owner Actions
Turning on the site attestation policy causes SharePoint Online to send Outlook actionable messages to site owners to ask them to confirm site details. I received 63 messages within ten minutes, including duplicate messages for a couple of sites.
The initial message (Figure 4 left) informs the site owners about their responsibilities and sets a attestation deadline. Pressing the “Yes, settings are accurate” button allows the owner to attest that everything is OK without leaving the message. Acknowledgement is automatic by updating the same message (Figure 4 – right).

You’ll notice that no button exists for a site owner to declare that the site settings are inaccurate. The assumption is that the site owner will simply ignore the messages sent by SharePoint Online. After three monthly warnings, SharePoint will enforce the action set in the policy. It would be nice to give site owners the ability to accelerate the process with an option to take the policy action immediately. Maybe that will come in a future release.
Removing Digital Debris is Goodness
Regular site attestation seems like a solid idea. Anything to remove debris from a tenant is goodness. One concern that I have is that moving a team-connected site to Microsoft 365 Archive does nothing to affect the team. Users won’t be able to access files in the SharePoint site, but shouldn’t an archive action process everything? After all, Teams supports team archiving.
Insight like this doesn’t come easily. You’ve got to know the technology and understand how to look behind the scenes. Benefit from the knowledge and experience of the Office 365 for IT Pros team by subscribing to the best eBook covering Office 365 and the wider Microsoft 365 ecosystem.
Modernizing Sensitivity Label Grouping for App Display
The End of Parent-Child Label Relationships
Message center notification MC1111778 (last updated 24 September 2025, Microsoft 365 roadmap item 386900) announces the modernization of sensitivity label grouping to a “dynamic architecture” consisting of labels and label grouping rather than parent and child labels. The new architecture supports moving sensitivity labels between groups “without losing referential integrity.” In other words, the settings of sensitivity labels remain intact when they are moved from one label group to another.
Removing the Last Vestiges of AIP
When Microsoft launched Azure Information Protection (AIP) labels in 2016, they adopted a two-tier parent-child model for organizing the display of labels. In this model, the parent label functions as a navigation location for child labels and cannot be applied to files. When sensitivity labels took over from AIP labels, the same arrangement was kept. In Figure 1, the Secret label is the parent and the All Company and My Team are child labels.

When details of an assigned label are viewed in client user interfaces, the structure is displayed as ParentChild (Figure 2).

The problem with the parent-child structure is its strict nature. Once a child label is created and deployed in active use, it becomes very difficult (if not practically impossible) to change the labeling structure to reflect current business requirements. The inflexible nature of the parent-child structure is the main reason why I never recommended its use to customers. It’s difficult enough to construct a workable labeling structure for a tenant without having to deal with inflexible groupings.
Public Preview and Migration
Microsoft is currently deploying the modern label architecture in public preview with the aim of attaining general availability in December 2025. New tenants created after 1 October 2025 must use the new architecture. No administrator action is required before general availability occurs, but it might be a good idea afterwards to review the current label structure to see if sensitivity labels can be presented in more effective manner to end users.
When a tenant is upgraded, any existing parent-child groups are migrated to the new architecture. During the preview, if a tenant has parent-child label groups, they can use the manual migration method invoked from the Information Protection section of the Purview portal (Figure 3). Migration is an irreversible process, so take the time to read up before plunging ahead and migrate a set of sensitivity labels in a test tenant first.

Launching the migration is preceded by notification of what the expected outcome will be (Figure 4). My tenant has used sensitivity labels since their AIP predecessors and has accumulated many different sensitivity labels used for content protection and container management over the years, including two parent-child groups (for testing only).

The migration took just a few seconds and only difference seen afterwards is that the parent labels are now label groups and the child labels are members of those groups. The Secret parent viewed earlier became a label group and also a standalone sensitivity label. The standalone label takes the name, GUID, and settings as the original parent label. Following the migration, I updated the display name of the affected labels and label groups to make their function obvious.
The new architecture exposes options in the Purview portal to move sensitivity labels into and out of groups. This is the big advantage of the change as administrators can now easily construct and change label groups according to business demands. For instance, I created a label group called Email Labels to organize the sensitivity labels most appopriately used for email to give additional guidance to end users. Figure 5 shows how the new label group appears in OWA.

Notice how all the sensitivity labels in the Email Labels group have the same label color. This might affect the carefully-crafted custom colors you might have assigned to sensitivity labels in the past. Another important change is that the standalone labels moved into the label group have priority orders based on the priority assigned to the label group. Label priority is supposed to indicate the degree of confidentiality or sensitivity of files that labels are applied to, so some rearrangement of labels is probably needed here. A change in label priority can lead to an increase in document mismatch notifications, and that’s not a good thing.
Although you can move container management labels into label groups there’s no point in doing so. First, organizations tend to have relatively few container management labels, so there’s no need for grouping. Second, the applications that use container management labels, like Teams and SharePoint Online, display container management labels in a simple list.
PowerShell Changes
A set of cmdlets in the security and compliance module support sensitivity labels. The label settings manipulated by the cmdlets use the same properties to update label group membership as was used to associate a child label with a parent label. For instance, a label group has the isParent and isLabelGroup settings set to true:
$Label = Get-Label -Identity 'Email Labels' $Label.Settings [isparent, True] [islabelgroup, True]
A sensitivity label in a label group has the isParent property set to false and the identifier for the label group in its ParentId property:
$Label = Get-Label -Identity '1b070e6f-4b3c-4534-95c4-08335a5ca610' $Label.Settings [contenttype, File, Email] [isparent, False] [parentid, 62acd157-1757-4361-9a53-71ea316279ca]
To move a label into a label group, run the Set-Label cmdlet and update the ParentId parameter with the identifier for the label group. Here’s an example of moving a label into the Email Labels group:
Set-Label -Identity 'Employee Confidential' -ParentId (Get-Label -Identity 'Email Labels').ImmutableId
To move a sensitivity label out of a label group, pass $null or the identifier for another label group as the parent identifier.
Heading to a New Architecture
Referring to a new way to manage sensitivity labels for display in applications as a new architecture is a stretch. However, it’s still a good change. It will take time for tenants to figure out how to best use label groups, but that will come in time. In the meantime, the task is to migrate to the new architecture, either manually or by waiting just a few more weeks.
Insight like this doesn’t come easily. You’ve got to know the technology and understand how to look behind the scenes. Benefit from the knowledge and experience of the Office 365 for IT Pros team by subscribing to the best eBook covering Office 365 and the wider Microsoft 365 ecosystem.
Auto-Updating Teams Work Location is Not Employee Monitoring
Setting Teams Work Location by Reference to a Wi-Fi Network
I’m amazed at some of the commentary flowing from MC1081568 (last updated 24 October 2025, Microsoft 365 roadmap item 488800) about a new Teams feature to automatically set a work location based on connecting to a Wi-Fi network or known peripherals such as Teams Rooms devices. The way some people described it, you’d think that this is tantamount to Microsoft making a method available for managers to keep an eye on employee work habits. The simple truth is that automation work location detection is not, and anyone who thinks that it is reveals a woeful lack of knowledge about how Teams works.
Setting work location has been a feature in Teams and Outlook for quite a while (Figure 1). The idea is that people can collaborate more effectively with co-workers if everyone knows where everyone is. Knowing where people are is important from a support perspective too, especially when Teams Phone serves as the corporate phone system.

Today, users must set their location manually. I forget to do so as a matter of course, just like I suspect many others do. But Teams knows when people connect to a work network. At least, it can if automatic detection is configured in Microsoft Places. In addition, the tenant must configure a Teams work location detection policy to enable automatic detection because by default, the feature is off.
Managing the Work Location Detection Policy with PowerShell
To configure the policy, connect to Microsoft Teams PowerShell and either run the Set-CsTeamsWorkLocationDetectionPolicy to switch automatic detection on by default for all users or (recommended) run the New-CsTeamsWorkLocationDetectionPolicy cmdlet to create a new work location detection policy and assign that policy to the users who you want the policy to apply to. This command creates a new policy:
New-CsTeamsWorkLocationDetectionPolicy -Identity AutoDetectNetwork -EnableWorkLocationDetection $true
To assign the policy to user accounts, use the Grant-CsTeamsWorkLocationDetectionPolicy cmdlet:
Grant-CsTeamsWorkLocationDetectionPolicy -Identity Lotte.Vetler@office365itpros.com -Policy AutoDetectNetwork
The Get-CsTeamsWorkLocationDetectionPolicy reports which work location detection policies enable automatic detection:
Identity EnableWorkLocationDetection -------- --------------------------- Global False Tag:NetworkDetectOn True Tag:AutoDetectNetwork True
It’s important to remember that Teams clears location information at the end of the working day and does not update locations outside working hours (based on Outlook settings).
Keeping an Eye on User Locations
For those who suspect that managers will monitor their locations to check where people are, my response is that managers can do this today by checking the user profiles for their employees where their location is displayed (Figure 2).

Having been a senior manager in several organizations, my view is that any manager that devotes time to this kind of checking needs to reevaluate how they allocate their time. It is something that might be justified when monitoring a problem employee, but not elsewhere. If people are really worried about management oversight, they can use the Teams browser or mobile clients. Detecting location automatically only works for the Teams desktop clients for Windows and MacOS.
Privacy is Important
People are right to worry about their privacy, and they should understand the potential impact of new functionality on how they work. In this case, I don’t think that there’s much to complain about. There are better tools available if an organization wants to monitor employee productivity. Automation work location detection by Teams to register if someone is in the office is not going to worry the people who build employee monitoring software. It shouldn’t worry you either.
Learn about managing Teams and the rest of Microsoft 365 by subscribing to the Office 365 for IT Pros eBook. Use our experience to understand what’s important and how best to protect your tenant
Stealing Access Token Secrets from Teams is Hard Unless a Workstation is Compromised
French Security Company Highlights Stealing Teams Access Tokens from the Local State File
On October 23, 2025, a French security company called Randorisec, published an article about stealing Microsoft Teams access tokens in 2025. Over the next few hours, I received several messages asking if the news as reported was serious and required action. My response was “Nope.”
I don’t think that the article surfaces any new information. More importantly, the compromise as described is only possible if attackers first manage to gain control over a workstation running Teams. In that scenario, the problem is more serious than fetching a few access tokens to use to send messages with the Graph API. Let’s discuss what the article reveals and why I’m sanguine about its findings.
The Teams Local State File
The discussion centers on fetching content from the local state file used by Teams, which is found in:
%LocalAppData%PackagesMSTeams_8wekyb3d8bbweLocalCacheMicrosoftMSTeamsEBWebViewLocal State
The article explains how to fetch and decrypt cookies protected using the Chromium Data Protection API (DPAPI), which in turn are used to fetch access tokens. I’m not sure that there’s anything new here because I found several articles to explain the process (here’s a good example). Chromium-based browsers use JSON-formatted local state files to store information needed for browser sessions, including encrypted keys used to protect sensitive information like user passwords.
Why Does Teams Use a Local State File?
What people might not understand is why Teams uses a local state file to hold information about the current client configuration, software version, other client settings, and encrypted content (Figure 1). The answer is that the Teams V2 client architecture depends on the WebView2 component. WebView2 uses the Edge rendering engine to display content within apps, including Teams, the new Outlook for Windows, and features shared between Outlook clients like the Room Finder. Microsoft includes the WebView2 component with Office and other products.

Because the Teams clients are deeply integrated with WebView2, it makes sense to adopt other Chromium constructs, like the local state file and DPAPI, and that’s probably why you end up with a Teams-specific local state file that behaves much like the local state file used by Chromium browsers.
Access Tokens for Teams
Eventually, the researchers end up with access tokens that can be used to interact with Teams via the Graph API. Getting to the access tokens requires fetching them from the cookies SQLlite database. This file is found in the %LocalAppData%PackagesMSTeams_8wekyb3d8bbweLocalCacheMicrosoftMSTeamsEBWebViewWV2Profile_tfwNetwork folder and is locked when a Teams client is active.
The assertion that they can use the tokens to send email is erroneous. As pointed out in the article, the tokens are for use with Teams, not Exchange Online, so the permissions granted in the tokens do not permit use of the Mail Send API.
Local State File is Inaccessible Unless a Device is Compromised
Don’t get me wrong. Security researchers do a great job of finding weaknesses in products before attackers figure out how to use those weaknesses to do damage. I applaud the efforts of the Randorisec team, but I just don’t think that there’s anything surprising to become too concerned about. The attempt to hype the problem by Cyber Security News is also regretable. I wonder if either the researchers or reporter actually know anything about how Teams works, but hey, all publicity is good.
I keep on going back to the simple fact that before an attacker can access the Teams local state file and cookies database, they’ve broken into the workstation and therefore have full access to whatever’s on that device. In all probability, they can start the Teams client and can send chats and channel messages without needing to fetch and decrypt information.
The best defence is to stop attackers from compromising user accounts by deploying strong multifactor authentication. If you can do that, you shouldn’t need to worry about the details of Teams, WebView2, and the cookies file.
So much change, all the time. It’s a challenge to stay abreast of all the updates Microsoft makes across the Microsoft 365 ecosystem. Subscribe to the Office 365 for IT Pros eBook to receive insights updated monthly into what happens within Microsoft 365, why it happens, and what new features and capabilities mean for your tenant.
Allowing Users to Add Enterprise Apps to Entra ID is a Bad Idea
Reviewing Enterprise Apps is a Good Idea
Over the years, I have advised Microsoft 365 tenants to check and clean up enterprise apps regularly. Initially, the Graph APIs available to report information about enterprise apps weren’t too approachable and lacked some data. However, the situation has improved and it’s now easier to get a solid handle on the enterprise apps present in a tenant, the usage of those apps, and the permissions used by apps to access data.
Given that the original clean-up script dates back to April 2020, I’ve been writing a new script based on the Microsoft Graph PowerShell SDK to demonstrate how to generate review data. (Microsoft released V2.32 of the SDK on October 20, 2025, so far, the new version appears to be solid). In any case, once I’ve finished tweaking the code, I’ll write up details about what the script does and release it via the Office 365 for IT Pros GitHub repository.
The Case of the Newly-Added Enterprise Application
One of the checks performed by the script highlights recently added service principals. After writing the code, I was interested to discover the presence of an enterprise app called GuideAnts, added on 15 October 2025 by my account. I couldn’t remember anything about adding such an app. Advancing age has a nasty habit of eroding immediate recall.
In any case, running an audit log search confirmed that my account had added the service principal (Use the Search-UnifiedAuditLog cmdlet to search the audit log for events with operations = “Add Service Principal.”). Here’s an extract from the audit log:
Actor : {@{ID=Tony.Redmond@office365itpros; Type=5}, @{ID=1003BFFD805C87B0; Type=3}, @{ID=Azure ESTS Service; Type=1}, @{ID=00000001-0000-0000-c000-000000000000; Type=2}…}
InterSystemsId : e5fce0de-688c-4e1e-bf64-22d9246ba0e6
IntraSystemId : 00000000-0000-0000-0000-000000000000
SupportTicketId :
Target : {@{ID=ServicePrincipal_d448e5cc-80cc-4c95-8aca-356068dc2972; Type=2},@{ID=d448e5cc-80cc-4c95-8aca-356068dc2972; Type=2}, @{ID=ServicePrincipal; Type=2},@{ID=guideants; Type=1}…}
Having still no memory of doing such a thing, I exported my browser history and loaded the CSV file into PowerShell to check it:
$History = Import-CSV browserhistory.csv
$History | Where-Object {$_.pagetitle -like "*GuideAnts*"} | Format-table DateTime, PageTitle, NavigatedToURL
DateTime PageTitle NavigatedToUrl
-------- --------- --------------
2025-10-15T20:26:54.855Z GuideAnts Notebooks https://go.guideants.ai/access
2025-10-15T20:26:30.514Z GuideAnts Notebooks https://go.guideants.ai/login
2025-10-15T20:26:29.801Z GuideAnts Notebooks https://go.guideants.ai/
This is the kind of interaction captured when someone goes through the consent process to add an enterprise app (Figure 1) and consents on behalf of the organization. There was no doubt. I was the culprit.

This is an example of bad practice in action. I might have been tired, and I might have wanted to check out the app because I was writing about ISV AI-powered add-ins for Microsoft 365 at the time, but these are not acceptable excuses.
Consent Approval Workflow for Enterprise Apps
I violated my personal standards in three ways. First, I added an enterprise app without much consideration, perhaps because the permissions sought for the app were pretty benign. Second, I added an unverified app. Enterprise apps published by ISVs should go through the Microsoft verification process to give tenants some additional trust that the app comes from a reputable publisher.
Third, I used my administrator account. Had I used my normal account, I wouldn’t have been able to add an enterprise app because the tenant settings would block immediate app creation by users. Instead, a request to add the app would have gone through a consent approval workflow for approval by an administrator (Figure 2). Even if that administrator was me, being forced to go through the approval process might have caused me to think why an enterprise app was needed, or to review the reply URLs used by the app and ask myself why these URLs are required.

We live and learn from our mistakes. I hope that I won’t make the same mistake again!
GuideAnts AI Notebooks
Apart from noting the unverified nature of the enterprise app, none of the above is criticism of the GuideAnts app (an AI-powered notebook). The app’s author is Doug Ware, an ex-MVP, who publishes some interesting AI-related content on Elumenotion.com. The app is currently in preview. You can read more about GuideAnts here and decide if you want its enterprise app to exist in your tenant. Use invite code 22VG6Y if you want to join the preview.
Learn how to exploit the data available to Microsoft 365 tenant administrators through the Office 365 for IT Pros eBook. We love figuring out how things work.
Updating the Entra ID Password Protection Policy with the Microsoft Graph PowerShell SDK
Use SDK Cmdlets to Create or Update Password Protection Policy Settings
A reader asks if the script written for the article about updating the Entra ID banned password list can be used to update other settings in the Entra ID password protection policy. The answer is “of course.” The code is PowerShell, and it can be adapted to update any of the password protection settings found in the Entra admin center (Figure 1).

A few considerations must be remembered when updating the Entra ID password protection policy:
- You don’t need additional licenses to use the default password protection policy. If you create a custom policy by updating settings, user accounts must be licensed with Entra P1 or P2.
- Custom password policy settings are immediately effective across the entire tenant. You can’t assign a custom password policy to specific users or groups.
- In a hybrid environment, password protection can extend to Active Directory.
Creating a Password Protection Policy
The underlying concepts for creating a custom password policy are similar to the management of other Entra ID policies (like the Microsoft 365 groups policy):
Check if a custom policy exists, or rather, a directory setting object created using the directory setting template for password rules. The template always has the identifier 5cf42378-d67d-4f36-ba46-e8b86229381d, so we can check if a custom password protection policy exists follows:
$Policy = (Get-MgBetaDirectorySetting | Where-Object {$_.TemplateId -eq "5cf42378-d67d-4f36-ba46-e8b86229381d"})A client-side filter is used because the Graph API does not support server-side filtering against template identifiers.
If a password policy object is not available, you can create a new password policy object. The values for the policy settings are in a hash table containing an array of values. Each value (a setting) is a hash table consisting of the setting name and its value. For example, this code creates the hash table to hold the setting for lockout duration:
$Value5 = @{}
$Value5.Add("Name", "LockoutDurationInSeconds")
$Value5.Add("Value", $LockoutDuration -as [int32])
After populating values for all settings (or just the ones that are different from the default), run the New-MgBetaDirectorySetting cmdlet to create the new custom password policy:
$NewBannedListParameters = @{}
$NewBannedListParameters.Add("templateId", "5cf42378-d67d-4f36-ba46-e8b86229381d")
$NewBannedListParameters.Add("values", ($Value1, $Value2, $Value3, $Value4, $Value5, $Value6))
$Policy = New-MgBetaDirectorySetting -BodyParameter $NewBannedListParameters -ErrorAction Stop
Updating the Password Protection Policy
If a custom policy already exists, fetch the policy settings, update the value for the settings that you want to change, and use the Update-MgBetaDirectorySetting cmdlet to update the policy. This example changes the lock out duration time to 120 seconds (the default is 60 seconds):
[array]$PolicyValues = Get-MgBetaDirectorySetting -DirectorySettingId $Policy.Id | Select-Object -ExpandProperty Values
($PolicyValues | Where-Object {$_.Name -eq "LockOutDurationInSeconds"}).Value = 120
Update-MgBetaDirectorySetting -DirectorySettingId $Policy.id -Values $PolicyValues -ErrorAction Stop
The code for these operations is the same as used in the script to update the banned passwords list. Grab what you need from that script and repurpose it to do whatever you need to. For instance, some organizations like to validate that the password policy settings in the tenants that they manage are consistent and up to date. This is easily done on a periodic basis by creating a PowerShell runbook in Azure Automation. I imagine that checking the password policy would only be one of the Entra ID configuration checks that such a runbook would process. At least, that’s how I would do it.
Next Step – Testing Configurations
The Maester utility includes some checks against the password policy and it would be easy to expand test coverage to whatever aspect of the password policy you consider needs to be checked. Once you’ve mastered programmatic manipulation of the Entra ID password protection policy settings, anything is possible.
Support the work of the Office 365 for IT Pros team by subscribing to the Office 365 for IT Pros eBook. Your support pays for the time we need to track, analyze, and document the changing world of Microsoft 365 and Office 365. Only humans contribute to our work!
Important Change Coming for Entra ID Passkeys in November 2025
Passkey Settings Behavior Change After Introduction of New Passkey Profiles
If your focus is on Entra ID or security, you probably agree with the statement that passkeys are the future for authentication. Or at least, the immediate next step. Who knows what might happen after passkeys are fully deployed? After all, it wasn’t so long ago that people congratulated themselves for using SMS messages for multifactor authentication.
In any case, message center notification MC1097225 (first published 17 June 2025, updated 20 October 2025) marks an important point in the evolution of passkey support within Entra ID. Where today Entra ID supports tenant-wide controls for passkeys as an authentication method, from November 2025 (December 2025 for government clouds), the preview Entra ID feature will support up to ten passkey profiles per tenant. The intention behind the change is to allow tenants to exert more granular control over which users can use what passkeys for authentication.
Granular control is usually goodness, and there’s goodness in this change. You’ll be able to create a passkey profile for departments or other groups and dictate what kind of passkeys the users within the scope of the profile can use.
Passkey Authenticator Attestation
A potential downside exists that should be understood before rushing to embrace the change. When a tenant opts in to use the new approach, Entra ID switches to a new schema to describe what passkey policies are. Logically enough, the existing passkey settings become the default passkey policy, and if the setting to enforce attestation is disabled, Entra ID will become less strict about the kind of passkeys it accepts as an authentication method.
Passkeys have an Authenticator Attestation GUID (AAGUID), a 128-bit identifier to identify the make and model. In enterprise environments, it is common practice to decide on a set of passkeys or FIDO2 keys that the tenant wishes to support. This decision is enforced by specifying the AAGUIDs in the passkey settings.
But as part of the change to the new passkey schema, Microsoft says that “if Enforce attestation is disabled (in a policy), we (Entra ID) will start accepting security key or passkey providers using the following attestation statements:
- “none”
- “tpm”
- “packed” (AttCA type only)
- Custom attestation formats ≤ 32 characters
This will allow a wider range of security keys and passkey providers to be accepted for registration and authentication in Microsoft Entra ID.”
That doesn’t sound too serious, but it does mean that if your current passkey settings do not enforce attestation (Figure 1), anyone covered by the default policy created when the switchover happens will be able to choose whatever passkey type they like.

A Passkey Setting Worth Checking
Some tenants might not care very much about the non-enforcement of attestation. Others will care deeply because of the work they’ve done previously to figure out what kind of passkeys should be used within the tenant. In either case, it’s worthwhile considering the topic and deciding if attestation should be enforced.
Microsoft says that there’s no administrator action necessary for the change. It will be deployed automatically to tenants, and you might not realize that anything has happened if you don’t have the need to review authentication methods.
APIs Not Ready for Change
MC1097225 contains an important note: “If you continue using Graph API or third-party tools to modify the policy, the schema will not change until General Availability.” Remember, what comes in November is a preview and it takes time for APIs to catch up with change. Customers who have built tools to manage authentication methods can continue to use those methods until general availability happens, which will probably be in early to mid-2026 (my guess). When that happens, I guess I’ll revisit my password and authentication methods report script.
Support the work of the Office 365 for IT Pros team by subscribing to the Office 365 for IT Pros eBook. Your support pays for the time we need to track, analyze, and document the changing world of Microsoft 365 and Office 365. Only humans contribute to our work!
Automating Microsoft 365 with PowerShell November 2025 Update
Updated PDF and EPUB Files Available for Automating Microsoft 365 with PowerShell eBook
The Office 365 for IT Pros team is happy to announce the availability of version 17 of the Automating Microsoft 365 with PowerShell eBook. Updated PDF and EPUB files are available for download from Gumroad.com by subscribers of the Office 365 for IT Pros eBook and by those who bought the PowerShell book separately. Remember, when you subcribe to these books, you’re entitled to receive any updates we release for the edition.
We’re still working on the November 2025 update of the main Office 365 and anticipate that it will be ready for subscribers to download on November 1, 2025.
Final Retirement of AzureAD and AzureADPreview Modules
This month marks the final retirement of the AzureAD and AzureAD Preview modules. Microsoft made the original announcement about the retirement of these and the MSOL module on August 26, 2021. Fifty months and multiple postponements later, Microsoft has eventually managed to cajole, persuade, and force customers to dump the old modules to embrace the Graph. At least, Microsoft wants customers to replace old code with cmdlets from the Microsoft Graph PowerShell SDK or the Microsoft Entra module. Naturally, Automating Microsoft 365 with PowerShell is absolutely the best text to consult for anyone who needs to upgrade old scripts. The worked-out code examples are of great help when figuring out cmdlet syntax.
The Entra module is based on the Microsoft Graph PowerShell SDK. It features cmdlets to work with Entra objects like users, groups, and devices with aliases to make the cmdlets work like their AzureAD equivalents, if one exists. I don’t recommend using the Entra module because I think it’s better that administrators and developers understand how to use the full Graph.
Paperbacks at TEC
The TEC 2025 conference was at the start of October. During the event (enjoyable as always), I ordered some copies of the paperback version of Automating Microsoft 365 for PowerShell for delivery to the hotel (Figure 1).

After looking at the Word and PDF versions of the book for months, I wanted to see how the content looked after going through Amazon’s print-on-demand process to verify that people who buy the paperback will be happy. I think they will because the quality surpassed my expectation. It’s definitely not in the same class as the production quality seen in books like the Microsoft Press Inside Out series, but the book is perfectly acceptable.
Point Updates
Those who pay close attention (or who have time to spare) might notice that point releases appear for Automating Microsoft 365 with PowerShell. For instance, the current release is version 17.2, two point releases from version 17.0. Last month, we issued 16.0 through 16.4.
We issue point releases when we correct minor errors or add some material that’s important and we want readers to benefit from without waiting for a monthly update. Minor errors include grammatical and spelling errors, like an annoying “Get-MgServicePrincipall” discovered in V17.0. Code errors like an incorrect parameter also justify a point release, as does the inclusion of a new example. There’s no point in using electronic publishing if you can’t take advantage of the mechanism to improve the quality and content of the book on an ongoing basis.
Our release cadence poses problems for the paperback version because we obviously can’t update printed books. The books I had delivered to TEC 2025 were version 16.0 and the text printed on those pages will always remain the same. Such is the downside of committing words to print instead of an electronic medium.
Sharing Knowledge
We continue to add content to Automating Microsoft 365 with PowerShell. It’s become my go-to notebook to capture experiences, hints, and insights acquired by working with different Graph APIs and SDK cmdlets. It’s been quite a journey so far and I anticipate that there’s much more to come. Stay tuned.
New Audio-Only Recording Option for Teams Meetings
Audio-Only Recording to Protect User Privacy During Recording Playback
In 2023, mesh avatars were the focus for helping people who didn’t like to appear with their video turned on in Teams meetings. To some, it seemed utterly cool to be able to hand over their visual online presence to an avatar that they created with care to be broadly similar to their real self. Avatars are dumb (your voice remains your voice), but they can express some visual reactions to what happened during meetings.
Earlier this year, Microsoft released the ability to create a mesh avatar from a photo in an attempt to make the avatars more realistic. Figure 1 shows the avatar I created from my photo. My efforts didn’t create a very realistic digital presence.

The Avatars for Teams service plan is included with many Microsoft 365 and Office 365 products, so most the Teams installed base of 320 million monthly active users can use avatars. According to the Teams Avatar app, 83.8K people installed the app to create or update their avatars in the last month, so interest remains in having a way to attend meetings in a visual sense without projecting our real self, flawed and imperfect as that might be.
Audio-Only Recording for Teams Meetings
Which brings us neatly to the news announced in Microsoft 365 message notification MC1173926 (16 October 2025) that the Teams meeting recording feature will soon be able to create an audio-only recording. Deployment of the feature to make it generally available has started and should be complete in late November 2025.
What’s interesting is that Microsoft says that making an audio-only recording for a meeting offers “a more comfortable and convenient recording experience.” Microsoft goes on to note that audio-only recording “alleviates concerns about facial information exposure when recording is necessary, offering a more privacy-conscious approach to recording meetings.”
I thought avatars were all about making the visual side of meetings more comfortable for users. However, it’s important to remember that using avatars is a personal choice to customize the video feed for people who opt to use avatars. Audio-only recording is a meeting option to suppress the video feed that flows into the meeting recording for all users, no matter whether they use Teams desktop, browser, or mobile clients. Participants can have their cameras turned on during meetings, but only the audio feeds will make it into the .MP4 file created in the meeting organizer’s OneDrive for Business account.
Suppressing the video feed for the recording means that anyone who plays the recording afterwards cannot see how the participants appeared during the meeting, including if any avatars are used. All the playback can deliver is the audio stream. This is what Microsoft means when they refer to a more privacy-conscious approach. It seems reasonable to say that if you’re not in a meeting, privacy of the participants is better respected if you cannot see how people appeared during the meeting.
The ability to generate a meeting transcript depends on the audio feed, so suppressing the video feed has no impact on the transcript.
No Administrative Controls
There doesn’t seem to be any administrative control in the Teams meeting policy for an organization to decide that audio-only recording is the default or only option for Teams meetings. Microsoft says that administrator intervention is unnecessary because audio-only recording “integrates into existing workflows.” In other words, “Audio only” is an option in a drop-down menu for an organizer to decide what to record for a meeting (Figure 2).

See the Microsoft documentation for more information about recordings for Teams meetings.
Little Value in the Video Stream in Recordings
It took me a little while to work out why Microsoft wanted to introduce audio-only recordings for Teams meetings. After thinking things through, I think this is a good idea. Few of us really want our visual appearance to be replayed in recordings, and it’s uncertain if the video stream adds much value to those who listen to recordings after an event. The transcript is a much more valuable artifact, especially if Microsoft 365 Copilot can reason over it to produce a summary and action items.
Learn about managing Teams and the rest of Microsoft 365 by subscribing to the Office 365 for IT Pros eBook. Use our experience to understand what’s important and how best to protect your tenant.
Outlook Gets AI Drafting of Meeting Agendas
Agenda Auto-Draft Available for OWA and the New Outlook
Microsoft is doing its level best to convince Microsoft 365 tenants to invest in Copilot. Given the massive capital investment in datacenters to power AI experiences, it’s unsurprising that engineering groups are busy infusing Copilot features into as many applications as possible. Features like Copilot memory add value and help dissuade tenants from investigating other options, such as the ChatGPT Connector for SharePoint Online and OneDrive for Business.
Of course, a SharePoint connector is limited when compared to the breadth of integration of Copilot across the Microsoft 365 apps. Because Copilot works well for some and not for others, work continues apace to find new ways to integrate AI in daily tasks. This brings me to message center notification MC1171854 (13 Oct 2025), which describes “Intelligent agenda suggestions for calendar events.” The feature is available now, but only to users with Microsoft 365 Copilot licenses.
Agenda Auto-Draft Uses AI to Generate Some Bullet Points
At first glance, I didn’t see much to get excited about. The description says that AI is used “to automatically generate a proposed agenda when users create or edit a calendar event, making it easier to align meeting goals, participants, and discussion topics.” I’ve never had any problems coming up with a few salient points for a draft meeting agenda, and agendas have a nasty habit of changing as soon as meetings start. However, I can see the value of being able to create some bullet points to frame an agenda.
What happens is that Microsoft has updated the calendar scheduling form to add an auto draft an agenda option to the set of prompts available when the Draft with Copilot button is used. When the auto draft option is used, Copilot uses the meeting subject to generate an agenda composed of some introductory text and some bullet points. Copilot has always been good at generating bullet points in document and message summaries!
In Figure 1, the meeting subject is Review Chapter Updates for Office 365 for IT Pros. Copilot’s suggested agenda items seem reasonable, and it looks as if Copilot discovered that Office 365 for IT Pros is an eBook from information found internally or on the web (Bing search).

If the meeting organizer doesn’t like the draft agenda, they can simply instruct Copilot to retry or adjust the text by making the agenda longer or shorter. The changes proposed in further versions are not dramatic, likely due to using the meeting subject as the core input to the AI processing.
Eventually, the suggested text is accepted or rejected. If accepted, it can be further edited before the meeting notice is sent.
Now Available Worldwide
Auto-draft of meeting agendas is now a default feature that is enabled in OWA and the new Outlook. According to Microsoft, the feature was enabled worldwide from October 9, 2025.
There’s no administrative control to enable or disable auto-draft for meeting agendas. Given the dramatic difference between the scheduling interface of Outlook classic, it’s unlikely that auto-draft of agendas will find its way into that client.
New Feature that Won’t Move the Needle
Agenda auto-draft won’t move the needle at all when the time comes for Microsoft 365 tenants to decide whether to embrace Microsoft 365 Copilot. It’s a feature that will please some people (those who scheduled meetings and discover how to use agenda auto-draft). For most, I suspect that this is one of the Copilot features that will pass them by because they never need to create an agenda. But that’s always true for new software features.
So much change, all the time. It’s a challenge to stay abreast of all the updates Microsoft makes across the Microsoft 365 ecosystem. Subscribe to the Office 365 for IT Pros eBook to receive insights updated monthly into what happens within Microsoft 365, why it happens, and what new features and capabilities mean for your tenant.
Using the Secret Management PowerShell Module with Azure Key Vault and Azure Automation
Use Secret Management to Store and Manage Secrets Needed by Azure Automation Runbooks
Storing hard-coded account credentials in PowerShell scripts is a big security no-no. Previously, I’ve discussed using Azure Key Vault to store passwords and other credentials that might be needed by PowerShell scripts or Azure Automation runbooks. Another method that I’ve used with runbooks is to store the credentials as an automation account resource, most recently when using the Connect-IPPSSession cmdlet to update a Microsoft 365 retention policy. Unhappily, the security and compliance cmdlets don’t currently support managed identities.
Although storing credential objects is the easiest way to make credentials available to an automation account, I would prefer to avoid duplication by using Azure Key Vault everywhere. This decision this brought me to the Secret Management PowerShell module. Essentially, the module supports an easy-to-use connection to Azure Key Vault (and other repositories) to access secrets stored in the vault, like usernames and passwords.
Installing the Secret Management Module
First, install the necessary module from the PowerShell gallery.
Install-Module Microsoft.PowerShell.SecretManagement -Repository PSGallery -Force -Scope AllUsers
Remember to make the module available in any Azure Automation runtime environments where runbooks will use the module to fetch secrets. The module only supports PowerShell core, so I tested runbooks with a custom PowerShell 7.4 runtime environment.
Registering Azure Key Vault with Secret Management
Before you can fetch any secrets from a vault, you must register the vault for the current session. The Secret Management module supports access to Azure Key Vault through one of its default extensions, but first a connection is needed an Azure account that’s linked to a subscription. Interactively, you’d do something like this:
Connect-AzAccount -Subscription 25429342-a1a5-4427-9e2d-551840f2ad25
In an Azure automation runbook, you can use a managed identity:
Connect-AzAccount -Identity
In this case, the secrets I need to use are stored in an Azure Key Vault called “Office365ITPros.” To access Azure Key Vault, the signed-in account must have permission to the target vault granted via a legacy access policy or an appropriate Azure RBAC role. This requirement also applies to the automation account used to execute runbooks, where permission is granted to the automation account’s service principal.
With the necessary access, I can use the Register-SecretVault cmdlet to connect to Azure Key Vault for the current session as follows. The call to the Get-SecretVault cmdlet is to confirm that the registration worked.
$parameters = @{
Name = 'Azure'
ModuleName = 'Az.KeyVault'
VaultParameters = @{
AZKVaultName = ‘Office365ITPros'
SubscriptionId = (Get-AzContext).Subscription.Id
}
DefaultVault = $true
}
Register-SecretVault @parameters
Get-SecretVault
Name ModuleName IsDefaultVault
---- ---------- --------------
Azure Az.KeyVault True
Fetching and Using Secrets in an Azure Automation Runbook
Once a vault is properly registered, the Get-Secret cmdlet can fetch secrets from the target vault. We need to combine the secrets holding the username and password for an Exchange Online administrator account into a credentials object. The object can then be used with the Connect-ExchangeOnline and Connect-IPPSSession cmdlets to connect to Exchange Online and the compliance endpoint before running the cmdlets necessary to complete whatever task is required.
This example shows how to list the sensitivity labels defined in the tenant after making all the necessary connections and registrations. The full code is listed below. Figure 1 shows the output from the Azure automation test pane.
Connect-AzAccount -Identity
$parameters = @{
Name = 'Azure'
ModuleName = 'Az.KeyVault'
VaultParameters = @{
AZKVaultName = 'Office365ITPros'
SubscriptionId = (Get-AzContext).Subscription.Id
}
DefaultVault = $true
}
Register-SecretVault @parameters
Get-SecretVault
$UserName = Get-Secret -Name ExoAccountName -AsPlainText -Vault Azure
$Password = Get-Secret -Name ExoAccountPassword -Vault Azure
$Credentials = New-Object 'Management.Automation.PsCredential' $UserName, $Password
Connect-ExchangeOnline -Credential $Credentials -DisableWAM
Connect-IPPSSession -Credential $Credentials
Get-Label | Format-Table ImmutableId, DisplayName

Secret Management is an Alternative to Credential Resources
There’s no doubt that storing credential objects as Azure Automation resources is the easiest way to manage credentials used with runbooks. However, the credential objects are associated with individual automation accounts and not shared elsewhere. Putting credentials in Azure Key Vault and accessing those credentials using the Secret Management module isn’t much harder, and those credentials are available to any user or service principal that’s allowed access to the key vault. You pay your money and make your choice…
Need help to write and manage PowerShell scripts for Microsoft 365, including Azure Automation runbooks? Get a copy of the Automating Microsoft 365 with PowerShell eBook, available standalone or as part of the Office 365 for IT Pros eBook bundle.
The My Sign-Ins Portal, Applications, and Conditional Access
Making Conditional Access and the My Sign-Ins Portal Work Better
A couple of weeks ago, I attended a keynote at the TEC 2025 conference where Alex Simons, Microsoft Corporate VP for Entra, discussed the investments Entra is making to develop agents to help tenant administrators to work smarter. There’s a cost to these agents in the form of Entra premium licenses and the security compute units required to run the agents. Microsoft’s bet is that they can deliver sufficient value to customers through agents to take the cost question off the table. Time will tell.
The Conditional Access optimization agent is one of the agents Microsoft has available in preview. I think both agents can do more and have said so both in print and in person. At this point, the conditional access agent seems more practical and likely to have an impact simply because it’s so easy to screw up conditional access policies.
Which brings me to a LinkedIn post by David Nündel reporting that Microsoft has exposed several additional first-party applications in the Entra admin center. There’s nothing really surprising here because Microsoft 365 and Entra ID are constructed from many multitenant applications. Instances of these applications exist in customer tenants (or rather, service principals for the applications) that can then be used in different aspects of tenant management.
Applications and the My Sign-Ins Portal
What is surprising and useful is that the newly-exposed applications relate to the My Sign-ins portal where users can perform actions such as changing their password, removing themselves as guest accounts from other Microsoft 365 tenants, and viewing recent sign-in activity (Figure 1).

The point is that the My Sign-ins portal relies on access to several applications to display the information revealed by the various menu options. If access to the applications is blocked by something like a conditional access policy, then the portal cannot function. And as it so happens, the newly revealed applications are those that are needed by the My Sign-Ins portal. Six applications are in the set with the following display names and application identifiers:
- My Signins: 19db86c3-b2b9-44cc-b339-36da233a3be2
- My Profile: 8c59ead7-d703-4a27-9e55-c96a0054c8d2
- Microsoft App Access Panel: 0000000c-0000-0000-c000-000000000000
- AADReporting: 1b912ec3-a9dd-4c4d-a53e-76aa7adb28d7
- Windows Azure Active Directory: 00000002-0000-0000-c000-000000000000
- Azure Credential Configuration Endpoint Service: ea890292-c8c8-4433-b5ea-b09d0668e1a6
Checking Service Principals for the My Sign-Ins Portal Applications
Service principals for most or maybe all of these applications are likely already present in your tenant. When I checked using the Microsoft Graph PowerShell SDK command shown below, only the My SignIns application was missing:
Get-MgServicePrincipal -filter "displayName eq 'Azure Credential Configuration Endpoint Service' or displayName eq 'Windows Azure Active Directory' or displayName eq 'AADReporting' or displayName eq 'Microsoft App Access Panel' or displayName eq 'My Profile' or displayName eq 'My SignIns'" | Format-Table DisplayName, Id, AppId DisplayName Id AppId ----------- -- ----- My Profile 1f1f813e-0778-4b5b-a379-a924c97e023f 8c59ead7-d703-4a27-9e55-c96a0054c8d2 AADReporting 31bd9b44-bc6b-42df-9be6-3030109b84a5 1b912ec3-a9dd-4c4d-a53e-76aa7adb28d7 Microsoft App Access Panel 10334c63-ac46-4b2a-a80a-dc9c62e34dd8 0000000c-0000-0000-c000-000000000000 Windows Azure Active Directory 2be71509-6ab9-44d7-bfd8-eff4e50bfc7c 00000002-0000-0000-c000-000000000000 Azure Credential Configuration Endpoint Service 6d1fdc7c-f64b-4aeb-9133-5246b467035c ea890292-c8c8-4433-b5ea-b09d0668e1a6
The problem was easily fixed by running the New-MgServicePrincipal cmdlet:
New-MgServicePrincipal -AppId 19db86c3-b2b9-44cc-b339-36da233a3be2 DisplayName Id AppId SignInAudience ServicePrincipalType ----------- -- ----- -------------- -------------------- My Signins a7cda215-2932-4042-8e3e-631ecf7ae23b 19db86c3-b2b9-44cc-b339-36da233a3be2 AzureADMultipleOrgs Application
The command to create a service principal from an application identifier works because the My SignIns application is a multitenant application owned by Microsoft. We can prove this by using the tenant relationship API to check the value of the identifier for the owning tenant. Using the Find-MgTenantRelationshipTenantInformationByTenantId cmdlet requires the Graph CrossTenantInformation.ReadBasic.All permission:
$AppTenantOwner = (Get-MgServicePrincipal -ServicePrincipalId a7cda215-2932-4042-8e3e-631ecf7ae23b).AppOwnerOrganizationId
Find-MgTenantRelationshipTenantInformationByTenantId -TenantId $AppTenantOwner
Write-Host ("The tenant name is {0} and its default domain is {1}" -f $TenantInfo.displayName, $TenantInfo.DefaultDomainName)
The tenant name is Microsoft Services and its default domain is sharepoint.com
No Point in Repeating What’s Already Available
With all the applications in place, you can use them in conditional access policies. I don’t like repeating information that’s already online, and I hate seeing many different descriptions of a new feature published by people who haven’t bothered to add any personal insight or knowledge to help others understand the technology better.
With that point in mind, you can read about how these applications could be used in a description of configuring conditional access for guest users by MVP Kenneth Van Surksum. Kenneth adds a few more applications to the “must exclude from blocking” list, so it’s important that you read the article. Excluding applications in conditional access policies simply allows users to access applications that they need to do their jobs, or to make functionality work, like the exclusion required by Outlook to handle sensitivity labels.
Now all I want to know is whether the Entra conditional access optimization agent is ready to optimize for this condition. I suspect not, because it’s clear that first generation agents solve immediate issues (like stopping people from locking themselves out) rather than delivering great insight into more subtle policy details.
Changing the Offline Access Period for Sensitivity Labels
Offline Access Lets Clients Like Outlook Work with Protected Content
The use of Microsoft Purview sensitivity labels to protect confidential files and messages seems to be more widespread. Although Microsoft doesn’t publish data to say how many Microsoft 365 tenants use sensitivity labels or the percentage of files stored in SharePoint Online and OneDrive for Business that are protected by sensitivity labels, my guess is that use has grown steadily over the last few years. Certainly, Microsoft is encouraging the use of sensitivity labels by increasing its use in different places. For example, implementing dynamic watermarking, preventing Microsoft 365 Copilot from using content from documents with specific sensitivity labels in AI-generated responses, and removing the requirement to pay to use the Graph API to assign sensitivity labels programmatically. These are all good signs that the sensitivity label framework is developing and building out nicely.
Offline Access to Protected Content
Protecting files with encryption applied by assigning a sensitivity label is a core piece of functionality. Encryption is managed by the Azure Rights Management service, which controls the interpretation and enforcement of the access rights assigned to users through sensitivity label settings.
When an authenticated user attempts to access a protected item, they obtain a use license from the Azure Rights Management service. The use license is a certificate containing the access rights for the item (like whether the user can print the item), the encryption key used to encrypt the content, and if access expires at any point. Importantly, the validity of the use license is limited.
If access to the item is not date-limited, the service issues a use license with a validity period based on the offline access setting contained in the sensitivity label (by default, 30 days). The validity period controls when the user must next authenticate to continue to have access to the item. In practical terms, during the validity period, the existence of the use license means that the user doesn’t need to prove their right to access the content. This is the basis for offline access to protected content by clients such as Outlook. The use license is available on the workstation and can be used to access the protected item even when a network connection is unavailable.
Once the validity period expires, the user is prompted to reauthenticate. During the reauthentication process, the service checks the label settings and evaluates group membership (if used to grant access rights) to establish precisely what rights the user has to the item before it issues a new use license.
Setting the Access Period for a Sensitivity Label
You can restrict the maximum period for offline access on a per-label or tenant-wide basis. To change the validity period for a label, edit the Allow offline access setting (Figure 1) and select the number of days for offline access. Always means that the label uses the maximum validity period for the tenant. Never means that items protected by the label cannot be accessed offline.

Changing the Maximum Validity Period for a Tenant
A sensitivity label cannot have a longer offline access period than the tenant maximum validity period. While 30 days is a good balance between frequent user reauthorization and maintaining security for offline content, some believe that a shorter period is better because it limits the ability of people who leave the organization to access sensitive information. A use license is bound to the device where access occurred, so to continue to have access to the protected content, the person who left must have access to the device.
In any case, a tenant administrator can change the validity period setting for the tenant with PowerShell using the Set-AipServiceMaxUseLicenseValidityTime cmdlet from the AIPService module. The AIPService module only supports Windows PowerShell (5.1). Don’t bother trying to run it on PowerShell 7. Here’s an example of setting the period to 14 days:
Import-Module AIPService Connect-AipService Set-AipServiceMaxUseLicenseValidityTime 14 WARNING: The MaxUseLicenseValidityTime will be updated by this operation. Confirm Are you sure you want to perform this action? Performing the operation "Set-AipServiceMaxUseLicenseValidityTime" on target "current organization". [Y] Yes [A] Yes to All [N] No [L] No to All [S] Suspend [?] Help (default is "Y"): y The MaxUseLicenseValidityTime for the Azure Information Protection service has been successfully set to 14. Get-AipServiceMaxUseLicenseValidityTime 14
The adjusted validity period only applies to newly-issued use licenses. The new value can be anything from 0 to 65535 days (which should be enough for anyone).
Test Before Deployment
As always, it’s best to make changes to settings like the maximum validity period in a test tenant to assess if the change breaks anything. I don’t think it will, but it’s always best to test, assess, and then deploy.
Learn about managing sensitivity labels and the rest of Microsoft Purview Information Protection by subscribing to the Office 365 for IT Pros eBook. Use our experience to understand what’s important and how best to protect your tenant.
ChatGPT Enterprise Connects to SharePoint Online
SharePoint Connector Throws Down the Gauntlet to Microsoft 365 Copilot

An October 8 LinkedIn post announced that OpenAI business customers can “centrally deploy SharePoint for their entire workspace,” The move throws down the gauntlet to Microsoft 365 Copilot by delivering the same kind of ability to reason over files stored in SharePoint Online and OneDrive for Business. While Microsoft 365 Copilot boasts more points of integration with Microsoft 365 apps, including SharePoint agents, the new Knowledge agent (in preview), and the ability to consume SharePoint content in custom agents built with Copilot Studio, I don’t think anyone in Microsoft will be happy to see OpenAI offer customers the opportunity to fully exploit the information stored in SharePoint Online.
Given that Microsoft 365 Copilot uses the OpenAI models, including GPT-5, it’s hard to know why companies opt for OpenAI enterprise, especially if those companies use SharePoint Online (which implies that they use Microsoft 365). List prices for the two offerings is compatible, but Microsoft 365 Copilot delivers more integrated functionality.
OpenAI and SharePoint Online
OpenAI has long offered the ability for individual users to connect to OneDrive for Business accounts and SharePoint Online sites. Access is granted through OAuth authentication against Entra ID and is limited to the information accessible to the user, just like any other app that uses the Graph API to interact with SharePoint Online and OneDrive for Business. Because the OpenAI connector is an app, the app can be blocked to prevent users from being able to upload information to OpenAI.
The description of the ChatGPT SharePoint Connector says “The admin-managed sync connector lets an administrator authenticate once and deploy across the entire organization. Users don’t need to set up anything themselves—it just works. To configure the connector, administrators must be both a SharePoint Online (or tenant) administrator and a ChatGPT administrator. During the configuration, the administrator can choose to synchronize all files or scope the connector to specific sites and folders, with the synchronized copies appearing in ChatGPT as “admin-managed” files. According to OpenAI, new files or updates made to SharePoint files are available to ChatGPT within an hour.
Access to files is governed by “strict email domain matching between SharePoint and ChatGPT. A user’s SharePoint account must match their ChatGPT account email.” I guess this means that user principal names must match the email addresses used to create ChatGPT accounts for ChatGPT to allow access to synchronized files. Of course, Microsoft 365 does not insist that user principal names match a user’s primary SMTP address, so there’s some opportunity for mismatches here.
OpenAI notes that synchronized connectors are only available to customers based in the U.S. that enable data residency or international customers who don’t mind that their data is stored in the U.S. They note that “We don’t yet support in-region storage for non-US data residency configurations.”
The SharePoint Connector
Overall, it seems like the new version of the ChatGPT connector uses application permissions like Sites.Read.All and Files.Read.All to access SharePoint and OneDrive content and synchronize it to ChatGPT, while User.Read.All, Group.Read.All, and GroupMember.Read.All permissions are used for account matching. An example of an app using Graph permissions to read SharePoint is available here.
One thing that’s become painfully obvious since the introduction of Microsoft 365 Copilot is that Microsoft 365 tenants store some complete rubbish in SharePoint Online. Old files and misleading and inaccurate content is stored alongside interesting and useful information, but Copilot can’t tell the difference between the two. Add in some sensitive and confidential information that should never appear in AI-generated output, and you can understand why Microsoft has struggled to make Copilot work for SharePoint in the real world (rather that carefully curated demos). Solutions like Restricted Content Discovery and the DLP Policy for Copilot allow organizations to hide content from Copilot or stop Copilot using information in its responses. It’s taken time for these solutions to arrive, but things are much better now.
OpenAI has the advantage of learning from Microsoft’s toils. It seems like OpenAI uses scoping to restrict what SharePoint content ChatGPT can process, which is kind of like what Restricted Content Discovery does.
Why Use the OpenAI Connector?
Apart from avoiding having to buy Microsoft 365 Copilot licenses, I could never understand why Microsoft 365 tenants let people upload corporate information to ChatGPT for processing. The enterprise SharePoint connector is even worse in my eyes, even if OpenAI guarantees that the information loaded through the connector is never used to train its models.
The notion of synchronizing SharePoint files to ChatGPT so that they people can use that content with ChatGPT seems a little crazy. As far as I can tell, OpenAI offers none of the compliance functionality that Microsoft has developed to protect and secure SharePoint Online. For instance, how does ChatGPT deal with files protected by sensitivity labels?
It seems like once the connector copies SharePoint Online sites to ChatGPT, a Microsoft 365 tenant runs some risk of losing control over information. It’s hard enough to persuade people to store important files in SharePoint Online rather than OneDrive for Business. Adding ChatGPT to the mix makes the task of managing corporate files even harder.
Insight like this doesn’t come easily. You’ve got to know the technology and understand how to look behind the scenes. Benefit from the knowledge and experience of the Office 365 for IT Pros team by subscribing to the best eBook covering Office 365 and the wider Microsoft 365 ecosystem.
Microsoft 365 Copilot Usage Report API General Availability
It’s Nice to be GA, but What Can You Do with the Copilot Usage Report API?
MC877369 first appeared in August 2024 to announce the availability of Microsoft 365 Copilot usage data through the Graph usage reports API (Microsoft 365 roadmap item 396562). The most recent update (6 Oct 2025) sets out a new timeline for general availability of the APIs, which is now expected to roll out in late October 2025 for worldwide completion in late November 2025. Microsoft doesn’t say why the latest delay occurred or why it’s taken so long to move the API from preview to GA.
Still at the Beta Endpoint
Although the Copilot usage report API is heading for general availability, it’s still only accessible through the beta endpoint. There’s nothing wrong with that, providing the API works. Normally, Microsoft Graph APIs accessible through the beta endpoint are under active development to solve performance or reliability problems, or to complete the features necessary to move to production (V1.0) status.
Using the Copilot Usage Report API
I first looked at the API in September 2024 and concluded that most value can be derived from the Copilot user activity detail API. Knowing what apps people use Copilot in is valuable information if you want to do things like:
- Knowing what departments Copilot is being used in and those that need a little help to get going. By including user data from Entra ID with Copilot usage data, we can slice and dice the usage data to generate additional insights (Figure 1).

- Look for user accounts with expensive ($360/year) Microsoft 365 Copilot licenses and automatically remove underused licenses so that the licenses can be reallocated to people who might use them more. The folks who lose the Microsoft 365 Copilot licenses might be happy with the no-charge Microsoft Copilot chat capability. Or they might be the folks in the company who are using ChatGPT and other AI tools instead of Copilot.
- A variation on the theme is to integrate Microsoft 365 audit data with Copilot usage report data to drill down into what people are doing with Copilot. The intention once again is to weed out underused Microsoft 365 Copilot licenses so that others might be assigned those licenses.
- I have a script to create a composite picture of user activity across multiple workloads. It would be easy to add the Copilot usage data to the mix.
Example PowerShell scripts are available to demonstrate the principles explored in each scenario. The point is that usage data is interesting in its own right, but it becomes more powerful when combined with other easily-accessible Microsoft 365 data sources about user activity.
Remember to allow full display of usernames and other information for the report data. If you don’t, the usage data will be obfuscated (concealed) and won’t be able to match up with data from other Microsoft 365 sources.
Other Usage Report APIs
Microsoft 365 supports a bunch of other usage reports APIs for different workloads. Not all workloads featured in the Microsoft 365 admin center are available through a Graph API (like Forms, Project, Visio, and Viva Learning). The same is true for some sub-reports (like Copilot agents). However, there’s enough data available to be able to build a good picture of how people use Microsoft 365 across the board.
The issue with reporting SharePoint URLs (first reported in September 2023) persists. Some security issue is apparently cramping Microsoft’s ability to include site URLs in the site activity report (powered by the getSharePointSiteUsageDetail API), which means that the usage data returned for a site looks like this:
Report Refresh Date : 2025-10-07 Site Id : 66bbf297-2f09-43ec-ab94-9333deacf769 Site URL : Owner Display Name : Project Haycock Owners Is Deleted : False Last Activity Date : 2025-05-23 File Count : 375 Active File Count : 131 Page View Count : 0 Visited Page Count : 0 Storage Used (Byte) : 110786012 Storage Allocated (Byte) : 27487790694400 Root Web Template : Group Owner Principal Name : projecthaycock@office365itpros.com Report Period : 180
The Site Id can be used to find the website URL:
(Get-MgSite -SiteId '66bbf297-2f09-43ec-ab94-9333deacf769').WebUrl https://office365itpros.sharepoint.com/sites/projecthaycock
It’s a mystery why Microsoft won’t or can’t fix this irritating issue. Just one of those cloud mysteries…









