Blocking Microsoft 365 Copilot From Making Inferences in Teams Meetings
Copilot Inference and Evaluation Policy Blocks Copilot in Teams from Interpreting Participant Emotions During Meetings
One of the interesting things about using Microsoft 365 Copilot in Teams meetings (or rather Copilot in Teams, for it’s only one of the many Copilots licensed by Microsoft 365 Copilot) is that it attempts to evaluate user sentiment based on their contributions to meetings. Copilot does this by analyzing the words spoken by participants to “infer emotions, make evaluations, discuss personal traits, and use context to deduce answers.”
Evaluating how happy someone is during a meeting sounds a bit too much like big brother oversight to many, especially in countries where personal privacy is more highly prized than in others. In your organization is in this situation, tenant administrators can restrict “Copilot’s ability to make inferences or evaluations about people or groups when prompted to do so by users” by updating the Copilot inference and evaluation policy.
A Simple Graph Query
Microsoft explains how to update the policy in message center notification MC916990 (last updated 16 December 2024, Microsoft 365 roadmap item 411568). Deployment to tenants with Microsoft 365 Copilot licenses is now complete.
MC916990 describes how to use the Graph Explorer to update the policy. Much as I like the Graph Explorer, the description given isn’t very clear and lacks some essential detail, like how to format the JSON input payload (Figure 1) and the required permission.
The Copilot Admin Limited Mode Resource Type
Here’s some of that detail together with instructions about how to do the job with the Microsoft Graph PowerShell SDK. The first thing to know is that the Copilot inference and evaluation policy is represented in the Graph by the copilotAdminLimitedMode resource type. This is important to know, because we can then reference the documentation to discover that the CopilotSettings-LimitedMode.ReadWrite permission is needed to update the policy. This is a delegated permission, so it works in the context of the signed-in user, and only accounts with the Global administrator or Global reader roles can update policy settings.
The documentation for the Get and Update operations doesn’t include any Microsoft Graph PowerShell SDK cmdlets to get and update the policy, but we can use the HTTP URI to interact with the policy through the Invoke-MgGraphRequest cmdlet.
To begin, let’s sign into a Microsoft Graph PowerShell SDK interactive session and request the necessary permission.
Connect-MgGraph -Scopes CopilotSettings-LimitedMode.ReadWrite
If CopilotSettings-LimitedMode.ReadWrite is not in the static list of permissions held by the service principal for the Microsoft Graph Command Line Tools app, you’ll be prompted to grant consent (Figure 2):
Next, let’s fetch the current policy values by running Invoke-MgGraphRequest with a Get request to the URI for the policy. The values shown below are the defaults:
$Uri = "https://graph.microsoft.com/beta/copilot/admin/settings/limitedMode" $Data = Invoke-MgGraphRequest -Uri $Uri -Method GET $Data Name Value ---- ----- isEnabledForGroup False groupId @odata.context https://graph.microsoft.com/beta/$metadata#copilot/admin/settings/limitedMode/$entity
To update the policy and block Copilot evaluating sentiment for some users, you must create a group and populate its membership with the user accounts to block. Then find the object identifier for the group and copy it for reuse (Figure 3).
You can then use the group identifier to update the group settings. This code creates a hash table to hold the two settings to update. The first setting contains the group identifier (stored in a PowerShell variable). The second sets the value for the isEnabledForGroup setting to true for the members of the group. The effect is to instruct Teams to use limited mode for the members of the group when they are meeting participants. When the hash table is ready, run Invoke-MgGraphRequest again to patch the policy with the settings in the hash table.
$GroupId = 'f805d711-c4f4-4663-9993-b08b4be52cb5' $Parameters = @{} $Parameters.Add("groupId",$GroupId) $Parameters.Add("isEnabledForGroup",$true) Invoke-MgGraphRequest -Uri $Uri -Method Patch -Body ($Parameters | ConvertTo-Json)
The policy update can take up to a day before it is effective. When it is, Copilot will decline to answer questions about someone’s performance, emotions, or personal traits. Figure 4 shows two examples. The first is the same example as used in Microsoft’s announcement: ask if someone is happy based on their contributions to a meeting. The second asks who made the most positive contribution to the conversion.
In contrast, Figure 5 shows how Copilot responds to the same question asked by another meeting participant who isn’t restricted by policy.
Stop Big Brother Oversight
I’m not sure that many people ask questions about the feelings or emotions of other meeting participants. It seems like a kind of weird thing to do, and I can appreciate that some would find the prospect of AI measuring their emotions to be on the wrong side of big brother observation. With that thought in mind, this is a good update that organizations with Copilot should consider implementing.
Insight like this doesn’t come easily. You’ve got to know the technology and understand how to look behind the scenes. Benefit from the knowledge and experience of the Office 365 for IT Pros team by subscribing to the best eBook covering Office 365 and the wider Microsoft 365 ecosystem.