Category: News
Running Teams PowerShell Cmdlets in Azure Automation
Why Would You Need to Run Teams PowerShell Cmdlets in an Azure Automation Runbook
The new permissions requirement for Entra ID apps that want to use cmdlets from the Teams PowerShell module creates the question of what conditions could occur to make apps need to use the Teams cmdlets. After all, access to Teams structures and data is available through Graph APIs and the Microsoft Graph PowerShell SDK. The answer lies in the history of the policy-drive management framework used by Teams.
When Microsoft launched Teams in 2017, they borrowed from many existing technologies. Teams took its approach to management from Skype for Business Online and adapted many of the policies used by Skype for Business to control how meetings, messaging, and other aspects of the system worked. This decision made a lot of sense as it became clear that Teams would replace Skype for Business Online, a decision first revealed in September 2017 and eventually completed in July, 2021.
The Teams PowerShell module appeared in late 2017. I didn’t like the module very much at the time, but have become accustomed to it over time as Microsoft removed the rough edges. One such irritation was the Skype for Business connector, the component used to allow PowerShell control over Teams management policies. In February 2021, Microsoft retired the Skype for Business connector and moved its cmdlets into the Teams module. The legacy of the Skype connector cmdlets still exists today with cmdlets like Get-CsOnlineUser and Get-CsTeamsMeetingPolicy.
In general, any cmdlet with a “Team” prefix (like Get-TeamChannel) is Graph-based and an equivalent cmdlet is in the Microsoft Graph PowerShell SDK (Get-MgTeamChannel in this instance). Any cmdlet with a “Cs” prefix (like Get-CsTeamsMessagingPolicy) uses the old Skype for Business framework and doesn’t have a Graph SDK equivalent. It’s therefore a reasonable conclusion that Entra ID apps need to use the Teams PowerShell module only when apps need to interact with Teams policies. The Graph handles Entra ID app access to teams, channels, chats, meetings, and so on.
Running Teams PowerShell in Azure Automation
Running PowerShell in Azure Automation runbooks is like running PowerShell through Entra ID apps. Both need to be authentication before cmdlets run, and both need some permissions to access data. Permissions can be granted explicitly or via membership of a management role.
Unless the need exists to do widespread and extensive policy reassignment to accounts, I’m not sure that many runbooks use Teams PowerShell. The exception might be where a tenant develops a script based on Teams PowerShell and wants to run the script on a scheduled basis, or the script takes so long to run that interactive execution becomes a pain. In these cases, Azure Automation is a good answer, especially because the Connect-MicrosoftTeams cmdlet supports managed identities for authentication.
The caveat is that some of the cmdlets in the Teams PowerShell module are unsupported when application-based authentication is used. New-Team is the most notable cmdlet in this category, but that gap is easily worked around by creating a new team through Microsoft Graph PowerShell SDK cmdlets.
Configuring Azure Automation for Teams PowerShell
Temas uses much of the setup used to run other Microsoft 365 PowerShell modules with Azure Automation. An automation account must be available that’s connected to an Azure subscription. Ideally, the automation account should support managed identities. The module must be loaded as a resource into the runtime environment. Teams PowerShell supports the PowerShell V7.4 runtime engine, so a custom runtime environment can be used.
Before attempting to authenticate, the service principal for the automation account must be assigned:
- The Teams administrator role. This is what allows the automation account to run the cmdlets in the module as a Teams administrator. The role can be assigned with PowerShell or via the Entra admin center (Figure 1).
- Any Graph API application permissions needed to access the Teams data processed by runbooks. For instance, because Teams management might be gated by administrative units, the RoleManagement.Read.Directory and GroupMember.Read.All permissions are needed to read details about administrative units and the members of those units. Other Graph permissions like Chat.Read.All might be needed to read information stored in teams.

With these preprequisites in place, you should be able to create and execute runbooks using the Teams PowerShell module. Figure 2 shows the output from a very simple runbook that connects to Teams with a managed identity before listing the teams in a tenant.

Not the Normal Option
My normal course of action is to use the Microsoft Graph PowerShell SDK or Graph API requests to interact with Teams (like reporting details about Teams online meetings or analyzing chat messages with external users to figure out an allow list for external access) and I seldom use the Teams PowerShell module with Azure Automation. However, the option is there if needed, and that’s good to know.
Need some assistance to write and manage PowerShell scripts for Microsoft 365, including Azure Automation runbooks? Get a copy of the Automating Microsoft 365 with PowerShell eBook, available standalone or as part of the Office 365 for IT Pros eBook bundle.
Measuring Line Profiles in IMages
I have a image such as below which has horizontal and vertical lines in the field of view. I want to segment the vertical line adjacent to the dark area such as in red below and vertical line in the intersection of the horizontal line(as seen in black).The full image has many such structiures. I want to stack all similar segmented structures within the fov (Average measuremnet of red and black) and display.I also want to measure the average x and y as shown below if the red line concaves at the max conving location in y.I have a image such as below which has horizontal and vertical lines in the field of view. I want to segment the vertical line adjacent to the dark area such as in red below and vertical line in the intersection of the horizontal line(as seen in black).The full image has many such structiures. I want to stack all similar segmented structures within the fov (Average measuremnet of red and black) and display.I also want to measure the average x and y as shown below if the red line concaves at the max conving location in y. I have a image such as below which has horizontal and vertical lines in the field of view. I want to segment the vertical line adjacent to the dark area such as in red below and vertical line in the intersection of the horizontal line(as seen in black).The full image has many such structiures. I want to stack all similar segmented structures within the fov (Average measuremnet of red and black) and display.I also want to measure the average x and y as shown below if the red line concaves at the max conving location in y. image segmentation, image processing MATLAB Answers — New Questions
Please provide me the matlab code for 8-bit integer arithmetic coding and decoding.
I am currently using 8-bit integer arithmetic encoding to compress a binary vector of approximately 7000 bits in MATLAB. However, after decoding the arithmetic encoding, the reconstructed binary vector does not match the original input. Could you please provide MATLAB code for 8-bit integer arithmetic encoding and decoding that ensures full recovery of the original data?I am currently using 8-bit integer arithmetic encoding to compress a binary vector of approximately 7000 bits in MATLAB. However, after decoding the arithmetic encoding, the reconstructed binary vector does not match the original input. Could you please provide MATLAB code for 8-bit integer arithmetic encoding and decoding that ensures full recovery of the original data? I am currently using 8-bit integer arithmetic encoding to compress a binary vector of approximately 7000 bits in MATLAB. However, after decoding the arithmetic encoding, the reconstructed binary vector does not match the original input. Could you please provide MATLAB code for 8-bit integer arithmetic encoding and decoding that ensures full recovery of the original data? 8-bit integer arithmetic coding MATLAB Answers — New Questions
Strange behaviour using double
I’m currently using Matlab 2020b
The doumentation for the double function states that ‘Converting a string that does not represent a single numeric value to double will produce a NaN result’. Based on this, I am using isnan(double(string(‘chars’))) to check whether a user inputted character vector represents a number or not. One particular case I wish to guard against is users entering a ‘,’ instaed of ‘.’. This works very well, except if the vector in question has a comma followed by exactly 3 numerals, in which case double tells me it is a number. For example, the vector ‘1.23’ will produce NaN, but ‘1,234’ gives the number 1234. Any other number of characters after the comma gives NaN (as far as I can tell).
Does anybody understand why this is and is there a way to get it to do what I actually want?
ThanksI’m currently using Matlab 2020b
The doumentation for the double function states that ‘Converting a string that does not represent a single numeric value to double will produce a NaN result’. Based on this, I am using isnan(double(string(‘chars’))) to check whether a user inputted character vector represents a number or not. One particular case I wish to guard against is users entering a ‘,’ instaed of ‘.’. This works very well, except if the vector in question has a comma followed by exactly 3 numerals, in which case double tells me it is a number. For example, the vector ‘1.23’ will produce NaN, but ‘1,234’ gives the number 1234. Any other number of characters after the comma gives NaN (as far as I can tell).
Does anybody understand why this is and is there a way to get it to do what I actually want?
Thanks I’m currently using Matlab 2020b
The doumentation for the double function states that ‘Converting a string that does not represent a single numeric value to double will produce a NaN result’. Based on this, I am using isnan(double(string(‘chars’))) to check whether a user inputted character vector represents a number or not. One particular case I wish to guard against is users entering a ‘,’ instaed of ‘.’. This works very well, except if the vector in question has a comma followed by exactly 3 numerals, in which case double tells me it is a number. For example, the vector ‘1.23’ will produce NaN, but ‘1,234’ gives the number 1234. Any other number of characters after the comma gives NaN (as far as I can tell).
Does anybody understand why this is and is there a way to get it to do what I actually want?
Thanks double MATLAB Answers — New Questions
Issues creating error bar for bar figure
Hi,
I’m trying to create error bars on my bar plot.
I get the error: "Input arguments must be numeric, datetime, duration, or categorical."
I’m not sure what I’m doing wrong. Even when I make err equal to two numbers it still doesn’t work.
AMean = 656631
BMean = 1130
ASTD = 237027
BSTD = 209
AHeight = 10
BHeight = 11
Names = ["A"; "B" ] ;
Averages = [AMean; BMean] ;
StandDev = [ASTD ; BSTD] ;
SampSize = [AHeight; BHeight] ;
NewTable = table(Names, Averages, StandDev, SampSize) ;
x = NewTable.Names ;
y = NewTable.Averages ;
err = StandDev ./ sqrt(SampSize) ;
bar(x, y)
errorbar(x,y,err)Hi,
I’m trying to create error bars on my bar plot.
I get the error: "Input arguments must be numeric, datetime, duration, or categorical."
I’m not sure what I’m doing wrong. Even when I make err equal to two numbers it still doesn’t work.
AMean = 656631
BMean = 1130
ASTD = 237027
BSTD = 209
AHeight = 10
BHeight = 11
Names = ["A"; "B" ] ;
Averages = [AMean; BMean] ;
StandDev = [ASTD ; BSTD] ;
SampSize = [AHeight; BHeight] ;
NewTable = table(Names, Averages, StandDev, SampSize) ;
x = NewTable.Names ;
y = NewTable.Averages ;
err = StandDev ./ sqrt(SampSize) ;
bar(x, y)
errorbar(x,y,err) Hi,
I’m trying to create error bars on my bar plot.
I get the error: "Input arguments must be numeric, datetime, duration, or categorical."
I’m not sure what I’m doing wrong. Even when I make err equal to two numbers it still doesn’t work.
AMean = 656631
BMean = 1130
ASTD = 237027
BSTD = 209
AHeight = 10
BHeight = 11
Names = ["A"; "B" ] ;
Averages = [AMean; BMean] ;
StandDev = [ASTD ; BSTD] ;
SampSize = [AHeight; BHeight] ;
NewTable = table(Names, Averages, StandDev, SampSize) ;
x = NewTable.Names ;
y = NewTable.Averages ;
err = StandDev ./ sqrt(SampSize) ;
bar(x, y)
errorbar(x,y,err) error bar, graphing MATLAB Answers — New Questions
Running the SharePoint Site Content and Policy Comparison Report
Use the SharePoint Site Content Report to Highlight Issues – If You Can Find Any
As quickly as they can and in as many ways as possible, Microsoft is attempting to address the problem of digital rot in SharePoint Online. Having old, obsolete, inaccurate, and just plain wrong information in SharePoint Online and OneDrive for Business doesn’t matter so much until the eagle eye of AI tools are deployed. At that point, all manner of misleading responses can appear because the AI is grounded on misleading or incorrect information.
To some extent, customers cannot be blamed for the digital debris that they accrue in SharePoint Online. For years, Microsoft has encouraged cloud storage (the latest tactic is a policy setting to block users from saving to non-cloud locations), so it’s natural that some rot accumulates along with valuable material. As I’ve said before, many of the problems customers have encountered with Microsoft 365 Copilot deployments are the legacy of previous Microsoft collaboration strategies. Not everyone goes through the process of auditing the files stored in SharePoint Online to remove dross (here’s how to create a report of files in a document library, and here’s how to do the same for a OneDrive for Business account).
The Site Content and Policy Comparison Report
All of which brings me to the “SharePoint AI-powered site content and policy comparison report” announced in MC1143287 (27 August 2025). The report is available to tenants with Microsoft 365 Copilot or SharePoint Advanced Management (SAM) licenses. It’s one of the SAM features made available to Copilot customers without the need for SAM licenses. The site content and policy comparison report is rolling out in general availability now with completion worldwide due in late December 2025.
According to Microsoft, the idea is that a tenant selects a reference site that they regard as a good example of a site with “up-to-date and relevant policies” to use as a benchmark to compare up to 10,000 other sites against (Figure 1). In addition to policies that apply to the site (like data lifecycle policies), the reference site should contain more than 10 files of the same kind of material that’s found in the target sites. This is because the comparison uses the 10 most recently used files from each site.

The report uses AI to examine the target sites. The target sites can be chosen by uploading a CSV containing their URLs or selected using several site properties, ranging from the simplest (examine everything as shown in Figure 2) to identifying sites based on the container management sensitivity label assigned to sites, the site type, creation date (for example, sites created within the last 30 days), sites with no owners, and sites where external sharing is disabled.

Nothing in My Reports
Behind the scenes, AI compares the target sites against the reference site to highlight inconsistent application of policies based on the similarity between the reference site and target sites (here’s the documentation). After pondering on any anomalies that it finds (a process that Microsoft warns could take up to 48 hours), the AI generates a report for administrators to consider and potentially act upon.
And that’s where my story ends because despite multiple attempts to select a good reference site to compare against the other sites in my tenant, the AI always came up with an empty report. I even purposely populated a site with content that I knew is similar to other sites and edited ten of the files added to the site to make sure that fresh material was available for the comparison. The site had the same sensitivity label and settings as the reference site, but the report still ignored it.
Maybe my SharePoint Deployment Has No Problems
I could take a positive view and conclude that the AI discovered no irregularities. For instance, all my team sites have container management labels and have assigned retention policies. It could also be the case that the selected reference sites are very dissimilar to the other sites in the organization, so much so that none of the other sites came close enough to be of interest.
However, I suspect that the AI comparison just doesn’t work well for tenants where many similar sites exist but only a few of those sites are actively used. I also wonder why Microsoft insists on comparing the last ten most recently used files because if the intention is to help organizations prepare for Copilot, then perhaps sites that hold many files without having sufficient recently modified files should be highlighted? After all, in the eyes of AI tools, a file is a file, and the information contained in a file that hasn’t been modified in years could end up being cited in a Copilot response. Using old material often leads to poor responses.
Don’t assume that just because it didn’t work for me, the site content and policy comparison report is rubbish. It might work well in your tenant and highlight many areas that you should investigate to improve the tenant readiness for AI.
Learn about managing SharePoint Online and the rest of Microsoft 365 by subscribing to the Office 365 for IT Pros eBook. Use our experience to understand what’s important and how best to protect your tenant.
startRecording() not reliable in real-time app? New file logs not being created when expected
I used this Mathworks post to implement an intermittent file logging method in my real-time app running on Speedgoat.
I created a long duration (10’s on minutes) timer with a callback. The callback performs stopRecording() and then startRecording(), which should create a new log file.
The timer looks something like this:
timer(‘ExecutionMode’, ‘fixedRate’, ‘Period’, 600, ‘TimerFcn’, @(~,~)app.createNewLogFile(app))
createNewLogFile() does the following:
stopRecording(app.tg)
short pause()
startRecording(app.tg)
Checks whether a new log file was created (it should be from stopping/restarting)
This works the majority of the time, but fails intermittently. I performed basic troubleshooting to see that it fails to log for a period of time when the callback fails. I can see that it stopped recording but did not restart within that failed callback.
Extra info:
There is no Enable Log File block in the model because its documentation page says that it can affect the usage of recording functions
I have tried giving the stop/restart calls multiple attempts within the same timer callback. It seems that if it fails the first attempt in a timer callback, it fails repeatedly within that callback
I want the timed log functionality to be in the App because I want the ability for an App user to toggle it. Meaning that in-model default timer solutions don’t work as an alternative
I would like to learn how to fish here. What can I do to truly understand why/how a startRecording() call would not work? Is there something about housing this behavior within a timer callback on the host side that makes it unreliable for sending commands to the target? It doesn’t seem like it should be since the timer is very long duration in comparison to the callback commands.
If I understand the documentation correctly, there is a target->host ACK built in to the stopRecording() and startRecording() functions? So I would expect the restart to fail the acknowledge step if it did not go through. But I don’t see a failure.
Thanks for any insight here.I used this Mathworks post to implement an intermittent file logging method in my real-time app running on Speedgoat.
I created a long duration (10’s on minutes) timer with a callback. The callback performs stopRecording() and then startRecording(), which should create a new log file.
The timer looks something like this:
timer(‘ExecutionMode’, ‘fixedRate’, ‘Period’, 600, ‘TimerFcn’, @(~,~)app.createNewLogFile(app))
createNewLogFile() does the following:
stopRecording(app.tg)
short pause()
startRecording(app.tg)
Checks whether a new log file was created (it should be from stopping/restarting)
This works the majority of the time, but fails intermittently. I performed basic troubleshooting to see that it fails to log for a period of time when the callback fails. I can see that it stopped recording but did not restart within that failed callback.
Extra info:
There is no Enable Log File block in the model because its documentation page says that it can affect the usage of recording functions
I have tried giving the stop/restart calls multiple attempts within the same timer callback. It seems that if it fails the first attempt in a timer callback, it fails repeatedly within that callback
I want the timed log functionality to be in the App because I want the ability for an App user to toggle it. Meaning that in-model default timer solutions don’t work as an alternative
I would like to learn how to fish here. What can I do to truly understand why/how a startRecording() call would not work? Is there something about housing this behavior within a timer callback on the host side that makes it unreliable for sending commands to the target? It doesn’t seem like it should be since the timer is very long duration in comparison to the callback commands.
If I understand the documentation correctly, there is a target->host ACK built in to the stopRecording() and startRecording() functions? So I would expect the restart to fail the acknowledge step if it did not go through. But I don’t see a failure.
Thanks for any insight here. I used this Mathworks post to implement an intermittent file logging method in my real-time app running on Speedgoat.
I created a long duration (10’s on minutes) timer with a callback. The callback performs stopRecording() and then startRecording(), which should create a new log file.
The timer looks something like this:
timer(‘ExecutionMode’, ‘fixedRate’, ‘Period’, 600, ‘TimerFcn’, @(~,~)app.createNewLogFile(app))
createNewLogFile() does the following:
stopRecording(app.tg)
short pause()
startRecording(app.tg)
Checks whether a new log file was created (it should be from stopping/restarting)
This works the majority of the time, but fails intermittently. I performed basic troubleshooting to see that it fails to log for a period of time when the callback fails. I can see that it stopped recording but did not restart within that failed callback.
Extra info:
There is no Enable Log File block in the model because its documentation page says that it can affect the usage of recording functions
I have tried giving the stop/restart calls multiple attempts within the same timer callback. It seems that if it fails the first attempt in a timer callback, it fails repeatedly within that callback
I want the timed log functionality to be in the App because I want the ability for an App user to toggle it. Meaning that in-model default timer solutions don’t work as an alternative
I would like to learn how to fish here. What can I do to truly understand why/how a startRecording() call would not work? Is there something about housing this behavior within a timer callback on the host side that makes it unreliable for sending commands to the target? It doesn’t seem like it should be since the timer is very long duration in comparison to the callback commands.
If I understand the documentation correctly, there is a target->host ACK built in to the stopRecording() and startRecording() functions? So I would expect the restart to fail the acknowledge step if it did not go through. But I don’t see a failure.
Thanks for any insight here. app designer, timer, real-time MATLAB Answers — New Questions
unable to open timetables from workspace
Hello Matlab Comminity,
I have just updated my MATLAB to R2025a and I can not open the timetables from my workspace.
Other variables (numeric arrays, tables, timerange objects) open fine. With timetables, the workspace either becomes unresponsive or never shows the variable. No explicit error dialog appears.If I write the variable to command window, I can see the content but it is not what I need.
It seems like a strange error or unresponsive and i do not know how to fix this.
Is there any idea to remove this trouble?
Thank you in advance for your time.Hello Matlab Comminity,
I have just updated my MATLAB to R2025a and I can not open the timetables from my workspace.
Other variables (numeric arrays, tables, timerange objects) open fine. With timetables, the workspace either becomes unresponsive or never shows the variable. No explicit error dialog appears.If I write the variable to command window, I can see the content but it is not what I need.
It seems like a strange error or unresponsive and i do not know how to fix this.
Is there any idea to remove this trouble?
Thank you in advance for your time. Hello Matlab Comminity,
I have just updated my MATLAB to R2025a and I can not open the timetables from my workspace.
Other variables (numeric arrays, tables, timerange objects) open fine. With timetables, the workspace either becomes unresponsive or never shows the variable. No explicit error dialog appears.If I write the variable to command window, I can see the content but it is not what I need.
It seems like a strange error or unresponsive and i do not know how to fix this.
Is there any idea to remove this trouble?
Thank you in advance for your time. workspace, timetable MATLAB Answers — New Questions
Compare two row and select appropriate data
I have two columns. Let’s call them column a and column b.
I want to do a check where:
if row 1 of column a > row 1 of column b, use row 1 of column a. Else, use row 1 of column b.
I have tried
if Column a > Column b
Column c = column b
else
Column c = column a
end
However, when I check the data, I find out some of the function isn’t working and it just pulls all the data from column a into column c.
Basically column b is the "cap." And no number in column c should be greater than that. If any numbers in column a is greater than column b, column b should be used.I have two columns. Let’s call them column a and column b.
I want to do a check where:
if row 1 of column a > row 1 of column b, use row 1 of column a. Else, use row 1 of column b.
I have tried
if Column a > Column b
Column c = column b
else
Column c = column a
end
However, when I check the data, I find out some of the function isn’t working and it just pulls all the data from column a into column c.
Basically column b is the "cap." And no number in column c should be greater than that. If any numbers in column a is greater than column b, column b should be used. I have two columns. Let’s call them column a and column b.
I want to do a check where:
if row 1 of column a > row 1 of column b, use row 1 of column a. Else, use row 1 of column b.
I have tried
if Column a > Column b
Column c = column b
else
Column c = column a
end
However, when I check the data, I find out some of the function isn’t working and it just pulls all the data from column a into column c.
Basically column b is the "cap." And no number in column c should be greater than that. If any numbers in column a is greater than column b, column b should be used. comparison, matlab MATLAB Answers — New Questions
How can I solve a problem that is related to Fuzzy Logic Designer
Hi… I have designed a fuzzy controller that I want to tune. The dataset I currently have contains output from another software, recorded over time. Consider that I read two inputs and generate an output, and the resulting table provides data over a 60-second period. How can I import this dataset into MATLAB and use it for tuning my fuzzy controller?
Fuzzy Controller designed in Fuzzy Logic Designer app:
Data sample:Hi… I have designed a fuzzy controller that I want to tune. The dataset I currently have contains output from another software, recorded over time. Consider that I read two inputs and generate an output, and the resulting table provides data over a 60-second period. How can I import this dataset into MATLAB and use it for tuning my fuzzy controller?
Fuzzy Controller designed in Fuzzy Logic Designer app:
Data sample: Hi… I have designed a fuzzy controller that I want to tune. The dataset I currently have contains output from another software, recorded over time. Consider that I read two inputs and generate an output, and the resulting table provides data over a 60-second period. How can I import this dataset into MATLAB and use it for tuning my fuzzy controller?
Fuzzy Controller designed in Fuzzy Logic Designer app:
Data sample: fuzzy logic controller, fuzzy logic designer MATLAB Answers — New Questions
How to add two columns based on conditions
I was going to try to write a very basic loop (to familiarize myself with writing loops), otherwise not using a loop would probably cost less.
I have a large matrix, but for simplicity, I am trying to add 2 columns together and create the sum in another column in the same matrix.
I want to have two conditions to be met in order for the addition to take place, if conditions are not met, then do not combine values, but still generate the first value.
So I want to add column 3 and 4, if column 1 x<1.5 and if column 2 x==0, then insert into column 5. If the critieria is not met, then copy value from column 3 into 5.
1 0 1 4 5
2 0 1 5 1
1 1.5 2 1 2
if intrun(:,1)<1.50) & (intrun(:,2)==0;
intrun(:,5)= intrun(:,3)+intrun(:,4);
end
Thank YouI was going to try to write a very basic loop (to familiarize myself with writing loops), otherwise not using a loop would probably cost less.
I have a large matrix, but for simplicity, I am trying to add 2 columns together and create the sum in another column in the same matrix.
I want to have two conditions to be met in order for the addition to take place, if conditions are not met, then do not combine values, but still generate the first value.
So I want to add column 3 and 4, if column 1 x<1.5 and if column 2 x==0, then insert into column 5. If the critieria is not met, then copy value from column 3 into 5.
1 0 1 4 5
2 0 1 5 1
1 1.5 2 1 2
if intrun(:,1)<1.50) & (intrun(:,2)==0;
intrun(:,5)= intrun(:,3)+intrun(:,4);
end
Thank You I was going to try to write a very basic loop (to familiarize myself with writing loops), otherwise not using a loop would probably cost less.
I have a large matrix, but for simplicity, I am trying to add 2 columns together and create the sum in another column in the same matrix.
I want to have two conditions to be met in order for the addition to take place, if conditions are not met, then do not combine values, but still generate the first value.
So I want to add column 3 and 4, if column 1 x<1.5 and if column 2 x==0, then insert into column 5. If the critieria is not met, then copy value from column 3 into 5.
1 0 1 4 5
2 0 1 5 1
1 1.5 2 1 2
if intrun(:,1)<1.50) & (intrun(:,2)==0;
intrun(:,5)= intrun(:,3)+intrun(:,4);
end
Thank You for loop MATLAB Answers — New Questions
Microsoft’s Effort to Develop a Broad People Platform
Build Out of a Complete Person Platform within Microsoft 365
After publishing the article about customizing the Microsoft 365 profile card through the Microsoft 365 admin center, I started to think about the work Microsoft is doing on the “people platform,” an effort to unify the different strands of people-related data that exist inside Microsoft 365, such as Entra ID, mailbox properties, skills defined for SharePoint Online users (now the people skills solution), people settings like name pronunciation and personal pronouns, and so on. Drawing together disparate sources of information to create a unified view of a person (tenant user, guest, or contact) seems like a good idea, especially if AI tools like Copilot use people information to enhance responses to user prompts.
Exploiting the Graph to Discover More About the People Platform
I’ve often argued that acquiring a knowledge of how the Microsoft Graph works and how to interact with Graph APIs is a good way to gain an insight into how Microsoft 365 works. In this case, the profile resource type is very revealing. A lot of the work in this area is still in beta and is therefore prone to change, but you can see where Microsoft is heading in the description of the Microsoft 365 person experience (described for a multi-geo tenant, but still valid):
“The Microsoft 365 Person encompasses the complete set of properties, attributes and associated people contacts that are representative of a user in the Microsoft 365 tenant.”
The sources used to build the complete picture of a person include:
- Basic administrative information about people stored in Entra ID, including information synchronized from on-premises Active Directory. This information includes the display name, employee identification number, employee type, physical location, and so on. It also includes the person’s photo.
- Data to build out the one-dimensional picture of a user as found in Entra ID by populating information about the person’s jobs (work positions or roles), skills, languages, education (including certifications), and so on. This information can be added programmatically through the Graph APIs or imported from an external system using a Copilot connector for people data.
Adding a Work Position
To demonstrate what kind of information the people platform uses to build out a complete picture about a user, I spent a little time playing with the Microsoft Graph PowerShell SDK cmdlets that add work position information to the people platform. A work position is a job or role undertaken by a person within a business. The role might be in the past or current, so a complete work history within an organization can be constructed from the work position data. Like the rest of the platform, these cmdlets are based on beta code, so some stuff doesn’t work. Nevertheless, the information defined for a work position is a good example of what Microsoft is doing.
This code adds a new work position, marking the role as current (isCurrent is true). The allowedAudiences property is set to organization, meaning that the information is available within the organization (it could also be set to “me,” meaning that it’s personal to the user). After building the properties to describe the role, running the New-MgBetaUserProfilePosition cmdlet adds the work position record to the people platform. This is equivalent to posting a request to the https://graph.microsoft.com/beta/users/{userId}/profile/positions endpoint.
$AddDate = ([datetime]::UtcNow.ToString("s")) + "Z" [string]$AddDate = '2025-09-04T22:18:03.8608268Z' [string]$endDate = '2039-09-04T22:18:03.8608268Z' $Parameters = @{} $RoleDetail = @{} $RoleDetail.Add("jobTitle", "Senior Technical Manager") $RoleDetail.Add("role", "Technology Consulting") $RoleDetail.Add("employeeId", "150847") $RoleDetail.Add("employeeType", "Permanent") $RoleDetail.Add("description", "Responsible for the management of a small team of technical consultants covering the NYC area") $RoleDetail.Add("level", "MGR-02") $RoleDetail.Add("startMonthYear",$AddDate) $RoleDetail.Add("endMonthYear", $EndDate) $Company = @{} $Company.Add("displayName", "Office 365 for IT Pros") $Company.Add("department", "Consulting") $Company.Add("officeLocation", "New York City") $Company.Add("costCenter", "100014") $Company.Add("division", "Professional Services") $Company.Add("webUrl", "https://office365itpros.com/") $Address = @{} $Address.add("type", "business") $Address.add("street", "170 Avenue of the Americas") $Address.add("city", "New York City, New York") $Address.add("countryOrRegion", "United States") $Address.add("postalCode", "10036") $Company.Add("address", $address) $RoleDetail.Add("company", $Company) $Parameters.Add("detail", $RoleDetail) $Parameters.Add("isCurrent", "true") $Parameters.Add("allowedAudiences","organization") $Parameters.Add("createdBy", "Tony Redmond") New-MgBetaUserProfilePosition -UserId $User.Id -BodyParameter $Parameters
After positions are added, the Get-MgBetaUserProfilePosition cmdlet can retrieve an array of all the work positions known for the user:
[array]$Positions = Get-MgBetaUserProfilePosition -UserId $User.Id
The information written by New-MgBetaUserProfilePosition is available in several composite properties, like Detail (Figure 1). It’s easy to see how this information might surface on a form of the current profile card.

You’ll always find a position registered for a user because the people platform synchronizes details from Entra ID to create a default position from information like a user’s job title held in the directory.
A Complex Undertaking
It’s obvious that building the people platform is a complex undertaking. Anything to do with personal information is sensitive. Personal privacy must be protected. I am sure that employee unions and worker councils will take a keen interest in how Microsoft 365 tenants decide to populate the Graph with people information and then, more importantly, how that data surfaces in places like the Microsoft 365 profile card.
It’s often said that we live in interesting times. Things might just get more interesting as the people platform develops. It’s certainly something that anyone involved in identities and user management should keep an eye on.
Insight like this doesn’t come easily. You’ve got to know the technology and understand how to look behind the scenes. Benefit from the knowledge and experience of the Office 365 for IT Pros team by subscribing to the best eBook covering Office 365 and the wider Microsoft 365 ecosystem.
Flexible work update
Amy Coleman, Executive Vice President and Chief People Officer, shared the below communication with Microsoft employees this morning.
How we work has forever changed. I remember starting at Microsoft in the late ‘90s, always in the office, no laptops, and primarily working with the people right down the hall. As technology evolved and our business expanded, we became more open, more global, and able to scale in ways we couldn’t have imagined. Then the pandemic reshaped everything. It pushed us to think differently about work, to connect like never before (thank you Teams!), reminded us of how much we value being together, and gave us focus and autonomy in the traditional workday. We’re not going back, and we shouldn’t. Instead, we should take the best of what we’ve learned and move forward.
In the AI era, we are moving faster than ever, building world-class technology that changes how people live and work, and how organizations everywhere operate. If you reflect on our history, the most meaningful breakthroughs happen when we build on each other’s ideas together, in real time.
We’ve looked at how our teams work best, and the data is clear: when people work together in person more often, they thrive — they are more energized, empowered, and they deliver stronger results. As we build the AI products that will define this era, we need the kind of energy and momentum that comes from smart people working side by side, solving challenging problems together.
With that in mind, we’re updating our flexible work expectations to three days a week in the office.
We’ll roll this out in three phases: 1) starting in Puget Sound at the end of February; 2) expanding to other US locations; 3) then launching outside the US.
Our goal with this change is to provide more clarity and consistency in how we come together, while maintaining the flexibility we know you value. We want you to continue to shape your schedule in ways that work best for you, making in-person time intentional and impactful. Importantly, this update is not about reducing headcount. It’s about working together in a way that enables us to meet our customers’ needs.
For some of you, this is not a change. For others this may be a bigger adjustment, which is exactly why we’re providing time to plan thoughtfully. As part of these updates, we’re also enhancing our workplace safety and security measures so we can continue to provide a workplace where every employee can do their best work.
What you need to know:
Puget Sound-area employees: If you live within 50 miles of a Microsoft office, you’ll be expected to work onsite three days a week by the end of February 2026. You’ll receive a personalized email today with more details. Please connect with your manager and team to understand your organization’s plans. If needed, you can request an exception by Friday, September 19.
Managers: You’ll find actions to take, and the resources to support both you and your team on the Managers@Microsoft SharePoint.
All employees: You’ll hear from your EVP or organizational leadership today with specific guidance. Each business will do what is best for their team, which means some groups will deviate from our company-wide expectations. If you are outside of the Puget Sound area, you do not need to take any action at this time unless your EVP communicates otherwise.
Timelines and details for additional US office locations will be announced soon. For employees outside the United States, we will begin planning in 2026. More information is available on the Flexible Work at Microsoft SharePoint.
As always, we’ll keep learning together to ensure Microsoft is the best place for you to grow and have a great career. Let’s keep moving forward together.
Thank you,
Amy
The post Flexible work update appeared first on The Official Microsoft Blog.
Amy Coleman, Executive Vice President and Chief People Officer, shared the below communication with Microsoft employees this morning. How we work has forever changed. I remember starting at Microsoft in the late ‘90s, always in the office, no laptops, and primarily working with the people right down the hall. As technology evolved and our business expanded, we became…
The post Flexible work update appeared first on The Official Microsoft Blog.Read More
Identify exact jittered point from swarmchart plot
I created a horizontal swarmchart in app designer. The input data have numeric values and grouping tags (intGroups) associated with each point. Each point’s color is associated with its group. I also defined a ButtonDownFcn to get the point that the user clicks on (using code from a previous Matlab Answer).
The issue is that when I click a point, swarmchart’s XData and YData give the undithered coordinates (i.e., the y-values are all the same). So if I click a point that is far from center (due to a lot of points having that same value), the code below may or may not identify that point I clicked, so I may not get the correct group.
Is there a way to ensure that I get the exact point I clicked (or its index in the vector), not others that are in the same vicinity?
for zz = 1:length(intVarsUnique)
hPlot = swarmchart(axPlot, xData(:, zz), intVars(:, zz), [], categorical(strGroupColors), ‘filled’, ‘o’, …
‘YJitter’,’density’, …
‘HitTest’, ‘on’, ‘ButtonDownFcn’, @(src, eventData) fcnPlot_ButtonDown(app, src, eventData));
if ~ishold(axPlot)
hold(axPlot, ‘on’);
end
end % zz
hold(axPlot, ‘off’);
function fcnPlot_ButtonDown(app, hPlot, eventData)
% Based on code from Yair Altman in https://www.mathworks.com/matlabcentral/answers/1903190-get-data-point-that-was-clicked-on-in-graph
% Get the click coordinates from the click event data
x = eventData.IntersectionPoint(1);
y = eventData.IntersectionPoint(2);
line_xs = hPlot.XData;
line_ys = hPlot.YData;
dist2 = (x-line_xs).^2 + (y-line_ys).^2;
[~,min_idx] = min(dist2);
closest_x = line_xs(min_idx);
closest_y = line_ys(min_idx);
msgbox(sprintf(‘Clicked on: (%g,%g)nClosest pt: (%g,%g)’, x, y, closest_x, closest_y));
% Code to locate point in vector and ID group…
endI created a horizontal swarmchart in app designer. The input data have numeric values and grouping tags (intGroups) associated with each point. Each point’s color is associated with its group. I also defined a ButtonDownFcn to get the point that the user clicks on (using code from a previous Matlab Answer).
The issue is that when I click a point, swarmchart’s XData and YData give the undithered coordinates (i.e., the y-values are all the same). So if I click a point that is far from center (due to a lot of points having that same value), the code below may or may not identify that point I clicked, so I may not get the correct group.
Is there a way to ensure that I get the exact point I clicked (or its index in the vector), not others that are in the same vicinity?
for zz = 1:length(intVarsUnique)
hPlot = swarmchart(axPlot, xData(:, zz), intVars(:, zz), [], categorical(strGroupColors), ‘filled’, ‘o’, …
‘YJitter’,’density’, …
‘HitTest’, ‘on’, ‘ButtonDownFcn’, @(src, eventData) fcnPlot_ButtonDown(app, src, eventData));
if ~ishold(axPlot)
hold(axPlot, ‘on’);
end
end % zz
hold(axPlot, ‘off’);
function fcnPlot_ButtonDown(app, hPlot, eventData)
% Based on code from Yair Altman in https://www.mathworks.com/matlabcentral/answers/1903190-get-data-point-that-was-clicked-on-in-graph
% Get the click coordinates from the click event data
x = eventData.IntersectionPoint(1);
y = eventData.IntersectionPoint(2);
line_xs = hPlot.XData;
line_ys = hPlot.YData;
dist2 = (x-line_xs).^2 + (y-line_ys).^2;
[~,min_idx] = min(dist2);
closest_x = line_xs(min_idx);
closest_y = line_ys(min_idx);
msgbox(sprintf(‘Clicked on: (%g,%g)nClosest pt: (%g,%g)’, x, y, closest_x, closest_y));
% Code to locate point in vector and ID group…
end I created a horizontal swarmchart in app designer. The input data have numeric values and grouping tags (intGroups) associated with each point. Each point’s color is associated with its group. I also defined a ButtonDownFcn to get the point that the user clicks on (using code from a previous Matlab Answer).
The issue is that when I click a point, swarmchart’s XData and YData give the undithered coordinates (i.e., the y-values are all the same). So if I click a point that is far from center (due to a lot of points having that same value), the code below may or may not identify that point I clicked, so I may not get the correct group.
Is there a way to ensure that I get the exact point I clicked (or its index in the vector), not others that are in the same vicinity?
for zz = 1:length(intVarsUnique)
hPlot = swarmchart(axPlot, xData(:, zz), intVars(:, zz), [], categorical(strGroupColors), ‘filled’, ‘o’, …
‘YJitter’,’density’, …
‘HitTest’, ‘on’, ‘ButtonDownFcn’, @(src, eventData) fcnPlot_ButtonDown(app, src, eventData));
if ~ishold(axPlot)
hold(axPlot, ‘on’);
end
end % zz
hold(axPlot, ‘off’);
function fcnPlot_ButtonDown(app, hPlot, eventData)
% Based on code from Yair Altman in https://www.mathworks.com/matlabcentral/answers/1903190-get-data-point-that-was-clicked-on-in-graph
% Get the click coordinates from the click event data
x = eventData.IntersectionPoint(1);
y = eventData.IntersectionPoint(2);
line_xs = hPlot.XData;
line_ys = hPlot.YData;
dist2 = (x-line_xs).^2 + (y-line_ys).^2;
[~,min_idx] = min(dist2);
closest_x = line_xs(min_idx);
closest_y = line_ys(min_idx);
msgbox(sprintf(‘Clicked on: (%g,%g)nClosest pt: (%g,%g)’, x, y, closest_x, closest_y));
% Code to locate point in vector and ID group…
end matlab, swarmchart, appdesigner, undocumented MATLAB Answers — New Questions
Default button of “questdlg” not working in 2025a
answer = questdlg(quest,dlgtitle,btn1,btn2,btn3,defbtn)
When the dialogue box is opened, the default button is selected, but pressing ENTER won’t click the button, and pressing ESC won’t close the dialogue box.answer = questdlg(quest,dlgtitle,btn1,btn2,btn3,defbtn)
When the dialogue box is opened, the default button is selected, but pressing ENTER won’t click the button, and pressing ESC won’t close the dialogue box. answer = questdlg(quest,dlgtitle,btn1,btn2,btn3,defbtn)
When the dialogue box is opened, the default button is selected, but pressing ENTER won’t click the button, and pressing ESC won’t close the dialogue box. matlab gui, nevermind – works now MATLAB Answers — New Questions
Microsoft’s Push to Save Office Files in the Cloud
New Policy Setting to Force Office to Save to Cloud Locations
Announced in Microsoft 365 message notification MC1137593 on 18 August 2025, Microsoft is introducing a new policy setting to force the creation (saving) of Office files in cloud locations. The setting only applies to Word, Excel, and PowerPoint files saved using the Microsoft 365 enterprise apps (subscription version of Office). No other Office version is affected.
Cloud locations means SharePoint Online, OneDrive for Business, and third-party cloud storage services like Dropbox or ShareFile (if the feature is enabled for the Microsoft 365 tenant).
The new policy enablecloudonlysaveasmode setting can be deployed to clients using Group policies or registry updates. The setting is available in the Administrative Template files for Microsoft Office (version 5516.1000 or newer) and applies to the Windows versions of the Microsoft 365 enterprise apps version 2506 (build 19029.2000) or later. I checked the feature using version 2508 (build 19127.20192) (Current Channel Preview).
MC1137593 says that public preview started to roll out in late August 2025 and is scheduled to complete by mid-September 2025. General availability for all clouds will start in mid-September 2025 and should be available everywhere by the end of September 2025. If you don’t take steps to apply the policy setting, nothing changes.
Higher Cloud Usage
Microsoft says that the new policy is “part of a new set of tools for IT Administrators to move their organizations towards higher Cloud usage and Increase security and compliance.” Increasing cloud usage adds stickiness to the Microsoft cloud because the more data that users store in SharePoint Online and OneDrive for Business, the harder it is to move to any other platform.
The point about increasing security and compliance is justified by the fact that SharePoint Online and OneDrive for Business are subject to the tenant’s security and compliance policies. It’s true that cloud files are more resilient than files stored on a local drive and it is very convenient to be able to move from PC to PC while keeping files available. However, everything depends on the network and if the network’s not available or you don’t want to use a potentially insecure network like free wi-fi, losing the ability to save files to the local drive can be a real pain. I often save copies of Offices files as PDFs to share with other people, and I don’t really want those PDFs cluttering up OneDrive.
Microsoft sometimes goes overboard in its enthusiasm to save files in OneDrive, like PowerShell modules. Some might consider this step to be in that category.
What Users See
The default situation is shown in Figure 1. No policy is configured, and the user can save to SharePoint Online, OneDrive for Business, the local hard drive, and other cloud services (configured through Add a Place).

With the policy setting enabled, Office applications are limited to cloud locations to save new files or save existing files as a new file (Figure 2). Note that the “This PC” and “Browse” (for folder) options are missing.

Updating Office with the Microsoft 365 Apps Policy
Microsoft 365 tenants can apply the setting to restrict saving to the cloud via a cloud policy configured in the Microsoft 365 Apps center. Search for the Restrict saving on non-cloud locations setting and change the value from not configured to enabled (Figure 3).

After saving the policy, its settings are applied by the click to run service and the new setting should be active within a day.
Like most Office settings, the change can be made manually by updating the system registry on a PC. In this case, it seems like the setting is controlled by two settings. The first enables the cloud only save mode by creating a new DWORD value called EnableCloudOnlySaveAsMode at HKEY_CURRENT_USERSOFTWAREMicrosoftOffice16.0CommonFileIO. The second apparently removes the UI options for non-cloud locations through another DWORD value called PreferCloudSaveLocations at HKEY_CURRENT_USERSoftwareMicrosoftOffice16.0CommonGeneral. Both values are set to 1 to enable or 0 to disable. When I tested, the settings worked on one PC and not on another. It took too long to figure out that the PC where things worked ran Current Channel (Preview) while the one where the feature didn’t work ran Current Channel.
A Change That Might Annoy Some
Some will hate this change and ask what’s the point of having a local drive if it’s inaccessible. Others might not notice that all files are stored in the cloud because they do that as the norm. And some will only notice the change when they go to save a file locally. I save most of what I do with Word, Excel, and PowerPoint in the cloud, so I guess that I’m in the last category.
If I was forced to live with storing all Office files in the cloud, I could adapt my workflow without much difficulty. Until a network outage occurs…

Support the work of the Office 365 for IT Pros team by subscribing to the Office 365 for IT Pros eBook. Your support pays for the time we need to track, analyze, and document the changing world of Microsoft 365 and Office 365. Only humans contribute to our work!
start app by doubleclicking mat-file with custom extension
An application which was greated using GUIDE can save *.mat files with a cusom extension, like *.MyExtension. What I want is, that I doubleclick in the windows explorer such a file, which will:
1. open my application (probably some settings in windows)
2. pass the file path to the application, so that the data can be loaded. Can someone help me on this?An application which was greated using GUIDE can save *.mat files with a cusom extension, like *.MyExtension. What I want is, that I doubleclick in the windows explorer such a file, which will:
1. open my application (probably some settings in windows)
2. pass the file path to the application, so that the data can be loaded. Can someone help me on this? An application which was greated using GUIDE can save *.mat files with a cusom extension, like *.MyExtension. What I want is, that I doubleclick in the windows explorer such a file, which will:
1. open my application (probably some settings in windows)
2. pass the file path to the application, so that the data can be loaded. Can someone help me on this? load file in application MATLAB Answers — New Questions
Performing 2D convolution
Hello,
I am new to image processing and had a conceptual question. I am currently trying to perform a 2D convolution of an image and a kernel (filter) in simulink using the 2D convolution block. I have code that transposes the image, then flips it, then performs the 2D convolution with the kernel. However from my (very limited) understanding convolution already flips the filter that I want to apply. I was told that the output of me transposing and flipping the image, then convolving with the kernel looks correct, however from my understanding I would be transposing and flipping the image once then convolving, which involves another flip, so therefore not actually convolving? I have read that this would just produce instead a correlation?
Hoping someone who knows more may be able to help give me a better understanding.
Thank you!Hello,
I am new to image processing and had a conceptual question. I am currently trying to perform a 2D convolution of an image and a kernel (filter) in simulink using the 2D convolution block. I have code that transposes the image, then flips it, then performs the 2D convolution with the kernel. However from my (very limited) understanding convolution already flips the filter that I want to apply. I was told that the output of me transposing and flipping the image, then convolving with the kernel looks correct, however from my understanding I would be transposing and flipping the image once then convolving, which involves another flip, so therefore not actually convolving? I have read that this would just produce instead a correlation?
Hoping someone who knows more may be able to help give me a better understanding.
Thank you! Hello,
I am new to image processing and had a conceptual question. I am currently trying to perform a 2D convolution of an image and a kernel (filter) in simulink using the 2D convolution block. I have code that transposes the image, then flips it, then performs the 2D convolution with the kernel. However from my (very limited) understanding convolution already flips the filter that I want to apply. I was told that the output of me transposing and flipping the image, then convolving with the kernel looks correct, however from my understanding I would be transposing and flipping the image once then convolving, which involves another flip, so therefore not actually convolving? I have read that this would just produce instead a correlation?
Hoping someone who knows more may be able to help give me a better understanding.
Thank you! convolution, simulink, image processing MATLAB Answers — New Questions
Problem on startup. MacMini M4, Sequoia 15.6.1
On starup I get this meggage:
Ubable yo communicate with required MathWorks services (error 201)
Any ideas?On starup I get this meggage:
Ubable yo communicate with required MathWorks services (error 201)
Any ideas? On starup I get this meggage:
Ubable yo communicate with required MathWorks services (error 201)
Any ideas? jon’s q MATLAB Answers — New Questions
Microsoft Bolts on Copilot License Check onto ExRCA
Hard to Find Logic for Copilot License Check in Exchange Connectivity Tool
Unless you’re a keen reader of the Microsoft blogs posted to the Microsoft Technical community, you might have missed the August 25 article about a new diagnostic tool for a “Copilot License Details Check.” According to the text, it’s “a powerful tool designed to streamline and validate your license readiness by confirming whether Copilot for Microsoft 365 licenses are properly assigned to users.” In reality, it’s some Graph API requests cobbled together to report details of a Microsoft 365 Copilot license assignment to a user account that’s been bolted onto the side of the Exchange Remote Connectivity Analyzer (ExRCA).
As I explain in another article, ExRCA started as a troubleshooting tool to help Exchange on-premises administrators debug connectivity problems with protocols like Autodiscover and ActiveSync (check out the YouTube video from that time). Later, Microsoft upgraded ExRCA to support Exchange Online and Teams. At this point, it’s fair to say that ExRCA is an invaluable tool for Microsoft 365 tenant administrators.
However, having a valuable support tool is no reason to bolt on a license checker. I’m sure Microsoft will point to the inclusion of the message header analyzer tool in ExRCA as evidence that ExRCA has become a toolbox, but that’s missing the point that the message header tool is at least associated with a messaging protocol (SMTP) whereas the Copilot license check is a barefaced attempt to help people use more Copilot features.
Running a Copilot License Check
Running a Copilot license check is very simple. Input the user principal name or primary SMTP address of a user account, sign in as an administrator with permissions to access user account details, and the code runs to verify that the account has a Microsoft 365 Copilot license with all its service plans intact (Figure 1).

A Simple Check
Stripping everything away, the license check is very simple and the results that it generates are equally simple (no expensive CPU cycles for AI analysis are burned here). Figure 2 shows that the user being checked is good to go. I’m sure that this is deemed to be a successful test.

But some issues exist. First, the test doesn’t distinguish between direct-assigned licenses and group-assigned licenses, which is valuable information for an administrator to know if they want to address a problem highlighted by the test. Second, the test only considers a “full” Microsoft 365 Copilot license to be valid. Trial licenses are not considered valid. Third, disabling some Copilot features is a perfectly reasonable thing to do. Not everyone needs to create new agents through Copilot Studio, for example.
PowerShell Code for the Copilot License Check
To show what the ExRCA Copilot check does, I recreated the check using the Microsoft Graph PowerShell SDK. The code is simple and took about an hour to write (including testing):
- Sign into the Graph with Connect-MgGraph using an administrator account.
- Prompt for a user account and validate that the account exists.
- Check that the account has a valid Microsoft 365 Copilot license (including trial licenses). License and service plan information is available online.
- Run the Get-MgUserLicenseDetail cmdlet to retrieve service plan information for the Copilot SKU.
- Check each of the service plans defined in the license to report if it is enabled or disabled.
Figure 3 shows some sample output.

You can download the script from the Office 365 for IT Pros GitHub repository.
No Reason to Use ExRCA to Check License Details
I don’t know how many Microsoft 365 tenant administrators will seek out ExRCA to answer questions like “I wonder if the Copilot license assigned to John Doe is good to go?” It seems like an unnatural reaction to consider ExRCA in that light when it’s straightforward to build your own tools to measure user readiness for Copilot or analyze and report licenses assigned to user accounts (including disabled service plans).
The idea might be a good one, but I fear it’s implemented in the wrong place and is too limited to deliver much value.
Need some assistance to write and manage PowerShell scripts for Microsoft 365, including Azure Automation runbooks? Get a copy of the Automating Microsoft 365 with PowerShell eBook, available standalone or as part of the Office 365 for IT Pros eBook bundle.