Month: January 2026
Interact with the Test Browser GUI programatically?
I have a matlab project that contains a folder for of test classes at
test_path = "<project_root>/tests/"
This file is automatically added to the project search path by a startup script. I like the Test Browser panel in the GUI, however I find that each session I have to manually add my tests to the browser by clicking the "plus" icon and selecting my test path in the selection window.
Is there a way to modify my project settings or the startup script to automatically populate the Test Browser panel with my tests?
Thanks for the responses. Please note that I am NOT asking how to run tests or create test suites programatically here – I know how to do that using the testsuite and run- I’m specifically intersted in making the Test Browser panel populate automatically with my tests.I have a matlab project that contains a folder for of test classes at
test_path = "<project_root>/tests/"
This file is automatically added to the project search path by a startup script. I like the Test Browser panel in the GUI, however I find that each session I have to manually add my tests to the browser by clicking the "plus" icon and selecting my test path in the selection window.
Is there a way to modify my project settings or the startup script to automatically populate the Test Browser panel with my tests?
Thanks for the responses. Please note that I am NOT asking how to run tests or create test suites programatically here – I know how to do that using the testsuite and run- I’m specifically intersted in making the Test Browser panel populate automatically with my tests. I have a matlab project that contains a folder for of test classes at
test_path = "<project_root>/tests/"
This file is automatically added to the project search path by a startup script. I like the Test Browser panel in the GUI, however I find that each session I have to manually add my tests to the browser by clicking the "plus" icon and selecting my test path in the selection window.
Is there a way to modify my project settings or the startup script to automatically populate the Test Browser panel with my tests?
Thanks for the responses. Please note that I am NOT asking how to run tests or create test suites programatically here – I know how to do that using the testsuite and run- I’m specifically intersted in making the Test Browser panel populate automatically with my tests. test browser, tests, test MATLAB Answers — New Questions
How do I force Simulink app setting during startup to impact the target instead of just the GUI?
Simple simulink model converted to app via App Generator & App Designer.
Trying to load last values (from a .mat file) for a given item via callback on Start Button (app.StartStopButton), PreStartFcn.
I check if the file is available, load, then set app value (code below). This results in the GUI correctly displaying the value, however, the underlying component/value (referred to hereafter as the "model value") that the GUI is connected to doesn’t reflect the state of the GUI and is set to the underlying constant value defined in the model. Confirmed model value not loaded during startup via saving state of model value to file via File Log blocks and observing state via SDI. Can also verify value via calculation value is used for that is displayed on the GUI.
The GUI is definitely connected to the model value as when the GUI is updated by the user (after the Start button is pressed), the model value syncs to the GUI state… it is just this initial load that will only update the GUI.
function StartStopButtonPreStart(app, event) % Callback called on Start (pre or post, doesn’t matter).
if isfile(‘DST.mat’)
loadedData = load(‘DST.mat’);
app.DST_Enabled_Value.Value = loadedData.value; % This updates the GUI, but not underlying model value.
end
end
% generic code created by default when utilizing app generator & app
% designer
function startupFcn(app)
% … other setup …
hInst = slrealtime.Instrument();
DST_Enabled_Value_values = [0 1];
DST_Enabled_Value_labels = [0 1];
targetSelector = app.TargetSelector;
slrtcomp = slrealtime.ui.tool.ParameterTuner(app.UIFigure, ‘TargetSource’, targetSelector);
slrtcomp.Component = app.DST_Enabled_Value;
slrtcomp.BlockPath = ‘DST_TimeZone_Test/Subsystem Reference/DST_Enabled’;
slrtcomp.ParameterName = ‘Value’;
slrtcomp.ConvertToComponent = @(val)app.convertValueToValue(val, DST_Enabled_Value_values, DST_Enabled_Value_labels);
slrtcomp.ConvertToTarget = @(val)app.convertValueToValue(val, DST_Enabled_Value_labels, DST_Enabled_Value_values);
slrtInstrumentsComponent = slrealtime.ui.tool.InstrumentManager(app.UIFigure, ‘TargetSource’, targetSelector);
slrtInstrumentsComponent.Instruments = hInst;
% … other setup …
endSimple simulink model converted to app via App Generator & App Designer.
Trying to load last values (from a .mat file) for a given item via callback on Start Button (app.StartStopButton), PreStartFcn.
I check if the file is available, load, then set app value (code below). This results in the GUI correctly displaying the value, however, the underlying component/value (referred to hereafter as the "model value") that the GUI is connected to doesn’t reflect the state of the GUI and is set to the underlying constant value defined in the model. Confirmed model value not loaded during startup via saving state of model value to file via File Log blocks and observing state via SDI. Can also verify value via calculation value is used for that is displayed on the GUI.
The GUI is definitely connected to the model value as when the GUI is updated by the user (after the Start button is pressed), the model value syncs to the GUI state… it is just this initial load that will only update the GUI.
function StartStopButtonPreStart(app, event) % Callback called on Start (pre or post, doesn’t matter).
if isfile(‘DST.mat’)
loadedData = load(‘DST.mat’);
app.DST_Enabled_Value.Value = loadedData.value; % This updates the GUI, but not underlying model value.
end
end
% generic code created by default when utilizing app generator & app
% designer
function startupFcn(app)
% … other setup …
hInst = slrealtime.Instrument();
DST_Enabled_Value_values = [0 1];
DST_Enabled_Value_labels = [0 1];
targetSelector = app.TargetSelector;
slrtcomp = slrealtime.ui.tool.ParameterTuner(app.UIFigure, ‘TargetSource’, targetSelector);
slrtcomp.Component = app.DST_Enabled_Value;
slrtcomp.BlockPath = ‘DST_TimeZone_Test/Subsystem Reference/DST_Enabled’;
slrtcomp.ParameterName = ‘Value’;
slrtcomp.ConvertToComponent = @(val)app.convertValueToValue(val, DST_Enabled_Value_values, DST_Enabled_Value_labels);
slrtcomp.ConvertToTarget = @(val)app.convertValueToValue(val, DST_Enabled_Value_labels, DST_Enabled_Value_values);
slrtInstrumentsComponent = slrealtime.ui.tool.InstrumentManager(app.UIFigure, ‘TargetSource’, targetSelector);
slrtInstrumentsComponent.Instruments = hInst;
% … other setup …
end Simple simulink model converted to app via App Generator & App Designer.
Trying to load last values (from a .mat file) for a given item via callback on Start Button (app.StartStopButton), PreStartFcn.
I check if the file is available, load, then set app value (code below). This results in the GUI correctly displaying the value, however, the underlying component/value (referred to hereafter as the "model value") that the GUI is connected to doesn’t reflect the state of the GUI and is set to the underlying constant value defined in the model. Confirmed model value not loaded during startup via saving state of model value to file via File Log blocks and observing state via SDI. Can also verify value via calculation value is used for that is displayed on the GUI.
The GUI is definitely connected to the model value as when the GUI is updated by the user (after the Start button is pressed), the model value syncs to the GUI state… it is just this initial load that will only update the GUI.
function StartStopButtonPreStart(app, event) % Callback called on Start (pre or post, doesn’t matter).
if isfile(‘DST.mat’)
loadedData = load(‘DST.mat’);
app.DST_Enabled_Value.Value = loadedData.value; % This updates the GUI, but not underlying model value.
end
end
% generic code created by default when utilizing app generator & app
% designer
function startupFcn(app)
% … other setup …
hInst = slrealtime.Instrument();
DST_Enabled_Value_values = [0 1];
DST_Enabled_Value_labels = [0 1];
targetSelector = app.TargetSelector;
slrtcomp = slrealtime.ui.tool.ParameterTuner(app.UIFigure, ‘TargetSource’, targetSelector);
slrtcomp.Component = app.DST_Enabled_Value;
slrtcomp.BlockPath = ‘DST_TimeZone_Test/Subsystem Reference/DST_Enabled’;
slrtcomp.ParameterName = ‘Value’;
slrtcomp.ConvertToComponent = @(val)app.convertValueToValue(val, DST_Enabled_Value_values, DST_Enabled_Value_labels);
slrtcomp.ConvertToTarget = @(val)app.convertValueToValue(val, DST_Enabled_Value_labels, DST_Enabled_Value_values);
slrtInstrumentsComponent = slrealtime.ui.tool.InstrumentManager(app.UIFigure, ‘TargetSource’, targetSelector);
slrtInstrumentsComponent.Instruments = hInst;
% … other setup …
end simulink, startup loading, gui but not target MATLAB Answers — New Questions
Printing values to the command window
how does one print a value of a variable to the command window?how does one print a value of a variable to the command window? how does one print a value of a variable to the command window? command window, print value MATLAB Answers — New Questions
copygraphics has extra padding when copying with “ContentType” as “vector”.
Hi,
I am using copy graphics to copy my figures. I am running into a problem though.
figure;
plot(randn(5,1));
copygraphics(gcf, "ContentType","vector")
Will copy a plot that looks like this when pasted into microsoft Word:
Notice the extra padding at the bottom and on the right.
If I do the same code but with the image content type:
figure;
plot(randn(5,1));
copygraphics(gcf, "ContentType","image")
Then the resulting pasted image has no padding:
Note: the "Padding" option works, but when I add a "Padding" of 60 then I still have more padding on the right and bottom of the image.
I tried pasting it into another program, Inkscape, and it didn’t have the padding error. So it must be something to do with microsoft Word (and OneNote as well). I also can copy the axes from the toolstrip and it has the same padding error.
This error only occurs in Matlab 2025b. I also tried it in Matlab 2022a and that error is gone. So it must be something with the newer versions of matlab and it is something that matlab workers could fix.
If anyone has any idea what is going on that would be great.Hi,
I am using copy graphics to copy my figures. I am running into a problem though.
figure;
plot(randn(5,1));
copygraphics(gcf, "ContentType","vector")
Will copy a plot that looks like this when pasted into microsoft Word:
Notice the extra padding at the bottom and on the right.
If I do the same code but with the image content type:
figure;
plot(randn(5,1));
copygraphics(gcf, "ContentType","image")
Then the resulting pasted image has no padding:
Note: the "Padding" option works, but when I add a "Padding" of 60 then I still have more padding on the right and bottom of the image.
I tried pasting it into another program, Inkscape, and it didn’t have the padding error. So it must be something to do with microsoft Word (and OneNote as well). I also can copy the axes from the toolstrip and it has the same padding error.
This error only occurs in Matlab 2025b. I also tried it in Matlab 2022a and that error is gone. So it must be something with the newer versions of matlab and it is something that matlab workers could fix.
If anyone has any idea what is going on that would be great. Hi,
I am using copy graphics to copy my figures. I am running into a problem though.
figure;
plot(randn(5,1));
copygraphics(gcf, "ContentType","vector")
Will copy a plot that looks like this when pasted into microsoft Word:
Notice the extra padding at the bottom and on the right.
If I do the same code but with the image content type:
figure;
plot(randn(5,1));
copygraphics(gcf, "ContentType","image")
Then the resulting pasted image has no padding:
Note: the "Padding" option works, but when I add a "Padding" of 60 then I still have more padding on the right and bottom of the image.
I tried pasting it into another program, Inkscape, and it didn’t have the padding error. So it must be something to do with microsoft Word (and OneNote as well). I also can copy the axes from the toolstrip and it has the same padding error.
This error only occurs in Matlab 2025b. I also tried it in Matlab 2022a and that error is gone. So it must be something with the newer versions of matlab and it is something that matlab workers could fix.
If anyone has any idea what is going on that would be great. copygraphics, export MATLAB Answers — New Questions
Microsoft 365 Exceeds 450 Million Commercial Paid Seats
Microsoft FY26 Q2 Results Focus on Copilot

Microsoft released their FY26 Q2 results on January 28, 2026. Two items stood out from a Microsoft 365 perspective. First, Microsoft reported a number for Microsoft 365 Copilot licenses (15 million), the first time that Microsoft has given a number for paid Microsoft 365 Copilot seats. Second, Microsoft said that the number of paid commercial Microsoft 365 seats now exceeds 450 million, a small increase from the 446 million reported last quarter. Microsoft reported a 6% year-over-year growth for Microsoft 365 seats. Given these numbers, it seems like Microsoft 365 Copilot is used by 3.33% of the installed base.
Quarterly revenues for the Microsoft Cloud hit $51.5 billion, or $206 billion ARR. Microsoft said that “Microsoft Cloud gross margin percentage was slightly better than expected at 67%,” underlining the highly profitable nature of its Microsoft 365, LinkedIn, and Azure solutions which constitute the majority of Microsoft Cloud income.
Microsoft 365 Copilot Revenue
Microsoft 365 revenue increased 17% (14% in constant currency). As is their habit when discussing quarterly results with analysts, Microsoft attributed the growth in average revenue per user (ARPU) to customer purchases of E5 and Microsoft 365 Copilot licenses.
At list price, the 15 million paid licenses for Microsoft 365 Copilot represent $5.4 billion annual revenue. Microsoft said that the 15 million represents a 160% year-over-year growth in seats. $5.4 billion is a good number, but the number pails against the ongoing capital investment Microsoft makes to support AI. CFO Amy Hood said that “capital expenditures were $37.5 billion, and this quarter, roughly two thirds of our capex was on short-lived assets, primarily GPUs and CPUs.”
Microsoft also said that Copilot “is becoming a true daily habit, with daily active users increasing 10X year-over-year.” They threw in a meaningless statistic when saying that “24 billion Copilot interactions were audited by Purview this quarter, up 9X year-over-year.” Given that Purview captures audit records for all Copilot interactions, this kind of statistic does not give an accurate insight into the activity levels of Microsoft 365 Copilot users. I assume that the people use Copilot more as they become more familiar with its capabilities allied to the availability of new agents like Researcher.
The investment in Copilot across Microsoft 365 reduces engineering budget in other areas. One example is the announcement to partners that Microsoft plans to retire the standalone SharePoint Online and OneDrive for Business plan 1 and 2 SKUs. The reasons cited are “low customer demand for standalone offerings, increased instances of unintended or nonstandard usage, and higher operational costs.” Given the increased integration of SharePoint Online and OneDrive for Business with the rest of the Microsoft 365 suite, selling a standalone offering doesn’t make as much sense in 2026 as it did when Office 365 launched in 2011.
GitHub Copilot
I like GitHub Copilot and recommend it to every Microsoft 365 administrator who works with PowerShell. Microsoft says that there are now 4.7 million paid GitHub Copilot subscribers, up 75% year-over-year as people come to appreciate the advantages of AI-assisted development. Like any assistant, the suggestions made by Copilot are not perfect, but it does accelerate progress and that’s all that counts.
Satya Nadella referenced the GitHub Copilot SDK and its ability to bring Microsoft 365 content into the development cycle. For example, while working in Visual Studio Code, a developer can retrieve relevant messages and files to allow them to check information like program requirements. It’s an example of what Microsoft calls WorkIQ.
License Increases Will Drive Higher Microsoft 365 Revenues
Things look good for Microsoft 365 revenues over the coming months. Revenue growth through an increase in overall commercial seat count (solidly 6% annually on average over the last few years) will be turbocharged by license increases coming in June 2026. If 450 million Microsoft 365 users pay an average $2 extra per month, that’s $10.8 billion extra revenue. Which is nice.
Support the work of the Office 365 for IT Pros team by subscribing to the Office 365 for IT Pros eBook. Your support pays for the time we need to track, analyze, and document the changing world of Microsoft 365 and Office 365. Only humans contribute to our work!
How does the Positive-Displacement Compressor (2P) compute the fluid power?
What is the energy balance equation of the Positive-Displacement Compressor (2P) block to calculate the fluid power? The documentation says the fluid power is equal to Wc_dot in the polytropic case, yet the Wc_dot does not equal to the energy flow or the enthalpy change of the refrigerant through the block. Wc_dot actually equals to mechanical power * mechanical efficiency based on my calculation using a test model. And the fluid power is equal to the enthapy change of the refrigerant in my test model, which corresponds to the isentropic thermodynamics model in the document. However, I am using the polytropic thermodynamics model in my test model.What is the energy balance equation of the Positive-Displacement Compressor (2P) block to calculate the fluid power? The documentation says the fluid power is equal to Wc_dot in the polytropic case, yet the Wc_dot does not equal to the energy flow or the enthalpy change of the refrigerant through the block. Wc_dot actually equals to mechanical power * mechanical efficiency based on my calculation using a test model. And the fluid power is equal to the enthapy change of the refrigerant in my test model, which corresponds to the isentropic thermodynamics model in the document. However, I am using the polytropic thermodynamics model in my test model. What is the energy balance equation of the Positive-Displacement Compressor (2P) block to calculate the fluid power? The documentation says the fluid power is equal to Wc_dot in the polytropic case, yet the Wc_dot does not equal to the energy flow or the enthalpy change of the refrigerant through the block. Wc_dot actually equals to mechanical power * mechanical efficiency based on my calculation using a test model. And the fluid power is equal to the enthapy change of the refrigerant in my test model, which corresponds to the isentropic thermodynamics model in the document. However, I am using the polytropic thermodynamics model in my test model. simscape, simulink MATLAB Answers — New Questions
Matlab workers in Parloop are not using the path order
Hello, I’m using this code
parfor k=1:10
[..]=pca(..)
end
I have 2 definitions of pca (the matlab package and another package). I’ve set the search path to use the matlab package by default, but inside the parloop the other function is used instead. I didn’t found a way to arrange this without removing the other package.
ThanksHello, I’m using this code
parfor k=1:10
[..]=pca(..)
end
I have 2 definitions of pca (the matlab package and another package). I’ve set the search path to use the matlab package by default, but inside the parloop the other function is used instead. I didn’t found a way to arrange this without removing the other package.
Thanks Hello, I’m using this code
parfor k=1:10
[..]=pca(..)
end
I have 2 definitions of pca (the matlab package and another package). I’ve set the search path to use the matlab package by default, but inside the parloop the other function is used instead. I didn’t found a way to arrange this without removing the other package.
Thanks parfor, path MATLAB Answers — New Questions
How to install ‘FMU Builder’ in Docker Container?
Hello, I’m trying to export FMU from the Simulink Model in Docker Container.
I actually did it without any problem until I used R2022b Matlab/Simulink, but when I tried with R2024b I got an error:
"FMU export requires FMU Builder for Simulink Support Package"
And I realized I need additional Add-on (FMU Builder) from R2023b Matlab/Simulink.
Installing ‘FMU Builder’ can be easily done in MS Windows (using Add-on Explorer), but it’s totally different story for Docker Container environment.
I want to build a Docker Image which supports FMU creation under the conditions below:
(Of course) I have ‘Simulink Compiler’ license.
I want to build a Docker Image from the scratch (Using ubi Image, Not using the Image provided by Mathworks, because I have some dependencies to be installed)
I want to use Matlab/Simulink installation file (*.iso) downloaded from Mathworks website (Not using MPM, because of the network constraints)
I wonder how to install ‘FMU Builder’ satisfying all the conditions above,
becuase the only way I found until now, is to install Matlab/Simulink with MPM when I build a Docker Image.
But this is against one of the conditions.
I also found the help page:
https://www.mathworks.com/help/matlab/ref/matlab.addons.install.html
This help doc explains how to install Add-on using .mltbx file format,
but when I download FMU Builder:
https://www.mathworks.com/products/fmubuilder.html
I only got ‘fmubuilder.mlpkginstall’, not .mltbx file, and I have no idea where to get .mltbx file for FMU Builder.
Anyone can help about this issue?
Thank you!Hello, I’m trying to export FMU from the Simulink Model in Docker Container.
I actually did it without any problem until I used R2022b Matlab/Simulink, but when I tried with R2024b I got an error:
"FMU export requires FMU Builder for Simulink Support Package"
And I realized I need additional Add-on (FMU Builder) from R2023b Matlab/Simulink.
Installing ‘FMU Builder’ can be easily done in MS Windows (using Add-on Explorer), but it’s totally different story for Docker Container environment.
I want to build a Docker Image which supports FMU creation under the conditions below:
(Of course) I have ‘Simulink Compiler’ license.
I want to build a Docker Image from the scratch (Using ubi Image, Not using the Image provided by Mathworks, because I have some dependencies to be installed)
I want to use Matlab/Simulink installation file (*.iso) downloaded from Mathworks website (Not using MPM, because of the network constraints)
I wonder how to install ‘FMU Builder’ satisfying all the conditions above,
becuase the only way I found until now, is to install Matlab/Simulink with MPM when I build a Docker Image.
But this is against one of the conditions.
I also found the help page:
https://www.mathworks.com/help/matlab/ref/matlab.addons.install.html
This help doc explains how to install Add-on using .mltbx file format,
but when I download FMU Builder:
https://www.mathworks.com/products/fmubuilder.html
I only got ‘fmubuilder.mlpkginstall’, not .mltbx file, and I have no idea where to get .mltbx file for FMU Builder.
Anyone can help about this issue?
Thank you! Hello, I’m trying to export FMU from the Simulink Model in Docker Container.
I actually did it without any problem until I used R2022b Matlab/Simulink, but when I tried with R2024b I got an error:
"FMU export requires FMU Builder for Simulink Support Package"
And I realized I need additional Add-on (FMU Builder) from R2023b Matlab/Simulink.
Installing ‘FMU Builder’ can be easily done in MS Windows (using Add-on Explorer), but it’s totally different story for Docker Container environment.
I want to build a Docker Image which supports FMU creation under the conditions below:
(Of course) I have ‘Simulink Compiler’ license.
I want to build a Docker Image from the scratch (Using ubi Image, Not using the Image provided by Mathworks, because I have some dependencies to be installed)
I want to use Matlab/Simulink installation file (*.iso) downloaded from Mathworks website (Not using MPM, because of the network constraints)
I wonder how to install ‘FMU Builder’ satisfying all the conditions above,
becuase the only way I found until now, is to install Matlab/Simulink with MPM when I build a Docker Image.
But this is against one of the conditions.
I also found the help page:
https://www.mathworks.com/help/matlab/ref/matlab.addons.install.html
This help doc explains how to install Add-on using .mltbx file format,
but when I download FMU Builder:
https://www.mathworks.com/products/fmubuilder.html
I only got ‘fmubuilder.mlpkginstall’, not .mltbx file, and I have no idea where to get .mltbx file for FMU Builder.
Anyone can help about this issue?
Thank you! docker, add-on, installation MATLAB Answers — New Questions
Microsoft Delays Retirement of Basic Authentication for SMTP AUTH
New Date for Retirement Will be Announced Sometime in 2027

In the latest twist in Microsoft’s effort to retire basic authentication for the SMTP AUTH client submission protocol from Exchange Online, new guidance appeared on January 27 to set out what Microsoft must hope to be the last steps in the process. Previously Microsoft wanted to close off basic authentication for SMTP AUTH in September 2025, but following customer pushback, Microsoft adjusted those dates and in June 2025 announced their intention to begin rejecting a small percentage of SMTP AUTH submissions on March 1, 2026. The percentage of rejection would gradually increase to reach 100% on April 30, 2026. That won’t happen now.
In a nutshell, Microsoft 365 tenants can use basic authentication with SMTP AUTH to submit messages to Exchange Online for processing until the end of December 2026. At that time, Microsoft will disable basic authentication for SMTP AUTH. However, tenant administrators can reverse the block to allow apps and devices to resume using basic authentication to submit messages. In the second half of 2027, Microsoft will announce the final drop-dead removal date when basic authentication for SMTP AUTH will no longer be available.
New tenants created from January 2027 (or the end of December 2026) will be unable to use basic authentication with SMTP AUTH. If these tenants need to send messages from apps or devices, they must use OAuth.
I Need More Time
Microsoft’s logic for pushing the date out is that customers “face real challenges modernizing legacy email workflows.” Extra time is required to update code to remove basic authentication (username and password credentials) and replace it with OAuth, a task that Microsoft’s documentation takes many words to explain.
Updating PowerShell scripts is relatively easy because of the availability of off-the-shelf OAuth authentication in the form of the Send-MgUserMail and Send-MgUserMessage cmdlets from the Microsoft Graph PowerShell SDK. When Microsoft started on the journey to retire basic authentication for SMTP AUTH, examples of using these cmdlets were hard to find. That’s not the case now, and it’s easy to find many scripts that use the cmdlets in different ways, like attaching multiple files to messages.
The Device Issue
Updating devices like multi-function printers and scanners is more challenging. Only the device vendors can upgrade code, and I hear of many blank looks when customers ask vendors about their plans to upgrade devices to support OAuth for client submissions. If an update isn’t available to allow an application or device to support OAuth, the usual course of action is to remove Exchange Online from the equation and replace it with a different SMTP server (like SMTP2Go or even an on-premises Exchange Server).
Another solution proposes the translation of basic authentication commands to OAuth using a local proxy. I have not tested the effectiveness of the solution and cannot attest to how well it works in the wild. However, I’ve heard good things about it, so the local proxy approach could be worth investigating.
Microsoft in a Bind
Microsoft very badly wants to retire basic authentication for SMTP AUTH client submission, but they don’t want to make customers unhappy. SMTP AUTH is the last hurdle for a project to remove basic authentication for all email protocols that began with protocols like IMAP4, POP3, EWS, and MAPI in October 2022. At the time, Microsoft said “We are not touching SMTP AUTH and are done turning it off for now.” Wise heads realized then that removing basic authentication from SMTP AUTH would be a struggle, and they were right.
Insight like this doesn’t come easily. You’ve got to know the technology and understand how to look behind the scenes. Benefit from the knowledge and experience of the Office 365 for IT Pros team by subscribing to the best eBook covering Office 365 and the wider Microsoft 365 ecosystem.
How Microsoft is empowering Frontier Transformation with Intelligence + Trust
At Microsoft Ignite in November, we introduced Frontier Transformation — a holistic reimagining of business aligning AI with human ambition to help organizations achieve their highest aspirations and growth potential. While AI Transformation centered on efficiency and productivity, Frontier Transformation challenges us to do more for humanity by democratizing intelligence to unlock creativity and innovation for organizations and people around the world.
Across industries, our customers are leading the way to becoming Frontier; sharing three common traits anchored in a foundation of Intelligence + Trust. AI in the flow of human ambition, putting Copilots and agents directly in the tools people use; and ubiquitous innovation, empowering the maker in every role. These capabilities are served through Microsoft’s new intelligence layer: Work IQ, which understands how people work; Fabric IQ, which provides a trusted semantic layer for reasoning over an organization’s data; and Foundry IQ, the world’s leading AI app server powering safe, scalable agent experiences. Together, these capabilities put the “I” back in AI by grounding Copilots and agents in an organization’s own data, logic and workflows to fully understand operations and drive decisions that matter most. The third trait is observability at every layer of the stack; ensuring trust, safety and reliable outcomes. As the control plane to observe, govern and secure all AI artifacts, Agent 365 provides a unified view of every AI agent running in an organization’s environment — whether built on Microsoft’s platforms or others.
Our customers and partners are showcasing what can be achieved with Frontier Transformation and human ambition paired with Copilots + agents, and I am pleased to share their stories — including many onstage with us at Ignite. Their journeys demonstrate what is possible for organizations everywhere when AI-first innovation is built upon Intelligence + Trust.
Putting AI in the flow of human ambition so people can achieve more in every role, across every industry
Using a secure Azure foundation, Epic embedded AI directly into clinical workflows, enabling hundreds of thousands of clinicians worldwide to work faster and deliver higher quality care. Epic AI generates documentation in the flow of work, reducing time spent on prior authorization questions by over 40% and surfacing critical insights that could be missed during manual review. In one month alone, Epic AI automatically generated more than 16 million patient record summaries, helping clinicians reduce administrative workload and speed time to treatment. AI-driven imaging follow-up also boosted early cancer detection at the Christ Hospital to 69%, far above the national 46% average. By delivering real improvements like these today, Epic is building confidence and familiarity that will accelerate adoption of tomorrow’s AI-enabled breakthroughs in precision medicine, drug discovery and the understanding of disease.
To create a consistent experience across its entire workforce, heritage brand Levi Strauss & Co. standardized on Windows 11, Copilot+ PCs, Intune, Microsoft 365 Copilot and Microsoft Foundry to give every team — from designers to retail associates to distribution centers — a modern, AI-powered workplace. With Copilot and agents accelerating workflows and eliminating fragmentation across legacy systems, teams can model demand faster, bring products to market with greater precision and spend more time on creative and commercial work that strengthens the brand. They are also reducing operational noise, strengthening security and scaling insights across design, merchandising, retail and supply chain. With a unified, secure Microsoft platform, Levi’s is enriching the employee experience, driving sharper execution and building durable advantage in an increasingly dynamic market.
London Stock Exchange Group (LSEG) is unifying the data foundation of global finance by modernizing its platform on Microsoft Fabric and bringing trusted financial intelligence directly into Microsoft 365 Copilot. The company has consolidated 30 legacy data systems, 1,200 datasets and more than 33 petabytes of financial content into a single, governed environment. This unified foundation is now delivering faster, cleaner insights to 44,000 customers in over 170 countries and cutting product development timelines from years to months. With Fabric and Copilot working together, financial professionals can access LSEG’s expansive data and analytics directly in the flow of work — helping them make decisions with greater speed and confidence while reducing friction across risk modeling, regulatory compliance and investment workflows. By simplifying the data estate first, LSEG is safely surfacing insights through Microsoft 365 Copilot and empowering teams across the organization to innovate with consistency, compliance and at global scale.
The University of Manchester is the first higher education institution in the world to provide Microsoft 365 Copilot access and training to all 65,000 students and staff. Learners and researchers will gain equitable access to Copilot-powered tools to strengthen teaching, accelerate interdisciplinary discovery and build future-ready skills. For students, this is an essential aid for revision, translation and academic success; while university leadership can ensure responsible use policies and training so every student can use AI ethically and confidently. Researchers can synthesize vast volumes of information across fields from photonic materials to biomedical science, enabling faster progress on challenges from cancer treatment to sustainable manufacturing; while operationally, Copilot helps administrative staff free their time for higher value work. The University of Manchester is defining a new model for modern higher education by pairing its decades of AI innovation with equitable access to cutting edge AI tools that prepare the next generation of citizens, innovators and creators.
Inspiring the maker in every one of us with ubiquitous innovation that amplifies creativity and accelerates impact
Adobe is redefining creativity, productivity and customer experience by infusing AI deeply into its product ecosystem, powered by Azure, Copilot and Microsoft Foundry. By supporting third-party models directly inside Adobe Firefly, creators can choose the best model for the job while unlocking new agentic capabilities across Photoshop, Acrobat and Adobe’s Customer Experience Orchestration solutions; resulting in significant acceleration in workflows through AI-driven agents. With daily use of GitHub Copilot, its engineering organization is boosting developer productivity and speed to innovation. The company is also focused on enterprise-grade governance and data provenance to help customers trust and verify content as AI adoption grows — further reinforced by Adobe Marketing Agent for Microsoft 365 Copilot as part of the Agent 365 preview. By combining open model choice with responsible AI infrastructure, Adobe is giving customers creative choice and operational confidence, while unlocking faster innovation, without compromising security, trust or brand integrity.
In an industry facing unprecedented pressure — from rising costs to shrinking margins — Land O’Lakes is accelerating AI innovation across American agriculture by developing a new digital assistant called Oz. Built on models within Microsoft Foundry, the digital assistant turns an 800-page crop protection guide into instant, data-rich insights delivered through intuitive workflows. The Copilot solution provides agronomic expertise throughout the growing season tailored to each grower’s soil, crop and environmental conditions — with personalized recommendations that address unique farm-level challenges. This AI-enhanced solution streamlines access to critical information so experts can help growers make faster, more confident decisions that help control input costs, boost crop yields and drive long-term success. The work Land O’Lakes is doing shows how AI can democratize intelligence — empowering farmers to grow their businesses while feeding their communities and building a more resilient agricultural ecosystem.
Mercedes Benz is transforming every layer of its enterprise — from headquarters to the factory floor to the driving experience. Built on Azure and powered by Copilot, the company is moving toward making Copilot available broadly, with over 50 business areas already using Copilot Studio to build and automate their own workflows and agents. GitHub Copilot has driven a 70% increase in engagement with software developers, shifting teams from routine coding to higher value innovation. On the factory floor, Mercedes’ MO360 data platform connects 30 passenger plants, and its Digital Factory Chat multiagent system is cutting issue diagnosis time from days to minutes. With the next generation of Hey Mercedes — powered by Azure OpenAI, Bing, Microsoft Teams, Intune and Microsoft 365 Copilot — the vehicle becomes a “third workspace,” enabling productivity through natural voice. Investing in AI skills, tools and platform breadth is helping Mercedes Benz build enterprise capability and bend the curve on innovation; with efficiency gains that help teams innovate and drive operational impact internally and across customer experiences. We also recently announced our partnership with the Mercedes-AMG PETRONAS F1 Team to drive innovation across its racing operations. With Microsoft’s cloud and enterprise AI stack — including Azure AI, Microsoft 365 and GitHub — it is turning data into real-time intelligence that powers faster decisions, smarter strategies and sustained competitive advantage on and off the track.
Pantone is transforming decades of color expertise into a next-generation AI offering with the launch of its Pantone Palette Generator, built entirely on Microsoft Foundry and Azure AI. By applying a multi-agent architecture powered by Azure AI Search, Azure Cosmos DB and Azure OpenAI, Pantone is bringing instant, trend-backed color guidance directly into creative workflows. What once required weeks of research across physical color books and expert archives can now be achieved in seconds, enabling designers, brands and product teams to move from inspiration to production with greater speed and accuracy. Using GitHub Copilot, its engineering team accelerated development of initial proof of concept by more than 200 hours, allowing the company to focus on enhancing agent orchestration and color science logic. As Pantone expands its AI-native platform, it is also helping creators build new skills — learning how to integrate agentic workflows, prompt engineering and trend-driven insights into the design process. The platform modernizes Pantone’s iconic color system and positions the company to scale new digital services as it evolves its multiagent capabilities and reshapes business processes.
Westpac is bringing Copilot to more than 35,000 employees across its global workforce — the largest Microsoft 365 Copilot rollout to date in Australia and the largest deployment in financial services within Asia Pacific. This comes after a successful pilot with 15,000 employees that delivered strong business outcomes and freed up significant time for users each month. The company is now deploying AI to accelerate work, reduce friction and reinvent how employees engage with their customers. The bank is pairing its Copilot implementation with AI education programs and Microsoft Copilot Studio to build custom agents for HR and IT, while creating a new Azure-based innovation sandbox to enable teams to quickly experiment with AI-enabled workflows and solutions. Westpac’s move to embed AI at scale is a strategic investment in people and a catalyst for more efficient, higher value work—underscoring how responsible, enterprise grade AI can drive meaningful value for employees, customers and shareholders.
Bringing observability to every layer of the stack to ensure outcomes are reliable, safe and aligned with the business
ServiceNow is helping its customers accelerate AI adoption safely by integrating with Microsoft Agent 365 — Microsoft’s control plane for securing and governing agents at scale. By enabling them to bring their agentic workflows into a unified governance environment, ServiceNow can help them gain visibility, access controls and ensure compliance across AI systems. Companies like AstraZeneca are already using ServiceNow AI Control Tower together with Agent 365 to manage lab and operational workflows, saving 90,000 hours that researchers can redirect toward discovering lifesaving drugs. The ability to see, trust and scale AI agents gives organizations confidence to move quickly without losing control. ServiceNow is demonstrating how advanced workflow AI delivers its greatest value when paired with enterprise-wide governance — giving organizations the speed and efficiency they want while maintaining the control required for mission critical operations.
To help organizations safely accelerate agentic AI adoption, Workday is building solutions that work with Agent 365 for a unified way to govern its agents. The company is helping businesses address the shift from shadow IT to shadow AI as employees begin incorporating AI agents into the way they work. Workday’s Agent System of Record helps customers establish governance and oversight around the work AI agents are doing, allowing them to scale intelligent workflows with confidence. Workday highlights that responsible AI acceleration requires combining powerful automation with a shared control plan — making the secure, compliant path the easiest one, and enabling organizations to scale without losing oversight.
As organizations move to operationalize agentic AI at scale, Genspark is integrating with Agent 365 to provide a governed, enterprise grade path for deploying its rapidly growing ecosystem of Super Agents. As employees increasingly experiment with personal AI creation tools, companies are looking for ways to shift from unmanaged shadow AI to secure, outcome-driven agent workflows. Through Agent 365, the platform enables organizations to register its agents alongside those from Microsoft and other partners, apply unified governance policies, maintain consistent identity and permission controls, and ensure all agent generated outputs align with corporate, regulatory and data residency requirements. This governance layer extends the value of its own agent registry — where more than 80 specialized agents and millions of user generated prompts are already driving productivity — allowing customers to safely scale agentic creation across roles, teams and industries. Genspark demonstrates that responsible acceleration requires pairing powerful, outcome first agent experiences with a shared control plane like Agent 365, enabling AI to scale without losing oversight.
At Ignite, we also introduced Agent Factory — a new way for organizations to build and scale AI agents with confidence — bringing together Work IQ, Fabric IQ and Foundry IQ under a single, ROI-driven model. Agent Factory enables companies to take complex workflows — from claims processing to freight forwarding to supply chain management — and turn them into measurable, production-ready agentic systems supported by Microsoft’s forward deployed engineers, partner ecosystem and built-in governance. As agents move from experimentation to mission critical automation, companies need a standardized, governed path to build and scale them, and Agent Factory is the solution — tying innovation directly to measurable ROI.
Our ambition with Frontier Transformation is to ensure that the maker in every one of us is empowered by everything we build and deliver. As we enter the second half of the fiscal year, one thing is clear: our customers and partners are redefining what can be achieved as Frontier Firms. Built on a foundation of Intelligence + Trust, and with the full breadth of Microsoft’s cloud and AI solutions, we are committed to helping every organization scale AI-first innovation. Our model diverse, open and heterogenous platform unifies your IQ assets — and the human ambition that lives inside of your company — to deliver outcomes that help you achieve your highest aspirations. Thank you for your continued partnership and trust as we continue shaping what is possible together.
The post How Microsoft is empowering Frontier Transformation with Intelligence + Trust appeared first on The Official Microsoft Blog.
At Microsoft Ignite in November, we introduced Frontier Transformation — a holistic reimagining of business aligning AI with human ambition to help organizations achieve their highest aspirations and growth potential. While AI Transformation centered on efficiency and productivity, Frontier Transformation challenges us to do more for humanity by democratizing intelligence to unlock creativity and innovation for…
The post How Microsoft is empowering Frontier Transformation with Intelligence + Trust appeared first on The Official Microsoft Blog.Read More
Is MATLAB supported on Windows on ARM devices?
Windows devices using ARM hardware, such as the Snapdragon X Elite, have two ways to run applications. Applications built for Windows on ARM run natively, while apps built for x86-64 processors run in the Prism emulator. Is MATLAB supported on Windows ARM devices?Windows devices using ARM hardware, such as the Snapdragon X Elite, have two ways to run applications. Applications built for Windows on ARM run natively, while apps built for x86-64 processors run in the Prism emulator. Is MATLAB supported on Windows ARM devices? Windows devices using ARM hardware, such as the Snapdragon X Elite, have two ways to run applications. Applications built for Windows on ARM run natively, while apps built for x86-64 processors run in the Prism emulator. Is MATLAB supported on Windows ARM devices? MATLAB Answers — New Questions
How to build a mechanical arm dynamics simulation model
The mechanical arm grabs the object motion to calculate the joint torque.The mechanical arm grabs the object motion to calculate the joint torque. The mechanical arm grabs the object motion to calculate the joint torque. dynamics simulation model MATLAB Answers — New Questions
Need coordinates of endpoints on scatter plot
I have a scatterplot that forms multiple rays and I need to know the coordinates of the points at the ends of each ray.
My code is as follows:
data1 = readtable(‘600psi_15k_FT.txt’);
data2 = readtable(‘1000psi_10k_FT.txt’);
x1=data1.Var1;
y1=data1.Var2;
x2=data2.Var1;
y2=data2.Var2;
[peaks,locs]=findpeaks(y2,MinPeakHeight=0.003,MinPeakProminence=0.0006)
figure;
hold on;
plot(x1, y1, ‘b-‘, ‘DisplayName’, ‘600psi’);
plot(x2, y2, ‘r-‘, ‘DisplayName’, ‘1000psi’);
plot(x2(locs), peaks, ‘ro’, ‘MarkerFaceColor’, ‘r’, ‘DisplayName’, ‘Peaks’);
legend show;
hold off;
other_peaks = y1(locs);
scatter(peaks,other_peaks);
plot(peaks,other_peaks,’-o’)I have a scatterplot that forms multiple rays and I need to know the coordinates of the points at the ends of each ray.
My code is as follows:
data1 = readtable(‘600psi_15k_FT.txt’);
data2 = readtable(‘1000psi_10k_FT.txt’);
x1=data1.Var1;
y1=data1.Var2;
x2=data2.Var1;
y2=data2.Var2;
[peaks,locs]=findpeaks(y2,MinPeakHeight=0.003,MinPeakProminence=0.0006)
figure;
hold on;
plot(x1, y1, ‘b-‘, ‘DisplayName’, ‘600psi’);
plot(x2, y2, ‘r-‘, ‘DisplayName’, ‘1000psi’);
plot(x2(locs), peaks, ‘ro’, ‘MarkerFaceColor’, ‘r’, ‘DisplayName’, ‘Peaks’);
legend show;
hold off;
other_peaks = y1(locs);
scatter(peaks,other_peaks);
plot(peaks,other_peaks,’-o’) I have a scatterplot that forms multiple rays and I need to know the coordinates of the points at the ends of each ray.
My code is as follows:
data1 = readtable(‘600psi_15k_FT.txt’);
data2 = readtable(‘1000psi_10k_FT.txt’);
x1=data1.Var1;
y1=data1.Var2;
x2=data2.Var1;
y2=data2.Var2;
[peaks,locs]=findpeaks(y2,MinPeakHeight=0.003,MinPeakProminence=0.0006)
figure;
hold on;
plot(x1, y1, ‘b-‘, ‘DisplayName’, ‘600psi’);
plot(x2, y2, ‘r-‘, ‘DisplayName’, ‘1000psi’);
plot(x2(locs), peaks, ‘ro’, ‘MarkerFaceColor’, ‘r’, ‘DisplayName’, ‘Peaks’);
legend show;
hold off;
other_peaks = y1(locs);
scatter(peaks,other_peaks);
plot(peaks,other_peaks,’-o’) scatterplot, coordinates MATLAB Answers — New Questions
Does isAlways Make an Unwarranted Assumption that a Variable is real?
Define a sym variable
syms v
isAlways can’t prove that v is real. Makes sense.
isAlways(in(v,’real’))
Now make an assumption
assume(v > 0); % (1)
Is that assumption a sufficient condition to imply that v is real?
isAlways(in(v,’real’)) %(2)
Apparently it does.
But we don’t see that v is real in the assumptions
assumptions(v)
as we would if stated explicitly
assumeAlso(v,’real’);
assumptions(v)
Now define v as a complex number, which clears all of the assumptions
v = sym(1+1i);
assumptions(v)
Here, v satisfies assumption (1) because symbolic gt only compares the real parts of both sides (though the doc page does not state that explicitly)
isAlways(v > 0)
But satisfying assumption (1) in this case does not imply the truth of condition (2) (thankfully)
isAlways(in(v,’real’))
Seems like the correct way for the software to interpret (1) would be Re(v) > 0, in accordance with the de facto definition of gt, which would provide no information for evaluating (2).Define a sym variable
syms v
isAlways can’t prove that v is real. Makes sense.
isAlways(in(v,’real’))
Now make an assumption
assume(v > 0); % (1)
Is that assumption a sufficient condition to imply that v is real?
isAlways(in(v,’real’)) %(2)
Apparently it does.
But we don’t see that v is real in the assumptions
assumptions(v)
as we would if stated explicitly
assumeAlso(v,’real’);
assumptions(v)
Now define v as a complex number, which clears all of the assumptions
v = sym(1+1i);
assumptions(v)
Here, v satisfies assumption (1) because symbolic gt only compares the real parts of both sides (though the doc page does not state that explicitly)
isAlways(v > 0)
But satisfying assumption (1) in this case does not imply the truth of condition (2) (thankfully)
isAlways(in(v,’real’))
Seems like the correct way for the software to interpret (1) would be Re(v) > 0, in accordance with the de facto definition of gt, which would provide no information for evaluating (2). Define a sym variable
syms v
isAlways can’t prove that v is real. Makes sense.
isAlways(in(v,’real’))
Now make an assumption
assume(v > 0); % (1)
Is that assumption a sufficient condition to imply that v is real?
isAlways(in(v,’real’)) %(2)
Apparently it does.
But we don’t see that v is real in the assumptions
assumptions(v)
as we would if stated explicitly
assumeAlso(v,’real’);
assumptions(v)
Now define v as a complex number, which clears all of the assumptions
v = sym(1+1i);
assumptions(v)
Here, v satisfies assumption (1) because symbolic gt only compares the real parts of both sides (though the doc page does not state that explicitly)
isAlways(v > 0)
But satisfying assumption (1) in this case does not imply the truth of condition (2) (thankfully)
isAlways(in(v,’real’))
Seems like the correct way for the software to interpret (1) would be Re(v) > 0, in accordance with the de facto definition of gt, which would provide no information for evaluating (2). assumption, isalways MATLAB Answers — New Questions
Strange behavior of “isAlways” with respect to the number of symbolic variables
Dear all,
I am trying some tests with "isAlways" in order to check that a rater long algebraic expression is always positive, assuming that all its variables are positive.
One of the tests is the following
syms a b c d e f g real
assume(a >=0 & b>=0 & c>=0 & d>=0 & e>=0 & f>=0 & g>=0)
assumeAlso(a + b + c + d + e + f + g <=1)
isAlways(a+b<=1)
ans =
logical
1
which correctly says that if a,b,c,d,e,f,g are real and positive numbers with their sum less or equal to 1, then also the sum of the first two is less or equal to 1.
But when I try the same test with one additional variable, then
syms a b c d e f g m real
assume(a >=0 & b>=0 & c>=0 & d>=0 & e>=0 & f>=0 & g>=0 & m>=0)
assumeAlso(a + b + c + d + e + f + g +m <=1)
isAlways(a+b<=1)
Warning: Unable to prove ‘a + b <= 1’.
> In mupadengine/evalin2logical
In mupadengine/feval2logical
In sym/isAlways (line 39)
ans =
logical
0
How is this possible?Dear all,
I am trying some tests with "isAlways" in order to check that a rater long algebraic expression is always positive, assuming that all its variables are positive.
One of the tests is the following
syms a b c d e f g real
assume(a >=0 & b>=0 & c>=0 & d>=0 & e>=0 & f>=0 & g>=0)
assumeAlso(a + b + c + d + e + f + g <=1)
isAlways(a+b<=1)
ans =
logical
1
which correctly says that if a,b,c,d,e,f,g are real and positive numbers with their sum less or equal to 1, then also the sum of the first two is less or equal to 1.
But when I try the same test with one additional variable, then
syms a b c d e f g m real
assume(a >=0 & b>=0 & c>=0 & d>=0 & e>=0 & f>=0 & g>=0 & m>=0)
assumeAlso(a + b + c + d + e + f + g +m <=1)
isAlways(a+b<=1)
Warning: Unable to prove ‘a + b <= 1’.
> In mupadengine/evalin2logical
In mupadengine/feval2logical
In sym/isAlways (line 39)
ans =
logical
0
How is this possible? Dear all,
I am trying some tests with "isAlways" in order to check that a rater long algebraic expression is always positive, assuming that all its variables are positive.
One of the tests is the following
syms a b c d e f g real
assume(a >=0 & b>=0 & c>=0 & d>=0 & e>=0 & f>=0 & g>=0)
assumeAlso(a + b + c + d + e + f + g <=1)
isAlways(a+b<=1)
ans =
logical
1
which correctly says that if a,b,c,d,e,f,g are real and positive numbers with their sum less or equal to 1, then also the sum of the first two is less or equal to 1.
But when I try the same test with one additional variable, then
syms a b c d e f g m real
assume(a >=0 & b>=0 & c>=0 & d>=0 & e>=0 & f>=0 & g>=0 & m>=0)
assumeAlso(a + b + c + d + e + f + g +m <=1)
isAlways(a+b<=1)
Warning: Unable to prove ‘a + b <= 1’.
> In mupadengine/evalin2logical
In mupadengine/feval2logical
In sym/isAlways (line 39)
ans =
logical
0
How is this possible? isalways, positivity, assume, assumealso MATLAB Answers — New Questions
How to Control Access to Entra Multi-Tenant Apps
Use App Properties to Restrict Sign-In Audiences for Applications
Entra applications come in two forms: single- and multi-tenant. Most applications created within a tenant, such as those created to allow the use of Graph application permissions, are single-tenant, meaning that Entra allows the application only to run in its home tenant. To facilitate their use by the widest possible audience, applications created by Microsoft and ISVs are usually multi-tenant. In this case, the application (aka a registered app) is controlled by the home tenant, and a service principal represents the application in host tenants.
Lack of knowledge about what application settings do is one of the reasons why it’s a terrible idea to let users create applications. It’s all too easy for someone to create an application that could end up being the entry point for hackers into a tenant. The number of attacks on Microsoft 365 tenants through apps means that keeping an eye on the apps in a tenant and the permissions assigned to those apps has become a critical management task.
Restricting Access to Multi-Tenant Applications
Until recently, multi-tenant applications were open to use by any tenant. If you know the application identifier, you can create a service principal in your tenant, assign whatever permissions are necessary to the service principal, and run the application. Sometimes a multi-tenant application should be restricted to a set of host tenants, and that’s what a new feature called sign-in audience restrictions does by allowing application owners to define the set of tenants permitted to run the application. The feature is in beta and is currently managed through Graph APIs rather than the Entra admin center.
An Example Application
Let’s take the example of the application I created to find large items in Exchange Online mailboxes. I purposely created the application to be multi-tenant (Figure 1).

If you have a single-tenant application that you want to be multi-tenant, you can amend the setting in the Authentication section of the application properties or in PowerShell. Running the Update-MgApplication cmdlet like this amends an application’s sign-in audience to be multi-organization:
Update-MgApplication -ApplicationId '2493c2e6-e0ac-4c5b-9d46-a191914fa54b' -SignInAudience "AzureADMultipleOrgs"
Updating Applications to Restrict Sign-In Audiences
You need to know several pieces of information before you can restrict the sign-in audience for an application:
- The application identifier (to create a service principal in the host tenants).
- The application object identifier (to update the application settings).
- The tenant identifier for each tenant permitted to run the application.
First, update the application settings to create the list of permitted tenants. In this example I define an array containing a single tenant identifier and then create a hash table to hold the request body required to update the settings with the tenant list. When everything is ready, run the Update-MgBetaApplication cmdlet to update the application:
# Array of Entra ID tenants allowed to access the app
[array]$AllowedTenantIds = "22e90715-3da6-4a78-9ec6-b3282389492b", "72f988bf-86f1-41af-91ab-2d7cd011db47","91c369b5-1c9e-439c-989c-1867ec606603","fe3af2c1-e762-4b62-ad3b-e3f1ed5dd0a0", "68583590-5a71-4642-a292-9ce7980bcdd3"
$SignInAudienceRestrictions = @{}
$SignInAudienceRestrictions.Add("@odata.type","#microsoft.graph.allowedTenantsAudience")
$SignInAudienceRestrictions.Add("isHomeTenantAllowed", "true")
$SignInAudienceRestrictions.Add("allowedTenantIds", $AllowedTenantIds)
$Body = @{}
$Body.Add("signInAudience", "AzureADMultipleOrgs")
$Body.Add("signInAudienceRestrictions", $SignInAudienceRestrictions)
# Use ObjectId not clientId to update the app
$ObjectId = '8868296e-81d1-4ca4-b63a-635c8d42998d'
Update-MgBetaApplication -ApplicationId $ObjectId -BodyParameter $Body
I can’t find a documented limit for the number of tenant identifiers supported by the allowedTenantAudience resource type. Given the usual Graph limitations, I suspect that the value is relatively low. As shown above, we know that at least five tenants work! For large-scale ISV applications, protection against unauthorized access is likely delivered by other means.
To check that the SignInAudienceRestrictions property holds the correct value, retrieve it using the Invoke-MgGraphRequest cmdlet.
$Uri = ("https://graph.microsoft.com/beta/applications/{0}?`$Select=SignInAudienceRestrictions,SignInAudience,displayName" -f $AppId)
$Data = Invoke-MgGraphRequest -Uri $Uri -Method Get
$Data.SignInAudienceRestrictions
Name Value
---- -----
allowedTenantIds {22e90715-3da6-4a78-9ec6-b3282389492b, 72f988bf-86f1-41af-91ab-2d7cd011db47, 91c369b5-1c9e-439c-989c-1867ec606603, fe3af2c1-e762-4b62-ad3b-e3f1ed…
@odata.type #microsoft.graph.allowedTenantsAudience
kind allowedTenants
isHomeTenantAllowed True
Creating a Service Principal in a Host Tenant
Next, go to each of the host tenants permitted to run the application and create a service principal using the application identifier:
New-MgServicePrincipal -AppId 'd224fa41-4c2a-4380-b688-31625f90207a' DisplayName Id AppId SignInAudience ----------- -- ----- -------------- --- FindLargeMailboxItems 2f5aed8b-e121-4d94-9876-d41c300db3da d224fa41-4c2a-4380-b688-31625f90207a AzureADMultipleOrgs
If you make a mistake, remove the service principal as follows and rerun the command to create the service principal:
Remove-MgServicePrincipal -ServicePrincipalId 2f5aed8b-e121-4d94-9876-d41c300db3da
Resetting Sign-In Audience Restriction
To reset the allowed tenants to just the home tenant, populate the array of permitted tenants with the identifier for the home tenant and create the hash table for the request body, and rerun the Update-MgBetaApplication cmdlet shown above. You cannot use an empty array as the update fails if at least one tenant identifier is not present.
$TenantId = (Get-MgOrganization).Id [array]$AllowedTenantIds = $TenantId
Attempts to create or use the service principal for the app in a host tenant fail after the update because the tenant identifier is not in the allowed tenants list, In effect, the application now behaves like a single-tenant application because the home tenant is the only entry in the permitted list.
New-MgServicePrincipal -AppId $AppId New-MgServicePrincipal_CreateExpanded: Application d224fa41-4c2a-4380-b688-31625f90207a settings does not allow ServicePrincipal creation in 22e90715-3da6-4a78-9ec6-b3282389492b tenant due to SignInAudienceRestrictions configuration.
To revert to unrestricted access, change the @odata.type setting in the request body to “#microsoft.graph.unrestrictedAudience” and update the application settings as shown below:
$SignInAudienceRestrictions = @{}
$SignInAudienceRestrictions.Add("@odata.type","#microsoft.graph.unrestrictedAudience")
$Body = @{}
$Body.Add("signInAudience", "AzureADMultipleOrgs")
$Body.Add("signInAudienceRestrictions", $SignInAudienceRestrictions)
Update-MgBetaApplication -ApplicationId $ObjectId -BodyParameter $Body
Any of the operations to update application properties require the Application.ReadWrite.All permission.
Sign-in Audience Restriction for Applications is a Good Thing
Given the current state of threat, any additional control over Entra applications is a good thing. The application property lock feature is another example. Being able to restrict the sign-in audience for an application to specific tenants makes a heap of sense.
Need help to write and manage PowerShell scripts for Microsoft 365, including Azure Automation runbooks? Get a copy of the Automating Microsoft 365 with PowerShell eBook, available standalone or as part of the Office 365 for IT Pros eBook bundle.
What are redundant servers, how do I generate a license file for one, and how do I set it up?
I would like to know what redundant servers are, and how to generate and install a redundant server license file for MATLAB and associated products.I would like to know what redundant servers are, and how to generate and install a redundant server license file for MATLAB and associated products. I would like to know what redundant servers are, and how to generate and install a redundant server license file for MATLAB and associated products. MATLAB Answers — New Questions
How to use RandStreams appropriately with Parallel Computing?
I am currently working to update an existing set of code for reproducibility.
Currently, the code is structured as follows:
nlabs = 6;
seed = 1; % User-choice
[globalstream, labstreams{1:nlabs}] = RandStream.create(‘mrg32k3a’,’NumStreams’,nlabs+1,’Seed’,seed);
RandStream.setGlobalStream( globalstream );
parallelpool=parpool(nlabs);
spmd
RandStream.setGlobalStream( labstreams{spmdIndex} );
end
parfor i=1:nlabs
Calculations here
end
However, I need the code to be fully reproducible. I understand that to achieve reproducibility with parallel computing I need to use substreams ( https://www.mathworks.com/help/stats/reproducibility-in-parallel-statistical-computations.html ). However I am not confident of how to distinguish the global stream and worker stream.
I’ve seen an example in which the user used only a single global stream by storing and retreiving the stream state before and after the parfor loop ( https://www.mathworks.com/matlabcentral/answers/1670009-reproducible-and-independent-random-stream-generation-in-parfor-loop ) but it seems like it would be simpler to setup two independent streams.
I’ve outlined a two-stream setup below. Does this seem reasonable? I want globalstream and each substream of labstream to be independent.
nlabs = 6;
seed = 1; % User-choice
[globalstream, labstream] = RandStream.create(‘mrg32k3a’,’NumStreams’,2,’Seed’,seed);
RandStream.setGlobalStream( globalstream );
<Some Calculations>
parallelpool=parpool(nlabs);
parallel.pool.Constant(RandStream.setGlobalStream(labstream)) % Not sure of the syntax here
parfor i=1:nlabs
set(labstream,’Substream’,i)
<Some Calculations>
end
RandStream.setGlobalStream( globalstream );
<Some Calculations>I am currently working to update an existing set of code for reproducibility.
Currently, the code is structured as follows:
nlabs = 6;
seed = 1; % User-choice
[globalstream, labstreams{1:nlabs}] = RandStream.create(‘mrg32k3a’,’NumStreams’,nlabs+1,’Seed’,seed);
RandStream.setGlobalStream( globalstream );
parallelpool=parpool(nlabs);
spmd
RandStream.setGlobalStream( labstreams{spmdIndex} );
end
parfor i=1:nlabs
Calculations here
end
However, I need the code to be fully reproducible. I understand that to achieve reproducibility with parallel computing I need to use substreams ( https://www.mathworks.com/help/stats/reproducibility-in-parallel-statistical-computations.html ). However I am not confident of how to distinguish the global stream and worker stream.
I’ve seen an example in which the user used only a single global stream by storing and retreiving the stream state before and after the parfor loop ( https://www.mathworks.com/matlabcentral/answers/1670009-reproducible-and-independent-random-stream-generation-in-parfor-loop ) but it seems like it would be simpler to setup two independent streams.
I’ve outlined a two-stream setup below. Does this seem reasonable? I want globalstream and each substream of labstream to be independent.
nlabs = 6;
seed = 1; % User-choice
[globalstream, labstream] = RandStream.create(‘mrg32k3a’,’NumStreams’,2,’Seed’,seed);
RandStream.setGlobalStream( globalstream );
<Some Calculations>
parallelpool=parpool(nlabs);
parallel.pool.Constant(RandStream.setGlobalStream(labstream)) % Not sure of the syntax here
parfor i=1:nlabs
set(labstream,’Substream’,i)
<Some Calculations>
end
RandStream.setGlobalStream( globalstream );
<Some Calculations> I am currently working to update an existing set of code for reproducibility.
Currently, the code is structured as follows:
nlabs = 6;
seed = 1; % User-choice
[globalstream, labstreams{1:nlabs}] = RandStream.create(‘mrg32k3a’,’NumStreams’,nlabs+1,’Seed’,seed);
RandStream.setGlobalStream( globalstream );
parallelpool=parpool(nlabs);
spmd
RandStream.setGlobalStream( labstreams{spmdIndex} );
end
parfor i=1:nlabs
Calculations here
end
However, I need the code to be fully reproducible. I understand that to achieve reproducibility with parallel computing I need to use substreams ( https://www.mathworks.com/help/stats/reproducibility-in-parallel-statistical-computations.html ). However I am not confident of how to distinguish the global stream and worker stream.
I’ve seen an example in which the user used only a single global stream by storing and retreiving the stream state before and after the parfor loop ( https://www.mathworks.com/matlabcentral/answers/1670009-reproducible-and-independent-random-stream-generation-in-parfor-loop ) but it seems like it would be simpler to setup two independent streams.
I’ve outlined a two-stream setup below. Does this seem reasonable? I want globalstream and each substream of labstream to be independent.
nlabs = 6;
seed = 1; % User-choice
[globalstream, labstream] = RandStream.create(‘mrg32k3a’,’NumStreams’,2,’Seed’,seed);
RandStream.setGlobalStream( globalstream );
<Some Calculations>
parallelpool=parpool(nlabs);
parallel.pool.Constant(RandStream.setGlobalStream(labstream)) % Not sure of the syntax here
parfor i=1:nlabs
set(labstream,’Substream’,i)
<Some Calculations>
end
RandStream.setGlobalStream( globalstream );
<Some Calculations> randstream, parfor, substream MATLAB Answers — New Questions
addpath and genpath duplicating the first part of chosen path
I’m trying to set up my file access for a script, but MATLAB’s addpath and genpath functions keep duplicating the first part of the file path. This, obviously, throws an error, but I can’t for the life of me understand why it does this. Has anyone else experienced a similar problem?
fullfile(pth{2}, fpth{1})
ans =
"Users/David/Library/CloudStorage/GoogleDrive-dblair@gsu.edu/My Drive/Calhoun/Functions"
addpath(fullfile(pth{2}, fpth{1}))
Warning: Name is nonexistent or not a directory: /Users/David/Library/CloudStorage/GoogleDrive-dblair@gsu.edu/My Drive/Calhoun/Users/David/Library/CloudStorage/GoogleDrive-dblair@gsu.edu/My Drive/Calhoun/Functions
> In path (line 109)
In addpath (line 96)I’m trying to set up my file access for a script, but MATLAB’s addpath and genpath functions keep duplicating the first part of the file path. This, obviously, throws an error, but I can’t for the life of me understand why it does this. Has anyone else experienced a similar problem?
fullfile(pth{2}, fpth{1})
ans =
"Users/David/Library/CloudStorage/GoogleDrive-dblair@gsu.edu/My Drive/Calhoun/Functions"
addpath(fullfile(pth{2}, fpth{1}))
Warning: Name is nonexistent or not a directory: /Users/David/Library/CloudStorage/GoogleDrive-dblair@gsu.edu/My Drive/Calhoun/Users/David/Library/CloudStorage/GoogleDrive-dblair@gsu.edu/My Drive/Calhoun/Functions
> In path (line 109)
In addpath (line 96) I’m trying to set up my file access for a script, but MATLAB’s addpath and genpath functions keep duplicating the first part of the file path. This, obviously, throws an error, but I can’t for the life of me understand why it does this. Has anyone else experienced a similar problem?
fullfile(pth{2}, fpth{1})
ans =
"Users/David/Library/CloudStorage/GoogleDrive-dblair@gsu.edu/My Drive/Calhoun/Functions"
addpath(fullfile(pth{2}, fpth{1}))
Warning: Name is nonexistent or not a directory: /Users/David/Library/CloudStorage/GoogleDrive-dblair@gsu.edu/My Drive/Calhoun/Users/David/Library/CloudStorage/GoogleDrive-dblair@gsu.edu/My Drive/Calhoun/Functions
> In path (line 109)
In addpath (line 96) file path MATLAB Answers — New Questions
Issue with Blockproc when using PadPartialBlocks
I have an image that I want to break up into regions and calculate the Standard deviation of (using STD2). I only want the result to contain calculations for COMPLETE blocks. Walter has kindly suggested to use "PadMethod"as NaN, then the resulting data would have NaNs in the partial columns / rows that I can then process out.
However, using PadPartialBlocks doesn’t seem to be working as it should
bss = [500,500];
fh = @(bs) std2(bs.data);
J = blockproc(IM2, bss, fh,’UseParallel’,true,’PadPartialBlocks’,true,’PadMethod’,NaN);
The last column (=col 21) which is the "partial" column doesn’t have values I was expecting. Surely they should all be NaN? Its as though the NaN isn’t actally been replaced in the last partial column – what am I doing wrong?I have an image that I want to break up into regions and calculate the Standard deviation of (using STD2). I only want the result to contain calculations for COMPLETE blocks. Walter has kindly suggested to use "PadMethod"as NaN, then the resulting data would have NaNs in the partial columns / rows that I can then process out.
However, using PadPartialBlocks doesn’t seem to be working as it should
bss = [500,500];
fh = @(bs) std2(bs.data);
J = blockproc(IM2, bss, fh,’UseParallel’,true,’PadPartialBlocks’,true,’PadMethod’,NaN);
The last column (=col 21) which is the "partial" column doesn’t have values I was expecting. Surely they should all be NaN? Its as though the NaN isn’t actally been replaced in the last partial column – what am I doing wrong? I have an image that I want to break up into regions and calculate the Standard deviation of (using STD2). I only want the result to contain calculations for COMPLETE blocks. Walter has kindly suggested to use "PadMethod"as NaN, then the resulting data would have NaNs in the partial columns / rows that I can then process out.
However, using PadPartialBlocks doesn’t seem to be working as it should
bss = [500,500];
fh = @(bs) std2(bs.data);
J = blockproc(IM2, bss, fh,’UseParallel’,true,’PadPartialBlocks’,true,’PadMethod’,NaN);
The last column (=col 21) which is the "partial" column doesn’t have values I was expecting. Surely they should all be NaN? Its as though the NaN isn’t actally been replaced in the last partial column – what am I doing wrong? blockproc, padpartialblocks, nan MATLAB Answers — New Questions









