Month: July 2024
Azure Policy Support is Generally Available for PostgreSQL Flexible Server
What is Azure Policy?
Azure Policy is a service within Microsoft Azure that allows organizations to create, assign, and manage policies. These policies define rules and effects over resources, identities, and groups, in an effort to ensure compliance and uphold security. Enforcement comes in two forms – flagging noncompliance so your team can remediate the concern or simply blocking deployment.
Core Concepts of Azure Policy
At the heart of Azure Policy are two core components: policies and initiatives. Policies in Azure are the specific rules or guidelines, while initiatives are collections of policies that help achieve a broader compliance goal. Let’s break down the components of policies below.
A policy definition expresses what to evaluate and what action to take. Each policy definition in Azure Policy has a set of conditions under which it’s enforced and an accompanying effect that takes place if the conditions are met.
Policy effects is what happens when the conditions are met. Some common effects include: Deny, Audit, Append, Disabled, and DeployIfNotExists
Policy parameters are used to provide flexibility and reduce policy definition redundancy. They allow you to reuse the policy definition for different scenarios. Think of them as fields on a form to fill out – name, city, birthdate, address, etc. They remain, but how you fill them out can change.
Policy assignments are the application of a policy or initiative to a specific scope (subscription, management group, etc.)
Pic 1. Structure of Azure Policy (credit Sonrai Security)
Advantages of Azure Policy
Main benefits of using Azure Policy include consistent governance across all resources, streamlined management of policy enforcement, improved security and compliance, and increased visibility and control over cloud resources.
Azure Policy vs. Azure Role Based Access Control (RBAC)
Azure Policy and Azure Role-Based Access Control (RBAC) differ significantly. While Azure Policy focuses on resource properties, RBAC concentrates on user actions. Azure Policy enforces properties at the time of resource creation or update, whereas RBAC controls what users can do with existing resources.
Announcing General Availability of Pre-defined Azure Policy Definitions for PostgreSQL Flexible Server
Built-in policies are developed and tested by Microsoft, ensuring they meet common standards and best practices, and can be deployed quickly without the need for additional configuration, making them ideal for standard compliance requirements. We are happy to announce general availability of built-in policy support. This document has a full list of supported pre-built policy definitions.
Custom policy definitions
A custom policy definition allows customers to define their own rules for using Azure. These rules often enforce:
Security practices
Cost management
Organization-specific rules (like naming or locations)
An example of creation of custom policy definition can be found in this document.
Resources
For more information on Azure Policy and its support with Azure PostgreSQL Flexible Server:
Security – Azure Database for PostgreSQL – Flexible Server | Microsoft Learn
List of built-in policy definitions – Azure Policy | Microsoft Learn
Azure Policy Regulatory Compliance controls for Azure Database for PostgreSQL | Microsoft Learn
Azure/azure-policy: Repository for Azure Resource Policy built-in definitions and samples (github.com)
To learn more about our Flexible Server managed service, see the Azure Database for PostgreSQL service page. We’re always eager to hear customer feedback, so please reach out to us at Ask Azure DB for PostgreSQL.
Microsoft Tech Community – Latest Blogs –Read More
Creating simulink neural network from my own weights and bias
Hello everyone,
I have created my own neural network with matlab script (weights, bias, input, hidden and output layer) and I trained it until I got good results. but I don’t know how to assemble this script network into block simulink to use it in my simulation,
NOTE: I know the function gensim but it seems that it is used for matlab ANNs created with nntool, not for the manually created network.Hello everyone,
I have created my own neural network with matlab script (weights, bias, input, hidden and output layer) and I trained it until I got good results. but I don’t know how to assemble this script network into block simulink to use it in my simulation,
NOTE: I know the function gensim but it seems that it is used for matlab ANNs created with nntool, not for the manually created network. Hello everyone,
I have created my own neural network with matlab script (weights, bias, input, hidden and output layer) and I trained it until I got good results. but I don’t know how to assemble this script network into block simulink to use it in my simulation,
NOTE: I know the function gensim but it seems that it is used for matlab ANNs created with nntool, not for the manually created network. gensim / neural netwok / simulink MATLAB Answers — New Questions
How do I fit multiple curves at once, sharing some fitting parameters and floating others?
Hi,
I am new to using MATLAB and SIMBIOLOGY. I am trying to fit 6 binding curves where the ligand is the same throughout the experiment and only the analyte concentrations are different. I want to globally fit all 6 curves to one model, sharing the rate parameters but not sharing the concentration parameters. Is there a way to do this in simbiology? When I am setting up the Data Map I have my independent variable as Time, since binding is measured over time, and I have 6 responses each corresponding to my binding response generated for each analyte concentration. I get an error of "The same model component appears in the left-hand side of multiple elements of the responseMap input argument. The responseMap cannot contain any duplicates." Any help would be great! Attached is the data I am trying to fit.Hi,
I am new to using MATLAB and SIMBIOLOGY. I am trying to fit 6 binding curves where the ligand is the same throughout the experiment and only the analyte concentrations are different. I want to globally fit all 6 curves to one model, sharing the rate parameters but not sharing the concentration parameters. Is there a way to do this in simbiology? When I am setting up the Data Map I have my independent variable as Time, since binding is measured over time, and I have 6 responses each corresponding to my binding response generated for each analyte concentration. I get an error of "The same model component appears in the left-hand side of multiple elements of the responseMap input argument. The responseMap cannot contain any duplicates." Any help would be great! Attached is the data I am trying to fit. Hi,
I am new to using MATLAB and SIMBIOLOGY. I am trying to fit 6 binding curves where the ligand is the same throughout the experiment and only the analyte concentrations are different. I want to globally fit all 6 curves to one model, sharing the rate parameters but not sharing the concentration parameters. Is there a way to do this in simbiology? When I am setting up the Data Map I have my independent variable as Time, since binding is measured over time, and I have 6 responses each corresponding to my binding response generated for each analyte concentration. I get an error of "The same model component appears in the left-hand side of multiple elements of the responseMap input argument. The responseMap cannot contain any duplicates." Any help would be great! Attached is the data I am trying to fit. curve fitting, simbiology, binding, multiple curves MATLAB Answers — New Questions
New initial starting point (input and output) of already trained LSTM Network
I have input data X and output data Y.
I am training a LSTM network using:
net = trainNetwork(X(1:500), Y(1:500), layers, options);
This trains and initialize the network
However is there a way to initialize the network with for example X(1:600) and Y(1:600), not by retraining but by using the previous trained network ansd start any new predictions from that point on (601 and up)?I have input data X and output data Y.
I am training a LSTM network using:
net = trainNetwork(X(1:500), Y(1:500), layers, options);
This trains and initialize the network
However is there a way to initialize the network with for example X(1:600) and Y(1:600), not by retraining but by using the previous trained network ansd start any new predictions from that point on (601 and up)? I have input data X and output data Y.
I am training a LSTM network using:
net = trainNetwork(X(1:500), Y(1:500), layers, options);
This trains and initialize the network
However is there a way to initialize the network with for example X(1:600) and Y(1:600), not by retraining but by using the previous trained network ansd start any new predictions from that point on (601 and up)? lstm initiate deep learning MATLAB Answers — New Questions
Connecting PLC to Azure IoT Hub
We have a client with an Automation Direct Productivity 3000 PLC. Looks like it has an MQTT client configuration that may allow us to connect directly to Azure IoT Hub, but we are struggling. We’re waiting for the client to get a vendor resource to assist. Meanwhile. we’re trying to figure out if that will work directly, or if we need something like IoT Edge to act as a protocol translator. Unsure if the device is able to utilize TLS 1.2 or not. The config screen looks like this (please ignore current settings):
Do we need to use something like IoT Edge to convert the MQTT to use TLS? Any other advice for getting this up and running?
Thank you,
-Peter
We have a client with an Automation Direct Productivity 3000 PLC. Looks like it has an MQTT client configuration that may allow us to connect directly to Azure IoT Hub, but we are struggling. We’re waiting for the client to get a vendor resource to assist. Meanwhile. we’re trying to figure out if that will work directly, or if we need something like IoT Edge to act as a protocol translator. Unsure if the device is able to utilize TLS 1.2 or not. The config screen looks like this (please ignore current settings): Do we need to use something like IoT Edge to convert the MQTT to use TLS? Any other advice for getting this up and running? Thank you, -Peter Read More
Created Column Format
Hey everyone, I’m building a document library in SharePoint and having some issues with the Created column. I’m trying to have the column showing the year but the only thing I can get it to show is the date like this (January 29th). I would be okay with January 29th, 2024 or MM,DD,YYYY. I just can’t get the year to show up at all.
Any help would be greatly appreciated!
Thanks!
Hey everyone, I’m building a document library in SharePoint and having some issues with the Created column. I’m trying to have the column showing the year but the only thing I can get it to show is the date like this (January 29th). I would be okay with January 29th, 2024 or MM,DD,YYYY. I just can’t get the year to show up at all. Any help would be greatly appreciated! Thanks! Read More
INTRO TO MICROSOFT COPILOT FOR SECURITY
All you need to know to deploy your own Copilot for Security Instance
Copilot for Security is a generative AI security product that empowers security and IT professionals respond to cyber threats, process signals, and assess risk exposure at the speed and scale of AI.
Minimum requirements
Subscription
In order to purchase security compute units, you need to have an Azure subscription. For more information, see Create your Azure free account.
Security compute units
Security compute units are the required units of resources that are needed for dependable and consistent performance of Microsoft Copilot for Security.
Copilot for Security is sold in a provisioned capacity model and is billed by the hour. You can provision Security Compute Units (SCUs) and increase or decrease them at any time. Billing is calculated on an hourly basis with a minimum of one hour.
For more information, see Microsoft Copilot for Security pricing.
Capacity
Capacity in the context of Copilot for Security, is an Azure resource that contains SCUs. SCUs are provisioned for Copilot for Security. You can easily manage capacity by increasing or decreasing provisioned SCUs within the Azure portal or the Copilot for Security portal. Copilot for Security provides a usage monitoring dashboard for Copilot owners, allowing them to track usage over time and make informed decisions about capacity provisioning. For more information, see Managing usage.
Provisioning
We have 2 options to provision Compute Units, directly from the Copilot for Security Portal or from our Azure Subscription. The second option is to simply head over to Azure, search for Copilot for Security and you can create the resource, which in fact represents the billable CUs for the Directory the Subscription is associated with.
The first is the recommended option where we go through the actual portal where we can later manage Access and see our provisioned CUs, and follow a wizard type of activation.
Configure
Once we complete the wizard and press finish we are ready to start working with our Copilot for Security. Observe the information on the Home screen of the portal, with links to Training Prompts and Documentation.
Authentication & Roles
It is important to have a good understating of the Roles and permissions that apply for Copilot for Security.
Copilot for Security roles
Copilot for Security introduces two roles that function like access groups but aren’t Microsoft Entra ID roles. Instead, they only control access to the capabilities of the Copilot for Security platform.
Copilot ownerCopilot contributor
By default, all users in the Microsoft Entra tenant are given Copilot contributor access.
Microsoft Entra roles
The following Microsoft Entra roles automatically inherit Copilot owner access.
Security AdministratorGlobal Administrator
Have a look at the relevant documentation page explaining everything about Roles & permissions:
Understand authentication in Microsoft Copilot for Security | Microsoft Learn.
Copilot in action
Once we have a good understanding and we have built our Team, we can start working with Copilot for Security within Defender Dashboards from https://security.microsoft.com.
Most Dashboards offer the interactive experience that helps us understand different signals, take potential actions and get explanatory suggestions from the Copilot.
“Copilot for Threat Analytics is designed to assist users in understanding and responding to security threats. It provides evidence-based, objective, and actionable insights derived from security data. The purpose is to help users make informed decisions about their security posture and response strategies. It does this by analyzing data from various sources, identifying potential threats, and providing detailed information about those threats. This includes information about the nature of the threat, its potential impact, and possible mitigation strategies. The goal is to provide users with the information they need to effectively manage and respond to security threats.” (generated from Copilot for Security)
Especially in Advanced Hunting, Copilot offers a preset of KQL Queries that we can run directly or load them into our Editor for further editing.
Another powerful capability lays inside the Incidents that we get from Defender for Endpoint. Just click on the incident and Copilot will provide information and investigation information along with recommendations if available.
Intune – Copilot the Endpoints
Yes you are reading correct! Once your Copilot Platform is ready you are in for a nice surprise! In Endpoint Management or Intune you will find Copilot ready to assist on your Endpoints Management Tasks! It is an integration in Preview, and i believe it is going to be a great addition for Endpoint Administrators.
Here is an example where we are getting a summary of our Windows policy in Intune:
If we wanted to list the functionality here is a list of the main points:
Input Processing: When you ask Copilot a question in Intune, it sends the query to Copilot for Security.Data Sources: Copilot for Security uses data from your tenant and authoritative Microsoft documentation sources.Response Generation: It processes the input and generates a response, which is then displayed in Intune.Session Tracking: You can review all interactions in Copilot for Security by checking your sessions.Privacy and Verification: Always double-check Copilot’s responses, as it may not always be accurate.Partial Information: In some cases, Copilot might provide partial information due to large data volumes.
It is quite important to pay attention to the Responsible use of AI. A Frequently Asked Questions page is available for everyone as well.
Copilot for Security is a natural language, AI-powered security analysis tool that assists security professionals in responding to threats quickly, processing signals at machine speed, and assessing risk exposure in minutes. It draws context from plugins and data to answer security-related prompts so that security professionals can help keep their organizations secure. Users can collect the responses that they find useful from Copilot for Security and pin them to the pinboard for future reference. (source: Microsoft Responsible use of AI FAQ)
But that’s not all. Apart from the Defender portal, a good use of Copilot for Security comes within the Copilot for Security Platform. We can find a wide range of Prompts, we can utilize Plugins even build our own. We can upload files that provide guidance to the Copilot, examples of files you can upload are your organization’s policy and compliance documents, investigation and response procedures, and templates. Integrating this wealth of knowledge into Copilot allows Copilot to reason over the knowledge base or documents and generate responses that are more relevant, specific, and customized to your operational needs (source Microsoft Documentation).
The current library of Plugins is quite extensive but a key capability is the fact that we can create our own.
You can create new plugins to extend what Copilot can do by following the steps in Create new plugins. To add and manage your custom plugins to Copilot for Security, follow the steps in Manage custom plugins. (source Microsoft Documentation).
Final thoughts
Microsoft has significantly impacted the cybersecurity landscape with Copilot for Security. This powerful tool provides an instant upgrade for organizations, enabling IT and security teams to work more efficiently, prioritize findings, and take action without exhausting investigation efforts. Copilot for Security serves as a valuable AI expert assistant, guiding the security landscape in the right direction. It also acts as an upskilling platform, presenting a positive challenge for all involved to embrace the AI era through the lens of cybersecurity excellence. My personal testimony through the experience so far, is that the Product Team did an excellent Job building an AI Security platform that makes the difference and combines the best of our technology at hand with our needs for secure environments, while keeping a “learn while doing” pattern as usual. Don’t forget to download the Security Copilot diagram and get a high level overview of the architecture.
All you need to know to deploy your own Copilot for Security Instance Copilot for Security is a generative AI security product that empowers security and IT professionals respond to cyber threats, process signals, and assess risk exposure at the speed and scale of AI.Minimum requirementsSubscriptionIn order to purchase security compute units, you need to have an Azure subscription. For more information, see Create your Azure free account.Security compute unitsSecurity compute units are the required units of resources that are needed for dependable and consistent performance of Microsoft Copilot for Security.Copilot for Security is sold in a provisioned capacity model and is billed by the hour. You can provision Security Compute Units (SCUs) and increase or decrease them at any time. Billing is calculated on an hourly basis with a minimum of one hour.For more information, see Microsoft Copilot for Security pricing.CapacityCapacity in the context of Copilot for Security, is an Azure resource that contains SCUs. SCUs are provisioned for Copilot for Security. You can easily manage capacity by increasing or decreasing provisioned SCUs within the Azure portal or the Copilot for Security portal. Copilot for Security provides a usage monitoring dashboard for Copilot owners, allowing them to track usage over time and make informed decisions about capacity provisioning. For more information, see Managing usage.ProvisioningWe have 2 options to provision Compute Units, directly from the Copilot for Security Portal or from our Azure Subscription. The second option is to simply head over to Azure, search for Copilot for Security and you can create the resource, which in fact represents the billable CUs for the Directory the Subscription is associated with. The first is the recommended option where we go through the actual portal where we can later manage Access and see our provisioned CUs, and follow a wizard type of activation. ConfigureOnce we complete the wizard and press finish we are ready to start working with our Copilot for Security. Observe the information on the Home screen of the portal, with links to Training Prompts and Documentation. Authentication & RolesIt is important to have a good understating of the Roles and permissions that apply for Copilot for Security.Copilot for Security rolesCopilot for Security introduces two roles that function like access groups but aren’t Microsoft Entra ID roles. Instead, they only control access to the capabilities of the Copilot for Security platform.Copilot ownerCopilot contributorBy default, all users in the Microsoft Entra tenant are given Copilot contributor access. Microsoft Entra rolesThe following Microsoft Entra roles automatically inherit Copilot owner access.Security AdministratorGlobal AdministratorHave a look at the relevant documentation page explaining everything about Roles & permissions:Understand authentication in Microsoft Copilot for Security | Microsoft Learn.Copilot in actionOnce we have a good understanding and we have built our Team, we can start working with Copilot for Security within Defender Dashboards from https://security.microsoft.com.Most Dashboards offer the interactive experience that helps us understand different signals, take potential actions and get explanatory suggestions from the Copilot. “Copilot for Threat Analytics is designed to assist users in understanding and responding to security threats. It provides evidence-based, objective, and actionable insights derived from security data. The purpose is to help users make informed decisions about their security posture and response strategies. It does this by analyzing data from various sources, identifying potential threats, and providing detailed information about those threats. This includes information about the nature of the threat, its potential impact, and possible mitigation strategies. The goal is to provide users with the information they need to effectively manage and respond to security threats.” (generated from Copilot for Security) Especially in Advanced Hunting, Copilot offers a preset of KQL Queries that we can run directly or load them into our Editor for further editing. Another powerful capability lays inside the Incidents that we get from Defender for Endpoint. Just click on the incident and Copilot will provide information and investigation information along with recommendations if available. Intune – Copilot the EndpointsYes you are reading correct! Once your Copilot Platform is ready you are in for a nice surprise! In Endpoint Management or Intune you will find Copilot ready to assist on your Endpoints Management Tasks! It is an integration in Preview, and i believe it is going to be a great addition for Endpoint Administrators. Here is an example where we are getting a summary of our Windows policy in Intune: If we wanted to list the functionality here is a list of the main points:Input Processing: When you ask Copilot a question in Intune, it sends the query to Copilot for Security.Data Sources: Copilot for Security uses data from your tenant and authoritative Microsoft documentation sources.Response Generation: It processes the input and generates a response, which is then displayed in Intune.Session Tracking: You can review all interactions in Copilot for Security by checking your sessions.Privacy and Verification: Always double-check Copilot’s responses, as it may not always be accurate.Partial Information: In some cases, Copilot might provide partial information due to large data volumes.It is quite important to pay attention to the Responsible use of AI. A Frequently Asked Questions page is available for everyone as well.Copilot for Security is a natural language, AI-powered security analysis tool that assists security professionals in responding to threats quickly, processing signals at machine speed, and assessing risk exposure in minutes. It draws context from plugins and data to answer security-related prompts so that security professionals can help keep their organizations secure. Users can collect the responses that they find useful from Copilot for Security and pin them to the pinboard for future reference. (source: Microsoft Responsible use of AI FAQ)But that’s not all. Apart from the Defender portal, a good use of Copilot for Security comes within the Copilot for Security Platform. We can find a wide range of Prompts, we can utilize Plugins even build our own. We can upload files that provide guidance to the Copilot, examples of files you can upload are your organization’s policy and compliance documents, investigation and response procedures, and templates. Integrating this wealth of knowledge into Copilot allows Copilot to reason over the knowledge base or documents and generate responses that are more relevant, specific, and customized to your operational needs (source Microsoft Documentation).The current library of Plugins is quite extensive but a key capability is the fact that we can create our own.You can create new plugins to extend what Copilot can do by following the steps in Create new plugins. To add and manage your custom plugins to Copilot for Security, follow the steps in Manage custom plugins. (source Microsoft Documentation).Final thoughtsMicrosoft has significantly impacted the cybersecurity landscape with Copilot for Security. This powerful tool provides an instant upgrade for organizations, enabling IT and security teams to work more efficiently, prioritize findings, and take action without exhausting investigation efforts. Copilot for Security serves as a valuable AI expert assistant, guiding the security landscape in the right direction. It also acts as an upskilling platform, presenting a positive challenge for all involved to embrace the AI era through the lens of cybersecurity excellence. My personal testimony through the experience so far, is that the Product Team did an excellent Job building an AI Security platform that makes the difference and combines the best of our technology at hand with our needs for secure environments, while keeping a “learn while doing” pattern as usual. Don’t forget to download the Security Copilot diagram and get a high level overview of the architecture. Read More
By sending of email in Outlook when using email template the PDF attachments are corrupted
By sending of email in Outlook when using email template the PDF attachments are corrupted and cannot be opened by the recipient. Safe mode doesn’t help and Sara logs don’t show any errors.
By sending of email in Outlook when using email template the PDF attachments are corrupted and cannot be opened by the recipient. Safe mode doesn’t help and Sara logs don’t show any errors. Read More
How to get alerted on pending items in the Action Center
Good morning all!
Part of my daily duties is to ensure that items in the Action Center are acted upon in a timely manner. I have been trying to find ways to be able to be alerted on new items, but there is nothing in Microsoft documentation, or anything that is obvious. I have scoured the internet, where I stumbled upon an old post about having to use a PS script, but there has to be some sort of notification Microsoft can send out on these items?! Since these items are time sensitive, I am having to check constantly for any new soft/hard delete emails.
Good morning all! Part of my daily duties is to ensure that items in the Action Center are acted upon in a timely manner. I have been trying to find ways to be able to be alerted on new items, but there is nothing in Microsoft documentation, or anything that is obvious. I have scoured the internet, where I stumbled upon an old post about having to use a PS script, but there has to be some sort of notification Microsoft can send out on these items?! Since these items are time sensitive, I am having to check constantly for any new soft/hard delete emails. Read More
Redirection when converting from Provider Hosted SharePoint App (ACS) to Azure App Registration
Hi Everyone,
I have a SharePoint Provider Hosted App, that when you navigate to ‘Site Contents, and then click on the App Name, it redirects (links) to a web page on our Provider Hosted IIS site.
How do I replicate this functionality when converting to Azure App Registration from SharePoint App Registration/Azure Access Control Service (ACS)?
Thank you in advance!
Hi Everyone,I have a SharePoint Provider Hosted App, that when you navigate to ‘Site Contents, and then click on the App Name, it redirects (links) to a web page on our Provider Hosted IIS site. How do I replicate this functionality when converting to Azure App Registration from SharePoint App Registration/Azure Access Control Service (ACS)? Thank you in advance! Read More
New Blog | June 2024 update on Azure AD Graph API retirement
One year ago, we shared an update on the completion of a three-year notice period for the deprecation of the Azure AD Graph API service. This service is now in the retirement cycle and retirement (shut down) will occur in incremental stages. In the first stage of this retirement cycle, newly created applications will receive an error (HTTP 403) for any requests to Azure AD Graph APIs. We’re revising the date for this first stage from June 30 to August 31, and only applications created after August 31, 2024 will be impacted. After January 31, 2025, all applications – both new and existing – will receive an error when making requests to Azure AD Graph APIs, unless they’re configured to allow extended Azure AD Graph access.
We understand that some apps may not have fully completed migration to Microsoft Graph. We’re providing an optional configuration through the authenticationBehaviors property, which will allow an application to use Azure AD Graph APIs through June 30, 2025. Azure AD Graph will be fully retired after June 30, 2025, and no API requests will function at this point, regardless of the application’s configuration.
If you develop or distribute software that still uses Azure AD Graph APIs, you must act now to avoid interruption. You’ll either need to migrate your applications to Microsoft Graph (highly recommended) or configure the application for an extension, as described below, and ensure that your customers are prepared for the change. If you’re using applications supplied by a vendor that use Azure AD Graph APIs, work with the software vendor to update to a version that has migrated to Microsoft Graph APIs.
How do I find Applications in my tenant using Azure AD Graph APIs?
The Microsoft Entra recommendations feature provides recommendations to ensure your tenant is in a secure and healthy state, while also helping you maximize the value of the features available in Entra ID.
We’ve provided two Entra recommendations that show information about applications and service principals that are actively using Azure AD Graph APIs in your tenant. These new recommendations can support your efforts to identify and migrate the impacted applications and service principals to Microsoft Graph.
Read the full post here: June 2024 update on Azure AD Graph API retirement
By Kristopher Bash
One year ago, we shared an update on the completion of a three-year notice period for the deprecation of the Azure AD Graph API service. This service is now in the retirement cycle and retirement (shut down) will occur in incremental stages. In the first stage of this retirement cycle, newly created applications will receive an error (HTTP 403) for any requests to Azure AD Graph APIs. We’re revising the date for this first stage from June 30 to August 31, and only applications created after August 31, 2024 will be impacted. After January 31, 2025, all applications – both new and existing – will receive an error when making requests to Azure AD Graph APIs, unless they’re configured to allow extended Azure AD Graph access.
We understand that some apps may not have fully completed migration to Microsoft Graph. We’re providing an optional configuration through the authenticationBehaviors property, which will allow an application to use Azure AD Graph APIs through June 30, 2025. Azure AD Graph will be fully retired after June 30, 2025, and no API requests will function at this point, regardless of the application’s configuration.
If you develop or distribute software that still uses Azure AD Graph APIs, you must act now to avoid interruption. You’ll either need to migrate your applications to Microsoft Graph (highly recommended) or configure the application for an extension, as described below, and ensure that your customers are prepared for the change. If you’re using applications supplied by a vendor that use Azure AD Graph APIs, work with the software vendor to update to a version that has migrated to Microsoft Graph APIs.
How do I find Applications in my tenant using Azure AD Graph APIs?
The Microsoft Entra recommendations feature provides recommendations to ensure your tenant is in a secure and healthy state, while also helping you maximize the value of the features available in Entra ID.
We’ve provided two Entra recommendations that show information about applications and service principals that are actively using Azure AD Graph APIs in your tenant. These new recommendations can support your efforts to identify and migrate the impacted applications and service principals to Microsoft Graph.
Figure 1: Microsoft Entra Recommendations for Azure AD Graph migration
Read the full post here: June 2024 update on Azure AD Graph API retirement Read More
Are there tips or habits for Outlook performance
Sometimes when I have Outlook empty a folder, it says “22 minutes …” and locks me up until it finishes. In a database, one would get such bad performance by joining large tables without benefit of proper indexes. Is there something not happening that could let simple message delete happen at “click speed” instead of this?
Like so many people nowadays, I get hundreds and hundreds of emails, mostly from people I don’t know. Generally, I make rules to intercept new mail and send it to an appropriate folder, and then at end of day scan the hundreds of emails (or worse!) still left in the inbox, in case something new-ish has appeared, before deleting them all.
Deleting the undesirables from the inbox – both focused and other – often takes two to four minutes. After a week away, the wait may be several times that! Normally, ugly data delays are from missing or out-of-date index structures. Is there something I could do differently and thus “work” the system better?
Sometimes when I have Outlook empty a folder, it says “22 minutes …” and locks me up until it finishes. In a database, one would get such bad performance by joining large tables without benefit of proper indexes. Is there something not happening that could let simple message delete happen at “click speed” instead of this? Like so many people nowadays, I get hundreds and hundreds of emails, mostly from people I don’t know. Generally, I make rules to intercept new mail and send it to an appropriate folder, and then at end of day scan the hundreds of emails (or worse!) still left in the inbox, in case something new-ish has appeared, before deleting them all. Deleting the undesirables from the inbox – both focused and other – often takes two to four minutes. After a week away, the wait may be several times that! Normally, ugly data delays are from missing or out-of-date index structures. Is there something I could do differently and thus “work” the system better? Read More
sorting binary data
I have some pump runtimes that I’m wanting to condense into readable and presentable data. I have a snippet of a sample table that shows the data I’m working with on a larger scale. I have some previous programming experience but cannot connect the dots right now to get to where I want to be. Any help at all would be much appreciated. 0 signifies not running 1 signifies running. The larger tables are minute by minute. Like I said this is a snippet. Ideally my end goal is an output that says both pumps off for x minutes then pump 1 or 2 on for y minutes then both off again for x then on for y repeated until the entire table has been gone thru. The reports that have the raw data are generated in excel and I’d like to be able to apply some formulas to them to get my desired results. Any help is appreciated. Thank you.
Bennett’s LSBennett’s LSPUMP #1PUMP #2ValueValue000000001010000000000001010100
I have some pump runtimes that I’m wanting to condense into readable and presentable data. I have a snippet of a sample table that shows the data I’m working with on a larger scale. I have some previous programming experience but cannot connect the dots right now to get to where I want to be. Any help at all would be much appreciated. 0 signifies not running 1 signifies running. The larger tables are minute by minute. Like I said this is a snippet. Ideally my end goal is an output that says both pumps off for x minutes then pump 1 or 2 on for y minutes then both off again for x then on for y repeated until the entire table has been gone thru. The reports that have the raw data are generated in excel and I’d like to be able to apply some formulas to them to get my desired results. Any help is appreciated. Thank you.Bennett’s LSBennett’s LSPUMP #1PUMP #2ValueValue000000001010000000000001010100 Read More
What’s new in Microsoft Entra – June 2024
Have you explored the What’s New in Microsoft Entra hub in the Microsoft Entra admin center? It’s a centralized view of our roadmap and change announcements across the Microsoft Entra identity and network access portfolio so you can stay informed with the latest updates and actionable insights to strengthen your security posture.
Here in the Microsoft Entra blog, we share feature release information and change announcements every quarter. Today’s post covers April – June 2024. It’s organized by Microsoft Entra products, so you can quickly scan what’s relevant for your deployment.
Microsoft Entra ID
Microsoft Entra ID Governance
Microsoft Entra External ID
Microsoft Entra Permissions Management
Microsoft Entra Verified ID
New releases
Microsoft Entra ID Protection: remediate risks and unblock users
On-premises password reset remediates user risk
Multiple Passwordless Phone Sign-in for Android Devices
Windows Account extension is now Microsoft Single Sign On
Custom Claims Providers enable token claim augmentation from external data sources
Granular Certificate-Based Authentication Configuration in Conditional Access
New role: Organizational Branding Administrator
Microsoft Graph activity logs
Last successful sign-in date for users
Self-service password reset Admin policy expansion to include additional roles
Dynamic Groups quota increased to 15,000
Streamline your ADAL migration with updated sign-ins workbook
Change announcements
Security update to Entra ID affecting clients which are running old, unpatched builds of Windows
[Action may be required]
We’re making a security update to Entra ID such that use of older unpatched version of Windows which still use the less secure Key Derivation Function v1 (KDFv1) will no longer be supported. Once the update is rolled out, unsupported and unpatched Windows 10 and 11 clients will no longer be able to sign in to Entra ID. Globally, more than 99% of Windows clients signing in to Entra ID have the required security patches.
Action required:
If your Windows devices have Security Patches after July 2021 no action is required.
If your Windows devices do not have security updates after July 2021, update Windows to the latest build of your currently supported Windows version to maintain access to Entra ID.
All currently supported versions of Windows have the required patch.
We recommend you keep Windows up to date with Security Updates.
Background:
A Security Update to Windows CVE-2021-33781 was issued in July 2021 to address a vulnerability where Primary Refresh Tokens were not stored sufficiently securely in the client. Once patched, Windows clients used the stronger KDFv2 algorithm. All versions of Windows released since that time have the update and handle the token securely.
A small percentage of Windows devices have not yet been updated and are still using the older v1 key derivation function. To improve security of the system, unpatched devices using the KDFv1 algorithm will no longer be able to sign in to Entra ID using Primary Refresh Tokens.
What is the user experience on unsupported Windows devices when this change is rolled out?
Users of Windows devices which haven’t been updated with patches since July 2021 may experience sign in failures with their Entra ID user accounts on joined or hybrid joined Windows device.
How do I diagnose this situation?
The error code, which will show in sign in logs, is ‘AADSTS5000611: Symmetric Key Derivation Function version ‘KDFV1′ is invalid. Update the device for the latest updates.’
Enhancing the security of Apple devices in the enterprise with hardware bound device identity – 2-year notice
[Action may be required]
Device identity is one of the fundamental Entra ID concepts that enables multiple Entra ID and MDM/MAM security features like device compliance policies, app protection policies, or PRT-based SSO. To enhance security, Entra ID has now done work to support the binding of device identity keys to Apple’s Secure Enclave hardware, which will replace previous Keychain-based mechanism.
Starting in June 2026, all new Entra ID registrations will be bound to the Secure Enclave. As a result, all customers will need to adopt the Microsoft Enterprise SSO plug-in and some of the apps may need to make code changes to adopt the new Secure Enclave based device identity.
Opt-in, provide feedback
Before Entra enables Secure Enclave by default for all new registrations, we encourage tenants to perform early testing using the documentation provided on learn.microsoft.com. This will help to identify any compatibility issues, where you may need to request code changes from app or MDM vendors.
To report issues, raise questions, or voice concerns please open a support ticket or reach out to your Microsoft account team.
Upgrade to the latest version of Microsoft Entra Connect by September 23, 2024
[Action may be required]
Since September 2023, we have been auto-upgrading Microsoft Entra Connect Sync and Microsoft Entra Connect Health to an updated build as part of a precautionary security-related service change. For customers who have previously opted out of auto-upgrade or for whom auto-upgrade failed, we strongly recommend that you upgrade to the latest versions by September 23, 2024.
When you upgrade to the latest versions by that date, you ensure that when the service changes take effect, you avoid disruption for the following capabilities:
Service
Recommended Version
Features Impacted by Service Change
Microsoft Entra Connect Sync
Auto-upgrade will stop working. Synchronization isn’t impacted
Microsoft Entra Connect Health agent for Sync
A subset of alerts will be impacted:
· Connection to Microsoft Entra ID failed due to authentication failure
· High CPU usage detected
· High Memory Consumption Detected
· Password Hash Synchronization has stopped working
· Export to Microsoft Entra ID was Stopped. Accidental delete threshold was reached
· Password Hash Synchronization heartbeat was skipped in the last 120 minutes
· Microsoft Entra Sync service cannot start due to invalid encryption keys
· Microsoft Entra Sync service not running: Windows Service account Creds Expired
Microsoft Entra Connect Health agent for ADDS
Microsoft Entra Connect Health agent for ADFS
Note: If you cannot upgrade by September 23, 2024, you can still regain full functionality for the above features after that date. You would do so by manually upgrading to the recommended builds at your earliest convenience.
For upgrade-related guidance, please refer to our docs.
Important Update: Azure AD Graph Retirement
[Action may be required]
As of June 2023, the Azure AD Graph API service is in a retirement cycle and will be retired (shut down) in incremental stages. In the first stage of this retirement cycle, newly created applications will receive an error (HTTP 403) for any requests to Azure AD Graph APIs (https://graph.windows.net). We are revising the da20te for this first stage from June 30 to August 31, so only applications created after August 31, 2024, will be impacted. The second stage of the Azure AD Graph service retirement cycle will begin after January 31, 2025. At this point, all applications that are using Azure AD Graph APIs will receive an error when making requests to the AAD Graph service. Azure AD Graph will be completely retired (and stop working) after June 30, 2025.
We understand that some apps may not have fully completed migration to Microsoft Graph. We are providing an optional configuration (through the authenticationBehaviors setting) that will allow an application to continue use of Azure AD Graph APIs through March 30, 2025. If you develop or distribute software that still uses Azure AD Graph APIs, you must act now to avoid interruption. You will either need to migrate your applications to Microsoft Graph (highly recommended) or configure the application for an extension, and ensure that your customers are prepared for the change.
To identify applications that are using Azure AD Graph APIs, we have provided two Entra recommendations with information about applications and service principals that are actively using Azure AD Graph APIs in your tenant.
For more information, see the following references:
June 2024 update on Azure AD Graph API retirement
Migrate from Azure Active Directory (Azure AD) Graph to Microsoft Graph
Azure AD Graph app migration planning checklist
Azure AD Graph to Microsoft Graph migration FAQ
Important Update: AzureAD and MSOnline PowerShell retirement
[Action may be required]
As of March 30, 2024, the legacy Azure AD PowerShell, Azure AD PowerShell Preview, and MS Online modules are deprecated. These modules will continue to function through March 30, 2025, when they are retired and stop functioning. Microsoft Graph PowerShell SDK is the replacement for these modules and you should migrate your scripts to Microsoft Graph PowerShell SDK as soon as possible.
Note: as indicated in our April update, MS Online with “Legacy Auth” will stop functioning in the weeks after June 30, 2024. Legacy Auth is typically associated with versions before 1.1.166.0, and involves use of MS Online PowerShell with the Microsoft Online Sign-In Assistant package installed. If you are using MS Online versions before 1.1.166.0 or MS Online with Legacy Auth, you should immediately migrate to Microsoft Graph PowerShell SDK or update the MS Online version to the latest version (1.1.183.81).
To help you identify usage of Azure AD PowerShell in your tenant, you can use the Entra Recommendation titled Migrate Service Principals from the retiring Azure AD Graph APIs to Microsoft Graph. This recommendation will show vendor applications that are using Azure AD Graph APIs in your tenant, including AzureAD PowerShell.
We are making substantial new and future investments in the PowerShell experience for managing Entra, with the recent Public Preview launch of the Microsoft Entra PowerShell module. This new module builds upon and is part of the Microsoft Graph PowerShell SDK. It’s fully interoperable with all cmdlets in the Microsoft Graph PowerShell SDK, enabling you to perform complex operations with simple, well documented commands. The module also offers a backward compatibility option to simplify migraiton from the deprecated AzureAD Module. Additionally, we are aware that some of our customers were unable to fully migrate to scripts that managed Per-user MFA from MSOnline to Microsoft Graph PowerShell. Microsoft Graph APIs were recently made available to read and configure Per-user MFA settings for users, and availability in Microsoft Graph PowerShell SDK cmdlets is soon to follow.
Private Preview – QR code sign-in, a new authentication method for Frontline Workers
[Action may be required]
We are introducing a new simple way for Frontline Workers to authenticate in Microsoft Entra ID with a QR code and PIN, eliminating the need to enter long UPNs and alphanumeric passwords multiple times during their shift.
With the private preview release of this feature in August 2024, all users in your tenant will see a new link ‘Sign in with QR code’ on navigating to https://login.microsoftonline.com > ‘Sign-in options’ > ‘Sign in to an organization’ page. This new link, ‘Sign in with QR code’, will be visible only on mobile devices (Android/iOS/iPadOS). If you are not participating in the private preview, users from your tenant will not be able to sign-in through this method while we are still in private preview. They will receive an error message if they try to sign-in.
The feature will have a ‘preview’ tag until it is generally available. Your organization needs to be enabled to test this feature. Broad testing will be available in public preview, which we will announce later.
While the feature is in private preview, no technical support will be provided. Please learn more about support during previews here Microsoft Entra ID preview program information – Microsoft Entra | Microsoft Learn.
Changes to phone call settings: custom greetings and caller ID
[Action may be required]
Starting September 2024, phone call settings (custom greetings and caller ID) under Entra’s multifactor authentication blade will be moved under the voice authentication method in the authentication method policy. Instead of accessing these settings through the Entra ID or Azure portal, they will be accessible through MS Graph API. If your organization is using custom greetings and/or caller ID, please make sure to check the public documentation once we release the new experience to learn how to manage these settings through MS Graph.
MS Graph API support for per-user MFA
[Action may be required]
Starting June 2024, we are releasing the capability to manage user status (Enforced, Enabled, Disabled) for per-user MFA through MS Graph API. This will replace the legacy MS Online PowerShell module that is being retired. Please be aware that the recommended approach to protect users with Microsoft Entra MFA is Conditional Access (for licensed organizations) and security defaults (for unlicensed organizations). The public documentation will be updated once we release the new experience.
Azure Multi-Factor Authentication Server – 3-month notice
[Action may be required]
Beginning September 30, 2024, Azure Multi-Factor Authentication Server deployments will no longer service MFA requests, which could cause authentications to fail for your organization. MFA Server will have limited SLA and MFA Activity Report in the Azure Portal will no longer be available. To ensure uninterrupted authentication services and to remain in a supported state, organizations should migrate their users’ authentication data to the cloud-based Azure MFA service using the latest Migration Utility included in the most recent Azure MFA Server update. Learn more at Azure MFA Server Migration.
Decommissioning of Group Writeback V2 (Public Preview) in Entra Connect Sync – Reminder
[Action may be required]
The public preview of Group Writeback V2 (GWB) in Entra Connect Sync is no longer available and Connect Sync will no longer support provisioning cloud security groups to Active Directory.
Another similar functionality is offered in Entra Cloud Sync, called “Group Provision to AD”, that maybe used instead of GWB V2 for provisioning cloud security groups to AD. Enhanced functionality in Cloud Sync, along with other new features, are being developed.
Customers who use this preview feature in Connect Sync should switch their configuration from Connect Sync to Cloud Sync. Customers can choose to move all their hybrid sync to Cloud Sync (if it supports their needs) or Cloud Sync can be run side-by-side and move only cloud security group provisioning to AD onto Cloud Sync. Customers who provision Microsoft 365 groups to AD can continue using GWB V1 for this capability.
Visual enhancements to the per-user MFA admin configuration experience
[No action is required]
As part of ongoing service improvements, we are making updates to the per-user MFA admin configuration experience to align with the look and feel of Entra ID. This change does not include any changes to the core functionality and will only include visual improvements. Starting in August 2024, you will be redirected to the new experience both from the Entra admin center and Azure portal. There will be a banner presented for the first 30 days to switch back to the old experience, after which you can only use the new experience. The public documentation will be updated once we release the new experience.
Updates to “Target resources” in Microsoft Entra Conditional Access
[No action is required]
Starting in September 2024, the Microsoft Entra Conditional Access ‘Target resources’ assignment will consolidate the “Cloud apps” and “Global Secure Access” options under a new name “Resources”.
Customers will be able to target “All internet resources with Global Secure Access”, “All resources (formerly ‘all cloud apps’) or select specific resources (formerly “select apps”). Some of the Global Secure Access attributes in the Conditional Access API will be deprecated.
This change will start in September 2024 and will occur automatically, admins won’t need to take any action. There are no changes in the behavior of existing Conditional Access policies. To learn more, click here.
Upcoming Improvements to Entra ID device code flow
[No action is required]
As part of our ongoing commitment to security, we are announcing upcoming enhancements to the Entra ID device code flow. These improvements aim to provide a more secure and efficient authentication experience.
We’ve refined the messaging and included app details within the device code flow to ensure a more secure and precise user experience. Specifically, we’ve adjusted headers and calls to action to help your users recognize and respond to security threats more effectively. These changes are designed to help your users make more informed decisions and prevent phishing attacks.
These changes will be gradually introduced starting in July 2024 and are expected to be fully implemented by August 30, 2024. No action required from you.
Microsoft Entra ID Governance
New releases
Microsoft Entra ID multi-tenant organization
Security group provisioning to Active Directory using cloud sync
Support for PIM approvals and activations on the Azure mobile app (iOS and Android)
Lifecycle Workflows: Export workflow history data to CSV files
B2B Sponsors as an Attribute and Approvers in Entitlement Management
Maximum workflows limit in Lifecycle workflows is now 100
New provisioning connectors in the Microsoft Entra Application Gallery
Microsoft Entra External ID
New releases
Microsoft Entra External ID
Configure redemption order for B2B collaboration
Microsoft Entra Permissions Management
New releases
Support for PIM enabled Groups in Microsoft Entra Permissions Management
Microsoft Entra Verified ID
New releases
Quick Microsoft Entra Verified ID setup
Add to Favorites: What’s New in Microsoft Entra
Stay informed about Entra product updates and actionable insights with What’s New in Microsoft Entra. This new hub in the Microsoft Entra admin center offers you a centralized view of our roadmap and change announcements across the Microsoft Entra identity and network access portfolio.
Learn more about Microsoft Entra
Prevent identity attacks, ensure least privilege access, unify access controls, and improve the experience for users with comprehensive identity and network access solutions across on-premises and clouds.
Microsoft Entra News and Insights | Microsoft Security Blog
Microsoft Entra blog | Tech Community
Microsoft Entra documentation | Microsoft Learn
Microsoft Entra discussions | Microsoft Community
Microsoft Tech Community – Latest Blogs –Read More
Hello, how do you simulate real-time load profile on Simulink using lookup tables? My daily load profile is hourly KW and i want to simulate the load in a solar PV system?
My load profile is hourly KW data over a period of 24 hrsMy load profile is hourly KW data over a period of 24 hrs My load profile is hourly KW data over a period of 24 hrs lookup tables, load, variable load MATLAB Answers — New Questions
Is there a way to rename a file to one that contains certain strings?
For example, we have the following variables stored:
Altitude = 100ft
Mach=0.1
And suppose the script called output.txt.
Is there a way to, at the end of the run, rename the output.txt file to be called Output2000ft0.7m.txt so that it doesn’t get overwritten on the next run?For example, we have the following variables stored:
Altitude = 100ft
Mach=0.1
And suppose the script called output.txt.
Is there a way to, at the end of the run, rename the output.txt file to be called Output2000ft0.7m.txt so that it doesn’t get overwritten on the next run? For example, we have the following variables stored:
Altitude = 100ft
Mach=0.1
And suppose the script called output.txt.
Is there a way to, at the end of the run, rename the output.txt file to be called Output2000ft0.7m.txt so that it doesn’t get overwritten on the next run? matlab code, matlab function MATLAB Answers — New Questions
Solve a system of two variable inequalities with symbolic toolbox
Hello,
is it possible to symbolically solve a system of inequalities with variables x and y and plot the result?
lets say i have to solve x+y-25>0 and x*y+y+15>0, what function should i use?
i have tried fimplicit plus assumptions but it didn’t work.
thank youHello,
is it possible to symbolically solve a system of inequalities with variables x and y and plot the result?
lets say i have to solve x+y-25>0 and x*y+y+15>0, what function should i use?
i have tried fimplicit plus assumptions but it didn’t work.
thank you Hello,
is it possible to symbolically solve a system of inequalities with variables x and y and plot the result?
lets say i have to solve x+y-25>0 and x*y+y+15>0, what function should i use?
i have tried fimplicit plus assumptions but it didn’t work.
thank you symbolic math toolbox, inequalities, plotting, implicit MATLAB Answers — New Questions
How to initialize instance of object in Simulink Global Workspace?
Hello,
I am aiming to create an object (instance of class) in the Simulink Global Workspace via code in the Preload Fcn Callback.
This simulink model is using a Matlab Function block to input the parameters of an object to an S-Funciton block that is tied to a larger external simulation, and then the output of this simulation is passed back to another Matlab Function block that re-initializes the data with the results, and repeats until finished. Initially, I had used Interpreted Matlab Function blocks to pass this data back and forth but I am now forced to switch out of them and decided upon Matlab function blocks. I have made all the underlying code code-generation compatible, but I am having trouble intializing in object via the global workspace in Simulink.
In my preload function callback, I simply just write:
var_name = ObjClass();
I am then aiming to call methods/functions to manipulate the properties of this object in a Matlab Function block. For instance, in my initial Matlab Function block (which has no inputs), the code is like:
function output = fcn()
output = var_name.func_name();
end
However, whenever I run the simulation, I am getting the erro that var_name.func_name() is an undefined function or variable.
Can’t seem to figure out why the instance of class is not being created in the preload function as the Simulink is unchanged from successful use with Interpreted Matlab function blocks. Is there some sort of difference underlying the Matlab Function block that I am missing? Global Declarations?
Thanks!Hello,
I am aiming to create an object (instance of class) in the Simulink Global Workspace via code in the Preload Fcn Callback.
This simulink model is using a Matlab Function block to input the parameters of an object to an S-Funciton block that is tied to a larger external simulation, and then the output of this simulation is passed back to another Matlab Function block that re-initializes the data with the results, and repeats until finished. Initially, I had used Interpreted Matlab Function blocks to pass this data back and forth but I am now forced to switch out of them and decided upon Matlab function blocks. I have made all the underlying code code-generation compatible, but I am having trouble intializing in object via the global workspace in Simulink.
In my preload function callback, I simply just write:
var_name = ObjClass();
I am then aiming to call methods/functions to manipulate the properties of this object in a Matlab Function block. For instance, in my initial Matlab Function block (which has no inputs), the code is like:
function output = fcn()
output = var_name.func_name();
end
However, whenever I run the simulation, I am getting the erro that var_name.func_name() is an undefined function or variable.
Can’t seem to figure out why the instance of class is not being created in the preload function as the Simulink is unchanged from successful use with Interpreted Matlab function blocks. Is there some sort of difference underlying the Matlab Function block that I am missing? Global Declarations?
Thanks! Hello,
I am aiming to create an object (instance of class) in the Simulink Global Workspace via code in the Preload Fcn Callback.
This simulink model is using a Matlab Function block to input the parameters of an object to an S-Funciton block that is tied to a larger external simulation, and then the output of this simulation is passed back to another Matlab Function block that re-initializes the data with the results, and repeats until finished. Initially, I had used Interpreted Matlab Function blocks to pass this data back and forth but I am now forced to switch out of them and decided upon Matlab function blocks. I have made all the underlying code code-generation compatible, but I am having trouble intializing in object via the global workspace in Simulink.
In my preload function callback, I simply just write:
var_name = ObjClass();
I am then aiming to call methods/functions to manipulate the properties of this object in a Matlab Function block. For instance, in my initial Matlab Function block (which has no inputs), the code is like:
function output = fcn()
output = var_name.func_name();
end
However, whenever I run the simulation, I am getting the erro that var_name.func_name() is an undefined function or variable.
Can’t seem to figure out why the instance of class is not being created in the preload function as the Simulink is unchanged from successful use with Interpreted Matlab function blocks. Is there some sort of difference underlying the Matlab Function block that I am missing? Global Declarations?
Thanks! simulink, object oriented programming, matlab function, s-function MATLAB Answers — New Questions
TSI Partner Community Update | June 2024
Hello Partners,
Get the latest insights in our TSI June Community Update with updates on Nonprofit Azure Office Hours, MCAPS registration, Partner of the Year Awards, NCE M2M promotion details plus important NIS2 and DORA for our EMEA partners and a link to the Donorfy case study, a Digital Natives Partner Program participant.
Download the June 2024 TSI Community Update
Hello Partners,
Get the latest insights in our TSI June Community Update with updates on Nonprofit Azure Office Hours, MCAPS registration, Partner of the Year Awards, NCE M2M promotion details plus important NIS2 and DORA for our EMEA partners and a link to the Donorfy case study, a Digital Natives Partner Program participant.Download the June 2024 TSI Community Update
Read More