Month: August 2024
Pass parameter by URL with Default values on booleans
I have Parameter Field (type boolean) and a default value =True.
When I pass the parameter by url (= 0) it still shows it as True, how to prevent that behaviour.
(Default value is needed)
I have Parameter Field (type boolean) and a default value =True.When I pass the parameter by url (= 0) it still shows it as True, how to prevent that behaviour.(Default value is needed) https://learn.microsoft.com/en-us/sql/reporting-services/pass-a-report-parameter-within-a-url?view=sql-server-ver16 Read More
📢 Announcement!! Templates for Azure Logic Apps Standard are now in Public Preview 🎉
We are very excited to announce the Public Preview of Templates Support in Azure Logic Apps Standard! This release marks a significant milestone in our ongoing commitment to simplify and accelerate the development of enterprise integration solutions.
Templates in Azure Logic Apps are pre-built workflow solutions designed to address common integration scenarios. They cover a wide range of use cases, from simple data transfers to complex, multi-step automations, and event-driven processes. Templates provide a solid foundation, allowing users to quickly set up and deploy workflows without starting from scratch.
With Templates Support now in Public Preview, developers can leverage a growing library of pre-built templates to kickstart their Logic Apps projects. These templates are designed to cover a wide range of scenarios, from common workflows to complex integrations, making it easier than ever to build, deploy, and manage your applications.
This is just the beginning, and we are committed to evolving this feature by adding more templates based on your feedback and contributions.
Get started with Templates today
To get started with the new templates, simply create workflows using the available templates within the Logic Apps Designer. Each template includes detailed information and a read-only view of the workflow, allowing you to review entire workflow, pre-requisites, connectors used, and more to determine if it meets your requirements. These workflows are easily customizable and can be further extended to meet your specific business needs. For detailed guidance, check out our updated documentation and tutorials
Create Standard workflows from prebuilt templates – Azure Logic Apps | Microsoft Learn
Here is a brief introduction about the capabilities –
To access the templates, navigate to your workflows and use the dropdown menu to add a workflow from a template.
The template gallery shows the list of templates as well as dropdowns and free text search box to filter and search templates.
When you select a specific template, a side panel will display additional information, including a read-only view of the workflow, a detailed description, connectors used, prerequisites, and more.
When you choose a template, a wizard will guide you through the various stages of configuration, including setting up connections used in the workflow, configuring parameters referenced by different actions, and defining the name and state of the workflow.
Note: Before using templates, please be aware that many templates have prerequisites for them to work as expected. As we currently don’t have a way to enforce these prerequisites automatically, it’s essential to review and address them before using the template.
Don’t see the template you need
If you are interested in templates for specific scenario or patterns and do not see them in the gallery, there are a few options.
The recommended approach is to publish your workflows as templates by following a few simple steps. This not only empowers you to tailor solutions to your specific needs but also benefits others in the community. By doing so, you won’t be dependent on Microsoft for urgent requirements. However, please note that all templates must go through a PR process, where they are reviewed by our engineering and content experts. The detailed steps are outlined here
Create and publish workflow templates – Azure Logic Apps | Microsoft Learn
If this is not an option, then please submit your request for templates here to add them to our backlog of templates. https://aka.ms/survey/templates
Microsoft Tech Community – Latest Blogs –Read More
Azure Database for MySQL – July 2024 updates and latest feature roadmap
We’re excited to share a summary of the Azure Database for MySQL – Flexible Server announcements from last month, as well as the latest roadmap of upcoming features!
August 2024 Live webinar
These updates and the latest roadmap are also covered in our Monthly Live Webinar on YouTube (Click here to subscribe to our YouTube channel!), which streams the second Wednesday of every month, at 7:30 AM Pacific time. Below is a link to the session recording of the live webinar we delivered last week:
July 2024 updates and announcements
Major Version Upgrades for Azure Database for MySQL flexible servers in the Burstable service tier (General Availability)
You can now perform major version upgrades directly on flexible servers based on the Burstable service tier by using the Azure portal, and with just a few clicks! This feature release removes a previous limitation – the need to manually upgrade to General Purpose or Business Critical tiers, perform the version upgrade, and then downgrade the servers. The key benefits of this functionality include a seamless upgrade process, enhanced reliability, and effective cost management.
Learn more: Announcement Blog | Documentation | Demo video
Azure Key Vault Managed HSM (Hardware Security Module) support for customer managed keys (CMK) (General Availability)
Azure Key Vault Managed HSM (Hardware Security Module) is a fully managed, highly available, single-tenant, standards-compliant cloud service that enables you to safeguard cryptographic keys for your cloud applications, using FIPS 140-2 Level 3 validated HSMs. It ensures your data is stored and processed only within the region that hosts the HSM, ensuring data residency. With the latest feature update, you can now use your own HSM-backed encryption keys to protect your data at rest in MySQL – Flexible Server instances. You can generate HSM-backed keys and import the encryption keys from a physical on-premises HSM using CMK’s bring your own key (BYOK) feature while maintaining full control over the keys.
Learn more: Announcement Blog | Documentation | Demo video
Support for up to 8TB in a single table file (General Availability)
We are excited to announce that Azure Database for MySQL – Flexible Server now supports up to 8TB in a single table file! Previously, the service supported up to 4TB in a single table file. This increase in storage capacity reduces the need for table partitioning and simplifies database management. Additionally, the increased storage limit allows customers to scale their tables more efficiently without compromising performance.
Learn more: Documentation
Latest feature roadmap
Feature
Description
Release status
Coming soon!
(Tentative*)
On-demand backup and export
This feature provides you with the ability to export at-moment physical backup of the server to an Azure storage account (Azure blob storage) using an Azure CLI command. After export, these backups can be used for data recovery, migration, data redundancy and availability, or auditing. Learn more.
Public Preview
General Availability
in Q3 CY24
Flexible maintenance options
Building upon our existing system-managed and custom-managed maintenance windows, the following new flexible maintenance options aim to elevate user convenience and operational flexibility in server maintenance:
Reschedule window: Tailor maintenance schedules to suit your business rhythm.
On-demand maintenance: Instantly initiate maintenance activities using the Reschedule now option.
Public Preview
General Availability
in Q3 CY24
Virtual Canary
The Virtual Canary feature is an exciting solution for Azure MySQL users who prioritize staying at the forefront of technology by making sure that their servers always run the most current version. Servers opted in for virtual canary receive maintenance updates earlier in advance. You can also take advantage of the feature as an opportunity to perform an additional layer of update testing on your dev, test, or staging servers to help avoid workload-specific issues like application-level compatibility issues. The feature thus offers an efficient way to manage updates, align testing and production environments, and maintain operational stability with minimal disruption.
–
Public Preview
in Q3 CY24
Near-zero downtime maintenance for HA servers
This feature is designed to substantially reduce maintenance downtime for HA-enabled servers, ensuring that in most cases, maintenance downtime is expected to be between 40 to 60 seconds. This capability is pivotal for businesses that demand high availability and minimal interruption in their database operations. Learn more.
Public Preview
General Availability
in Q4 CY24
MySQL Discovery & Assessment in Azure Migrate
With this functionality, you can use Azure Migrate to discover MySQL servers in your environment, assess them by identifying their compatibility for moving to Azure Database for MySQL, and receive compute and storage SKU recommendations along with their costs. Learn more.
Private Preview
Public Preview
in Q4 CY24
Long Term Retention of Backups
Previously with Azure Database for MySQL, you could retain automated backups and on-demand backups for up to 35 days. With Long Term Retention, you can now retain the backups up to 10 years, further accommodating your audit and compliance needs. Learn more.
Public Preview
General Availability
in Q4 CY24
Error Logs (in Server Logs)
This feature allows you to maintain MySQL error log files under Server logs and download them for up to seven days. These error logs can help you efficiently identify and troubleshoot performance and reliability issues, and proactively detect and respond to unauthorized access attempts, failed login attempts, and other security-related events. Learn more.
Public Preview
General Availability
in Q4 CY24
CMK-enabled support for Accelerated Logs
Accelerated Logs, available with the Business Critical service tier and designed for mission-critical workloads, is a feature provides an increase in throughput of up to two times (2x) for your applications at no additional cost. The feature will soon be supported on servers that have Customer Managed Keys (CMK) enabled.
–
General Availability
in Q4 CY24
*The roadmap features and dates are tentative and subject to change. Please stay tuned for continuous updates.
Conclusion
As we continue to work on new features and functionalities, your feedback is very critical for our improvement. If you wish to enroll in Private Preview for any of the above features, or if you have any suggestions for or queries about our service, email us at AskAzureDBforMySQL@service.microsoft.com.
To learn more about what’s new with Flexible Server, see What’s new in Azure Database for MySQL – Flexible Server. Stay tuned for more updates and announcements by following us on social media: YouTube | LinkedIn | X.
Microsoft Tech Community – Latest Blogs –Read More
matlab code to design and analyze Racetrack resonator
How to design an ractrack resonator in matlab for biosensing applicationsHow to design an ractrack resonator in matlab for biosensing applications How to design an ractrack resonator in matlab for biosensing applications integrated photonics MATLAB Answers — New Questions
Help Index exceeds the number of array elements?
The error i am recieving is:
Index exceeds the number of array elements. Index must not exceed 0.
Error Line 39: disp([‘Book 1: ‘, num2str(x(1))]);
I bellive the issues might be occuring in:
Line 34: options = optimoptions(‘linprog’, ‘Display’, ‘off’); or
Line 35 [x, fval] = linprog(f, A, b, [], [], lb, [], options);
But can’t find what is causing the error.The error i am recieving is:
Index exceeds the number of array elements. Index must not exceed 0.
Error Line 39: disp([‘Book 1: ‘, num2str(x(1))]);
I bellive the issues might be occuring in:
Line 34: options = optimoptions(‘linprog’, ‘Display’, ‘off’); or
Line 35 [x, fval] = linprog(f, A, b, [], [], lb, [], options);
But can’t find what is causing the error. The error i am recieving is:
Index exceeds the number of array elements. Index must not exceed 0.
Error Line 39: disp([‘Book 1: ‘, num2str(x(1))]);
I bellive the issues might be occuring in:
Line 34: options = optimoptions(‘linprog’, ‘Display’, ‘off’); or
Line 35 [x, fval] = linprog(f, A, b, [], [], lb, [], options);
But can’t find what is causing the error. optimization, index MATLAB Answers — New Questions
How to resolve error showing ‘Unexpected matlab expression’?????
I am getting an error as Unexpected matlab expreesion; but i am neither be able to find out which expreesion is wrong and nor be able to resolve this error. Please help me…I am getting an error as Unexpected matlab expreesion; but i am neither be able to find out which expreesion is wrong and nor be able to resolve this error. Please help me… I am getting an error as Unexpected matlab expreesion; but i am neither be able to find out which expreesion is wrong and nor be able to resolve this error. Please help me… unexpected matlab expression MATLAB Answers — New Questions
Using system composer, it is possible to create a report with the sequence diagrams?
I´m using system composer, and i want to create a report with the views and sequence diagrams that i have. I have this code to generate the report and add the views, but i don´t know if there is a way to add too the sequence diagrams because there is nothing like "SequenceDiagramFinder".
Here is the code for the views:
model = systemcomposer.loadModel(model_name);
rpt = slreportgen.report.Report(output="ViewFinderReport",…
CompileModelBeforeReporting=false);
viewFinder = ViewFinder(model_name);
chapter = Chapter("Title","Views");
while hasNext(viewFinder)
view = next(viewFinder);
sect = Section("Title",view.Name);
add(sect,view);
add(chapter,sect);
endI´m using system composer, and i want to create a report with the views and sequence diagrams that i have. I have this code to generate the report and add the views, but i don´t know if there is a way to add too the sequence diagrams because there is nothing like "SequenceDiagramFinder".
Here is the code for the views:
model = systemcomposer.loadModel(model_name);
rpt = slreportgen.report.Report(output="ViewFinderReport",…
CompileModelBeforeReporting=false);
viewFinder = ViewFinder(model_name);
chapter = Chapter("Title","Views");
while hasNext(viewFinder)
view = next(viewFinder);
sect = Section("Title",view.Name);
add(sect,view);
add(chapter,sect);
end I´m using system composer, and i want to create a report with the views and sequence diagrams that i have. I have this code to generate the report and add the views, but i don´t know if there is a way to add too the sequence diagrams because there is nothing like "SequenceDiagramFinder".
Here is the code for the views:
model = systemcomposer.loadModel(model_name);
rpt = slreportgen.report.Report(output="ViewFinderReport",…
CompileModelBeforeReporting=false);
viewFinder = ViewFinder(model_name);
chapter = Chapter("Title","Views");
while hasNext(viewFinder)
view = next(viewFinder);
sect = Section("Title",view.Name);
add(sect,view);
add(chapter,sect);
end system composer, views, sequence diagrams MATLAB Answers — New Questions
How to use Individual License file in MATLAB Docker?
Hello, I am trying to use my Individual License file in a MATLAB Docker container but I get the error of ‘hostid of your computer does not match the hostid of your license file’.
I can login to the container by using my email and password but this is NOT ideal since the login session expires after some time and need to relogin again. I want to run this headless so if I can pass in a license file then I don’t have to keep logging in manually.
I downloaded the Individual License file from the online License Center portal Install and Activate tab. This downloaded a license.lic file which I pass into the Docker container.
My Docker run command:
docker run -it –rm -p 5901:5901 -p 6080:6080 -v ~/ivy:/ivy -e MLM_LICENSE_FILE=/ivy/license.lic mathworks/matlab:r2021b -vnc
License error:Hello, I am trying to use my Individual License file in a MATLAB Docker container but I get the error of ‘hostid of your computer does not match the hostid of your license file’.
I can login to the container by using my email and password but this is NOT ideal since the login session expires after some time and need to relogin again. I want to run this headless so if I can pass in a license file then I don’t have to keep logging in manually.
I downloaded the Individual License file from the online License Center portal Install and Activate tab. This downloaded a license.lic file which I pass into the Docker container.
My Docker run command:
docker run -it –rm -p 5901:5901 -p 6080:6080 -v ~/ivy:/ivy -e MLM_LICENSE_FILE=/ivy/license.lic mathworks/matlab:r2021b -vnc
License error: Hello, I am trying to use my Individual License file in a MATLAB Docker container but I get the error of ‘hostid of your computer does not match the hostid of your license file’.
I can login to the container by using my email and password but this is NOT ideal since the login session expires after some time and need to relogin again. I want to run this headless so if I can pass in a license file then I don’t have to keep logging in manually.
I downloaded the Individual License file from the online License Center portal Install and Activate tab. This downloaded a license.lic file which I pass into the Docker container.
My Docker run command:
docker run -it –rm -p 5901:5901 -p 6080:6080 -v ~/ivy:/ivy -e MLM_LICENSE_FILE=/ivy/license.lic mathworks/matlab:r2021b -vnc
License error: docker, license MATLAB Answers — New Questions
How do I replace the difference between two columns
Hello, I was wondering if someone can help me with this issue I’m encountering. I need to replace the mailing address with the property address that is grey out. Meaning the grey out I want to delete and replace it with the mailing address that shows that it is different. I have like hundreds of them, which I want to know if there is a possibility to just replace the ones that are actually different. I proceeded with selecting these two columns that I have to fix. > Find & Select > Go To Special …. > Row Differences > Ok. It then proceeded to show me which ones are the differences and now I just want to replace them with the correct ones that are shown in the mailing address column.
Hello, I was wondering if someone can help me with this issue I’m encountering. I need to replace the mailing address with the property address that is grey out. Meaning the grey out I want to delete and replace it with the mailing address that shows that it is different. I have like hundreds of them, which I want to know if there is a possibility to just replace the ones that are actually different. I proceeded with selecting these two columns that I have to fix. > Find & Select > Go To Special …. > Row Differences > Ok. It then proceeded to show me which ones are the differences and now I just want to replace them with the correct ones that are shown in the mailing address column. Read More
Word- Horizontal Line in footer with different page orientations problem.
Hello,
I am using same template for my documents about 3 years. I am using horizontal line in header and footer. By using that i didn’t need to uncheck link to previous while transitioning portrait to landscape. I think with a new update it starts to not working. It still showing no problem in word document but after creating PDF lines are shortened.
I checked older documents with no problem, but if i create pdf from these documents i will have same problem which occurs recently.
I am using Office 365, Windows 11 and also using Adobe PDF (but checked with MS edge too). Also tried to do it other 3 or 4 others computers where same problem exist.
Shortly i think, problems comes from new update of Office 365. Any solution will be appreciated. Sample files added link below.
Hello, I am using same template for my documents about 3 years. I am using horizontal line in header and footer. By using that i didn’t need to uncheck link to previous while transitioning portrait to landscape. I think with a new update it starts to not working. It still showing no problem in word document but after creating PDF lines are shortened. I checked older documents with no problem, but if i create pdf from these documents i will have same problem which occurs recently. I am using Office 365, Windows 11 and also using Adobe PDF (but checked with MS edge too). Also tried to do it other 3 or 4 others computers where same problem exist. Shortly i think, problems comes from new update of Office 365. Any solution will be appreciated. Sample files added link below. Files: Help Read More
Looking for a macro for automatic sorting by due date and emailing oneself when item is due
Hi everyone,
I am trying to see if someone could help me with making a macro that will automatically sort by closest due dates. In my table there are 2 columns one titled assessment date and due date. In my assessment date column it contains dates where paperwork was done. The due date column contains the day before but 2 years out (when the next round of paperwork is due). I have the due date column with conditional formatting saying that if todays date is past the due date then it is highlighted red. If the due date is within the 2 week mark of being due it is highlighted yellow and if it further out than 2 weeks then it is highlighted green. I wanted to see if there was a macro out there that will automatically sort by what is past due and if it is past due then it could email me to let me know so I can contact the client to fill out the paperwork. I tried finding other macros but the closest one that I could find just automatically sorts by the date not when an item is due. If anyone could help that would be great.
Thank you in advance!
Hi everyone, I am trying to see if someone could help me with making a macro that will automatically sort by closest due dates. In my table there are 2 columns one titled assessment date and due date. In my assessment date column it contains dates where paperwork was done. The due date column contains the day before but 2 years out (when the next round of paperwork is due). I have the due date column with conditional formatting saying that if todays date is past the due date then it is highlighted red. If the due date is within the 2 week mark of being due it is highlighted yellow and if it further out than 2 weeks then it is highlighted green. I wanted to see if there was a macro out there that will automatically sort by what is past due and if it is past due then it could email me to let me know so I can contact the client to fill out the paperwork. I tried finding other macros but the closest one that I could find just automatically sorts by the date not when an item is due. If anyone could help that would be great. Thank you in advance! Read More
How to build the Microsoft Purview extended report experience
This is a step-by-step guided walkthrough of the extended report experience.
Prerequisites
License requirements for Microsoft Purview Information Protection depend on the scenarios and features you use. To understand your licensing requirements and options for Microsoft Purview Information Protection, see the Information Protection sections from Microsoft 365 guidance for security & compliance and the related PDF download for feature-level licensing requirements.
Before you start, all endpoint interaction with Sensitive content is already being included in the audit logging with Endpoint DLP enabled. For Microsoft 365 SharePoint, OneDrive Exchange, Teams you can enable policies that generate events but not incidents for important sensitive information types.
Install Power BI Desktop to make use of the templates Downloads | Microsoft Power BI
Step-by-step guided walkthrough
In this guide, we will provide high-level steps to get started using the new tooling.
Get the latest version of the report that you are interested in from here. In this case we will show the Board report.
Open the report if Power BI Desktop is installed it should look like this.
You may have to approve the use of ArcGIS Maps if that has not been done before.
You must authenticate with https://api.security.microsoft.com, select Organizational account, and sign in. Then click Connect.
You will also have to authenticate with httpps://api.security.microsoft.com/api/advancedhunting, select Organizational account, and sign in. Then click Connect.
The system will start to collect the information from the built-in queries. Please note that this can take quite some time in larger environments.
When the load completes you should see something like this, in the Legal and Compliance tab. The report provides details on all content that is matching, built-in, and custom Sensitivity types, or any that have been touched by any of the compromised User accounts or Devices in the red box. The report needs to be updated.
7.1 All the reports have diagrams to measure KPI’s that measure the progress of improvement projects. Sample above is in the grey box, where it is measured based on how much sensitive content is accessed by compromised users or devices. This should be adjusted to be based on what resonates with your key objectives.
7.2 The green boxes used for the KPI measurements come from MaxDataSensitiveRisk, MaxDataDevice, MaxDataUser. You can either add a new value or update the current value.
7.2.1 To update the current value by selecting Transform data.
7.2.2 Select Goals, click on the flywheel for Source.
7.2.3 You can now update the values that are stored in the template. If you want to use a different value, you can click the + sign to add additional columns.
7.2.4 When you have made the modifications click Close & Apply.
7.3 Update the blue box high-level description to match the content or replace it with something automatically generated by Copilot, https://learn.microsoft.com/en-us/power-bi/create-reports/copilot-introduction.
7.4 Based on the organization’s requirements filter to only the required Sensitive information types.
7.5 The last part that you may want to update is the incident diagrams. By default, they show the severity and type of attack for incidents linked to access to sensitive data. You may want to map this to incident Tags or other fields based on your requirements.
The Trust & Reputation got a similar build as the Legal and compliance scorecard. Update it based on the requirements for your use case. The initial idea for this report is to show privacy-related data. The impact of having customer data leaking is devastating for the Trust customers have for the organization. Other reputational data points should be added as needed.
The Company & Shareholder Value contains some more information. The goal is to customize this to be bound to the organization’s secrets. Secret drawings, source code, internal financial results dashboards, supply chains, product development and other sensitive information. You may want to filter down to EDM, Fingerprint type SITs and specific trainable classifiers for this report.
9.1 To receive the accurate mapping of the labelled content you need to update the MIPLabel table with your label names and GUIDs.
9.1.2 Select Transform data.
9.1.3 Select MIPLabel, click on the flywheel for Source.
9.1.4 Connect to SCC PowerShell (Connect-IPPSsession)
-Run get-label | select immutableid, DisplayName
-Copy the Output
9.1.5 You can now update the values that are stored in the template. This ensures that the name mapping of labels works as expected.
9.1.6 The next step is to update the Access to mission-critical systems from compromised devices. Select the SensitiveSystems query. Then click Advanced Editor
9.1.7 Update the list of URLs that contain a system that has high business impact if an attacker has been accessing it. It is important to only use single quotes. Right now, there is no straightforward way to capture the URLs, so we need to do it manually. Once complete click Done.
9.1.8 When completed, click Close & Apply
If the previous steps have been completed the tab for operational scope should be ok. This view provides the organization with information about where Sensitive information is processed. This can help the organization to identify from where the content is being processed by which legal entity and function etc…. Failing this may in fact directly impact if an organization is allowed to operate in a specific market or not. Not knowing this have impact on restructuring the company and other actions to keep the company competitive.
10.1 We have one additional tab that does this based on Sensitivity labels. Called Operational Scope Classified Content.
11. The KPI tabs are more condensed and should be customized to fit with the context of the organization and the leaders to which the information is presented. The key thing is to communicate the information in a context that resonates.
11.1 You will want to update the incident view highlighted in red, switch it to something that works with the audience, it may be one of the Tags or other detail. You also want to be very deliberate about which incidents should generate the data to be shown in this dashboard. One way is to use tags, you may elect to only show incidents that are tagged with PossibleBoard as an example. This may enhance the communication between security teams and the board. By bringing awareness to the analysts the importance of their work and direct correlation with organizational leadership.
11.2 In this sample we have Credit Card in Focus and End user Identifiable, you should replace this with regulator names and the associated sensitive information types. Like SEC, FDA, FCC, NTIA, FCA etc. change the name and update the sensitive information filter.
Additional reports that come with this package
We are shipping a few additional reports that can be used to gain further insights. The Project sample provides this view for label usage. You can modify the targets similarly to you did for the board report.
One additional tip for this report is that you can,
Configure the “Maximum value” to be your target value, create the value in the Goals table.
Set the “Target value” to the value you had over the past period 275 in the case above.
While the incident sample will provide views like this. The incident reporting and progress view provides insights into the analyst process. It provides the overall efficiency metrics and measures to gauge the performance. It provides incident operations over time by different criteria, like severity, mean time to triage, mean time to resolve, DLP Policy, and more. You should customize this view to work with your practices.
The Incident view is by default 6 months while the event data is from the past 30 days. To increase the event data beyond 30 days you can use Microsoft Sentinel. If you on the other hand want to reduce the Incident window you can follow these steps.
Go to transform data
Select the Incident table, view settings by default you will see.
Update this to 30 days by updating the value to this as an example.
4. = OData.Feed(“https://api.security.microsoft.com/api/incidents?$filter=lastUpdateTime gt ” & Date.ToText(Date.AddDays(Date.From(DateTime.LocalNow()),-30), “yyyy-MM-dd”) , null, [Implementation=”2.0″])
The report also has a per workload detailed view like this sample for Exchange Online. The report contains Exchange, SharePoint, OneDrive for Business, Endpoint, Teams and OCR.
Additional configuration to be made
This is required to capture sensitive information that is transferred in Exchange Online or SharePoint Online. Setup captures all DLP policies that do not have any action or raise any alerts. This is also important for the Copilot for Security functionality to work correctly.
Create a custom policy.
Name the policy based on your naming standard and provide a description of the policy.
Select the workloads from where you want to capture sensitive data usage. For devices there is no need, devices are capturing all the sensitive data processing by default.
Click next.
Click Create rule.
Provide a rule name and click Add condition, then click Content Contains
Then click Sensitive info types, and select all the relevant Sensitive information types that you would like to capture for both internal and external processing. Note, do focus on the sensitive information types that are key to your operations (max 125 per rule). Then click Add, you can add your own custom SITs or make use of the built in SITs.
If you want any other conditions to be true for generating signals like external communications add that condition. Next, ensure that no Action, User notifications, Incident reports or Use email incident reports… are turned on. They should all be turned off.
Setup the Power BI online view
Providing an online view of the data has several benefits. You can delegate access to the dashboard without delegating permissions to the underlying data set. You can also create queries that only show information for a specific division or market and only present that information to that specific market. You can set up a scheduled refresh to refresh the data without having to upload it again.
Follow these steps to set up the integration https://learn.microsoft.com/en-us/azure/sentinel/powerbi#create-a-power-bi-online-workspace.
Posts part of this series
Cyber Security in a context that allows your organization to achieve more
https://techcommunity.microsoft.com/t5/security-compliance-and-identity/cyber-security-in-a-context-that-allows-your-organization-to/ba-p/4120041
Security for Copilot Data Security Analyst plugin https://techcommunity.microsoft.com/t5/security-compliance-and-identity/learn-how-to-customize-and-optimize-copilot-for-security-with/ba-p/4120147
Guided walkthrough of the Microsoft Purview extended report experience https://techcommunity.microsoft.com/t5/security-compliance-and-identity/guided-walkthrough-of-the-microsoft-purview-extended-report/ba-p/4121083
Microsoft Tech Community – Latest Blogs –Read More
Guided walkthrough of the Microsoft Purview extended report experience
This is a step-by-step guided walkthrough of the Microsoft Purview extended report experience and how it can empower your organization to understand the cyber security risks in a context that allows them to achieve more. By focusing on the information and organizational context to reflect the real impact/value of investments and incidents in cyber.
Prerequisites
License requirements for Microsoft Purview Information Protection depend on the scenarios and features you use. To understand your licensing requirements and options for Microsoft Purview Information Protection, see the Information Protection sections from Microsoft 365 guidance for security & compliance and the related PDF download for feature-level licensing requirements. For the best experience, all Microsoft Defender products should be enabled.
Follow the step-by-step guide to set up the reporting found here.
The DLP incident management documentation can be found here.
Install Power BI Desktop to make use of the templates Downloads | Microsoft Power BI
Overview and vision
The vision with this package is that it will allow for faster and more integrated communication between leaders and the cyber operations teams in a context that allows for effective collaboration. The structure can help present the positive result of attacks prevented by measuring distance to corporate secrets. It can also help you provide a view of the impact of an incident by listing the sensitive systems and content the attackers have accessed.
Based on the information you may also identify patterns where you need to improve your security posture based on sensitive content and systems. This makes improvement projects more connected to company value. Cybersecurity is fast pacing so being able to understand the future is just as important as current state. With this data available you should be able to input details about future threats and project their impact. As part of this we are also creating Security Copilot skills to help identify future risks.
Step-by-step guided walkthrough
Principles for the dashboards
When opening the Power BI view whether it is from a web-based version or from Power BI desktop you will find unique users and unique devices. These are user accounts and devices that have had at least one security incident flagged in Microsoft Defender Portal and have accessed sensitive information. Organizations may select to filter these based on incident flags, the type of incident etc. how to achieve this is outlined in the implementation guide.
Let us have a look at the base elements in the CISO, CCO view.
These are the default KPI views, you define a target for how much sensitive data can be accepted to be touched by compromised devices or users.
This is the view of the incidents showing the classification and type of attack. This view may be changed to be based on tags or other fields that instructs on what can be done to mitigate future attacks.
The number of compromised users and devices that have accessed sensitive content.
The count and types of sensitive content accessed by the compromised systems.
The core rule for what is shown is that sensitive content has been touched by a compromised system or account. A compromised system or account that has not accessed any sensitive content will not be shown. The only exception is the Operational scope pages more detail later.
Board level sample data.
The first version has four risk dimensions,
Legal Compliance, you should tweak this view to be centered around your regulatory obligations. The base report shows Credit card and end-user identifiable as an example. A suggestion is that you select the applicable sensitive information types, and group them under a regulator name (Like SEC, FDA, FCC, NTIA, FCA etc..). How to achieve this is outlined in the implementation guide. You may also update the KPI graph to align better with the objectives you have as an organization. A click on the department will filter the content across the page.
Trust Reputation, the standard setup of this report is to show privacy-related data. The impact of having customer data leaking is devastating to the Trust customers have for the organization. You can configure the report to be centered around the privacy data that is most applicable to your business.
Company and Shareholder Value is centered around the organization’s own secrets. Secret drawings, source code, internal financial results dashboards, supply chain information, product development, and other sensitive information. The dashboard is built on a few core components.
Access to content labeled as Sensitive from compromised.
Update this diagram to only reflect the sensitivity labels with high impact to the business, we will only show access made by compromised accounts.
Access to mission-critical systems from compromised.
This is based on connections to URL’s or IP addresses that host business sensitive systems. This should come from the asset classification already made for critical systems.
Access to Sensitive content from compromised.
This should be the core Sensitive information types, fingerprints, exact data matches that directly can impact the valuation of the organization.
The KPI diagram should be updated to a target that makes sense to the core security projects run by the organization.
Operational scope provides your organization with information about where Sensitive information is processed. Failing to process at the appropriate location may directly impact whether an organization is allowed to operate in specific markets or not. This report can also be used for restructuring the company and other actions to keep the company competitive while still staying in compliance with regulations.
With Security Copilot you can get this type of detail as well. It will help you with the contextual detail. Here is one example of a custom sensitive information type. The sub bullets are departments.
There is also a view included for the use of Sensitivity labels.
The CISO view contains more detail than the Board reports as outlined initially in this post. This is the Company & Shareholder Value view. Based on the implementation guide this view can be customized to meet the needs of your organization. But based on this you may feel that more detail is needed. This leads to the detail view.
Account Detailed Data view provides the next level of detail.
In the green box you will find all the users with incidents, where you can learn more about threat actors, threat families etc… as part of the implementation guide you can learn how to add additional fields such as tags and type.
In the red box you will find information about the actual documents and information that the user has been accessing.
Let’s use this sample where we pair the usage with Copilot for Security. Let us say that one of the object names is listall.json. And I want to get all the information surrounding that file.
Or you may have an e-mail subject that you are concerned about.
The information shared is to provide you with an idea of how to get started. Consider adding actual monetized impact on events across the system. Both those that were avoided and those that had a negative impact.
Improvement Project reporting
For data-driven feedback on the impact of improvement projects, we have a few sample dashboards to get you started. They are there to allow you to see the art of the possible. The rich data that is available from the system will in many cases allow you to build your own data-driven dashboards to show progress. The samples that are available is Document KPI, Oversharing SharePoint, Email KPI, Content upload, Operational Scope, and Operational scope classified content.
Below is a sample dashboard that displays the number of protected versus unprotected document operations across the organization. E.g. which ones are sensitivity labeled and which ones are not. Follow the technical guidance for setting this up properly.
This example provides an overview of the suppliers being used to access sensitive content. This is based on the processes, you may select to do something similar based on the IP tags and ranges and access to sensitive content and systems.
This example contains details about how credential data is being processed across the organization. To capture the All Credential Types you need to enable a policy for all workloads including endpoint.
Incident reporting and progress
The incident reporting and progress view provides insights into the analyst process. It provides the overall efficiency metrics and measures to gauge the performance. It provides incident operations over time by different criteria, like severity, mean time to triage, mean time to resolve, By DLP Policy and more. You should customize this view to work with your practices.
The package also comes with optimization suggestions per workload. Exchange, SharePoint, OneDrive for Business, Endpoint, Teams, and OCR.
You may select to use Copilot to summarize your incidents and provide next steps. This is a sample of output from Copilot summarizing an incident. The steps for implementing and tuning Security Copilot can be found in the Guidance Playbook for Security Copilot.
Events
As part of the technical documentation, there is guidance to set up additional event collection. If you are a decision-maker, consider if you want to set up alerts based on the views you have in Power BI. It is highly likely that a rule can be set up to trigger flows where you need to be involved. Here is the documentation for Microsoft Defender XDR Create and manage custom detection rules in Microsoft Defender XDR | Microsoft Learn.
Copilot for security can be used to draw conclusions from all relevant events associated with an incident and provide suggestions for next steps. This is a sample where it uses the corporate policy document from Microsoft Azure AI as well as Microsoft Defender incidents to suggest next steps. You can also use the upload feature Upload a file | Microsoft Learn.
Here is another example where you may want to confirm if content has been touched by a compromised account.
Posts part of this series.
Cyber Security in a context that allows your organization to achieve more
https://techcommunity.microsoft.com/t5/security-compliance-and-identity/cyber-security-in-a-context-that-allows-your-organization-to/ba-p/4120041
Security for Copilot Data Security Analyst plugin https://techcommunity.microsoft.com/t5/security-compliance-and-identity/learn-how-to-customize-and-optimize-copilot-for-security-with/ba-p/4120147
How to build the Microsoft Purview extended report experience https://techcommunity.microsoft.com/t5/security-compliance-and-identity/how-to-build-the-microsoft-purview-extended-report-experience/ba-p/4122028
Microsoft Tech Community – Latest Blogs –Read More
Library mblibv1 not found in R2020a
I am using MATLAB R2020a where SimScape libraries are also installed. But I couldn’t find mbliv1 library. Can you please let me know where can I find it? Whether it is deprecated in R2020a?I am using MATLAB R2020a where SimScape libraries are also installed. But I couldn’t find mbliv1 library. Can you please let me know where can I find it? Whether it is deprecated in R2020a? I am using MATLAB R2020a where SimScape libraries are also installed. But I couldn’t find mbliv1 library. Can you please let me know where can I find it? Whether it is deprecated in R2020a? simscape MATLAB Answers — New Questions
How can I use a previous version of Matlab ? I using Matlab 2022b and I would like to run some script in a previous version 2021a. Thanks
How can I use a previous version of Matlab ? I using Matlab 2022b and I would like to run some script in a previous version 2021a.
ThanksHow can I use a previous version of Matlab ? I using Matlab 2022b and I would like to run some script in a previous version 2021a.
Thanks How can I use a previous version of Matlab ? I using Matlab 2022b and I would like to run some script in a previous version 2021a.
Thanks matlab version MATLAB Answers — New Questions
How can I export Roboflow annotation to work in Matlab
Hello! I’m using annotations to create bounding boxes on my images to train a model. To export the dataset created on Roboflow, we can export in different ways, like COCO segmentation Json files, or TXT YOLO oriented bounding boxes, or CSV tensorflow/ CSV keras. The downloaded files comes with the images and labels created in train, test and validation. So I have the images, the labels with coordinates from each image that I’m creating the Annotation. But I don’t know how am I work with those files on Matlab. Anyone can help me with this problem?Hello! I’m using annotations to create bounding boxes on my images to train a model. To export the dataset created on Roboflow, we can export in different ways, like COCO segmentation Json files, or TXT YOLO oriented bounding boxes, or CSV tensorflow/ CSV keras. The downloaded files comes with the images and labels created in train, test and validation. So I have the images, the labels with coordinates from each image that I’m creating the Annotation. But I don’t know how am I work with those files on Matlab. Anyone can help me with this problem? Hello! I’m using annotations to create bounding boxes on my images to train a model. To export the dataset created on Roboflow, we can export in different ways, like COCO segmentation Json files, or TXT YOLO oriented bounding boxes, or CSV tensorflow/ CSV keras. The downloaded files comes with the images and labels created in train, test and validation. So I have the images, the labels with coordinates from each image that I’m creating the Annotation. But I don’t know how am I work with those files on Matlab. Anyone can help me with this problem? matlab, deep learning, neural network, neural networks, machine learning, image processing MATLAB Answers — New Questions
How can i encrypt image by xoring?
How can i encrypt image by xoring?How can i encrypt image by xoring? How can i encrypt image by xoring? af MATLAB Answers — New Questions
Difference between local time and “created_at
Hello
The time returned by "created_at" does not correspond to the local time
e.g. 18:54:33.401 -> Message:{"channel_id":2499901, "created_at": "2024-08-21T16:54:31Z"….}
although my profile is correctly filled in, the time indicated on the plot is correct
Should I add the term: ?timezone = Europe%2FParis
if where please I do not have a field in "channel setting" to add it
Thank you in advance.Hello
The time returned by "created_at" does not correspond to the local time
e.g. 18:54:33.401 -> Message:{"channel_id":2499901, "created_at": "2024-08-21T16:54:31Z"….}
although my profile is correctly filled in, the time indicated on the plot is correct
Should I add the term: ?timezone = Europe%2FParis
if where please I do not have a field in "channel setting" to add it
Thank you in advance. Hello
The time returned by "created_at" does not correspond to the local time
e.g. 18:54:33.401 -> Message:{"channel_id":2499901, "created_at": "2024-08-21T16:54:31Z"….}
although my profile is correctly filled in, the time indicated on the plot is correct
Should I add the term: ?timezone = Europe%2FParis
if where please I do not have a field in "channel setting" to add it
Thank you in advance. timezone MATLAB Answers — New Questions
Microsoft Defender Dynamic Tagging not running
I created several tags in the Asset Rule Management on Aug13th and 14th that have not run yet. How often do the UI tags get applied?
I created several tags in the Asset Rule Management on Aug13th and 14th that have not run yet. How often do the UI tags get applied? Read More
IF Function doubt
Dear Experts,
I have a data like below(attached sheet as well):-
Column “B” so(Segment Offset) is calculated as the sum of Column “F” and “G” only when Column “A”
is either MID or LAST, I tried to do the below formula in H2, but got #VALUE error, what wrong am I doing here? Could you please share..
Thanks in Advance,
Br,
Anupam
Dear Experts, I have a data like below(attached sheet as well):-Column “B” so(Segment Offset) is calculated as the sum of Column “F” and “G” only when Column “A”is either MID or LAST, I tried to do the below formula in H2, but got #VALUE error, what wrong am I doing here? Could you please share..Thanks in Advance,Br,Anupam Read More