Category: Microsoft
Category Archives: Microsoft
LISTS IS UNUSABLE. I NEED HELP.
Our organization has built a foundation on using SPO lists and the recent changes to SharePoint have caused havoc on our system. I need Microsoft support to get involved to repair everything that has been broken in order for us to continue operating as a business.
PLEASE – I AM BEGGING THE COMMUNITY.
Please provide me with some information about whom I need to contact to resolve this disaster.
Our organization has built a foundation on using SPO lists and the recent changes to SharePoint have caused havoc on our system. I need Microsoft support to get involved to repair everything that has been broken in order for us to continue operating as a business. PLEASE – I AM BEGGING THE COMMUNITY. Please provide me with some information about whom I need to contact to resolve this disaster. Read More
Closing and scoring OKRs
Hi Viva Goals community,
I have a question related to closing and scoring OKRs.
When I go to close an Objective (blue bullseye icon) or Key Result (purple speedometer), I am given the option to adjust the score if desired. However, when I go to close an Initiative (green calendar), the Close dialogue box does not have the Score line that allows me to adjust the score – I am forced to take the score that is auto-calculated by Viva Goals.
1) Can people confirm that this is the case for them as well?
2) If this is indeed the case, can anyone from Microsoft or otherwise help me understand why the score adjustment feature is not available on Initiatives?
3) Is there any plan to change this for Initiatives in the future?
Thanks everyone!
Evan
Hi Viva Goals community, I have a question related to closing and scoring OKRs. When I go to close an Objective (blue bullseye icon) or Key Result (purple speedometer), I am given the option to adjust the score if desired. However, when I go to close an Initiative (green calendar), the Close dialogue box does not have the Score line that allows me to adjust the score – I am forced to take the score that is auto-calculated by Viva Goals. 1) Can people confirm that this is the case for them as well?2) If this is indeed the case, can anyone from Microsoft or otherwise help me understand why the score adjustment feature is not available on Initiatives?3) Is there any plan to change this for Initiatives in the future? Thanks everyone! Evan Read More
Bookings N:N meetings removes all participants from the meeting whenever someone else signs up
I have set-up our bookings calendar for our orientation appointments as N: N (this is what I normally do). There are 10 spots and 3 of us take on these meetings. But for some reason, this time whenever someone signs up for one of the 10 spots, it removes everybody else who has already registered. Even if we sign them up here in the office it removes the other registered folks. Help! Any thoughts?
I have set-up our bookings calendar for our orientation appointments as N: N (this is what I normally do). There are 10 spots and 3 of us take on these meetings. But for some reason, this time whenever someone signs up for one of the 10 spots, it removes everybody else who has already registered. Even if we sign them up here in the office it removes the other registered folks. Help! Any thoughts? Read More
Pass parameter by URL with Default values on booleans
I have Parameter Field (type boolean) and a default value =True.
When I pass the parameter by url (= 0) it still shows it as True, how to prevent that behaviour.
(Default value is needed)
I have Parameter Field (type boolean) and a default value =True.When I pass the parameter by url (= 0) it still shows it as True, how to prevent that behaviour.(Default value is needed) https://learn.microsoft.com/en-us/sql/reporting-services/pass-a-report-parameter-within-a-url?view=sql-server-ver16 Read More
📢 Announcement!! Templates for Azure Logic Apps Standard are now in Public Preview 🎉
We are very excited to announce the Public Preview of Templates Support in Azure Logic Apps Standard! This release marks a significant milestone in our ongoing commitment to simplify and accelerate the development of enterprise integration solutions.
Templates in Azure Logic Apps are pre-built workflow solutions designed to address common integration scenarios. They cover a wide range of use cases, from simple data transfers to complex, multi-step automations, and event-driven processes. Templates provide a solid foundation, allowing users to quickly set up and deploy workflows without starting from scratch.
With Templates Support now in Public Preview, developers can leverage a growing library of pre-built templates to kickstart their Logic Apps projects. These templates are designed to cover a wide range of scenarios, from common workflows to complex integrations, making it easier than ever to build, deploy, and manage your applications.
This is just the beginning, and we are committed to evolving this feature by adding more templates based on your feedback and contributions.
Get started with Templates today
To get started with the new templates, simply create workflows using the available templates within the Logic Apps Designer. Each template includes detailed information and a read-only view of the workflow, allowing you to review entire workflow, pre-requisites, connectors used, and more to determine if it meets your requirements. These workflows are easily customizable and can be further extended to meet your specific business needs. For detailed guidance, check out our updated documentation and tutorials
Create Standard workflows from prebuilt templates – Azure Logic Apps | Microsoft Learn
Here is a brief introduction about the capabilities –
To access the templates, navigate to your workflows and use the dropdown menu to add a workflow from a template.
The template gallery shows the list of templates as well as dropdowns and free text search box to filter and search templates.
When you select a specific template, a side panel will display additional information, including a read-only view of the workflow, a detailed description, connectors used, prerequisites, and more.
When you choose a template, a wizard will guide you through the various stages of configuration, including setting up connections used in the workflow, configuring parameters referenced by different actions, and defining the name and state of the workflow.
Note: Before using templates, please be aware that many templates have prerequisites for them to work as expected. As we currently don’t have a way to enforce these prerequisites automatically, it’s essential to review and address them before using the template.
Don’t see the template you need
If you are interested in templates for specific scenario or patterns and do not see them in the gallery, there are a few options.
The recommended approach is to publish your workflows as templates by following a few simple steps. This not only empowers you to tailor solutions to your specific needs but also benefits others in the community. By doing so, you won’t be dependent on Microsoft for urgent requirements. However, please note that all templates must go through a PR process, where they are reviewed by our engineering and content experts. The detailed steps are outlined here
Create and publish workflow templates – Azure Logic Apps | Microsoft Learn
If this is not an option, then please submit your request for templates here to add them to our backlog of templates. https://aka.ms/survey/templates
Microsoft Tech Community – Latest Blogs –Read More
Azure Database for MySQL – July 2024 updates and latest feature roadmap
We’re excited to share a summary of the Azure Database for MySQL – Flexible Server announcements from last month, as well as the latest roadmap of upcoming features!
August 2024 Live webinar
These updates and the latest roadmap are also covered in our Monthly Live Webinar on YouTube (Click here to subscribe to our YouTube channel!), which streams the second Wednesday of every month, at 7:30 AM Pacific time. Below is a link to the session recording of the live webinar we delivered last week:
July 2024 updates and announcements
Major Version Upgrades for Azure Database for MySQL flexible servers in the Burstable service tier (General Availability)
You can now perform major version upgrades directly on flexible servers based on the Burstable service tier by using the Azure portal, and with just a few clicks! This feature release removes a previous limitation – the need to manually upgrade to General Purpose or Business Critical tiers, perform the version upgrade, and then downgrade the servers. The key benefits of this functionality include a seamless upgrade process, enhanced reliability, and effective cost management.
Learn more: Announcement Blog | Documentation | Demo video
Azure Key Vault Managed HSM (Hardware Security Module) support for customer managed keys (CMK) (General Availability)
Azure Key Vault Managed HSM (Hardware Security Module) is a fully managed, highly available, single-tenant, standards-compliant cloud service that enables you to safeguard cryptographic keys for your cloud applications, using FIPS 140-2 Level 3 validated HSMs. It ensures your data is stored and processed only within the region that hosts the HSM, ensuring data residency. With the latest feature update, you can now use your own HSM-backed encryption keys to protect your data at rest in MySQL – Flexible Server instances. You can generate HSM-backed keys and import the encryption keys from a physical on-premises HSM using CMK’s bring your own key (BYOK) feature while maintaining full control over the keys.
Learn more: Announcement Blog | Documentation | Demo video
Support for up to 8TB in a single table file (General Availability)
We are excited to announce that Azure Database for MySQL – Flexible Server now supports up to 8TB in a single table file! Previously, the service supported up to 4TB in a single table file. This increase in storage capacity reduces the need for table partitioning and simplifies database management. Additionally, the increased storage limit allows customers to scale their tables more efficiently without compromising performance.
Learn more: Documentation
Latest feature roadmap
Feature
Description
Release status
Coming soon!
(Tentative*)
On-demand backup and export
This feature provides you with the ability to export at-moment physical backup of the server to an Azure storage account (Azure blob storage) using an Azure CLI command. After export, these backups can be used for data recovery, migration, data redundancy and availability, or auditing. Learn more.
Public Preview
General Availability
in Q3 CY24
Flexible maintenance options
Building upon our existing system-managed and custom-managed maintenance windows, the following new flexible maintenance options aim to elevate user convenience and operational flexibility in server maintenance:
Reschedule window: Tailor maintenance schedules to suit your business rhythm.
On-demand maintenance: Instantly initiate maintenance activities using the Reschedule now option.
Public Preview
General Availability
in Q3 CY24
Virtual Canary
The Virtual Canary feature is an exciting solution for Azure MySQL users who prioritize staying at the forefront of technology by making sure that their servers always run the most current version. Servers opted in for virtual canary receive maintenance updates earlier in advance. You can also take advantage of the feature as an opportunity to perform an additional layer of update testing on your dev, test, or staging servers to help avoid workload-specific issues like application-level compatibility issues. The feature thus offers an efficient way to manage updates, align testing and production environments, and maintain operational stability with minimal disruption.
–
Public Preview
in Q3 CY24
Near-zero downtime maintenance for HA servers
This feature is designed to substantially reduce maintenance downtime for HA-enabled servers, ensuring that in most cases, maintenance downtime is expected to be between 40 to 60 seconds. This capability is pivotal for businesses that demand high availability and minimal interruption in their database operations. Learn more.
Public Preview
General Availability
in Q4 CY24
MySQL Discovery & Assessment in Azure Migrate
With this functionality, you can use Azure Migrate to discover MySQL servers in your environment, assess them by identifying their compatibility for moving to Azure Database for MySQL, and receive compute and storage SKU recommendations along with their costs. Learn more.
Private Preview
Public Preview
in Q4 CY24
Long Term Retention of Backups
Previously with Azure Database for MySQL, you could retain automated backups and on-demand backups for up to 35 days. With Long Term Retention, you can now retain the backups up to 10 years, further accommodating your audit and compliance needs. Learn more.
Public Preview
General Availability
in Q4 CY24
Error Logs (in Server Logs)
This feature allows you to maintain MySQL error log files under Server logs and download them for up to seven days. These error logs can help you efficiently identify and troubleshoot performance and reliability issues, and proactively detect and respond to unauthorized access attempts, failed login attempts, and other security-related events. Learn more.
Public Preview
General Availability
in Q4 CY24
CMK-enabled support for Accelerated Logs
Accelerated Logs, available with the Business Critical service tier and designed for mission-critical workloads, is a feature provides an increase in throughput of up to two times (2x) for your applications at no additional cost. The feature will soon be supported on servers that have Customer Managed Keys (CMK) enabled.
–
General Availability
in Q4 CY24
*The roadmap features and dates are tentative and subject to change. Please stay tuned for continuous updates.
Conclusion
As we continue to work on new features and functionalities, your feedback is very critical for our improvement. If you wish to enroll in Private Preview for any of the above features, or if you have any suggestions for or queries about our service, email us at AskAzureDBforMySQL@service.microsoft.com.
To learn more about what’s new with Flexible Server, see What’s new in Azure Database for MySQL – Flexible Server. Stay tuned for more updates and announcements by following us on social media: YouTube | LinkedIn | X.
Microsoft Tech Community – Latest Blogs –Read More
How do I replace the difference between two columns
Hello, I was wondering if someone can help me with this issue I’m encountering. I need to replace the mailing address with the property address that is grey out. Meaning the grey out I want to delete and replace it with the mailing address that shows that it is different. I have like hundreds of them, which I want to know if there is a possibility to just replace the ones that are actually different. I proceeded with selecting these two columns that I have to fix. > Find & Select > Go To Special …. > Row Differences > Ok. It then proceeded to show me which ones are the differences and now I just want to replace them with the correct ones that are shown in the mailing address column.
Hello, I was wondering if someone can help me with this issue I’m encountering. I need to replace the mailing address with the property address that is grey out. Meaning the grey out I want to delete and replace it with the mailing address that shows that it is different. I have like hundreds of them, which I want to know if there is a possibility to just replace the ones that are actually different. I proceeded with selecting these two columns that I have to fix. > Find & Select > Go To Special …. > Row Differences > Ok. It then proceeded to show me which ones are the differences and now I just want to replace them with the correct ones that are shown in the mailing address column. Read More
Word- Horizontal Line in footer with different page orientations problem.
Hello,
I am using same template for my documents about 3 years. I am using horizontal line in header and footer. By using that i didn’t need to uncheck link to previous while transitioning portrait to landscape. I think with a new update it starts to not working. It still showing no problem in word document but after creating PDF lines are shortened.
I checked older documents with no problem, but if i create pdf from these documents i will have same problem which occurs recently.
I am using Office 365, Windows 11 and also using Adobe PDF (but checked with MS edge too). Also tried to do it other 3 or 4 others computers where same problem exist.
Shortly i think, problems comes from new update of Office 365. Any solution will be appreciated. Sample files added link below.
Hello, I am using same template for my documents about 3 years. I am using horizontal line in header and footer. By using that i didn’t need to uncheck link to previous while transitioning portrait to landscape. I think with a new update it starts to not working. It still showing no problem in word document but after creating PDF lines are shortened. I checked older documents with no problem, but if i create pdf from these documents i will have same problem which occurs recently. I am using Office 365, Windows 11 and also using Adobe PDF (but checked with MS edge too). Also tried to do it other 3 or 4 others computers where same problem exist. Shortly i think, problems comes from new update of Office 365. Any solution will be appreciated. Sample files added link below. Files: Help Read More
Looking for a macro for automatic sorting by due date and emailing oneself when item is due
Hi everyone,
I am trying to see if someone could help me with making a macro that will automatically sort by closest due dates. In my table there are 2 columns one titled assessment date and due date. In my assessment date column it contains dates where paperwork was done. The due date column contains the day before but 2 years out (when the next round of paperwork is due). I have the due date column with conditional formatting saying that if todays date is past the due date then it is highlighted red. If the due date is within the 2 week mark of being due it is highlighted yellow and if it further out than 2 weeks then it is highlighted green. I wanted to see if there was a macro out there that will automatically sort by what is past due and if it is past due then it could email me to let me know so I can contact the client to fill out the paperwork. I tried finding other macros but the closest one that I could find just automatically sorts by the date not when an item is due. If anyone could help that would be great.
Thank you in advance!
Hi everyone, I am trying to see if someone could help me with making a macro that will automatically sort by closest due dates. In my table there are 2 columns one titled assessment date and due date. In my assessment date column it contains dates where paperwork was done. The due date column contains the day before but 2 years out (when the next round of paperwork is due). I have the due date column with conditional formatting saying that if todays date is past the due date then it is highlighted red. If the due date is within the 2 week mark of being due it is highlighted yellow and if it further out than 2 weeks then it is highlighted green. I wanted to see if there was a macro out there that will automatically sort by what is past due and if it is past due then it could email me to let me know so I can contact the client to fill out the paperwork. I tried finding other macros but the closest one that I could find just automatically sorts by the date not when an item is due. If anyone could help that would be great. Thank you in advance! Read More
How to build the Microsoft Purview extended report experience
This is a step-by-step guided walkthrough of the extended report experience.
Prerequisites
License requirements for Microsoft Purview Information Protection depend on the scenarios and features you use. To understand your licensing requirements and options for Microsoft Purview Information Protection, see the Information Protection sections from Microsoft 365 guidance for security & compliance and the related PDF download for feature-level licensing requirements.
Before you start, all endpoint interaction with Sensitive content is already being included in the audit logging with Endpoint DLP enabled. For Microsoft 365 SharePoint, OneDrive Exchange, Teams you can enable policies that generate events but not incidents for important sensitive information types.
Install Power BI Desktop to make use of the templates Downloads | Microsoft Power BI
Step-by-step guided walkthrough
In this guide, we will provide high-level steps to get started using the new tooling.
Get the latest version of the report that you are interested in from here. In this case we will show the Board report.
Open the report if Power BI Desktop is installed it should look like this.
You may have to approve the use of ArcGIS Maps if that has not been done before.
You must authenticate with https://api.security.microsoft.com, select Organizational account, and sign in. Then click Connect.
You will also have to authenticate with httpps://api.security.microsoft.com/api/advancedhunting, select Organizational account, and sign in. Then click Connect.
The system will start to collect the information from the built-in queries. Please note that this can take quite some time in larger environments.
When the load completes you should see something like this, in the Legal and Compliance tab. The report provides details on all content that is matching, built-in, and custom Sensitivity types, or any that have been touched by any of the compromised User accounts or Devices in the red box. The report needs to be updated.
7.1 All the reports have diagrams to measure KPI’s that measure the progress of improvement projects. Sample above is in the grey box, where it is measured based on how much sensitive content is accessed by compromised users or devices. This should be adjusted to be based on what resonates with your key objectives.
7.2 The green boxes used for the KPI measurements come from MaxDataSensitiveRisk, MaxDataDevice, MaxDataUser. You can either add a new value or update the current value.
7.2.1 To update the current value by selecting Transform data.
7.2.2 Select Goals, click on the flywheel for Source.
7.2.3 You can now update the values that are stored in the template. If you want to use a different value, you can click the + sign to add additional columns.
7.2.4 When you have made the modifications click Close & Apply.
7.3 Update the blue box high-level description to match the content or replace it with something automatically generated by Copilot, https://learn.microsoft.com/en-us/power-bi/create-reports/copilot-introduction.
7.4 Based on the organization’s requirements filter to only the required Sensitive information types.
7.5 The last part that you may want to update is the incident diagrams. By default, they show the severity and type of attack for incidents linked to access to sensitive data. You may want to map this to incident Tags or other fields based on your requirements.
The Trust & Reputation got a similar build as the Legal and compliance scorecard. Update it based on the requirements for your use case. The initial idea for this report is to show privacy-related data. The impact of having customer data leaking is devastating for the Trust customers have for the organization. Other reputational data points should be added as needed.
The Company & Shareholder Value contains some more information. The goal is to customize this to be bound to the organization’s secrets. Secret drawings, source code, internal financial results dashboards, supply chains, product development and other sensitive information. You may want to filter down to EDM, Fingerprint type SITs and specific trainable classifiers for this report.
9.1 To receive the accurate mapping of the labelled content you need to update the MIPLabel table with your label names and GUIDs.
9.1.2 Select Transform data.
9.1.3 Select MIPLabel, click on the flywheel for Source.
9.1.4 Connect to SCC PowerShell (Connect-IPPSsession)
-Run get-label | select immutableid, DisplayName
-Copy the Output
9.1.5 You can now update the values that are stored in the template. This ensures that the name mapping of labels works as expected.
9.1.6 The next step is to update the Access to mission-critical systems from compromised devices. Select the SensitiveSystems query. Then click Advanced Editor
9.1.7 Update the list of URLs that contain a system that has high business impact if an attacker has been accessing it. It is important to only use single quotes. Right now, there is no straightforward way to capture the URLs, so we need to do it manually. Once complete click Done.
9.1.8 When completed, click Close & Apply
If the previous steps have been completed the tab for operational scope should be ok. This view provides the organization with information about where Sensitive information is processed. This can help the organization to identify from where the content is being processed by which legal entity and function etc…. Failing this may in fact directly impact if an organization is allowed to operate in a specific market or not. Not knowing this have impact on restructuring the company and other actions to keep the company competitive.
10.1 We have one additional tab that does this based on Sensitivity labels. Called Operational Scope Classified Content.
11. The KPI tabs are more condensed and should be customized to fit with the context of the organization and the leaders to which the information is presented. The key thing is to communicate the information in a context that resonates.
11.1 You will want to update the incident view highlighted in red, switch it to something that works with the audience, it may be one of the Tags or other detail. You also want to be very deliberate about which incidents should generate the data to be shown in this dashboard. One way is to use tags, you may elect to only show incidents that are tagged with PossibleBoard as an example. This may enhance the communication between security teams and the board. By bringing awareness to the analysts the importance of their work and direct correlation with organizational leadership.
11.2 In this sample we have Credit Card in Focus and End user Identifiable, you should replace this with regulator names and the associated sensitive information types. Like SEC, FDA, FCC, NTIA, FCA etc. change the name and update the sensitive information filter.
Additional reports that come with this package
We are shipping a few additional reports that can be used to gain further insights. The Project sample provides this view for label usage. You can modify the targets similarly to you did for the board report.
One additional tip for this report is that you can,
Configure the “Maximum value” to be your target value, create the value in the Goals table.
Set the “Target value” to the value you had over the past period 275 in the case above.
While the incident sample will provide views like this. The incident reporting and progress view provides insights into the analyst process. It provides the overall efficiency metrics and measures to gauge the performance. It provides incident operations over time by different criteria, like severity, mean time to triage, mean time to resolve, DLP Policy, and more. You should customize this view to work with your practices.
The Incident view is by default 6 months while the event data is from the past 30 days. To increase the event data beyond 30 days you can use Microsoft Sentinel. If you on the other hand want to reduce the Incident window you can follow these steps.
Go to transform data
Select the Incident table, view settings by default you will see.
Update this to 30 days by updating the value to this as an example.
4. = OData.Feed(“https://api.security.microsoft.com/api/incidents?$filter=lastUpdateTime gt ” & Date.ToText(Date.AddDays(Date.From(DateTime.LocalNow()),-30), “yyyy-MM-dd”) , null, [Implementation=”2.0″])
The report also has a per workload detailed view like this sample for Exchange Online. The report contains Exchange, SharePoint, OneDrive for Business, Endpoint, Teams and OCR.
Additional configuration to be made
This is required to capture sensitive information that is transferred in Exchange Online or SharePoint Online. Setup captures all DLP policies that do not have any action or raise any alerts. This is also important for the Copilot for Security functionality to work correctly.
Create a custom policy.
Name the policy based on your naming standard and provide a description of the policy.
Select the workloads from where you want to capture sensitive data usage. For devices there is no need, devices are capturing all the sensitive data processing by default.
Click next.
Click Create rule.
Provide a rule name and click Add condition, then click Content Contains
Then click Sensitive info types, and select all the relevant Sensitive information types that you would like to capture for both internal and external processing. Note, do focus on the sensitive information types that are key to your operations (max 125 per rule). Then click Add, you can add your own custom SITs or make use of the built in SITs.
If you want any other conditions to be true for generating signals like external communications add that condition. Next, ensure that no Action, User notifications, Incident reports or Use email incident reports… are turned on. They should all be turned off.
Setup the Power BI online view
Providing an online view of the data has several benefits. You can delegate access to the dashboard without delegating permissions to the underlying data set. You can also create queries that only show information for a specific division or market and only present that information to that specific market. You can set up a scheduled refresh to refresh the data without having to upload it again.
Follow these steps to set up the integration https://learn.microsoft.com/en-us/azure/sentinel/powerbi#create-a-power-bi-online-workspace.
Posts part of this series
Cyber Security in a context that allows your organization to achieve more
https://techcommunity.microsoft.com/t5/security-compliance-and-identity/cyber-security-in-a-context-that-allows-your-organization-to/ba-p/4120041
Security for Copilot Data Security Analyst plugin https://techcommunity.microsoft.com/t5/security-compliance-and-identity/learn-how-to-customize-and-optimize-copilot-for-security-with/ba-p/4120147
Guided walkthrough of the Microsoft Purview extended report experience https://techcommunity.microsoft.com/t5/security-compliance-and-identity/guided-walkthrough-of-the-microsoft-purview-extended-report/ba-p/4121083
Microsoft Tech Community – Latest Blogs –Read More
Guided walkthrough of the Microsoft Purview extended report experience
This is a step-by-step guided walkthrough of the Microsoft Purview extended report experience and how it can empower your organization to understand the cyber security risks in a context that allows them to achieve more. By focusing on the information and organizational context to reflect the real impact/value of investments and incidents in cyber.
Prerequisites
License requirements for Microsoft Purview Information Protection depend on the scenarios and features you use. To understand your licensing requirements and options for Microsoft Purview Information Protection, see the Information Protection sections from Microsoft 365 guidance for security & compliance and the related PDF download for feature-level licensing requirements. For the best experience, all Microsoft Defender products should be enabled.
Follow the step-by-step guide to set up the reporting found here.
The DLP incident management documentation can be found here.
Install Power BI Desktop to make use of the templates Downloads | Microsoft Power BI
Overview and vision
The vision with this package is that it will allow for faster and more integrated communication between leaders and the cyber operations teams in a context that allows for effective collaboration. The structure can help present the positive result of attacks prevented by measuring distance to corporate secrets. It can also help you provide a view of the impact of an incident by listing the sensitive systems and content the attackers have accessed.
Based on the information you may also identify patterns where you need to improve your security posture based on sensitive content and systems. This makes improvement projects more connected to company value. Cybersecurity is fast pacing so being able to understand the future is just as important as current state. With this data available you should be able to input details about future threats and project their impact. As part of this we are also creating Security Copilot skills to help identify future risks.
Step-by-step guided walkthrough
Principles for the dashboards
When opening the Power BI view whether it is from a web-based version or from Power BI desktop you will find unique users and unique devices. These are user accounts and devices that have had at least one security incident flagged in Microsoft Defender Portal and have accessed sensitive information. Organizations may select to filter these based on incident flags, the type of incident etc. how to achieve this is outlined in the implementation guide.
Let us have a look at the base elements in the CISO, CCO view.
These are the default KPI views, you define a target for how much sensitive data can be accepted to be touched by compromised devices or users.
This is the view of the incidents showing the classification and type of attack. This view may be changed to be based on tags or other fields that instructs on what can be done to mitigate future attacks.
The number of compromised users and devices that have accessed sensitive content.
The count and types of sensitive content accessed by the compromised systems.
The core rule for what is shown is that sensitive content has been touched by a compromised system or account. A compromised system or account that has not accessed any sensitive content will not be shown. The only exception is the Operational scope pages more detail later.
Board level sample data.
The first version has four risk dimensions,
Legal Compliance, you should tweak this view to be centered around your regulatory obligations. The base report shows Credit card and end-user identifiable as an example. A suggestion is that you select the applicable sensitive information types, and group them under a regulator name (Like SEC, FDA, FCC, NTIA, FCA etc..). How to achieve this is outlined in the implementation guide. You may also update the KPI graph to align better with the objectives you have as an organization. A click on the department will filter the content across the page.
Trust Reputation, the standard setup of this report is to show privacy-related data. The impact of having customer data leaking is devastating to the Trust customers have for the organization. You can configure the report to be centered around the privacy data that is most applicable to your business.
Company and Shareholder Value is centered around the organization’s own secrets. Secret drawings, source code, internal financial results dashboards, supply chain information, product development, and other sensitive information. The dashboard is built on a few core components.
Access to content labeled as Sensitive from compromised.
Update this diagram to only reflect the sensitivity labels with high impact to the business, we will only show access made by compromised accounts.
Access to mission-critical systems from compromised.
This is based on connections to URL’s or IP addresses that host business sensitive systems. This should come from the asset classification already made for critical systems.
Access to Sensitive content from compromised.
This should be the core Sensitive information types, fingerprints, exact data matches that directly can impact the valuation of the organization.
The KPI diagram should be updated to a target that makes sense to the core security projects run by the organization.
Operational scope provides your organization with information about where Sensitive information is processed. Failing to process at the appropriate location may directly impact whether an organization is allowed to operate in specific markets or not. This report can also be used for restructuring the company and other actions to keep the company competitive while still staying in compliance with regulations.
With Security Copilot you can get this type of detail as well. It will help you with the contextual detail. Here is one example of a custom sensitive information type. The sub bullets are departments.
There is also a view included for the use of Sensitivity labels.
The CISO view contains more detail than the Board reports as outlined initially in this post. This is the Company & Shareholder Value view. Based on the implementation guide this view can be customized to meet the needs of your organization. But based on this you may feel that more detail is needed. This leads to the detail view.
Account Detailed Data view provides the next level of detail.
In the green box you will find all the users with incidents, where you can learn more about threat actors, threat families etc… as part of the implementation guide you can learn how to add additional fields such as tags and type.
In the red box you will find information about the actual documents and information that the user has been accessing.
Let’s use this sample where we pair the usage with Copilot for Security. Let us say that one of the object names is listall.json. And I want to get all the information surrounding that file.
Or you may have an e-mail subject that you are concerned about.
The information shared is to provide you with an idea of how to get started. Consider adding actual monetized impact on events across the system. Both those that were avoided and those that had a negative impact.
Improvement Project reporting
For data-driven feedback on the impact of improvement projects, we have a few sample dashboards to get you started. They are there to allow you to see the art of the possible. The rich data that is available from the system will in many cases allow you to build your own data-driven dashboards to show progress. The samples that are available is Document KPI, Oversharing SharePoint, Email KPI, Content upload, Operational Scope, and Operational scope classified content.
Below is a sample dashboard that displays the number of protected versus unprotected document operations across the organization. E.g. which ones are sensitivity labeled and which ones are not. Follow the technical guidance for setting this up properly.
This example provides an overview of the suppliers being used to access sensitive content. This is based on the processes, you may select to do something similar based on the IP tags and ranges and access to sensitive content and systems.
This example contains details about how credential data is being processed across the organization. To capture the All Credential Types you need to enable a policy for all workloads including endpoint.
Incident reporting and progress
The incident reporting and progress view provides insights into the analyst process. It provides the overall efficiency metrics and measures to gauge the performance. It provides incident operations over time by different criteria, like severity, mean time to triage, mean time to resolve, By DLP Policy and more. You should customize this view to work with your practices.
The package also comes with optimization suggestions per workload. Exchange, SharePoint, OneDrive for Business, Endpoint, Teams, and OCR.
You may select to use Copilot to summarize your incidents and provide next steps. This is a sample of output from Copilot summarizing an incident. The steps for implementing and tuning Security Copilot can be found in the Guidance Playbook for Security Copilot.
Events
As part of the technical documentation, there is guidance to set up additional event collection. If you are a decision-maker, consider if you want to set up alerts based on the views you have in Power BI. It is highly likely that a rule can be set up to trigger flows where you need to be involved. Here is the documentation for Microsoft Defender XDR Create and manage custom detection rules in Microsoft Defender XDR | Microsoft Learn.
Copilot for security can be used to draw conclusions from all relevant events associated with an incident and provide suggestions for next steps. This is a sample where it uses the corporate policy document from Microsoft Azure AI as well as Microsoft Defender incidents to suggest next steps. You can also use the upload feature Upload a file | Microsoft Learn.
Here is another example where you may want to confirm if content has been touched by a compromised account.
Posts part of this series.
Cyber Security in a context that allows your organization to achieve more
https://techcommunity.microsoft.com/t5/security-compliance-and-identity/cyber-security-in-a-context-that-allows-your-organization-to/ba-p/4120041
Security for Copilot Data Security Analyst plugin https://techcommunity.microsoft.com/t5/security-compliance-and-identity/learn-how-to-customize-and-optimize-copilot-for-security-with/ba-p/4120147
How to build the Microsoft Purview extended report experience https://techcommunity.microsoft.com/t5/security-compliance-and-identity/how-to-build-the-microsoft-purview-extended-report-experience/ba-p/4122028
Microsoft Tech Community – Latest Blogs –Read More
Microsoft Defender Dynamic Tagging not running
I created several tags in the Asset Rule Management on Aug13th and 14th that have not run yet. How often do the UI tags get applied?
I created several tags in the Asset Rule Management on Aug13th and 14th that have not run yet. How often do the UI tags get applied? Read More
IF Function doubt
Dear Experts,
I have a data like below(attached sheet as well):-
Column “B” so(Segment Offset) is calculated as the sum of Column “F” and “G” only when Column “A”
is either MID or LAST, I tried to do the below formula in H2, but got #VALUE error, what wrong am I doing here? Could you please share..
Thanks in Advance,
Br,
Anupam
Dear Experts, I have a data like below(attached sheet as well):-Column “B” so(Segment Offset) is calculated as the sum of Column “F” and “G” only when Column “A”is either MID or LAST, I tried to do the below formula in H2, but got #VALUE error, what wrong am I doing here? Could you please share..Thanks in Advance,Br,Anupam Read More
Dynamic membership rule to include Description attribute
Hello Everyone,
I want to create an Entra Dynamic Group that will that will look for the Department and the Description attribute. I have sync’d the “description (group)” and “description (user)” Directory extensions via our AAD Connect but the rule builder still won’t pick up on it.
After sync’ing the attributes I go into Entra -> Groups -> Create New -> select Dynamic. While building the rule I click the Get Custom Extension Properties button and enter the App ID of my Tenant Schema app. After this I can see the “extension_<AppID>_description Property to select from.
However when I put in a matching value the rule validator does not pick up on it.
Microsoft Entra Connect Sync: Directory extensions – Microsoft Entra ID | Microsoft Learn
Is there anything I am missing?
Ultimately I would like to simply create an EXO Dynamic Group that can filter on these, but I cannot find away.
Thanks for any input.
Hello Everyone, I want to create an Entra Dynamic Group that will that will look for the Department and the Description attribute. I have sync’d the “description (group)” and “description (user)” Directory extensions via our AAD Connect but the rule builder still won’t pick up on it. After sync’ing the attributes I go into Entra -> Groups -> Create New -> select Dynamic. While building the rule I click the Get Custom Extension Properties button and enter the App ID of my Tenant Schema app. After this I can see the “extension_<AppID>_description Property to select from. However when I put in a matching value the rule validator does not pick up on it. Microsoft Entra Connect Sync: Directory extensions – Microsoft Entra ID | Microsoft Learn Is there anything I am missing? Ultimately I would like to simply create an EXO Dynamic Group that can filter on these, but I cannot find away. Thanks for any input. Read More
Vou orientar cliente a processar a Microsoft
Faço gerenciamento digital de um negócio que é uma clínica de ortopedia de quadril, ou seja, 90% de pacientes idosos com baixa ou nenhuma locomoção. O Bing Places criou um perfil do negócio utilizando dados do Google Mapas, porém, a clínica mudou de endereço e o antigo ficou lá. Só descobrimos esse perfil quando vários pacientes passaram a ir ao local errado. Daí vimos que ao utilizar o buscador BING ao invés do Google, o endereço antigo aparecia. Criamos uma conta e reivindicamos a propriedade do negócio, mas para completar, é preciso verificar a conta. Fizemos o processo via telefone e email e, embora o aviso seja de que foi finalizada, logo aparece como não verificada, impedindo atualização de dados. Não é apresentada outra forma de verificar. Já enviamos inúmeros comentários de usuário apontando que o endereço está errado, já enviamos vários emails no endereço que aparece no suporte, nunca tivemos respostas. Enviei msg no X e forneceram um número que também não funciona. Essa é minha última tentativa e após, não vejo outro caminho, teremos que judicializar a questão. É um absurdo que uma big tech não seja capaz de resolver algo tão simples.
Faço gerenciamento digital de um negócio que é uma clínica de ortopedia de quadril, ou seja, 90% de pacientes idosos com baixa ou nenhuma locomoção. O Bing Places criou um perfil do negócio utilizando dados do Google Mapas, porém, a clínica mudou de endereço e o antigo ficou lá. Só descobrimos esse perfil quando vários pacientes passaram a ir ao local errado. Daí vimos que ao utilizar o buscador BING ao invés do Google, o endereço antigo aparecia. Criamos uma conta e reivindicamos a propriedade do negócio, mas para completar, é preciso verificar a conta. Fizemos o processo via telefone e email e, embora o aviso seja de que foi finalizada, logo aparece como não verificada, impedindo atualização de dados. Não é apresentada outra forma de verificar. Já enviamos inúmeros comentários de usuário apontando que o endereço está errado, já enviamos vários emails no endereço que aparece no suporte, nunca tivemos respostas. Enviei msg no X e forneceram um número que também não funciona. Essa é minha última tentativa e após, não vejo outro caminho, teremos que judicializar a questão. É um absurdo que uma big tech não seja capaz de resolver algo tão simples. Read More
Learn how to customize and optimize Copilot for Security with the custom Data Security plugin
This is a step-by-step guided walkthrough of how to use the custom Copilot for Security pack for Microsoft Data Security and how it can empower your organization to understand the cyber security risks in a context that allows them to achieve more. By focusing on the information and organizational context to reflect the real impact/value of investments and incidents in cyber. We are working to add this to our native toolset as well, we will update once ready.
Prerequisites
License requirements for Microsoft Purview Information Protection depend on the scenarios and features you use. To understand your licensing requirements and options for Microsoft Purview Information Protection, see the Information Protection sections from Microsoft 365 guidance for security & compliance and the related PDF download for feature-level licensing requirements. You also need to be licensed for Microsoft Copilot for Security, more information here.
Consider setting up Azure AI Search to ingest policy documents, so that they can be part of the process.
Step-by-step guided walkthrough
In this guide we will provide high-level steps to get started using the new tooling. We will start by adding the custom plugin.
Go to securitycopilot.microsoft.com
Download the DataSecurityAnalyst.yml file from here.
Select the plugins icon down in the left corner.
Under Custom upload, select upload plugin.
Select the Copilot for Security plugin and upload the DataSecurityAnalyst.yml file.
Click Add
Under Custom you will now see the plug-in
The custom package contains the following prompts
Under DLP you will find this if you type /DLP
Under Sensitive you will find this if you type sensitive
Let us get started using this together with the Copilot for Security capabilities
Anomalies detection sample.
Access to sensitive information by compromised accounts.
Document accessed by possible compromised accounts.
CVE or proximity to ISP/IPTags.
Tune Exchange DLP policies sample.
Purview unlabelled operations.
Applications accessing sensitive content.
Hosts that are internet accessible accessing sensitive content
Exchange incident sample prompt book.
SharePoint sample prompt book.
Anomalies detection sample
The DLP anomaly is checking data from the past 30 days and inspect on a 30m interval for possible anomalies. Using a timeseries decomposition model.
The sensitivity content anomaly is using a slightly different model due to the amount of data. It is based on the diffpatterns function that compares week 3,4 with week 1,2.
Access to sensitive information by compromised accounts.
This example is checking the alerts reported against users with sensitive information that they have accessed.
Who has accessed a Sensitive e-mail and from where?
We allow for organizations to input message subject or message Id to identify who has opened a message. Note this only works for internal recipients.
You can also ask the plugin to list any emails classified as Sensitive being accessed from a specific network or affected of a specific CVE.
Document accessed by possible compromised accounts.
You can use the plugin to check if compromised accounts have been accessing a specific document.
CVE or proximity to ISP/IPTags
This is a sample where you can check how much sensitive information that is exposed to a CVE as an example. You can pivot this based on ISP as well.
Tune Exchange DLP policies sample.
If you want to tune your Exchange, Teams, SharePoint, Endpoint or OCR rules and policies you can ask Copilot for Security for suggestions.
How many of the operations in your different departments are unlabelled? Are any of the departments standing out?
In this context you can also use Copilot for Security to deliver recommendations and highlight what the benefit of sensitivity labels are bringing.
Applications accessing sensitive content.
What applications have been used to access sensitive content? The plugin supports asking for applications being used to access sensitive content. This can be a fairly long list of applications, you can add filters in the code to filter out common applications.
If you want to zoom into what type of content a specific application is accessing.
What type of network connectivity has been made from this application?
Or what if you get concerned about the process that has been used and want to validate the SHA256?
Hosts that are internet accessible accessing sensitive content
Another threat vector could be that some of your devices are accessible to the Internet and sensitive content is being processed. Check for processing of secrets and other sensitive information.
Promptbooks are a valuable resource for accomplishing specific security-related tasks. Consider them as a way to practically implement your standard operating procedure (SOP) for certain incidents. By following the SOP, you can identify the various dimensions in an incident in a standardized way and summarize the outcome. For more information on prompt books please see this documentation.
Exchange incident sample prompt book
Note: The above detail is currently only available using Sentinel, we are working on Defender integration.
Posts part of this series
Cyber Security in a context that allows your organization to achieve more
https://techcommunity.microsoft.com/t5/security-compliance-and-identity/cyber-security-in-a-context-that-allows-your-organization-to/ba-p/4120041
Guided walkthrough of the Microsoft Purview extended report experience https://techcommunity.microsoft.com/t5/security-compliance-and-identity/guided-walkthrough-of-the-microsoft-purview-extended-report/ba-p/4121083
How to build the Microsoft Purview extended report experience https://techcommunity.microsoft.com/t5/security-compliance-and-identity/how-to-build-the-microsoft-purview-extended-report-experience/ba-p/4122028
Microsoft Tech Community – Latest Blogs –Read More
Accelerate Cloud Potential for Your SAP Workloads on Azure with these Learning Paths
Accelerate Cloud Potential for Your SAP Workloads on Azure with these Learning Paths
In today’s rapidly evolving digital landscape, businesses need to stay competitive by leveraging the latest tools and services. Together, SAP and Microsoft are not just providing these tools but also creating ecosystems that foster innovation and transformation. This collaboration enables businesses to unlock new potential for their SAP workloads on Azure.
Explore Azure for SAP Workloads
Streamline your SAP operations and maximize ROI with our comprehensive Azure training. Empower your team to seamlessly migrate, manage, and optimize SAP workloads on Azure, leveraging its robust infrastructure and specialized tools. This comprehensive training will enhance your SAP performance, drive efficiency, and unlock innovation within your existing environment
Highlight: New RISE SAP Learn Module
We are excited to introduce the new RISE SAP learn module, “Explore Azure networking for SAP RISE.” This module shows you how to use your Azure networks to connect to your SAP RISE architecture running in SAP’s Azure subscription. After completing this module, you will be able to differentiate the responsibilities of the SAP RISE team, Azure support, and the customer. You will also learn how to connect to SAP RISE with Azure virtual private network (VPN) peering, VNet-to-VNet, and with an on-premises network
Learn from the pros with live, interactive Virtual Training Days
Virtual Training Days are instructor-led classes designed to equip individuals and teams with in-demand skills related to cloud migration, AI, and other cutting-edge technologies. We offer Virtual Training Days to help you migrate SAP to Azure, optimizing your performance, reliability, and scalability while reducing costs. In this session, Migrate and Modernize SAP on the Microsoft Cloud, you’ll find out how to secure and monitor SAP workloads on Azure. Come explore how this move enhances productivity, fosters secure collaboration, and gives you AI-powered insights for greater efficiency. Register for our next session here.
To help you and your team better take advantage of these benefits, we’ve created an array of learning materials and interactive events—from self-guided courses to Virtual Training Days, certifications to conferences—that build your cloud expertise. Our Microsoft Learn Learning Paths are curated collections of free, online modules and resources designed to help you build specific skills or gain knowledge in a particular technology or subject area.
By leveraging these resources, learning paths and the new RISE SAP learn module, you can ensure that your team is well-equipped to handle the complexities of SAP workloads on Azure. Whether you are looking to migrate, manage, or optimize your SAP environment, these resources will provide you with the knowledge and skills needed to succeed.
Join us on this journey to unlock new potential for your SAP workloads on Azure. Start exploring our learning resources today and take the next step towards transforming your business.
Microsoft Tech Community – Latest Blogs –Read More
Azure Disaster recovery question !
Hello,
Our customer requested that to provide a solution for Azure DR. I am not sure how to accomplish this.
Anyone, please help.
Seaniro:
Primary Site
West Europe – Qatar Central
Secondery site
West Europe to North Europe
and all are IAAS VM’s
Hello, Our customer requested that to provide a solution for Azure DR. I am not sure how to accomplish this.Anyone, please help. Seaniro: Primary Site West Europe – Qatar CentralSecondery site West Europe to North Europe and all are IAAS VM’s Read More
Excel Formatting Help
Hello, all. I have a table that looks like this (below) with three columns merged and centered. I want to format the entire thing as a table and to include the merged cells. I also want the table to have the model number/description merged into those three as one table column. Is there any way to do this?
Hello, all. I have a table that looks like this (below) with three columns merged and centered. I want to format the entire thing as a table and to include the merged cells. I also want the table to have the model number/description merged into those three as one table column. Is there any way to do this? Read More
Critical Cloud Assets: Identifying and Protecting the Crown Jewels of your Cloud
Cloud computing has revolutionized the way businesses operate, with many organizations shifting their business-critical services and workloads to the cloud. This transition, and the massive growth of cloud environments, has led to a surge in security issues in need of addressing. Consequently, the need for contextual and differentiated security strategies is becoming a necessity. Organizations need solutions that allow them to detect, prioritize, and address security issues, based on their business-criticality and overall importance to the organization. Identifying an organization’s business-critical assets serves as the foundation to these solutions.
Microsoft is pleased to announce the release of a new set of critical cloud assets classification capability in the critical asset management and protection experience, as part of Microsoft Security Exposure Management solution, and Cloud Security Posture Management (CSPM) in Microsoft Defender for Cloud (MDC). This capability enables organizations to identify additional business-critical assets in the cloud, thereby allowing security administrators and the security operations center (SOC) teams to efficiently, accurately, and proactively prioritize to address various security issues affecting critical assets that may arise within their cloud environments.
Learn more how to get started with Critical Asset Management and Protection in Exposure Management and Microsoft Defender for Cloud: Critical Asset Protection with Microsoft Security Exposure Management, Critical assets protection (Preview) – Microsoft Defender for Cloud
Criticality classification methodology
Over the past few months, we, at Microsoft, have conducted extensive research with several key objectives:
Understand and identify the factors that signify a cloud asset’s importance relative to others.
Analyze how the structure and design of a cloud environment can aid in detecting its most critical assets.
Accurately and comprehensively identify a broad spectrum of critical assets, including cloud identities and resources.
As a result, we are announcing the release of a new set of pre-defined classifications for critical cloud assets, encompassing a wide range of asset types, from cloud resources, to identities with privileged permissions on cloud resources. With this release, the total number of business-critical classifications has expanded to 49 for cloud identities and 8 for cloud resources, further empowering users to focus on what matters most in their cloud environments.
In the following sections, we will briefly discuss some of these new classifications, both for cloud-based identities and cloud-based resources, their integration into our products, their objectives, and unique features.
Identities
In cloud environments, it is essential to distinguish between the various role-based access control (RBAC) services, such as Microsoft Entra ID and Azure RBAC. Each service has unique permissions and scopes, necessitating a tailored approach to business-criticality classification.
We will go through examples of new business-critical rules classifying identities with assigned roles both in Microsoft Entra and Azure RBAC:
Microsoft Entra
The Microsoft Entra service is an identity and access management solution in which administrators or non-administrators can be assigned a wide range of built-in or custom roles to allow management of Microsoft Entra resources.
Examples of new business-criticality rules classifying identities assigned with a specific Microsoft Entra built-in role:
Classification: “Exchange Administrator”
Default Criticality Level: “High”
This rule applies to identities assigned with the Microsoft Entra Exchange Administrator built-in role.
Identities assigned this role have strong capabilities and control over the Exchange product, with access to sensitive information through the Exchange Admin Center, and more.
Classification: “Conditional Access Administrator”
Default Criticality Level: “High”
This rule applies to identities assigned with the Microsoft Entra Conditional Access Administrator built-in role.
Identities assigned this role are deemed to be of high importance, as it grants the ability to manage Microsoft Entra Conditional Access settings.
Azure RBAC
Azure role-based access control (Azure RBAC) is a system that provides fine-grained access management of Azure resources that helps you manage who has access to Azure resources, what they can do with those resources, and what areas they have access to. The way you control access to resources using Azure RBAC is to assign Azure roles.
Example of a new criticality rule classifying identities assigned with specific Azure RBAC roles:
Classification: “Identities with Privileged Azure Role”
Default Criticality Level: “High”
This rule applies to identities assigned with an Azure privileged built-in or custom role.
Assets criticality classification within the Azure RBAC system necessitates consideration of different parameters, such as the role assigned to the identity, the scope in which the role takes effect, and the contextual business-criticality that lies within this scope.
Thus, this rule classifies identities which have a privileged action-permission assigned over an Azure subscription scope, in which a critical asset resides, thereby utilizing contextual and differential security measures. This provides the customer with a cutting-edge criticality classification technique for both Azure built-in roles, and custom roles, in which the classification accurately adapts to dynamic changes inside the customer environment, ensuring a more accurate reflection of criticality.
List of pre-defined criticality classifications for identities in Microsoft Security Exposure Management
Cloud resources
A cloud environment is a complex network of interconnected and isolated assets, allowing a remarkable amount of environment structure possibilities, asset configurations, and resource-identity interconnections. This flexibility provides users with significant value, particularly when designing environments around business-critical assets and configuring them to meet specific requirements.
We will present three examples of the new predefined criticality classifications as part of our release, that will illustrate innovative approaches to identifying business-critical assets.
Azure Virtual Machines
Examples of new criticality rules classifying Azure Virtual Machines:
Classification: “Azure Virtual Machine with High Availability and Performance”
Default Criticality Level: “Low”
Compute resources are the cornerstone of cloud environments, supporting production services, business-critical workloads, and more. These assets are created with a desired purpose, and upon creation, the user is presented with several types of configurations options, allowing the asset to meet its specific requirements and performance thresholds.
As a result, an Azure Virtual Machine configured with an availability set, indicates that the machine is designed to withstand faults and outages, while a machine equipped with a premium Azure storage, indicates that the machine should withstand heavy workloads requiring low-latency and high-performance. Machines equipped with both are often deemed to be business-critical.
Classification: “Azure Virtual Machine with a Critical User Signed In”
Default Criticality Level: “High”
Resource-user interconnections within a cloud environment enable the creation of efficient, well-maintained, and least privilege-based systems. These connections can be established to facilitate interaction between resources, enabling single sign-on (SSO) for associated identities and workstations, and more.
When a user with a high or very high criticality level has an active session in the resource, the resource can perform tasks within the user’s scoped permissions. However, if an attacker compromises the machine, they could assume the identity of the signed-in user and execute malicious operations.
Azure Key Vault
Example of a new criticality rule classifying Azure Key Vaults:
Classification: “Azure Key Vaults with Many Connected Identities”
Default Criticality Level: “High”
Through the complex environments of cloud computing, where different kinds of assets interact and perform different tasks, lies authentication and authorization, supported by the invaluable currency of secrets. Therefore, studying the structure of the environment and how the key management solutions inside it are built is essential to detect business-critical assets.
Azure Key Vault is an indispensable solution when it comes to key, secrets, and certificate management. It is widely used by both business-critical and non-critical processes inside environments, where it plays an integral role in the smoothness and robustness of these processes.
An Azure Key Vault whose role is critical within a business-critical workload, such as a production service, could be used by a high number of different identities compared to other key vaults in the organization, thus in case of disruption or compromise, could have adverse effects on the integrity of the service.
List of pre-defined criticality classifications for cloud resources in Exposure Management
Protecting the crown jewels of your cloud environment
The critical asset protection, identification, and management, lies in the heart of Exposure Management and Defender Cloud Security Posture Management (CSPM) products, enriching and enhancing the experience by providing the customer with an opportunity to create their own custom business-criticality classifications and use Microsoft’s predefined ones.
Protecting your cloud crown jewels is of utmost importance, thus staying on top of best practices is crucial, some of our best practice recommendations:
Thoroughly enabling protections in business-critical cloud environments.
Detecting, monitoring, and auditing critical assets inside the environments, by utilizing both pre-defined and custom classifications.
Prioritizing and executing the remediation and mitigation of active attack paths, security issues, and security incidents relating to existing critical assets.
Following the principle of least privilege by removing any permissions assigned to overprivileged identities, such identities could be identified inside the critical asset management experience in Microsoft Security Exposure Management.
Conclusion
In the rapidly growing and evolving world of cloud computing, the increasing volume of security issues underscores the need of contextual and differentiated security solutions to allow customers to effectively identify, prioritize, and address security issues, thereby the capability of identifying organizations’ critical assets is of utmost importance.
Not all assets are created equal, assets of importance could be in the form of a highly privileged user, an Azure Key Vault facilitating authentication to many identities, or a virtual machine created with high availability and performance requirements for production services.
Protecting customers’ most valuable assets is one of Microsoft’s top priorities. We are pleased to announce a new set of business-critical cloud asset classifications, as part of Microsoft Defender for Cloud and Microsoft Security Exposure Management solutions.
Learn more
Microsoft Security Exposure Management
Start with Exposure Management Documentation, Product website, blogs
Critical Asset Management documentation
Critical Asset Protection and how to get started in Microsoft Security Exposure Management blog post
List of Microsoft’s predefined criticality classifications: Link
Microsoft Security Exposure Management what’s new page
Microsoft Defender for Cloud
Microsoft Defender for Cloud (MDC) plans
Microsoft’s Cloud Security Posture Management (CSPM) documentation
Critical Asset Protection in Microsoft Defender for Cloud (MDC) documentation
Microsoft Tech Community – Latest Blogs –Read More