New Blog | Secure and Govern Your Custom-Built AI Apps with Microsoft Purview
By Liz Willets
The rise of generative AI unlocks new opportunities for developers to create groundbreaking applications. Studies show that 75% of organizations are more likely to adopt AI apps when they come with assurance mechanisms for secure and compliant use. This underscores the importance of building apps that can handle and govern sensitive data appropriately. But despite this growing demand for secure and compliant AI applications, developers often lack security expertise and tools to build these controls into custom-built applications. What developers need are easy to use APIs that enable them to build data security and compliance controls into their applications by design.
As consumers of AI applications, enterprises are concerned about data oversharing, data leakage, and non-compliant use of AI apps. Ensuring that your application meets enterprise needs for safeguarding against data risks is critical for enterprise adoption. Once deployed, security teams want visibility into which GenAI applications are being used, how often, by whom, and what kind of sensitive data is being shared with those applications.
On top of that, end users want clear visibility into the confidentiality of data referenced by AI applications. Ensuring that end users can clearly see the sensitivity label of any files referenced by your GenAI app is imperative. This visual cue informs the user that the application is interacting with a sensitive document, which is critical to maintain data integrity and compliance with their organization’s data handling obligations.
Today, we are excited to announce new innovations from Microsoft Purview to help developers build enterprise-grade security and compliance controls into their custom-built AI apps:
Microsoft Purview integration in Copilot Studio (public preview) and Azure AI Studio (coming soon) offers data security and compliance features to developers using Copilot Studio and Azure AI Studio. This integration provides visibility into when an application accesses sensitive data by recognizing and honoring sensitivity labels of the data being accessed. It also protects sensitive data generated by the app through label inheritance, and honors label permissions, limiting data access to authorized users only. Additionally, it facilitates governance of app development by providing audit logging for developer activities.
Build enterprise-grade data security and compliance controls with Purview SDK (coming soon) offering a set of easy to integrate APIs for pro-code developers. These APIs enable enterprise-grade data security, compliance and governance controls with just a few lines of code.
Microsoft Purview integration in Copilot Studio (public preview) and Azure AI Studio (coming soon)
For developers looking to get started today, we are thrilled to announce the integration of Microsoft Purview capabilities in Copilot Studio (public preview) and Azure AI Studio (coming soon). With this integration, Microsoft Purview capabilities come built-in so that when you build your custom apps in Copilot Studio or Azure AI Studio, your enterprise customers and end users get best-in-class security and governance features, including:
Discover data risks in AI interactions: Enhance end user confidence by providing visibility into the sensitivity label of the data referenced from SharePoint in responses from your custom-built Copilots and GenAI apps.
Protect sensitive data with encryption: Ensure that app generated responses inherit the sensitivity label of the files referenced and are encrypted accordingly. Additionally, ensure that your AI applications respect user permissions and sensitivity labels, limiting the access to sensitive data to authorized users only. This builds trust with your customers, as they know their data is handled according to their security policies.
Capture AI activities: Log developer activities during the creation of custom-built applications to understand which data sources were enabled, whether GenAI answers were enabled on those sources, and more. This ensure comprehensive oversight and transparency for enterprises purchasing your application to maintain control over data and applications interacting with it.
Read the full post here: Secure and Govern Your Custom-Built AI Apps with Microsoft Purview
By Liz Willets
The rise of generative AI unlocks new opportunities for developers to create groundbreaking applications. Studies show that 75% of organizations are more likely to adopt AI apps when they come with assurance mechanisms for secure and compliant use. This underscores the importance of building apps that can handle and govern sensitive data appropriately. But despite this growing demand for secure and compliant AI applications, developers often lack security expertise and tools to build these controls into custom-built applications. What developers need are easy to use APIs that enable them to build data security and compliance controls into their applications by design.
As consumers of AI applications, enterprises are concerned about data oversharing, data leakage, and non-compliant use of AI apps. Ensuring that your application meets enterprise needs for safeguarding against data risks is critical for enterprise adoption. Once deployed, security teams want visibility into which GenAI applications are being used, how often, by whom, and what kind of sensitive data is being shared with those applications.
On top of that, end users want clear visibility into the confidentiality of data referenced by AI applications. Ensuring that end users can clearly see the sensitivity label of any files referenced by your GenAI app is imperative. This visual cue informs the user that the application is interacting with a sensitive document, which is critical to maintain data integrity and compliance with their organization’s data handling obligations.
Today, we are excited to announce new innovations from Microsoft Purview to help developers build enterprise-grade security and compliance controls into their custom-built AI apps:
Microsoft Purview integration in Copilot Studio (public preview) and Azure AI Studio (coming soon) offers data security and compliance features to developers using Copilot Studio and Azure AI Studio. This integration provides visibility into when an application accesses sensitive data by recognizing and honoring sensitivity labels of the data being accessed. It also protects sensitive data generated by the app through label inheritance, and honors label permissions, limiting data access to authorized users only. Additionally, it facilitates governance of app development by providing audit logging for developer activities.
Build enterprise-grade data security and compliance controls with Purview SDK (coming soon) offering a set of easy to integrate APIs for pro-code developers. These APIs enable enterprise-grade data security, compliance and governance controls with just a few lines of code.
Microsoft Purview integration in Copilot Studio (public preview) and Azure AI Studio (coming soon)
For developers looking to get started today, we are thrilled to announce the integration of Microsoft Purview capabilities in Copilot Studio (public preview) and Azure AI Studio (coming soon). With this integration, Microsoft Purview capabilities come built-in so that when you build your custom apps in Copilot Studio or Azure AI Studio, your enterprise customers and end users get best-in-class security and governance features, including:
Discover data risks in AI interactions: Enhance end user confidence by providing visibility into the sensitivity label of the data referenced from SharePoint in responses from your custom-built Copilots and GenAI apps.
Protect sensitive data with encryption: Ensure that app generated responses inherit the sensitivity label of the files referenced and are encrypted accordingly. Additionally, ensure that your AI applications respect user permissions and sensitivity labels, limiting the access to sensitive data to authorized users only. This builds trust with your customers, as they know their data is handled according to their security policies.
Capture AI activities: Log developer activities during the creation of custom-built applications to understand which data sources were enabled, whether GenAI answers were enabled on those sources, and more. This ensure comprehensive oversight and transparency for enterprises purchasing your application to maintain control over data and applications interacting with it.
Figure 1: Copilot Studio can inherit labels from the referenced files, honor permission controls associated with the label, and enhance users’ awareness on the sensitivity of the content.
Read the full post here: Secure and Govern Your Custom-Built AI Apps with Microsoft Purview