Watermarks in preview in Azure OpenAI Service
Microsoft is proud to announce the rollout of a new built-in feature in Azure OpenAI Service. ‘Watermarks’ add invisible watermarks to all images generated using DALL·E, the company’s flagship generative AI image generator. This watermarking technology is designed to provide an additional layer of transparency and disclosure to AI-generated content.
The Importance of Watermarking AI-Generated Content
To address the risk of bad actors using AI and deepfakes to deceive the public, Microsoft recognizes that we need to take a whole-of-society approach. And as a technology company and AI leader, we have a special responsibility to lead and collaborate with others. Microsoft’s approach to combating abusive AI-generated content includes robust collaboration across industry, governments, and civil society while building strong safety architecture for our AI platform, model, and application levels and developing durable provenance and watermarking technologies.
By the end of 2023, Microsoft was automatically attaching provenance metadata to images generated with OpenAI’s DALL-E 3 model in our Azure OpenAI Service, Microsoft Designer, and Microsoft Paint, using cryptographic methods to mark and sign content. Provenance metadata includes important information such as when the content was created, and which organization certified the credentials. While this was considerable progress, challenges still exist, including that no disclosure method is perfect and all will be subject to adversarial attack, including removal.
With today’s announcement, we add another layer of protection to reinforce provenance techniques and help customers understand the source and history of a piece of digital content by embedding invisible watermarks to DALL·E- 3 model generated images in our Azure OpenAI Service. These watermarks are invisible to the human eye and do not degrade the image’s quality but can be identified by specialized tools.
How the Technology Works
Microsoft’s watermarking feature embeds signals within the pixels of AI-generated images. These signals are imperceptible to the human eye but detectable by AI-based verification tools.
Watermarks embed GUIDs that are traceable to offline provenance manifests. The manifest contains several key pieces of information:
Field name
Field content
“description”
This field has a value of ”AI Generated Image” for all DALL-E model generated images, attesting to the AI-generated nature of the image.
“softwareAgent”
This field has a value of ”Azure OpenAI DALL-E” for all images generated by DALL-E series models in Azure OpenAI Service.
“when”
The timestamp of when the Content Credentials were created.
The watermarking process is robust and resilient to common modifications such as resizing or cropping, ensuring that the integrity of the watermark remains intact even when images are altered.
Watermarks in other Azure AI services
The Azure AI services team is also embedding watermarks into other generative AI content. Last year, our team announced that watermarks are added to voices created with the Azure AI Speech personal voice feature. Watermarks allow customers and users to identify whether speech is synthesized using Azure AI Speech, and specifically, which voice was used.
Future Vision and Industry Collaboration
Microsoft’s watermarking launch is part of a broader initiative to create industry-wide standards around the detection of AI-generated content. The company is actively collaborating with other leading AI companies and stakeholders, including Adobe, Truepic and the BBC to ensure that watermarking, cryptographic metadata, and other disclosure and transparency mechanisms can be scaled and integrated across platforms.
Get started with DALL-E in Azure OpenAI Service
Use DALL·E in Azure OpenAI Service Studio
Use DALL·E in Azure AI Studio
Learn more about watermarks
For more information on how to responsibly build solutions with Azure OpenAI Service image-generation models, visit the Azure OpenAI transparency note.
Microsoft Tech Community – Latest Blogs –Read More