Category: Microsoft
Category Archives: Microsoft
Token Protection – getting “unbound” for admin user
Hello,
I’ve been looking and the Conditional Access Policy for Token Protection but before implementing I checked Azure Sign-In logs. What I found is that when I use mu admin credentials (different than my user credentials) to access for example Azure Portal, Token Protection Sign in Session value is “Unbound”. When I use my standard user, all sign ins are logged with the Token value “bound”. We use hybrid joined devices. I have checked with my colleague and his Token Protection values are always “bound”, no matter if he uses his standard or admin account. What can be the reason? I cannot find much information about how to troubleshoot it. I’m worried that when I enable CA policy, I will cut myself off.
Hello, I’ve been looking and the Conditional Access Policy for Token Protection but before implementing I checked Azure Sign-In logs. What I found is that when I use mu admin credentials (different than my user credentials) to access for example Azure Portal, Token Protection Sign in Session value is “Unbound”. When I use my standard user, all sign ins are logged with the Token value “bound”. We use hybrid joined devices. I have checked with my colleague and his Token Protection values are always “bound”, no matter if he uses his standard or admin account. What can be the reason? I cannot find much information about how to troubleshoot it. I’m worried that when I enable CA policy, I will cut myself off. Read More
Read aloud function
My Read Aloud Function does not work. I am using Word ( on a monthly subscription as part of Office 2011) I am also working on a Mac. Can anyone help?
My Read Aloud Function does not work. I am using Word ( on a monthly subscription as part of Office 2011) I am also working on a Mac. Can anyone help? Read More
Can we report an idea connected to Copilot for M365?
Can we report an idea connected to Copilot for M365?
I can see it is possible for Copilot for Service here Microsoft Copilot for Service Ideas – Microsoft Community Hub
Can we report an idea connected to Copilot for M365? I can see it is possible for Copilot for Service here Microsoft Copilot for Service Ideas – Microsoft Community Hub Read More
Can I update checklist in Planner by API
Hi All
I would like to updade Checklist in a task by API – Could you please help how to do it?
Thanks
Hi All I would like to updade Checklist in a task by API – Could you please help how to do it? Thanks Read More
Troubleshooting Cross-Tenant Mailbox Migrations
In Part 1 of this series, we talked about cross-tenant (sometimes referred to tenant to tenant or T2T) mailbox migrations. In Part 2, we’ll cover how to troubleshoot issues you may encounter during cross-tenant mailbox migrations. There are several tools we wanted to mention that can be useful when troubleshooting.
Here is a table with the key elements:
We cannot stress enough how important it is to ensure the configuration is correct. Ensuring this saves time!
T2T migration errors and fixes
The most common errors in T2T migrations and how to address them. The errors are found using Test-MigrationServerAvailability or Get-MigrationUserStatistics / Get-MoveRequestStatistics.
Test-MigrationServerAvailability
This simulates a move request cross-tenant and will identify configuration issues at user level. When you get the same error for multiple users, it is likely that configuration is incorrect at organization level.
Powerful commands to troubleshoot your migrations and verify progress. The -IncludeReport report is basically a move report which shows all the things that happened in the migration. -DiagnosticInfo provides information on durations, throttling and possible underlying causes for stalls.
Cross tenant mailbox migration validation script
The report will check user level organization level configuration for T2T migration so that you are prepared for migrations.
Permissions related cmdlets:
Know what permissions are migrated in Exchange Online and if they are broken.
Data Consistency Score in migrations
Sometimes, we cannot migrate all the data from source to target, so we skip it. There are 4 main categories of skipped items:
Bad / corrupt items,
Large items (MaxReceiveSize up to 150MB in Exchange Online),
Missing items in the target (which normally doesn’t happen in move requests where the target mailbox is locked until completion and the end-user or other processed wouldn’t be able to access the target mailbox at all),
Other category issues.
You can view what has been skipped (for example permissions that couldn’t be mapped to a user are considered bad items) in move reports. Sometimes, if there is significant data loss (not migrated), the admin will need to approve to complete migration.
Duration estimates for T2T migrations based on data retrieved for 50% (P50) and 90% (P90) of the statistics seen for that migration type. If you exceed P90, your migration may be slow.
Test-MigrationServerAvailability
A very useful tool when troubleshooting cross-tenant mailbox migration is Test-MigrationServerAvailability. You run this command in the target tenant after the migration endpoint has been created. You can run Get-MigrationEndpoint to view the identity of the endpoint.
Test-MigrationServerAvailabiliy -TestMailbox <user@contoso.com> -Endpoint <T2T Migration Endpoint Name>
If the Result is Success, you can proceed with the migration of the user.
Note that a successful test doesn’t guarantee you will be able to migrate the user without any issues, but it is a good starting point to ensure the minimum prerequisites are met.
For a list of common failures, please see the Migration Failures section here: Cross-tenant mailbox migration
Here is an example, a situation where the source archive mailbox is a few bytes over 100GB.
Test-MigrationServerAvailability in target tenant:
Mailbox statistics for archive in the source tenant:
In situations where a source primary, archive, or dumpster is larger than the target tenant’s user quotas you will see errors like ArchiveExceedsTargetQuotaPermanentException, MailboxExceedsTargetQuotaPermanentException, or ArchiveDumpsterExceedsTargetQuotaPermanentException. In this case, you can contact Microsoft support for recommendations and see which options are best for you. Support may be able to provide an exception for an individual mailbox (we cannot process bulk mailboxes) and if you plan on having auto-expanding on the target tenant side and allocate room to grow in the target tenant. Depending on the situation and if you have enabled the auto-expanding archive feature on the source tenant, you can wait until the AuxArchive is provisioned and MainArchive / primary mailbox size is reduced below the 100GB quota (this can take up to 30 days but it’s generally 7 days on average). Or, if possible, ask the user to clean up unnecessary data in the source mailboxes if auto-expanding is not an option in the target tenant. Sometimes, source mailboxes that are or were on hold at some point, have Recoverable Items quota set to 100GB as opposed to the default 30GB, and in this case, you might just want to enable litigation hold on the target tenant mail user to also increase the quotas there.
Coming back to Test-MigrationServerAvailability, if this fails for multiple users, then likely there is something wrong at the organization level and the cross-tenant mailbox migration validation script discussed in the next section can be helpful. For example, if you get an ‘Access is denied’ error (screenshot below) for some or all users, here are some possible causes at organization level:
Erroneous application registration;
Wrong credentials or expired credentials used in the migration endpoint; or
Source tenant organization relationship’s OAuthApplicationId does not match target migration endpoint’s ApplicationId, or the org relationship is not using tenant ID in DomainNames
CrossTenantMailboxMigrationValidation script
When it comes to validating a large cross-tenant mailbox migration (CTMM), a better tool you can rely on is the Cross-Tenant Mailbox Migration validation script written by Alberto Pascual Montoya, published in the official CSS-Exchange GitHub repo and referenced by our official CTMM documentation.
This script will check and validate various related user and organization settings. It is very useful to run it before CTMM takes place to ensure the mailbox(es) can be migrated, and if they can, that they won’t face any validation issues during the move.
You can run the script to validate the configuration between both organizations by running:
.CrossTenantMailboxMigrationValidation.ps1 -CheckOrgs -LogPath <LogFileLocation>
You can also run it to validate the objects on the target tenant by comparing them with the objects in the source tenant by running:
.CrossTenantMailboxMigrationValidation.ps1 -CheckObjects -CSV <CSVFileLocation> -LogPath <LogFileLocation>
As you can see below, it will highlight any discrepancies found, and if the target object is not synced from any AD, it will ask if you would like to correct the discrepancy on the go:
There are other scenarios covered by the script too. For example, running it only from the source tenant side and collecting the needed data that can be sent to the target tenant admins and they run the script against their tenant; or simply collecting the data for support purposes.
Analyzing migration reports
Move request and migration user Exchange Online PowerShell commands are other tools that can help us troubleshoot cross-tenant migration.
Let’s do a quick walkthrough of these commands:
When we create a new migration batch, in the background, a Get-MigrationUser object is created for each user specified in the CSV file for migration. If things go well during a partial validation, a Get-MoveRequest is created for each MigrationUser.
In the example below, we can see that we have a migration batch (Get-MigrationBatch) with 1 user (one user was specified in the CSV file) and a migration user object (Get-MigrationUser) was created for the user, but because we don’t have a mail user with this identity, we failed the migration at this early stage. The move request was not created (Get-MoveRequest fails).
This user is not found at all on the target tenant (wrong identity of the user in the CSV):
And if I had run Test-MigrationServerAvailability on the EmailAddress specified in the CSV, before creating the batch, the error would have been relatively obvious:
Also, you’d want to ensure that the users specified in the migration CSV file don’t have a move request already (or they are not a part of another batch).
There might be situations where you might see that migration is stuck at creating the move request, that is in injection workflow stage:
If State: Waiting takes too long, you can look at DiagnosticInfo on Get-MigrationUserStatistics (check the Durations section in this blog post for more info) to see if you have more information on where migration is stuck.
In this case, we can see that it waited 19 minutes to inject the move request:
And 0 move requests injected successfully since we have 0 InjectedRequestCount and no timestamp at LastSuccessfulInjectionTime or LastInjectionTime.
After waiting 19 minutes for the injection of the move request, we finally got a permanent failure: ErrorCrossTenantSourceUserIsInHoldOrRetentionPolicyAppliedPermanentException, and the service gave up trying to inject the move request. The lesson is to check Test-MigrationServerAvailability before creating the batch (prerequisite missed) and avoid this waiting time and failure that could have been easily detected.
You might wonder when we use Get-MigrationUserStatistics versus Get-MoveRequestStatistics. It mostly depends on preference. I am more used to Get-MoveRequestStatistics, but in the situation where we don’t have a Get-MoveRequest created (like in the example above), we are forced to use Get-MigrationUserStatistics. Normally I append -DiagnosticInfo “verbose,showtimeslots” to make sure I get the most details. Also, if you are looking to check skipped Items, you will append -IncludeSkippedItems to Get-MigrationUserStatistics. If using Get-MoveRequestStatistics, it is enough to use -IncludeReport.
Let’s take another example where a move request was injected successfully by the service (New-MoveRequest) but this one failed. We will troubleshoot using the Get-MoveRequestStatistics command (since we have a Get-MoveRequest in place):
Move requests in cross-tenant migrations will have Onboarding_CrossTenant WorkloadType and the SourceServer and TargetServer are Exchange Online servers, usually in different forests.
In this case, the move request failed because we don’t have the Cross-tenant User Data Migration license on source or target tenant.
Get-MoveRequestStatistics EXOMailbox1 |FL MailboxIdentity, WorkloadType, SourceServer, TargetServer, RemoteHostName, FailureType, Message, FailureTimestamp
If I want to check the full details of the failure, I can run a command like this:
$stats = Get-MoveRequestStatistics <User> -IncludeReport
$stats.Report.Failures
If you’d want to have a quick overview of failures encountered, you would run:
$stats.Report.Failures | group Failuretype | FT -a
To check if any items would be skipped during the move, you can run the following command:
Get-MoveRequestStatistics <User> | FL DataConsistency*, BadItem*, LargeItem*, MissingItem*, Skipped*
For more information on DCS (Data Consistency Score) please see Improving Migrations Using Data Consistency Scoring and Track and prevent migration data loss. For troubleshooting DCS and approving skipped items, please see Migrations with Data Consistency Score (DCS) – more than you ever wanted to know! – Microsoft Community Hub.
If you need to look at the available properties of the source or target user before and after the move, you can use the IncludeReport switch in Get-MoveRequestStatistics.
Run the following in the target tenant after move is completed:
$stats = Get-MoveRequestStatistics <User> -IncludeReport
Then you would look into the properties like this:
$stats.report.SourceMailboxBeforeMove.Props
$stats.report.TargetMailUserBeforeMove.Props
$stats.report.SourceMailuserAfterMove.Props
$stats.report.TargetMailboxAfterMove.Props
Suppose that we want to see if migration service stamped the ExternalEmailAddress on the source recipient after the move (according to the TargetDeliveryDomain set in the migration batch), we would run this command:
$stats.report.SourceMailUserAfterMove.Props | ? PropertyName -eq “ExternalEmailAddress” | FL PropertyName, Values
Troubleshooting migration of permissions
Let’s teach you how to examine migration reports to see if permissions were migrated correctly and if there are any permissions errors in move reports. We provide PowerShell commands for the target tenant to help corroborate outputs with the migration reports analysis.
We will first check mailbox Full Access permissions, which should move during the cross-tenant migration, and we check them by using the move reports and PowerShell commands.
In this example, we moved MyMailbox10 and MyMailbox19, from one tenant to another:
We can see that MyMailbox10 has maintained the FullAccess permission on MyMailbox19 after migration:
We can confirm this by looking at the property called ExchangeSecurityDescriptor (which is MsExchMailboxSecurityDescriptor in AD) and noticing the SID values in there.
Note: you first need to store the Get-MoveRequestStatistics <user> -IncludeReport into a variable, in this example $stats19.
$stats.Report.SourceMailboxBeforeMove.Props | ? PropertyName -eq “ExchangeSecuritydescriptor”
$stats.Report.TargetMailboxAfterMove.Props | ? PropertyName -eq “ExchangeSecuritydescriptor”
Source mailbox: before move the SID belongs to MyMailbox10 in the source tenant.
Target mailbox: after move the SID belongs to MyMailbox10 in the target tenant.
If you have multiple SID values listed here, you can use these commands to display them in a more readable format:
$valueAfterMove = ($stats19.Report.TargetMailboxAfterMove.Props | ? PropertyName -eq “ExchangeSecurityDescriptor”).values.Strvalue
$sdobj = New-Object System.Security.AccessControl.RawSecurityDescriptor($valueAfterMove)
$sdObj.DiscretionaryAcl | select -Skip 1 |FT AceQualifier,AccessMask, SecurityIdentifier, AceFlags, IsInherited
The AccessMask 1 here means Full Access Permission:
The same information is also available in the DebugEntries in the move report when we look for “MailboxSecurityDescriptor’.
$stats.report.DebugEntries | ? LocalizedString -match “MailboxSecurityDescriptor”| % {[string] $_}
Besides Get-MailboxPermission cmdlet, we can look at the ExchangeSecurityDescriptor property with the Get-Mailbox command:
get-mailbox mymailbox19 | select -ExpandProperty ExchangeSecurityDescriptor | select -ExpandProperty discretionaryACL
A quick reminder that Send on Behalf permissions are NOT migrated, so we won’t see anything in move reports.
Send As permissions
The Send-As permissions are migrated. In target tenant, after migration:
You can similarly check the permissions in the source tenant.
When checking the move reports for Send-As permission, you can put the DebugEntries in a Notepad++ file and search for the user’s SID (source user SID and target user SID). These permissions would fall into the category UserSecurityDescriptor. You might see many AD permissions (screenshot below where entries were separated by new line), so it can be quite hard to spot them.
If you want to list all SIDs with Send-As, you can search specifically for ‘CR’ and the following GUID: ab721a54-1e2f-11d0-9819-00aa0040529b.
$stats.report.DebugEntries | ? LocalizedString -match “UserSecurityDescriptor”| % {[string] $_} | clip
Then paste from clipboard to Notepad++, for example:
To have them listed on new lines, you can do the following:
(OA; replace with n(OA;
(A; replace with n(A;
(OD; replace with n(OD;
These are called Ace Types, for a full list see documentation.
Then, since Send-As is an Extended Right, we will see CR ab721a54-1e2f-11d0-9819-00aa0040529b, which is the SendAs Extended Right GUID.
Checking the first SID in the source tenant:
Checking the first SID in the target tenant:
On-premises, looking at the LDP dump of the security descriptor of the user, you would see something like this:
Mailbox folder permissions
We also migrate mailbox folder permissions.
In the target tenant, after migration, you will run Get-MailboxFolderPermission <user>:<Folder> to check permissions present. For example:
Based on this output alone, we would think that there are no permissions issues.
But expand the User property and notice that Cloud3 is of Unknown user type:
And, if we check the move report, we will see TargetPrincipal errors and FolderACL issues for cloud3.
In this case, the DataConsistencyScore is usually Good instead of Perfect, and we will have corresponding BadItems and Failures recorded if we don’t find the principal user with permissions, in the target tenant.
Here is a quick command to see the alias of the principal user that is not found in the target tenant (Cloud3 in our example).
$stats.report.BadItems | select Kind, Folder, ScoringClassifications, {$_.Failure.DataContext} | ft -a
We can further look into DataContext with |FT or |FL for full details:
$stats.report.BadItems | select {$_.Failure.DataContext} | ft -a
Or look at the entire failure on specific bad items. BadItems[1] is the second CorruptFolderAcl in the screenshot above (the first line count starts from 0, not 1).
$stats.report.BadItems[1].Failure
For migration of Calendar and FreeBusy data folder permissions entries, we can look at folder ACL in Report.DebugEntries:
$stats.report.DebugEntries | ? LocalizedString -match “FolderAcl”| % {[string] $_}
Checking migration durations
Now that you rock at troubleshooting migration of permissions (which was lengthy and a bit boring), we will get into another topic: duration of the cross-tenant migrations and when to recognize you have an issue.
To see how much time your migration took and how it is progressing, you can run these commands:
$stats = Get-MoveRequestStatistics <User> -IncludeReport -DiagnosticInfo “Verbose,showtimeslots”
$stats |FL *duration*
In my example, I see that overall duration was about 30 minutes (it was a very small 35 MB mailbox) but I have 51 days of TotalFailedDuration:
If I check the $stats.DiagnosticInfo property, we will see these durations in a more detailed and accurate way:
Overall Move = 57 days (about 2 months) and 28 min, out of which:
In Progress = 6 min (actual copying of the data)
Suspended = 21 min (I chose to complete it at a certain point after initial sync, so it went into suspended mode)
Failed = 51 days and about 8 hours (it was in a failed state for 51 days)
Completed = 5 days and 16 hours (the move completed 5 days ago but I ran the Get-MoveRequest command now, 5 days after completion)
You can check the table from Microsoft 365 and Office 365 migration performance and best practices | Microsoft Learn for duration estimates during mailbox migrations in Exchange Online.
For example, a mailbox with a size of less than 10 GB is estimated to be migrated within 1 day. You can open a case with Microsoft Support if P90 is exceeded and it is because of our service (for example it is not a configuration issue on your side that wasn’t remediated fast enough).
In my situation above, I had a permanent failure, and left the move sit there for 51 days in a Failed state. This doesn’t count against the P90, as I neglected the move. The InProgress duration was 6 min for my 35MB mailbox which meets the P90 estimation (90% of mailbox migrations less than 10GB would complete in 1 day).
We would like to thank Anshul Dube, Roman Powell and Nino Bilic for contributing and reviewing this blog post.
Mirela Buruiana and Alberto Pascual Montoya
Microsoft Tech Community – Latest Blogs –Read More
Make your voice chatbots more engaging with new text to speech features
In our increasingly digital world, the importance of giving a voice and image to chatbots cannot be overstated. Transforming a chatbot from an impersonal, automated responder into a relatable and personable assistant significantly enhances user engagement.
Today we’re thrilled to announce Azure AI Speech’s latest updates, enhancing text to speech capabilities for a more engaging and lifelike chatbot experience. These updates include:
A winder range of multilingual voices for natural and authentic interactions;
More prebuilt avatar options, with latest sample codes for seamless GPT-4o integration; and
A new text stream API that significantly reduces latency for ChatGPT integration, ensuring smoother and faster responses.
Introducing new multilingual and IVR-styled voices
We’re excited to introduce our newest collection of voices, equipped with advanced multilingual features. These voices are crafted from a variety of source languages, bringing a rich diversity of personas to enhance your user experience. With their authentic and natural interactions, they promise to transform your chatbot engagement through our technology.
Discover the diverse range of our new voices:
Voice name
Main locale
Gender
en-GB-AdaMultilingualNeural
en-GB (English – United Kingdom)
Female
en-GB-OllieMultilingualNeural
en-GB (English – United Kingdom)
Male
pt-BR-ThalitaMultilingualNeural
pt-BR (Portuguese – Portugal)
Female
es-ES-IsidoraMultilingualNeural
es-ES (Spanish – Spain)
Female
es-ES-ArabellaMultilingualNeural
es-ES (Spanish – Spain)
Female
it-IT-IsabellaMultilingualNeural
it-IT (Italian – Italy)
Female
it-IT-MarcelloMultilingualNeural
it-IT (Italian – Italy)
Male
it-IT-AlessioMultilingualNeural
it-IT (Italian – Italy)
Male
We’re also delighted to present two new optimized en-US voices, specifically designed for call center scenarios – a prevalent application of text-to-speech technology.
They are:
Voice name
Main locale
Gender
en-US-LunaNeural
En-US (English – United States)
Female
en-US-KaiNeural
En-US (English – United States)
Male
These voices are currently available for public preview in three regions: East US, West Europe, and South East Asia. Discover more in our Voice Gallery and delve deeper into the details via our developer documentation.
Announcing advanced features for text to speech avatars
Text to speech avatar, previewed at Ignite 2023, enables users to create realistic videos of speaking avatars simply by giving text input and allows users to create real-time interactive bots with visual elements that are more engaging. Since its preview, we have received great feedback and appreciation from customers in various industries. Today, we are glad to share what’s been added to the avatar portfolio.
More prebuilt avatar options and more regions available
Our prebuilt text-to-speech avatars offer ready-to-deploy solutions for our customers. We’ve recently enriched our portfolio’s diversity by introducing five new avatars. They can be used for both batch synthesis and real-time conversational scenarios. We remain committed to expanding our avatar collections to encompass a broader range of cultures and visual identities.
These newly introduced avatars can be accessed in Speech Studio for video creation and live chats. Dive deeper into the process of synthesizing a text-to-speech avatar using Speech SDK for real-time synthesis in chatbot interactions, or batch synthesis for generating creativity videos.
Beyond the previously available service regions – West US 2, West Europe, and Southeast Asia – we are excited to announce the expansion of our avatar service to three additional regions: Sweden Central, North Europe, and South Central US. Learn more here.
Enhanced text to speech avatar chat experience with Azure OpenAI capabilities
Text-to-speech avatars are increasingly leveraged for live chatbots, with many of our customers utilizing Azure OpenAI to develop customer service bots, virtual assistants, AI educators, and virtual tourist guides, among others. These avatars, with their lifelike appearance and natural sounding neural TTS or custom voice, combined with the advanced natural language processing capabilities of the Azure OpenAI GPT model, provide an interaction experience that closely mirrors human conversation.
The Azure OpenAI GPT-4o model is now part of the live chat avatar application in Speech Studio. This allows users to see firsthand the collaborative functioning of the live chat avatar and Azure OpenAI GPT-4o. Additionally, we provide sample code to aid in integrating the text-to-speech avatar with the GPT-4o model. Learn more about how to create lifelike chatbots with real-time avatars and Azure OpenAI GPTs, or dive into code samples here (JS code sample, and python code sample) .
This update also includes sample codes to assist in customizing Azure OpenAI GPT on your data. Azure OpenAI On Your Data is a new feature that enables users to tailor the chatbot’s responses according to their unique data source. This proves especially beneficial for enterprise customers aiming to develop an avatar-based live chat application capable of addressing business-specific queries from clients. For guidance on creating a live chat app using Azure OpenAI On Your Data, please refer to this sample code (search “On Your Data”).
More Responsible AI support for avatars
Ensuring responsibility in both the development and delivery of AI products is a core value for us. In line with this, we’ve introduced two features to bolster the responsible AI support for text-to-speech avatars, supplementing our existing transparency note, code of conduct, and disclosure guidelines.
We’ve integrated Azure AI Content Safety into the batch synthesis process of text to speech avatars for video creation scenarios. This added layer of text moderation allows for the detection of offensive, risky, or undesirable text input, thereby preventing the avatar from producing harmful output. The text moderation feature spans multiple categories, including sexual, violent, hate, self-harm content, and more. It’s available for batch synthesis of text-to-speech avatars both in Speech Studio and via the batch synthesis API.
In our bid to provide audiences with clearer insights into the source and history of video content created by text to speech avatars, we’ve adopted the Coalition for Content Provenance and Authenticity (C2PA) Standard. This standard offers transparent information about AI-generation of video content. For more details on the integration of C2PA with text to speech avatars, refer to Content Credentials in Azure Text to Speech Avatar .
Unlocking real-time speech synthesis with the new text stream API
Our latest release introduces an innovative Text Stream API designed to harness the power of real-time text processing to generate speech with unprecedented speed. This new API is perfect for dynamic text vocalization, such as reading outputs from AI models like GPT in real-time.
The Text Stream API represents a significant leap forward from traditional non-text stream TTS technologies. By accepting input in chunks (as opposed to whole responses), it significantly reduces the latency that typically hinders seamless audio synthesis.
Comparison: Non-Text Stream vs. Text Stream
Non-Text Stream
Text Stream
Input Type
Whole GPT response
Each GPT output chunk
TTS First Byte Latency
High (Total GPT response time + TTS time)
Low (Few GPT chunks time + TTS time)
The Text Stream API not only minimizes latency but also enhances the fluidity and responsiveness of real-time speech outputs, making it an ideal choice for interactive applications, live events, and responsive AI-driven dialogues.
Utilizing the Text Stream API is straightforward. Simply follow the steps provided with the Speech SDK. For detailed implementation, see the sample code on GitHub.
Get started
Microsoft provides access to more than 500 neural voices spanning over more than 140 languages and locales, complemented by avatar add-ons. These text-to-speech capabilities, part of Azure AI Speech service, allow you to swiftly imbue chatbots with a natural voice and realistic image, thereby enriching the conversational experience for users. Furthermore, the Custom Neural Voice and Custom Avatar features facilitate the creation of a distinctive brand voice and image for your chatbots. With a unique voice and image, a chatbot can seamlessly integrate into your brand’s identity, contributing to a cohesive and unforgettable brand experience.
For more information
Try our demo to listen to existing neural voices
Add Text-to-Speech to your apps today
Apply for access to Custom Avatar and Custom Neural Voice
Join Discord to collaborate and share feedback
Zheng Niu and Junwei Gan also contributed to this article.
Microsoft Tech Community – Latest Blogs –Read More
Need Help Resolving Quick-Books Error 6010, 100 – Assistance Appreciated!
I encountered Quick-Books Error 6010, 100 while trying to open my company file. The error message suggests a network issue, but I’ve checked my connection, and everything seems fine. I’m unable to access any of my data. Can anyone in the community help with a solution?
I encountered Quick-Books Error 6010, 100 while trying to open my company file. The error message suggests a network issue, but I’ve checked my connection, and everything seems fine. I’m unable to access any of my data. Can anyone in the community help with a solution? Read More
Need Help with Quick-Books Error 193: How to Fix?
Hello,
I’m experiencing Quick-Books Error 193 when trying to open my company file. The error message says the file is already in use, but no one else is accessing it. I’ve tried restarting Quick-Books and my computer, but the issue persists. Any advice?
Thanks!
Hello,I’m experiencing Quick-Books Error 193 when trying to open my company file. The error message says the file is already in use, but no one else is accessing it. I’ve tried restarting Quick-Books and my computer, but the issue persists. Any advice?Thanks! Read More
Ordering custom term templates when creating a new Glossary term in Microsoft Purview
Hello
I have created multiple custom term templates for use in the Glossary section of Microsoft Purview. I would like these term templates to appear against new Glossary Terms in a specific order but despite multiple attempts including adding an alphabetical prefix to the custom term template names, Purview seems to order the term templates at random.
Does anyone have a solution that would allow me to customise the ordering of the custom term templates in a Glossary Term?
Thanks
HelloI have created multiple custom term templates for use in the Glossary section of Microsoft Purview. I would like these term templates to appear against new Glossary Terms in a specific order but despite multiple attempts including adding an alphabetical prefix to the custom term template names, Purview seems to order the term templates at random.Does anyone have a solution that would allow me to customise the ordering of the custom term templates in a Glossary Term?Thanks Read More
Purview Information Protection for internal and external emails
I’m working with an organisation that is starting to use sensitivity labels. They have Office 365 E3 licenses. The current plan is to set up a default label for documents and emails called “Internal Only”. This label will encrypt contents and grant co-author permissions to all staff.
The challenge will be when emails include external recipients. Ideally, the user will change from the default label to one that grants access to any recipients. However, I can imagine that there will be many cases where they forget to do this.
If we had Office 365 E5 licenses, we would have the option to create a DLP policy to show a policy tip. I I would expect this would reduce the incidents of mislabeling.
I have seen recommendations to avoid encrypting by default and only use it where needed. However this client is keen to use encryption to protect as much content as possible.
One suggestion could be to change the default email label to only grant access to the sender and recipients, regardless of whether they are internal or external.
I’m interested in any real-world feedback on how others have tackled this issue.
I’m working with an organisation that is starting to use sensitivity labels. They have Office 365 E3 licenses. The current plan is to set up a default label for documents and emails called “Internal Only”. This label will encrypt contents and grant co-author permissions to all staff. The challenge will be when emails include external recipients. Ideally, the user will change from the default label to one that grants access to any recipients. However, I can imagine that there will be many cases where they forget to do this. If we had Office 365 E5 licenses, we would have the option to create a DLP policy to show a policy tip. I I would expect this would reduce the incidents of mislabeling. I have seen recommendations to avoid encrypting by default and only use it where needed. However this client is keen to use encryption to protect as much content as possible. One suggestion could be to change the default email label to only grant access to the sender and recipients, regardless of whether they are internal or external. I’m interested in any real-world feedback on how others have tackled this issue. Read More
Is it possible to protect the Primary Refresh Token (PRT) if attacker has hands on keyboard
Hi everyone,
I want to ask if anyone know if possible to defend against pass-the-prt attack? We are about to embark on a journey to deploy privilege access workstations to all IT admins with more or less no internet access. The idea is to have a clean source and heavily reduce an attacker getting hold of the credentials / PRT of an admin account. But because it is so heavily locked down it is already causing issues for us.
So I want to find out how big of an issue it is if an attacker was able to get a foothold on a device which is used by a standard user account that has Microsoft Entra ID roles assigned via PIM.
So we have Defender for Endpoint installed on all devices, Tamper protection is on and the ASR rule “Block credential stealing from the Windows local security authority subsystem (lsass.exe)” is set to block. further to that we require a FIDO2 security key for all IT admins and CA policies are set to require both MFA and a compliant device.
But as mentioned above, if an attacker gets a foothold on a device used by an IT admin user who logs in with his or hers standard account and elevate into an Entra admin role, is it game over by then?
If that is the case, it seems to me that the PRT is the weekend and we would be better off not having the device used for admin privileged joined Microsoft Entra.
Hi everyone, I want to ask if anyone know if possible to defend against pass-the-prt attack? We are about to embark on a journey to deploy privilege access workstations to all IT admins with more or less no internet access. The idea is to have a clean source and heavily reduce an attacker getting hold of the credentials / PRT of an admin account. But because it is so heavily locked down it is already causing issues for us.So I want to find out how big of an issue it is if an attacker was able to get a foothold on a device which is used by a standard user account that has Microsoft Entra ID roles assigned via PIM.So we have Defender for Endpoint installed on all devices, Tamper protection is on and the ASR rule “Block credential stealing from the Windows local security authority subsystem (lsass.exe)” is set to block. further to that we require a FIDO2 security key for all IT admins and CA policies are set to require both MFA and a compliant device.But as mentioned above, if an attacker gets a foothold on a device used by an IT admin user who logs in with his or hers standard account and elevate into an Entra admin role, is it game over by then? If that is the case, it seems to me that the PRT is the weekend and we would be better off not having the device used for admin privileged joined Microsoft Entra. Read More
Mac Book Air M2
Hello,
It’s impossible for some weeks to open a link in my outlook. It is possible to send or to reveive emails but impossible to open the links.
Thanks if some body has a solution.
Hello,It’s impossible for some weeks to open a link in my outlook. It is possible to send or to reveive emails but impossible to open the links. Thanks if some body has a solution. Read More
MailMessage Content-Type “multipart/mixed” to “multipart/report; report-type=feedback-report”
I’m updating our in-house security software to send XARF reports rather than having to use abuse portals(DigitalOcean). The only issue I’m having is that a requirement is to have the mail message Content-Type header be “multipart/report; report-type=feedback-report”. I can’t find any way to change the message header. It’s either the content-type of the body, or if I add attachments/alternate views, it’s “multipart/mixed”.
if (mailMessage.Headers[“Content-Type”] != null)
mailMessage.Headers.Remove(“Content-Type”);
mailMessage.Headers.Add(“Content-Type”, “multipart/report; report-type=feedback-report”);
I’m updating our in-house security software to send XARF reports rather than having to use abuse portals(DigitalOcean). The only issue I’m having is that a requirement is to have the mail message Content-Type header be “multipart/report; report-type=feedback-report”. I can’t find any way to change the message header. It’s either the content-type of the body, or if I add attachments/alternate views, it’s “multipart/mixed”. if (mailMessage.Headers[“Content-Type”] != null)
mailMessage.Headers.Remove(“Content-Type”);
mailMessage.Headers.Add(“Content-Type”, “multipart/report; report-type=feedback-report”); Read More
Why my site Data is not updating in Bing Webmaster Tools
I run a site and i added this in Web Master tool of Bing where i started showing clicks and impressions. But after sometime the data is not updating and it stucked so i can’t see any update related my site performance on Bing. My site name is menuspricesph.
I run a site and i added this in Web Master tool of Bing where i started showing clicks and impressions. But after sometime the data is not updating and it stucked so i can’t see any update related my site performance on Bing. My site name is menuspricesph. Read More
Is there a reason why status checks applied to a commit don’t pull through to the PR status checks?
Hello there,
Is there a reason why status checks applied to a particular commit (from an external scanning tool for example) as part of a PR don’t correctly carry up into the live PR for that branch? I can run ADO Pipeline Builds and they appear as expected, and even POST a PR status check back so it appears in the checks section, but it seems very odd that i may have X commit failing with external status checks and it not be visible/shown on the PR itself. In GitHub/Bitbucket it showcases everything in the PR view for related commits.
Any advice greatly appreciated!
Hello there,Is there a reason why status checks applied to a particular commit (from an external scanning tool for example) as part of a PR don’t correctly carry up into the live PR for that branch? I can run ADO Pipeline Builds and they appear as expected, and even POST a PR status check back so it appears in the checks section, but it seems very odd that i may have X commit failing with external status checks and it not be visible/shown on the PR itself. In GitHub/Bitbucket it showcases everything in the PR view for related commits.Any advice greatly appreciated! Read More
Surface MVP showcase: Enabling commercial experiences on Surface
One of the best parts of my role is the opportunity to work with our MVP community. We have a tight knit group of folks worldwide who are passionate about Surface, Windows, and all things tech. They’re super active in their communities, showcasing their thought leadership and technical know-how, making a real impact with customers and enthusiasts alike.
With all the buzz around the latest Microsoft and Surface announcements, I wanted to get some insights from our MVPs. They’ll share their thoughts on everything from AI PCs to security, management, sustainability, and more. Let’s get started!
A quick recap
In case you missed it, we recently announced two new sets of devices built for business, leading the way for AI experiences: the Surface Laptop 6 for Business and Surface Pro 10 for Business. These AI PCs are powered by the latest Intel® Core™ Ultra processors and are ready to enhance productivity with Copilot, bringing AI-accelerated experiences right to your desk. With AI-enhanced cameras and mics that support Windows Studio Effects, accelerated by a neural processing unit (NPU), these devices are game changers for businesses.
For customers looking to push the envelope with AI, we also announced our Copilot+ PCs: Surface Pro (11th edition) and Surface Laptop (7th edition), coming later this year. These devices feature Qualcomm Snapdragon® X Series processors and on-device AI that helps find almost anything fast with the optional Recall feature and empowers collaboration with real-time translation of 40+ languages to English with Live Captions, and more.
Thoughts from our Surface MVPs
Chauncey Larsen
As someone who has been active in supporting the Surface community and guiding our customers through several OS upgrades and other changes in the PC industry, Barb Bowman has a wealth of experience helping IT Pros deploy Surface and embrace modern device experiences. Barb, tell us a little bit about your history and what you think about these new AI PCs.
Barb Bowman
“I look back at my first personal computer, a Tandy 1000, with 128 KiB of RAM and no hard drive when I thought ‘Everything is going to change and every store, every office, every agency of every government will no longer run on punch cards, big central computers, and paper. This changes the world.’ And change it did.
I’ve seen an exponential growth in processors, RAM, and graphics chips. Like everyone else, I’ve used search engines and the cloud to find and process information. But it still takes hours of manual spreadsheet analysis to get any actionable insights from our own business information.
Until now. AI computing and the AI-powered PC are poised to change everything about how we work. Ultimately, the time saved by leveraging AI for our business data will free us up for other tasks, like growing our client bases.
Adding an NPU to our state-of-the-art devices is changing how computers operate and how we work with our data. Imagine, like my cousin, you’re a manufacturer’s representative and sell over 100 lines of clothing and accessories. Then what if you could customize a product catalog for each customer, from the data on your computer, using their purchase history to highlight new and related items? You could then present this information in a personalized document containing descriptions, images, pricing, delivery and shipping information.
Another thing I like is that AI PCs can locally process all your business data. There’s no need to sync with the cloud. Your sensitive business and personal data stays on your computer. This is just the beginning of a new age of computing.
The Surface Pro 10 for Business and Surface Laptop 6 for Business are the first Microsoft-built, AI PCs that include Intel’s NPU along with Intel’s Core Ultra processor and will begin to supercharge our everyday business tasks. I can’t wait to be part of the new computing age with one of these!“
Chauncey Larsen
I agree and love the potential real-world example. I think we’re just getting started with how AI PCs can revolutionize the way we work with our data and processes. It’s an exciting time to be in the industry.
Now I’d like to introduce Alexander Solaat Rødland, one of our Surface MVPs in Norway, who actively hosts seminars and events as well as a podcast. Alexander, we’ve discussed your passion for security. Can you expand on your views regarding security on the Surface platform?
Alexander Solaat Rødland
“As computers evolved from assisting tools to primary working tools, security became crucial. Early threats targeted the OS in the form of viruses, trojans, and rootkits. As the potential profits for a successful attack increased, modern threats became more sophisticated, targeting firmware and essential components. Surface devices were among the first to, by default, enable virtualization-based security, inspired by research and innovation included in the Xbox One, and built robust security measures into their design.
As Windows 10 security advanced, Microsoft announced the Secured-core PC together with their OEMs and partners – including Surface. This set the reference design for how to secure devices and the integrity of the platform from chip-to-cloud.
For Microsoft Surface, security is built into their entire design including the supply-chain, which they supervise and monitor to comply with security and inspection requirements. To ease the worry of security and compliance requirements for any organization, the Surface team built tools for key tasks like managing UEFI and firmware settings, performing image recovery, or securely erasing SSDs along with the ability to print a certificate of an erased disk.
Looking ahead, new AI powered devices could open a world of possibilities of high-performing security tasks without compromising user experience. With NPUs, we’re seeing more efficient and secure processing, keeping data on the device.”
Chauncey Larsen
I can imagine a future where AI enhances the security posture of our devices, and microprocessors like the NPU add efficiencies and minimize the impact of those security processes. We’re already seeing benefits with these recent devices where information inferred by the NPU remains entirely on the device, so that data doesn’t need to traverse the cloud. All great points.
Next up, we have Sung Ki Park, a Surface MVP from Korea who is continually active across social media and events where he represents Surface, Windows and Microsoft 365. Sung Ki, how do you think Surface impacts our remote and hybrid workforce?
Sung Ki Park
“Remote work is booming, the number of digital nomads are on the rise, and various forms of remote work, such as the use of base co-working spaces, work from home, and “workcation,” are becoming increasingly popular. Users need long battery life, stable performance, and features like background noise removal during remote meetings through Microsoft Teams. And these users all need to be protected from the increasing security threats that come with remote work.
All the latest Surface devices raise the bar on essential features for digital nomads and remote workers.
Windows Studio Effects features are available through the Intel® AI Boost NPU included in Intel® Core™ Ultra & Qualcomm® Hexagon NPU included in the Qualcomm Snapdragon® X Elite and Plus. This allows you to use background removal and eye-contact automatic framing functions. And since these tasks are handled by the NPU, the burden on the CPU is significantly reduced.
By taking over some of the CPU’s tasks, the NPU lowers the power usage and helps the battery last longer. Also, by decreasing the amount of heat production, it avoids performance loss due to throttling.
Additionally, the Surface Pro 10 and Surface Pro (11th edition) will be available in 5G models, enabling remote working in co-working spaces, cafes, or hotel lobbies without Wi-Fi.
On Surface Pro 10 and Surface Pro (11th edition), the 1440p Quad HD camera with ultrawide field of view allows you to capture not only yourself but also several attendees in a remote meeting during remote work or workcation.
Windows Hello face authentication and NFC authentication added to Surface Pro 10 and Surface Pro (11th edition) can prevent anyone else from unlocking your PC.
Hardware -based TPM 2.0 and Secured-core PC design help mitigate hardware security threats, and Windows Update helps you respond to the latest security threats through regular definition and periodic firmware updates.
In addition, Dolby Vision™ IQ for display and Dolby® Atmos® support for audio can greatly enhance the movie and music listening experience at leisure and relaxation.
In this way, the AI PC-based or Copilot+ PC-based Microsoft Surface provides the essentials for remote workers and digital nomads where work and leisure coexist. So, if you need to purchase a new PC, it will be a necessity, not an option.
Chauncey Larsen
Here in the US, I recently had the opportunity to travel to a few events on the West Coast, and it never ceases to amaze me how crystal clear my meetings are – both from my perspective, but also from the folks on the other end. Throw in background blur and automatic framing, and I had no issues taking critical calls in my hotel room. I agree that this is one of the areas Surface really shines – mobility.
Next up, let’s hear from Anand Narayanaswamy, one of our newest Surface MVPs based in India. He’s got some thoughts on the Copilot key and what that means for engaging with AI.
Anand Narayanaswamy
“It was a big day for tech enthusiasts on January 4, 2024, when Microsoft announced the introduction of a new Copilot key to Windows 11 PCs. The Copilot key is one of the biggest changes to the Windows PC keyboard in nearly three decades. No doubt, the Copilot key will help users be productive since you only need to tap on the key to work with Copilot in Windows 11. Just like Copilot in Microsoft Edge or on Copilot.Microsoft.com, Copilot combines powerful large language models with the intelligence of the Microsoft Graph to help you synthesize data from multiple sources, give you summaries on things you missed across the web and more. When you combine Copilot with Microsoft 365 data, Copilot in Windows enables you to query your meetings, emails, and files and do more with the data most important to you. Simply put, the Copilot key makes it super easy to utilize AI without distraction.”
Chauncey Larsen
When I first heard about the new key, to be honest, I was surprised, if not a bit skeptical. Since I started using Surface Pro 10, the Copilot key has become critical in my daily usage. To your points, I use it to make basic searches, but I’ve also just started to dabble in how Copilot in Microsoft 365 can make my job easier as a marketer – and it’s changed so many of my work habits!
To round out the insight in our first MVP co-written blog, I’m going to introduce Rob Quickenden, one of our Surface MVPs in the UK. I’d love to read your perspective on what makes AI PCs important in supporting our customer’s needs.
Rob Quickenden
“So, you might be asking ‘yeah ok, but why should I invest now rather than wait until my devices are naturally up for a refresh?’
Well, as explained in this post, this new class of AI PC stands out from the previous generation of devices. For businesses implementing an AI strategy, AI PCs can make a considerable difference. Here are my top three game changers:
Dedicated AI processors: AI PCs equipped with NPUs, executing AI models locally and in real time, enhance performance and efficiency. By doing this on an NPU, we also reduce the impact and load on the CPU, ensuring AI workloads don’t impact performance of non-AI workloads.
Enhanced AI capabilities: While some AI tasks can be executed by a conventional CPU in software (for example Windows Studio Effects), AI PCs can simply handle tasks like visual and audio inferencing, live transcription, and language processing much faster and more efficiently by offloading workloads to the device’s NPU.
Performance and battery Life: The integration of NPUs leads to improved processor efficiency, extended battery life, and increased security, making AI PCs more effective for users.
AI-accelerated experiences: As the wave of AI increases in sophistication and development, developers WILL start to build apps and services that require an NPU (we already see this in some Windows AI features today). AI PCs enable new AI-powered experiences and features, which are quicker and unlock additional NPU-specific functionalities, providing a more responsive operating system and applications.
While Copilot in Windows, Edge, and Microsoft 365 may function adequately on a Surface Pro 9 or Surface Laptop 5 (or other older device), these AI PC models provide a significantly enhanced and tightly integrated experience with AI-driven applications. This enhancement can be transformative for businesses aiming to maintain a competitive edge. By investing now rather than waiting, organizations can promptly leverage these advantages. Considering that this represents the future of computing, opting for this hardware is a logical choice for any forward-thinking organization.
Surface Pro 10 and Surface Laptop 6, represent the current flagship AI PC (using the latest Intel Chipsets) and have been a game changer in the tech industry bringing AI capabilities to Surface for Business devices. Equipped with advanced AI capabilities, these devices provide seamless multitasking, intelligent predictions and personalized experiences — all enabled by new AI features in Windows 11.
We then have the newly announced Copilot+ PCs: Surface Laptop (7th edition) and Surface Pro (11th edition). These are powered by the latest Qualcomm Snapdragon X Series Arm-based processors and run Windows on Arm. These devices go beyond the capabilities of our AI PCs thanks to powerful new dedicated NPU processors designed to run the next generation of AI applications with local off-loading.
Ultimately, choosing a Copilot+ PC or an AI PC depends on user needs. Some organizations still require an Intel based device for x86/X64 app compatibility and management. Other organizations will look to the sheer AI horse power in this new generation of Copilot+ PCs as the deciding factor. Both devices have their strengths, and the decision over which blend of devices ultimately comes down to what the organization and user values most in their device, the apps they use, and, potentially, the apps that leverage the new NPUs locally.
Above all – we have not seen this much innovation in the PC hardware architecture for more than a decade, and with IDC predicting 2024 being the year of significant growth in end-user computing, there’s never been a better time to choose a Surface!
Chauncey Larsen
Completely agreed – what an amazing time to be a PC owner. Having been with Surface for over a decade, I feel the excitement and opportunity that the AI PC represents. We are on the precipice of something major – a fundamental shift in the way that we interface with data and in the way that we are productive. I see our silicon partners joining us on this journey and building innovative technologies to bring these AI experiences to life.
Thank you so much for everyone’s thoughts in this blog. My goal is to continue this series and hear from our MVPs throughout the year, covering topics from AI to security to design. And don’t forget to follow our Surface MVPs—they’re always up to amazing things in our community.
Thanks for reading!
Microsoft Tech Community – Latest Blogs –Read More
New on Microsoft AppSource: June 14-20, 2024
We continue to expand the Microsoft AppSource ecosystem. For this volume, 114 new offers successfully met the onboarding criteria and went live. See details of the new offers below:
Get it now in our marketplace
Audiencerate CDP (Customer Data Platform): Audiencerate CDP simplifies customer data management, enabling brands to leverage their first-party data effectively. Point-and-click modeling and no-code audience creation streamline complex data processes. Monitor campaigns and achieve cross-device reach within the EU in real time. Adhere to privacy standards while engaging with your audience. Ideal for marketing professionals and brands seeking to efficiently utilize their first-party customer data for targeted, impactful marketing campaigns.
Document360: Document360 is an AI-powered knowledge base platform for creating and hosting documentation. It offers a self-service platform for building and managing knowledge bases, product documentation, manuals, SOPs, and wikis. It is useful for customer support teams, product owners, technical writers, and developers for various documentation needs.
iGTB Contextual Banking eXperience (CBX): CBX is a digital engagement banking platform that covers all domains of corporate banking. It provides over 400 user journeys as microservices and UI components, enabling banks to accelerate customer self-service and up-sell and cross-sell their services. CBX uses AI and machine learning to optimize transactions and deliver key insights to its bank customers.
Office Equipment: Office Equipment is an app that streamlines the ordering process for office supplies. It allows for comprehensive management of company resources, real-time notifications, and improved quality of reservation. Employees can choose the type and quantity of material they want to order, and the app centralizes the ordering process for convenience.
Office Reservation App: The Office Reservation app helps businesses manage hybrid work preferences by allowing employees to reserve seats in one or more locations. It optimizes space utilization, fosters in-person collaboration, and provides real-time views of who is attending an office on a given day. The app offers easy seat reservation, real-time data and analytics, transportation and parking management, customizable settings, enhanced administrative control, performance insights, and multi-office management. The benefits include cost efficiency, productivity enhancement, and operational efficiency.
Quick Cards Web Part for SharePoint (SaaS): The Quick Cards Web Part transforms list data into visually stunning and engaging cards with customizable content configuration and integration with SharePoint themes. It allows users to create visually impactful experiences without needing advanced technical knowledge.
Snow Atlas: Snow Atlas is a cloud-native platform that provides complete visibility of your IT landscape, including on-premises and cloud infrastructures, software-as-a-service applications, and more. It normalizes and augments data to present insights that help you make faster, more informed technology decisions.
Whizible: Whizible connects all the dots in an organization’s IT landscape and automates the entire value chain from efforts to cash. It streamlines businesses, saves costs, and makes existing managers more productive, accountable, data-driven, and free to generate more revenue. It covers engagement management, projects, financials, and more in a single tool.
Go further with workshops, proofs of concept, and implementations
ACP User Training for Copilot for Microsoft 365: 1-Day Workshop: Copilot for Microsoft 365 is easy to use, but training is necessary to fully utilize its potential. ACP’s training covers generative AI, prompting frameworks, and specific uses for Outlook, Word, Excel, PowerPoint, Teams, and Microsoft 365 Chat. The training is tailored to your needs and includes use cases.
App in a Day: Workshop: Intellias offers a hands-on training workshop for developing no-code/low-code applications using Microsoft Power Apps. The training covers process automation and integration with Microsoft Teams and Outlook. The workshop is suitable for companies struggling with rapid application development, IT talent shortages, and budget constraints. Intellias provides expert guidance and practical exercises. The training outcomes include reduced time to market, talent gap bridging, lowered development costs, and improved collaboration between business and IT.
Climb Document Analysis AI Solution: 6-Month Implementation: This AI-based document analysis solution automates and improves document management, reducing response times, increasing efficiency, and ensuring effective and transparent information management. It includes an AI engine, advanced algorithms for information extraction and categorization, a document management system, and a web application. Euro Informatica will integrate the solution with Microsoft Azure and Microsoft 365 for security, scalability, and efficient data and application management. This offer is available in Italian.
Copilot for Microsoft 365: 4-Month Implementation: Wavestone offers this implementation of Copilot for Microsoft 365, using generative AI to unlock its full potential. Wavestone provides technical, functional, and cybersecurity setup, identifies target groups and use cases, and offers comprehensive change management. Wavestone aims to empower businesses to reach new heights with Microsoft 365.
Dynamics 365 & the Power Platform: Implementation: Sirocco offers strategic road mapping and implementation services for Microsoft Dynamics 365 Sales. Sirocco assists with complex deployments, data migration, and integration, while also enabling citizen developers to use Microsoft Power Apps and Power Automate.
Infrastructure Optimization: 3-Week Proof of Concept: ELEKS’ Infrastructure Optimization service enhances your Microsoft Azure environment performance while minimizing costs. It utilizes Azure’s Autoscaling capabilities and integrates with Azure DevOps to streamline CI/CD pipelines. The service provides comprehensive analysis of resource utilization, cost, performance metrics, scaling patterns, and deployment efficiency. The final report includes an executive summary, resource optimization recommendations, cost savings opportunities, performance improvement plans, CI/CD pipeline enhancements, and autoscaling adjustments.
Kraft Kennedy Readiness for Microsoft Copilot: Workshop: Kraft Kennedy offers solutions for Microsoft AI to enhance productivity and security measures for businesses. Kraft Kennedy can conduct design and planning sessions to educate you on Microsoft Copilot for Microsoft 365 and GenAI, review existing policies, and configure your Microsoft 365 environment for successful implementation.
Microsoft 365 Copilot: 1-Week Workshop: Uni Systems offers specialized workshops to help organizations unlock the full potential of Microsoft 365 Copilot. They assess licensing, technical readiness, data protection, and showcase Copilot’s benefits to empower organizations to harness AI-driven capabilities effectively. The workshops also explore specific use cases where Copilot can provide significant benefits.
Microsoft Sentinel Log Management and Threat Detection: Implementation: AVASOFT offers consulting services to help organizations enhance their use of Microsoft 365 with Microsoft Sentinel for log management and threat detection. The methodology includes defining goals, designing architecture, developing and testing solutions, and deploying policies. The services include EDR inventory management, assessment, incident and alert detection policies, compliance and reporting, and around-the-clock support. The key benefits are enhanced threat detection, streamlined incident response, and improved visibility. The deliverables include seamless integration, enhanced incident response, and simplified compliance assurance.
Navtilus Extended Producer Responsibility for Packaging Setup: Workshop: Navtilus’s workshop provides comprehensive implementation and setup of the Navtilus Extended Producer Responsibility for Packaging app for Microsoft Dynamics 365, covering basic and advanced packaging details, data export/import, and maximized data utilization. By the end, attendees will be fully equipped to use the app independently and maximize its benefits for their business.
Netways Innov8 for Copilot Studio: Netways INNOV8 offers a six-step approach to help you maximize the value of Microsoft Copilot. It includes deep Copilot expertise, a tailored roadmap, proof of concept, and a clear blueprint for digital transformation.
OnActuate Public Sector Finance Accelerator: 15-Week Implementation: OnActuate will implement its Public Sector Finance Accelerator for Microsoft Dynamics 365 Finance. Designed for government finance departments, the accelerator includes automated workflows, custom security roles, personalized dashboards, and AI-powered features to enhance efficiency and compliance. With 230 automated processes, it modernizes public sector financial operations and improves reporting and transparency.
Power Apps and the Power Platform: 2-Day Workshop: Unlock your team’s potential with Netwise’s online workshop covering Microsoft Power Apps and the Power Platform. Experienced trainers provide a customized approach with practical exercises and real-life examples to ensure maximum benefits. Learn how to create Canvas and Model-Driven apps, utilize Microsoft Dataverse, and streamline workflows with Power Automate. Collaboratively define and plan business solutions tailored to your specific needs.
Power Automate and the Power Platform: 2-Day Workshop: Elevate your team’s understanding and utilization of Microsoft Power Automate and the Power Platform with Netwise’s dynamic online workshop. Led by seasoned experts, this customized approach combines engaging theory with hands-on exercises to ensure practical, real-world skills. Learn how to automate repetitive tasks, streamline processes, and establish governance best practices for secure and compliant automation.
Power Platform App: 1-Day Workshop: Allgeier provides hands-on training for creating custom business apps without coding. Participants learn to create apps that can be shared securely within their organization and work on mobile devices. The workshop covers connecting apps to various data sources and creating complex data relationships. The workshop is available in German in Switzerland.
Power Platform Automation: 1-Day Workshop: Allgeier’s training program for beginners teaches automation using Microsoft Power Automate and Power Automate Desktop. Participants will learn to develop automation based on digital process automation (DPA) and robotic process automation (RPA) in a single day using desktop and cloud flows. The program includes exercises that guide participants through building an end-to-end automation scenario for processing incoming invoices. The workshop is available in German in Switzerland.
Power Platform Power Pages: 1-Day Workshop: Allgeier’s training workshop is intended for beginners to learn for website and app creation using the Microsoft Power Platform. Participants will learn to create a data-driven website with automated processes and identify when their organizations can benefit from using Power Pages. This workshop is available in German in Switzerland.
Protect and Manage Data on Personal Devices Using Intune: Implementation: AVASOFT offers consulting services to help businesses utilize Microsoft Intune for enhanced device security and simplified management. This offer includes defining business requirements, designing and developing solutions, and deploying them to the entire organization. AVASOFT provides expert assistance for troubleshooting, optimization, compliance, and proactive maintenance. Deliverables include thorough security assessments, proactive maintenance, performance optimization, and on-demand technical assistance.
Security Managed Service and Incident Response: 1-Month Consultation: IT Partner’s managed service enhances Microsoft 365 by offering comprehensive protection and response capabilities against various information security incidents. It provides monthly subscription-based support and updates to leverage the latest features and improvements in Microsoft 365.
Contact our partners
Access Charity CRM Add-in for Excel
Advania Identity and Cost Control for Microsoft 365
Basware ICreative API Plugin for Dynamics 365
Benchmark Mineral Intelligence Add-in for Excel
BriteBlue Wholesale365 Order to Cash
Call Sentiment Analysis App for Teams
Contract Invoicing and Management
Copilot for Microsoft 365: 4-Hour Readiness Assessment
Corporate Social Responsibility App by Embee
CYBERN – CoE Governance Solution for the Power Platform
DMS and ECM Interface for ELO Digital Office
Enterprise Internal AI Chat Services
Expense Reimbursement Template
Fortified WISdom Enterprise – Database Workload Intelligence Solution
HexaSync BigCommerce Integration for Dynamics 365 Business Central
HexaSync Shopify Integration for Microsoft Dynamics 365 Business Central
Industrial Predictive Asset Intelligence
Lasso Security for Model Protection
m+m Tool Management with Production Order Job Overview
Quick Cards Web Part for SharePoint
Vinea for Winegrowers and Vineyards
Zeta Alpha Neural Discovery Platform
This content was generated by Microsoft Azure OpenAI and then revised by human editors.
Microsoft Tech Community – Latest Blogs –Read More
Microsoft Partner Agreement (“MPA”) and Microsoft Cloud Solution Provider (“CSP”) notice of suspensi
We just got this email from Microsoft:
Microsoft Partner Agreement (“MPA”) and Microsoft Cloud Solution Provider (“CSP”) notice of suspension and termination proceedings
This is regarding your company’s Microsoft CSP account with tenant ID XXXX
The problem is that behind this ID is our Azure subscription with all our services and data. There is no option to reply to the email to understand what happened.
We are a small company from the US working on SAAS products. We have CSP agreements from 2018 that are renewed each year.
I checked what could be wrong and went through all:
The account seems ok and is verified
Security in Entra ID is ok; security defaults are enabled
We tried to ask for support, but the issue is unclear. They mention:
In the Microsoft AI Cloud Partner Program Agreement, both Microsoft and our partners reserve the right to walk away from the partner relationship by providing 30 days’ notice to the other. Neither party is required to offer an explanation for the decision to terminate the partner agreement. As Microsoft is exercising its rights under this section 4.b of the Microsoft AI Cloud Program Agreement, we are unable to share an explanation or further details.
Can anyone help here?
We just got this email from Microsoft: Microsoft Partner Agreement (“MPA”) and Microsoft Cloud Solution Provider (“CSP”) notice of suspension and termination proceedingsThis is regarding your company’s Microsoft CSP account with tenant ID XXXXThe problem is that behind this ID is our Azure subscription with all our services and data. There is no option to reply to the email to understand what happened. We are a small company from the US working on SAAS products. We have CSP agreements from 2018 that are renewed each year. I checked what could be wrong and went through all:The account seems ok and is verifiedSecurity in Entra ID is ok; security defaults are enabledWe tried to ask for support, but the issue is unclear. They mention: In the Microsoft AI Cloud Partner Program Agreement, both Microsoft and our partners reserve the right to walk away from the partner relationship by providing 30 days’ notice to the other. Neither party is required to offer an explanation for the decision to terminate the partner agreement. As Microsoft is exercising its rights under this section 4.b of the Microsoft AI Cloud Program Agreement, we are unable to share an explanation or further details.Can anyone help here? Read More
Cosmos DB for MongoDB help me
I am getting error : Cannot create unique index when collection contains documents, i am using cosmos db for mongo and using Backup policy Continuous (7 days), How do I fix it?
I am getting error : Cannot create unique index when collection contains documents, i am using cosmos db for mongo and using Backup policy Continuous (7 days), How do I fix it? Read More
Libraries: Collapsed Column Categories as Default View Option – Help Needed
Hello,
In a library, is there a setting that sets the default column view option showing categorized columns in the collapsed condition?
If not, are there available programmed solutions you could recommend?
FYI: I am a site owner using the Modern Experience and found a (very) loosely related solution for Lists in the following thread:
Thanks,
Clint
Hello, In a library, is there a setting that sets the default column view option showing categorized columns in the collapsed condition? If not, are there available programmed solutions you could recommend? FYI: I am a site owner using the Modern Experience and found a (very) loosely related solution for Lists in the following thread:SharePoint list view Thanks,Clint Read More