Category: Microsoft
Category Archives: Microsoft
My computer has crashed multiple times
My computer (Windows Insider program) has crashed multiple times, and I’ve lost count of how many times I’ve had to restart it. The crashes seem to occur randomly, and I’ve noticed that they often happen when I’m in the middle of using certain applications or browsing the web. Sometimes, the device will freeze and become unresponsive, and other times, it will simply shut down without warning.
I’m hoping someone can help me identify the cause of the issue and provide a solution.
My computer (Windows Insider program) has crashed multiple times, and I’ve lost count of how many times I’ve had to restart it. The crashes seem to occur randomly, and I’ve noticed that they often happen when I’m in the middle of using certain applications or browsing the web. Sometimes, the device will freeze and become unresponsive, and other times, it will simply shut down without warning.I’m hoping someone can help me identify the cause of the issue and provide a solution. Read More
Architecture of Azure Database Migration Service
Azure Database Migration Service (DMS) is a fully managed service designed to enable seamless migrations from multiple database sources to Azure data platforms with minimal downtime. It powers the “Azure SQL Migration” extension for Azure Data Studio, can be used via Azure portal, PowerShell and Azure CLI.
This article explains architecture diagram of Azure Database Migration Service and how the DMS perform migration for different scenarios.
How does a normal migration work?
There are two ways to perform migration for a database from source to Azure SQL Targets as mentioned below:
Physical migration: In Physical migration, Backups from the source databases are used to perform restore on the Azure SQL Target databases. DMS uses Physical migration for migrating to Target Azure SQL VM and Azure SQL MI.
Physical Migration
Logical migration: In logical migration, data rows are read from the source database tables and then inserted into the Target Azure SQL’s database tables. DMS uses Logical migration for migrating to Azure SQL DB.
Note:
In case of logical migration, example for Target Azure SQL DB, Schema migration is a prerequisite before proceeding with data migration.
Logical Migration
How does DMS architecture work?
Overview
Azure Database Migration Service (DMS) is an Azure service that orchestrates migration pipelines to do data movement activities from an on-premises environment to Azure. When a customer creates a Database Migration Service instance (in customer’s subscription), it associates itself with the Azure Data Factory pipeline (in Microsoft’s subscription).
Using this DMS instance customers can start, monitor, and complete / cancel the migration. While DMS functions as an orchestrator, it uses the Azure Data Factory’s Self-hosted Integration Runtime (Migration agent) in case backup files are placed on local SMB file share to copy them to Azure blob container or to perform logical migration to Azure SQL DB (bulk reading data from source and writing on Target). Customer can install the SHIR on a local machine (near to source), then register it using DMS provided authentication keys and associate SHIR with DMS.
Once backups are copied to or let’s say are already present on Azure blob storage container, DMS communicate with Target (Azure SQL VM or Azure SQL MI) resource provider’s restore service to restore these backups from Azure blob storage to respective Target.
Architecture components:
The following list describes the various components of DMS architecture:
Source SQL Server: An on-premises instance of SQL Server that is in a private cloud or an instance of SQL Server on a virtual machine in a public cloud. SQL Server 2008 and later versions on Windows or Linux are supported.
Target Azure SQL: Supported Azure SQL targets are Azure SQL Managed Instance, SQL Server on Azure Virtual Machines (registered with the SQL infrastructure as a service extension in full management mode), and Azure SQL Database.
SMB (backup) file share: A Server Message Block (SMB) network file share where backup files are stored for the databases to be migrated. Azure storage blob containers and Azure storage file share also are supported.
Client Tools: Azure DMS can be used by different client tools like “Azure SQL Migration” extension for Azure Data Studio, Azure portal, PowerShell and Azure CLI.
Azure Storage Account: The blob storage account in customer’s subscription is used to write the backup files from the source and to read from this blob storage container for restoring it on Target.
Azure Data factory: Database Migration Service is associated with the Azure Data Factory’s pipeline. ADF pipeline acts as a placeholder of activities trigger by the DMS’s migration workflow and provides the capability to register and monitor the self-hosted integration runtime.
Self-hosted integration runtime (SHIR): Install a Self-hosted integration runtime on a computer that can connect to the source SQL Server instance and the location of the backup file along with connectivity to Azure blob storage / resources. Database Migration Service provides the authentication keys and registers the self-hosted integration runtime. SHIR act as a compute of the DMS migration’s data movement and can be scaled upto 4 nodes maximum.
Note:
If customers have backup files on local SMB file share or want to perform logical migration to Azure SQL DB, then only Self-hosted integration runtime is needed.
Target resource providers and Restore Service: Target resource provider is a collection of REST operations that enables functionality for an Azure service like restore service on the Target Azure SQL VM or MI. This restore service is responsible for scanning the backups in the Azure blob container, validate, creating a restoration plan, and then performing the restore of these backups on the Target. In case of Azure SQL VM, it is part of the SQL Server IaaS Agent extension and for Azure SQL MI, it is the Log replay service (LRS).
Architecture of Azure Database Migration Service
The following diagram illustrates the DMS architecture workflow:
DMS architecture
The number labels against the workflow defines the migration sequence w.r.t a given scenario (as stated in the key on right side bottom).
And below is the detailed description of architecture workflows based upon the migration scenarios:
A. Migration to Azure SQL VM or SQL MI from Azure blob storage:
1. Customer can use different client tools like – “Azure SQL Migration” extension for Azure Data Studio, Azure portal, PowerShell and Azure CLI to create and migrate using Azure Database migration Service. They can also perform Pre-migration activities like Assessment and SKU recommendations using Azure SQL Migration extension, PowerShell and Azure CLI.
2. Once a customer creates the DMS instance in their subscription, they can use it to perform Azure DB migrations. DMS also creates an ADF instance in the background which runs in Microsoft’s subscription.
3a. Since in this scenario, the customer has already placed the database backups (Full, differential and Tlog) in the Azure blob storage, DMS communication these details to the Target Resource provider’s restore service.
4a. Restore service scans the Azure blob storage container, validate, and creates a restore plan based upon the available backups files.
Note:
It is recommended not to delete any backup from blob container until migration completes as it impacts the restore plan.
It is required to keep the single database backup in a folder and the folder should be in the root directory of the blob container. Alternatively, if the backups are stored in the root directory, they should also correspond to one database only.
5a. Once it creates the restore plan, the Target’s restore service only takes the required backup files and skips the backup files which are not relevant. Example if there are two full backups of same database, restore service skip the older full backup.
6a. After that, backups are restored to the Target with NORECOVERY option. In case of online migration, once customer initiates the cutover, restore service looks for any remaining backup, restore those last backup with RECOVERY and bring the Target DB online. In case of offline migration, as soon as it finds the “last restore file” mentioned by customer (during configuration of DMS migration), it performs restore with RECOVERY and bring the Target DB online.
B. Migration to Azure SQL VM or SQL MI from local SMB backup file share:
1. Customer can use different client tools like – “Azure SQL Migration” extension for Azure Data Studio, Azure portal, PowerShell and Azure CLI to create and migrate using Azure Database migration Service. They can also perform Pre-migration activities like Assessment and SKU recommendations using Azure SQL Migration extension, PowerShell and Azure CLI.
2. Once a customer creates the DMS instance in their subscription, they can use it to perform migrations. DMS also creates an ADF instance in the background which runs in Microsoft’s subscription.
3b. To copy the backup files from local SMB backup file share to Azure blob storage container, DMS inform the ADF factory pipeline and request for a copy activity.
4b. This copy activity request is carried out by SHIR, which acts as a Migration agent, running on a nearby computer w.r.t to the source. To do so, SHIR node should be able to communicate with source and Azure resources.
Note:
The connectivity required for SHIR to function is mentioned here.
For SHIR related best practices, refer here.
5b. SHIR then read the meta data of backup files from the local SMB backup file share, ensuring the backup files should belong to the mentioned database. After copying the backup files, SHIR keeps on scanning the file share for new backup files after every 1 min interval.
6b. These backup files are then transferred to an Azure blob storage container.
7b. Restore service scans the Azure blob storage container, validate, and creates a retore plan based upon the available backup files.
Note:
It is recommended not to delete any backup from blob container until migration completes as it impacts the restore plan.
It is required to keep the single database backup in a folder and the folder should be in the root directory of the blob container. Alternatively, if the backups are stored in the root directory, they should also correspond to one database only.
8b. DMS informs the Target resource provider’s restore service after copying finishes. As per the restore plan, Target’s restore service only restores the required backup files and skips the backup files which are not relevant. Example if there are two full backups of same database, restore service skip the older full backup.
9b. After that, backups restored to the Target with NORECOVERY option. In case of online migration, once customer initiates the cutover, restore service looks for any remaining backup, then restore those last backup with RECOVERY and bring the Target DB online. In case of offline migration, as soon as it finds the “last restore file” mentioned by customer (during configuration of DMS migration), it performs restore with RECOVERY and bring the Target DB online.
C. Migration to Azure SQL DB:
1. Customer can use different client tools like – “Azure SQL Migration” extension for Azure Data Studio, Azure portal, PowerShell and Azure CLI to create and migrate using Azure Database migration Service. They can also perform Pre-migration activities like Assessment and SKU recommendations using Azure SQL Migration extension, PowerShell and Azure CLI.
2. Once a customer creates the DMS instance in their subscription, they can use it to perform migrations. DMS also creates an ADF instance in the background which runs in Microsoft’s subscription.
3c. Since it is a logical migration, DMS inform the ADF pipeline to initiate migrate schema and/or copy data rows activity from the selected tables based upon the configuration provided by the customer.
4c. This activity request is executed by the SHIR, acting as a Migration agent and running on a local computer near to source. To do so, SHIR node should be able to communicate with source and Azure resources.
Note:
The connectivity required for SHIR to function is mentioned here.
For SHIR related best practices, refer here.
5c. SHIR reads the schema / data rows from the selected tables. It performs SQL bulk copy to read the data.
Note:
To improve the Azure SQL DB migration performance, refer here.
6c. It is also responsible to write this data on the Target Azure SQL DB using bulk insert. On Target Azure SQL DB, first schema migration is performed (if configured by customer) and then the data row movement starts. Once all the data rows are inserted on the Target, DMS perform post copying activities like creation of indexes etc. before finishing the migration.
Note:
In case of Target Azure SQL DB, Schema migration is a prerequisite before proceeding with data migration.
In case of any error while migrating schema, if error does not belong to the selected table, then schema migration continues later followed by data row migration.
Supported migration mode
Below are the supported migration modes – Offline and Online migration based upon the Azure SQL Targets.
Azure SQL Targets
Offline migration
Online migration
Azure SQL Database
Yes
No
Azure SQL Managed Instance
Yes
Yes
SQL Server on Azure VM
Yes
Yes
Supported SQL server sources
Azure Database Migration service supports below listed SQL server sources and with version of SQL server 2008 or onwards.
SQL Sources
Supported
On-prem SQL Server instance
Yes
SQL Server on Private cloud
Yes
SQL Server on VM in Public cloud
Yes
AWS RDS SQL Server
Yes
Resources
For more information about the extension and Azure Database Migration Service, see the following resources.
Azure Database Migration Service documentation
Migrate databases using the Azure SQL Migration extension
One-click SQL Migration PoC environment
Ports and Firewall considerations for SHIR
Microsoft Tech Community – Latest Blogs –Read More
Why is my Windows 11 battery life dropping faster than before?
I’ve been using Windows 11 for a few weeks now, and I’ve noticed that my battery life is dropping significantly faster than it used to. I’ve tried to troubleshoot the issue, but I haven’t been able to find any solutions. I’m using the same laptop and same usage habits as before, but my battery life is now only lasting around 4-5 hours, compared to 6-7 hours on Windows 10. This is a big issue for me, as I need my laptop to last throughout the day.
I’d love to know if there are any specific settings or tweaks that might help improve my battery life. Any advice or guidance would be greatly appreciated!
I’ve been using Windows 11 for a few weeks now, and I’ve noticed that my battery life is dropping significantly faster than it used to. I’ve tried to troubleshoot the issue, but I haven’t been able to find any solutions. I’m using the same laptop and same usage habits as before, but my battery life is now only lasting around 4-5 hours, compared to 6-7 hours on Windows 10. This is a big issue for me, as I need my laptop to last throughout the day. I’d love to know if there are any specific settings or tweaks that might help improve my battery life. Any advice or guidance would be greatly appreciated! Read More
How do coupon apps boost revenue, and how can businesses maximize their impact?”
A coupon and discount app , available on Android, Apple Store, and web platforms, holds significant potential for revenue generation. By providing users with access to exclusive deals and discounts from various businesses, it not only attracts more customers but also encourages repeat purchases and brand loyalty. Additionally, such apps can leverage targeted marketing techniques and data analytics to personalize offers, further enhancing their effectiveness in driving sales and revenue growth.
A coupon and discount app , available on Android, Apple Store, and web platforms, holds significant potential for revenue generation. By providing users with access to exclusive deals and discounts from various businesses, it not only attracts more customers but also encourages repeat purchases and brand loyalty. Additionally, such apps can leverage targeted marketing techniques and data analytics to personalize offers, further enhancing their effectiveness in driving sales and revenue growth. Read More
Any recommended UI/UX framework?
Hi,
I am new to .net. May I know if there is any UI/UX framework that is good for .net webpage with search criteria? I would like to have a textbox with the following input methods:
– Exact text match
– Wildchars *
– Copy and paste the list
– Enter number range
Thank you!
Hi,I am new to .net. May I know if there is any UI/UX framework that is good for .net webpage with search criteria? I would like to have a textbox with the following input methods:- Exact text match- Wildchars *- Copy and paste the list- Enter number range Thank you! Read More
Azure Automation Managed Integration – No Enterprise App created
I have created an Azure Automation and configured system assigned managed identity.
To manage permissions etc I thought, based on what I read
, from others that I will get an Enterprise App application, but can’t find it.
I have created an Azure Automation and configured system assigned managed identity. To manage permissions etc I thought, based on what I read, from others that I will get an Enterprise App application, but can’t find it. Read More
Azure SQL Managed Instance: Supercharged for 2024 and Beyond
Zone-Redundant Configuration (Preview): Disaster recovery just got easier. Currently, in preview for general-purpose instances, zone redundancy adds an extra layer of protection against regional outages. Exciting Previews on the Horizon Azure SQL Database Copilot (Private Preview): Bid farewell to complex queries!
Cross-Subscription Database Mobility: This feature enables you to perform database operations across managed instances. It creates a new database on the destination instance either as a copy or by moving the source database. When copying, the source database remains online, but when moving, it gets dropped after completion.
Keep an Eye Out for Azure SQL Database Copilot (Private Preview): Say goodbye to complex queries! This innovative preview integrates natural language processing with SQL Server and Azure Copilot, offering an intuitive way to write and optimize your queries.
To learn more about
Azure SQL Managed Instance, SqlManagement Service Tag, Copy or move a database,
Zone-Redundant Configuration (Preview): Disaster recovery just got easier. Currently, in preview for general-purpose instances, zone redundancy adds an extra layer of protection against regional outages. Exciting Previews on the Horizon Azure SQL Database Copilot (Private Preview): Bid farewell to complex queries! Cross-Subscription Database Mobility: This feature enables you to perform database operations across managed instances. It creates a new database on the destination instance either as a copy or by moving the source database. When copying, the source database remains online, but when moving, it gets dropped after completion. Keep an Eye Out for Azure SQL Database Copilot (Private Preview): Say goodbye to complex queries! This innovative preview integrates natural language processing with SQL Server and Azure Copilot, offering an intuitive way to write and optimize your queries. To learn more aboutAzure SQL Managed Instance, SqlManagement Service Tag, Copy or move a database, Read More
Meetings Rooms in Teams User Activity Data (Meetings Organised vs Meetings Participated)
I tried searching for this questions on the forum but haven’t been able to find anything related to Meeting Rooms (MMRS).
We got few Meeting Rooms (MMRS) setup in business which helps us to collaborate with clients and attend meetings and I wanted to check if we are getting enough usage out of them for the investment we made.
I see there are 2 data points in Teams User Activity export which gives information on Meetings Organised vs Meeting Participated. Usually, when you want to attend a meeting from these rooms I need to book, and they get recorded in Meetings Organised but what about Meeting Participated? Usually, you can go in and plug the cable and start the meeting from the room and I think they will show up in Meetings Participated. We have 2-3 boardrooms where you can’t go just to attend the meeting, but it is still showing Meetings Participated > Meeting Organised.
Can someone tell me on what basis the Meeting Participated is calculated on or what does that actually mean in this scenario?
I tried searching for this questions on the forum but haven’t been able to find anything related to Meeting Rooms (MMRS).We got few Meeting Rooms (MMRS) setup in business which helps us to collaborate with clients and attend meetings and I wanted to check if we are getting enough usage out of them for the investment we made.I see there are 2 data points in Teams User Activity export which gives information on Meetings Organised vs Meeting Participated. Usually, when you want to attend a meeting from these rooms I need to book, and they get recorded in Meetings Organised but what about Meeting Participated? Usually, you can go in and plug the cable and start the meeting from the room and I think they will show up in Meetings Participated. We have 2-3 boardrooms where you can’t go just to attend the meeting, but it is still showing Meetings Participated > Meeting Organised.Can someone tell me on what basis the Meeting Participated is calculated on or what does that actually mean in this scenario? Read More
microphone not working due to intel smart sound technology
Hey everyone,
I’m encountering an issue with Windows 11 where the microphone stops working after some time while using my Logitech wireless zone connected via Bluetooth. It’s frustrating as a workaround to get it functioning again, I have to restart my computer. This isn’t an isolated incident either; some of my colleagues at work are experiencing the same problem.
I’ve come across a suggestion online to disable and enable the Intel Smart Sound Technology EOD driver as a temporary fix instead of resorting to a full restart. However, I’m looking for a more permanent solution to this problem.
Any ideas on how to resolve this issue without needing a workaround like this would be greatly appreciated.
Thanks in advance for your help!
Hey everyone,I’m encountering an issue with Windows 11 where the microphone stops working after some time while using my Logitech wireless zone connected via Bluetooth. It’s frustrating as a workaround to get it functioning again, I have to restart my computer. This isn’t an isolated incident either; some of my colleagues at work are experiencing the same problem.I’ve come across a suggestion online to disable and enable the Intel Smart Sound Technology EOD driver as a temporary fix instead of resorting to a full restart. However, I’m looking for a more permanent solution to this problem.Any ideas on how to resolve this issue without needing a workaround like this would be greatly appreciated.Thanks in advance for your help! Read More
How to achieve Multiversioning/Backward compatability for Dotnet 8. APIs?
What is the effective way in achieving “Multiversioning, Versioning, Backward compatability” for Dotnet 8.0 APIs . These APIs are going to be deployed AKS cluster as microservices.
What is the effective way in achieving “Multiversioning, Versioning, Backward compatability” for Dotnet 8.0 APIs . These APIs are going to be deployed AKS cluster as microservices. Read More
Azure OpenAI Whisper From Power Automate
Hi all,
I’ve successfully created a working Whisper Model In Azure OpenAI Service (tested at
Speech Studio – Whisper Model in Azure OpenAI Service ) to do audio transcriptions. I’ve had the exact model working via a Custom Connector to the regular OpenAI version of Whisper but need direct access via Azure to avoid needing to give guest users to my app the API Key.
When I try and call it via HTTP in Power Automate I get an ‘UnresolveableHostName’ error. and the 502 error in the image below
HTTP output is
I’m wondering if it is my URI
Speech Service is ‘azureopenaispeechjuneeastus2’
Location / Region is ‘eastus2’
Endpoint is ‘https://azureopenaispeechjuneeastus2.openai.azure.com/’
Deployment is ‘wmcazureopenaiwhisper’
Any thoughts would be appreciated.
Hi all, I’ve successfully created a working Whisper Model In Azure OpenAI Service (tested at Speech Studio – Whisper Model in Azure OpenAI Service ) to do audio transcriptions. I’ve had the exact model working via a Custom Connector to the regular OpenAI version of Whisper but need direct access via Azure to avoid needing to give guest users to my app the API Key. When I try and call it via HTTP in Power Automate I get an ‘UnresolveableHostName’ error. and the 502 error in the image below HTTP output is { “statusCode”: 404, “headers”: { “apim-request-id”: “1859e66c-6662-470e-8844-ec0f4053ac86”, “Strict-Transport-Security”: “max-age=31536000; includeSubDomains; preload”, “X-Content-Type-Options”: “nosniff”, “Date”: “Tue, 04 Jun 2024 04:35:01 GMT”, “Content-Length”: “56”, “Content-Type”: “application/json” }, “body”: { “error”: { “code”: “404”, “message”: “Resource not found” } }} I’m wondering if it is my URI https://azureopenaispeechjuneeastus2.openai.azure.com/openai/deployments/wmcazureopenaiwhisper/audio/transcriptions?api-version=001 Speech Service is ‘azureopenaispeechjuneeastus2’Location / Region is ‘eastus2’Endpoint is ‘https://azureopenaispeechjuneeastus2.openai.azure.com/’Deployment is ‘wmcazureopenaiwhisper’ Any thoughts would be appreciated. Read More
Converting MBOX to PST Using Automatic Solution
Want to convert MBOX to PST format? Take the help from Advik MBOX to PST Converter. This software will help you to import MBOX to Outlook. Basically, it will convert MBOX file into Outlook importable format i.e. PST. The software maintain all the folder structure and emails during mbox to pst conversion. download the software and try it for free from official website.
Steps to Convert MBOX to PST for free.
Launch Advik MBOX to PST Converter in your PC.Click add file and add mbox file.Choose PST as saving option.Click convert button.
Done! THis is how you can start converting .mbox to .pst format.
Want to convert MBOX to PST format? Take the help from Advik MBOX to PST Converter. This software will help you to import MBOX to Outlook. Basically, it will convert MBOX file into Outlook importable format i.e. PST. The software maintain all the folder structure and emails during mbox to pst conversion. download the software and try it for free from official website. Steps to Convert MBOX to PST for free.Launch Advik MBOX to PST Converter in your PC.Click add file and add mbox file.Choose PST as saving option.Click convert button.Done! THis is how you can start converting .mbox to .pst format. Read More
Convert OLM to PST Using Best Technique
If you want to convert OLM to PST format. You should download and install Advik OLM to PST Converter. This software is designed for converting OLM into PST format. It will maintain folder and sub folder hierarchy. The application support OLM file exported from Outlook for Mac 2016, 2013, 2019 and 2021 edition.
Steps to Convert OLM to PST format
Run Advik OLM to PST Converter in your PC.Click Add File and add Mac OLM file.Choose PST as saving option.Click Convert button.
Finished ! This is how software converts OLM to PST.
If you want to convert OLM to PST format. You should download and install Advik OLM to PST Converter. This software is designed for converting OLM into PST format. It will maintain folder and sub folder hierarchy. The application support OLM file exported from Outlook for Mac 2016, 2013, 2019 and 2021 edition. Steps to Convert OLM to PST formatRun Advik OLM to PST Converter in your PC.Click Add File and add Mac OLM file.Choose PST as saving option.Click Convert button.Finished ! This is how software converts OLM to PST. Read More
Unable to Get Pre Installed Version of Teams
In the middle class, I faced problems with this error message: Please Hold on… and also that pre-installed version isn’t opening correctly so I uninstalled that after I installed it from the Microsoft store and also from the Website and that got installed successfully but once I quit the teams or shut down the system after power on the system the teams showing an error message: This version of teams only supports Work or School ” like this. Please help me to get out of this. Attached are the Screenshots.Problem of store and website version on teams
Problem of preinstalled version on teams
In the middle class, I faced problems with this error message: Please Hold on… and also that pre-installed version isn’t opening correctly so I uninstalled that after I installed it from the Microsoft store and also from the Website and that got installed successfully but once I quit the teams or shut down the system after power on the system the teams showing an error message: This version of teams only supports Work or School ” like this. Please help me to get out of this. Attached are the Screenshots.Problem of store and website version on teams Problem of preinstalled version on teams Read More
Export Exchange Mailbox to PST Using Automated Method
Want to export Exchange mailbox to PST file? This can be done via professional method of 2024 i.e. Advik Email Backup Wizard. With this tool you can export Exchange mailbox to PST file format for Outlook. This file can easily transfered in MS Outlook 2013, 2016, 2019, 2021 and other versions. The software will preserve all the mailbox folders and email attributes.
Steps to Export Exchange Mailbox to PST file
Run Advik Email Backup Tool in your PC.Select Exchange as Email Source.Enter your Exchange account credentials and sign in.Select mailbox folders to export.Choose PST as saving option.Click Backup button.
The software will start exporting Exchange mailbox to PST file.
Want to export Exchange mailbox to PST file? This can be done via professional method of 2024 i.e. Advik Email Backup Wizard. With this tool you can export Exchange mailbox to PST file format for Outlook. This file can easily transfered in MS Outlook 2013, 2016, 2019, 2021 and other versions. The software will preserve all the mailbox folders and email attributes. Steps to Export Exchange Mailbox to PST fileRun Advik Email Backup Tool in your PC.Select Exchange as Email Source.Enter your Exchange account credentials and sign in.Select mailbox folders to export.Choose PST as saving option.Click Backup button.The software will start exporting Exchange mailbox to PST file. Read More
RE: help with powershell script please
hi i need help with a powershell script to get all the calendar permissions that a particular user has its an on prem exchange. This is my code so far. I need access rights, identity and user i need this data exported to a csv. It would be good if i could set the para to also be for a few users that im after.
Get-Mailbox | ForEach-Object { Get-MailboxFolderPermission (($_.PrimarySmtpAddress.ToString())+”:Calendar”) -User -ErrorAction SilentlyContinue} | select-object -property @{Label=’Identity,User,AccessRights’; expression={$_.accessrights -join ‘;’}} |Export-CSV -Path C:Tempuser.csv -NoTypeInformation
then was told to add this via reddit :-
select-object -property @{Label=’AccessRights’; expression={$_.accessrights -join ‘;’}
unfortunately they arent very helpful or willing to which i find is rude so im turning to this community. I just want the script i need it today please I have not got time to go backwards and forwards with different things its not hard someone must know how to format what i want. I will work out later with learning more of scripting. Right now im on a time contra
hi i need help with a powershell script to get all the calendar permissions that a particular user has its an on prem exchange. This is my code so far. I need access rights, identity and user i need this data exported to a csv. It would be good if i could set the para to also be for a few users that im after. Get-Mailbox | ForEach-Object { Get-MailboxFolderPermission (($_.PrimarySmtpAddress.ToString())+”:Calendar”) -User -ErrorAction SilentlyContinue} | select-object -property @{Label=’Identity,User,AccessRights’; expression={$_.accessrights -join ‘;’}} |Export-CSV -Path C:Tempuser.csv -NoTypeInformation then was told to add this via reddit :- select-object -property @{Label=’AccessRights’; expression={$_.accessrights -join ‘;’} unfortunately they arent very helpful or willing to which i find is rude so im turning to this community. I just want the script i need it today please I have not got time to go backwards and forwards with different things its not hard someone must know how to format what i want. I will work out later with learning more of scripting. Right now im on a time contra Read More
Login onedrive developer … Help me…
Hi !
I have a OneDrive account and have set up two-factor authentication using the Microsoft Authenticator app on my phone for login verification. My phone is currently broken, and I cannot log into my existing account. Could you please assist me in logging in on my computer?
Thank you very much!
Hi !I have a OneDrive account and have set up two-factor authentication using the Microsoft Authenticator app on my phone for login verification. My phone is currently broken, and I cannot log into my existing account. Could you please assist me in logging in on my computer?Thank you very much! Read More
Is Subsite really necessary in Sharepoint Online?
Hello everyone! I hope everyone is well!
Perhaps it is not a technical doubt but rather a need or concept.
What is the need for “subsites” in Sharepoint Online today? I ask this because thinking about the natural ways of creating and using websites, it is not necessary.
But trying to be a person of good will, the user still wanting to have a Subsite, I, as the Environment Administrator, cannot list this “site” nor can I observe where it uses the space in question. Then? Where does it consume space? But before that, really, what is the use of a subsite? And how do you then list those that exist in the environment, use, volume, etc.
It may seem like endless nonsense and I apologize for the inconvenience with something that may seem so trivial, but it has taken away my peace of mind in the last few hours thinking about it.
Thank you very much for your attention
Good Regards
Hello everyone! I hope everyone is well! Perhaps it is not a technical doubt but rather a need or concept. What is the need for “subsites” in Sharepoint Online today? I ask this because thinking about the natural ways of creating and using websites, it is not necessary. But trying to be a person of good will, the user still wanting to have a Subsite, I, as the Environment Administrator, cannot list this “site” nor can I observe where it uses the space in question. Then? Where does it consume space? But before that, really, what is the use of a subsite? And how do you then list those that exist in the environment, use, volume, etc. It may seem like endless nonsense and I apologize for the inconvenience with something that may seem so trivial, but it has taken away my peace of mind in the last few hours thinking about it. Thank you very much for your attentionGood Regards Read More
New Real-Time Intelligence in Microsoft Fabric | Event-based actions and insights
Quickly spot real-time indicators of issues as they unfold, without the need to poll or manually monitor changes in your data and without writing a single line of code. That’s what the new Real-Time Intelligence service in Microsoft Fabric is all about. It extends Microsoft Fabric to the world of streaming data across your IoT and operational systems. Whether you are a data analyst or business user, you can easily explore high-granularity, high-volume data and spot issues before they impact your business. And as a Data engineer, you can more easily track system level changes across your data estate to manage and improve your pipelines.
Courtney Berg, from the Microsoft Fabric product team, joins Jeremy Chapman to explore all of the updates, explain how Real-Time Intelligence adds to what was possible with Data Activator and Microsoft Synapse Real-Time Analytics and demonstrates how this would work to derive insights and take actions automatically in a scenario with multiple live data streams across different data types.
Spot issues before they impact business.
Get insights on high granularity, high-volume data with Real-Time Intelligence in Microsoft Fabric.
Use Copilot to stay updated.
Generate queries swiftly, enabling real-time insights to spot hidden issues and make informed decisions — all within a live and filterable dashboard interface. See it here.
Move from schedule-driven to event-driven.
Real-time alerts and analytics tailored to your business needs. Check out Real-Time Intelligence here.
Watch the full video here:
QUICK LINKS:
00:00 — Real-Time Intelligence
00:54 — How it’s different
02:07 — Eventstream and Real-Time Hub
03:27 — Synapse Real-Time Analytics
04:03 — See it in action
05:25 — Use Copilot to stay updated
06:28 — Filter data and set up alerts
08:25 — Sophisticated Logic and Data Integration
09:48 — Data integration to Real-Time Hub
11:23 — Workflow automation
12:18 — System events
13:30 — Additional areas of use for Real-Time Intelligence
14:02 — Wrap up
Link References
Check out Real-Time Intelligence at https://aka.ms/RealTimeIntelligence
For all things Microsoft Fabric, go to https://microsoft.com/fabric
Unfamiliar with Microsoft Mechanics?
As Microsoft’s official video series for IT, you can watch and share valuable content and demos of current and upcoming tech from the people who build it at Microsoft.
Subscribe to our YouTube: https://www.youtube.com/c/MicrosoftMechanicsSeries
Talk with other IT Pros, join us on the Microsoft Tech Community: https://techcommunity.microsoft.com/t5/microsoft-mechanics-blog/bg-p/MicrosoftMechanicsBlog
Watch or listen from anywhere, subscribe to our podcast: https://microsoftmechanics.libsyn.com/podcast
Keep getting this insider knowledge, join us on social:
Follow us on Twitter: https://twitter.com/MSFTMechanics
Share knowledge on LinkedIn: https://www.linkedin.com/company/microsoft-mechanics/
Enjoy us on Instagram: https://www.instagram.com/msftmechanics/
Loosen up with us on TikTok: https://www.tiktok.com/@msftmechanics
Video Transcript:
– What if I told you you can quickly spot real-time indicators of issues as they unfold without the need to poll or manually monitor changes in your data and without writing a single line of code. That’s the goal of the new Real-Time Intelligence service, part of Microsoft Fabric’s platform. It extends Fabric to the world of streaming data across your IoT and operational systems. As a data analyst or a business user, you can easily explore high-granularity, high-volume data, and spot issues before they impact the business, and as a data engineer, you can more easily track system-level changes across your data estate to manage and improve your pipelines, and today I’m joined by Courtney Berg, who also helped build Microsoft Fabric. Welcome.
– Hey, Jeremy. Thanks for having me on the show.
– Thank you, and congrats on the availability of Real-Time Intelligence in Microsoft Fabric today. Now, most of us are familiar with event orchestration systems, but this is a lot different than that, so what makes this different?
– Well, this will give you an intelligent, unified, no-code way to listen and analyze real-time changes in your data wherever it lives. So, for example, you might have telemetry data collected from your business systems sitting in other clouds, along with streaming data of your IoT devices on the edge. You can pull from the sources you want and transform and combine the data in real time. And then with interactive real-time analysis, you can explore the data, spot emerging patterns, and isolate that single data point that could be the first indicator of an issue. We make query authoring easier with generative AI to help you quickly discover insights, and you can act faster by establishing rich conditions for active monitoring and defining what should happen next, so whether it’s notifying the right team or triggering automated workflows for system-level remediations. So, you can build your own custom integrated and automated systems to detect changes in data across your estate, analyze these events in context, and trigger early actions, all without writing a single line of code.
– Right, and if we compare that to most event-based systems out there today that are tied to specific data sources and those aren’t listening to changes in data at an aggregate level, this is changing things significantly, so what’s behind that?
– Well, as you mentioned, Real-Time Intelligence is part of Microsoft Fabric, and what it does is it orchestrates the process of being able to ingest streaming and event data in real time, analyze and transform it, and then act on it. The way that it does this is through a number of capabilities. So, first is Eventstream, which lets you bring in data, whether it’s from Microsoft Services, external sources, or even change data feeds from operational databases using available connectors, and a major thing we’re solving for here is how to capture real-time data in motion so that you could directly act on insights while they’re fresh. So here, underlying everything is the new Real-Time hub. This provides a single location for streaming data, as well as discrete event data across your organization happening at a system level. Now importantly, as a central location, it’s also cataloging the data, and it makes it easy to search for and discover real-time data: something that’s been historically difficult. And from here, you can take two paths to make a decision for your available data. So first, as data comes in, you can take immediate action using our Reflex capability, which is part of Data Activator, to look for rich conditions to trigger notifications or specific processes. Secondly, your data can go directly into our Eventhouse that provides a unified workspace to work with all of your data and is optimized for time series data. From there, you can easily query it with KQL and visualize it on our Real-Time Dashboard before acting on it with Reflex.
– So, just to pause you there for a second, so where do existing components for Fabric like Synapse Real-Time Analytics then fit into the picture?
– Yeah, so what we’re doing is we’re removing those tech silos so that we could better orchestrate the entire lifecycle, from capturing, analyzing, and acting on Real-Time Intelligence. We’ve built Real-Time Intelligence on top of proven technologies while adding new functionality. So, capabilities from Synapse Real-Time Analytics and Data Activator in Fabric have been unified, and there’s also a number of real-time streaming analytics and data exploration features from the Azure platform, along with Fabric’s strength in data visualization from Power BI that we bring in under the covers.
– Right, this is really going to provide a familiar experience, while expanding the approach to addressing the specific challenges of working with real-time data.
– That’s right, and all your Entra ID, information protection and governance policies, they all apply here.
– So, can you walk us through an example with all this running?
– Sure, I’m going to show you a scenario focused on business events where we take change data recorded from one system to another that causes a butterfly effect. In this case, we’re a direct-to-consumer food retailer. So, imagine it’s a hot spell and our sales and marketing team want to improve customer loyalty and satisfaction. They’ve come up with this idea of an aggressive discount on ice cream, which sounds like a great idea, but there’s a chain of dependencies with different teams and systems on point, and it’s not until you integrate these systems together that you can catch and react to what’s unfolding in real time. Plus, we want to catch that sweet spot of early indicators, and Real-Time Intelligence will do that for you with zero code. Here in our Real-Time Dashboard, you can see we’ve brought in all the relevant information across multiple systems into one view. Data is coming in from our sales and stock system, which gets updated hourly, I have real-time information from our IoT sensors with the refrigeration temperatures in our stores, and on the right here, I see data from Postgres backend of our mobile delivery app, showing orders ready to be picked up and available drivers, and in fact, we see average temperatures across our freezers are increasing over the last few hours.
– So, just because we have perishable goods here, and like you said, it’s a hot day, we’ve also got logistical dependencies, including different teams that are on point, there is a lot of things that could potentially go wrong here, so how do we get ahead of something like this?
– Yeah, listen closely, because the devil’s in the details. You’ll notice that our dashboard currently shows aggregate numbers across multiple freezers across different departments. So, the first gotcha is I don’t have a view of the individual freezer levels to be able to spot hidden issues with the freezers containing the ice cream, so let’s dig in a little bit more. Now, I can manually query to see what’s going on, but to save time, I’ll ask Copilot to do this for me. I’ll paste in my prompt, “Show the average temperature by department as column chart,” and it generates a KQL query that I can insert to get my chart. This looks pretty useful, so I’ll pin it to a dashboard to keep me updated in real time. Now, I’ll select the existing dashboard, give the tile a name, and add it. I’ll move it where I want it, resize it, and now I have visibility over freezers by each department.
– So, I can tell there’s a lot of different things to look into here, so from the report directly, can we slice and dice that data?
– Yes, everything is filterable live and can be queried on each tile. Let’s explore the freezer data some more. I’m going to drill into the aggregate freezer temperature. This should normally be pretty flat over time. I’ll start by removing the summarization to look at the data across all of the stores. There’s way too much here, so I can aggregate it in different ways. I’ll start with the average temperature, group it by timestamp, and also by department. Now, I can see the frozen dessert department is trending up over time. Now, we found something in the information that’s interesting. The temperatures in the frozen dessert freezers are moving up, which is not a good thing when you’re dealing with ice cream, so of course, I get notified of the issue from any of these tiles. I could set up alert if a threshold is exceeded, but that wouldn’t be super useful in our case, because we’ve alerted it to change the aggregate temperature across multiple freezers.
– That makes sense. So, could we zero it in then on the freezers that we care about, maybe the frozen dessert ones that have our ice cream?
– Yeah, absolutely. You can get pretty granular here. To go into particular data streams, the Real-Time hub is the best option. Here, I can find and use all the streaming data in Fabric. I can filter with these options on top and I can search through these streams. I’ll start typing “IoT” to pull up all of my IoT sensors. This top row represents our freezer sensors. In the details, I can see here what other items are using the stream, and then over here on the right, I get to preview the actual events coming in with the details about each event, and from here I can also set an alert. I’ll do that, and at this time, I’ll set my condition to be on event grouped by. In the grouping field, I’ll select FreezerID, in when, I’ll choose temperature, for the condition, I’ll select it becomes greater than, and for the value, I’ll set it to 29. So now that if any freezer goes above that threshold, I can get alerted in Teams, and I’ll use the same workspace as before and the same item name, TemperatureAlerts.
– So now, you can see all these alerts per freezer as they happen.
– Yeah, and the logic can get more sophisticated to look at business conditions and how they’re changing when the alert condition happens over a period of time, not just event by event. So, let’s look here. In fact, if I go into the reflex that was created, I can see the trigger and the individual streams that it’s monitoring. So, it looks like freezers D1 and D2 have gone over our threshold, so we’re seeing a few indicators of issues.
– So, that’s one sign, but you might also recall that we saw some orders and wait times that were trending up in the dashboard, so what can we tell from that?
– Yeah, that’s valuable data. That might give us an early warning from our mobile customer order app. So, if I go back into Real-Time hub and search for your orders data, I can’t find that information yet, so let’s bring that data in from our app’s database. I can add more data to integrate a complete view from Real-Time hub. I click Get Events. You can see all the data sources that you connect to, like Confluent, a few Azure services, Google Pub Sub, and Amazon Kinesis. In this list, there are a few marked CDC, which use the open-source Debezium framework that we host for you. Here, I’ll connect a Postgres database and listen for the changes. From there, you’ll add the connection details, the server, and then the database instance. And I need to specify other Eventstreams that are going to manage the connection stream, And finally, I’ll put in the name of the table that I want to monitor for changes. In this case, it’s delivery orders. Now, I just need to confirm it and that’s it.
– So, is that then going to streamify our CDC data feed in this case so that we can get real-time updates from our Postgres database?
– Exactly, in just a few steps. In the Eventstream, I can see and transform the events. This is a preview of the list of all the changes to the orders as they come in. If I go into Edit, I can do all sorts of transformations, like aggregate, expand, filter, group, or join the data to integrate multiple events to a cleaned up feed before I publish it back to Real-Time hub. I’ll choose Manage Fields, and now I can select the fields by reaching into that schema under the payload, and then order_id, customer_id, order_type, delivery_type, and waitTime. I’ll refresh, and now preview shows me just those fields. This looks pretty good. It’s the sort of output I need, so I can choose from the stream output to publish it back to the Real-Time hub for my orderWaitTime feed, and now it’s configured.
– Okay, so now the data from the mobile delivery app is flowing into the Real-Time hub.
– Yeah, it just takes a moment for those events to come in. Now, I can try the same search as before in Real-Time hub. I’ll search again for those orders, and then there’s the orderWaitTime feed I just created. When I open it, I get a preview of the number of events that have been generated and flowing through the stream. I can see connected items, the event stream, and things that are subscribing to the events, and I can create alerts directly from here as well. To save time, since I showed you this before, I’ll fast forward and head right over to Data Activator, because I already have my trigger ready for when the wait time crosses nine minutes, and it will send out an email.
– And there’s nothing better than getting an email or a Teams notification, except for maybe a mobile app notification, right?
– It’s funny that you mention that. This is right up your alley: workflow automation. So, in this case, I might change the promotion to something that reduces the number of orders but balances it out increasing the average amount of each order. You’ve probably seen examples, where instead of getting 25% off a single item, the promotion might be something like when you spend $100, you save $25. So, instead of an email, I can actually have it start a Power Automate workflow. In fact, in this tab, I’ve created one here. It listens to when the reflex trigger fires, then it creates an approval, and an Adaptive Card is posted in the Teams channel to start a new campaign. This should reduce the number of smaller customer orders and drive higher value orders so that we don’t tie up our drivers. Back in my reflex item, I just need to change the action that I want to have take over to this custom action for the campaign approval, and that’s all I need to do.
– And that’s going to then kick off a Power Automate flow effectively every time that trigger fires, so does this only work then with business-related events?
– It also works with system events too. Every Fabric item generates a system event, and activity in Azure storage does too. We can use these events just like business events. You’ll remember that our inventory system only updates hourly, and we can make that closer to real time. Starting from the pipeline, I’m going to create a new trigger, and then ask it to listen for a particular event in Real-Time hub. Our inventory stock system writes all the recent transactions into Azure storage account. I’ll connect to existing account and choose the correct subscription. Now I’ll choose my storage account, ContosoStockOutput. In Eventstream name, I’ll paste in ContosoStorageEvents. Then I can choose the event types. In my case, I only want the create events, so I’ll deselect everything else. Then I just need to create, and after that, hit Save, and like before, I need to fill in details for the workspace and the new item. I’ll name it EventBasedDataLoad this time, and that’s it. If I head back over to the monitor, you’ll see that the batch sales load succeeded, so now Real-Time Intelligence is listening for events where files are added to my Azure storage account, and will kick off the pipeline automatically. The analytics over the stock system are event driven, so you’ll see updates faster than the hourly poll we had previously.
– And I can see a lot of cases where this would be really useful to Real-Time Intelligence, whether that’s for recommendation engines or for things like generative AI. It could also be used to ground large language models with up-to-date information for a lot more accurate responses.
– Yeah, and of course, you can use system events to start other Fabric jobs. For example, you could run a notebook to train an LLM using real-time streaming data routed into OneLake via Real-Time Intelligence. So, basically any data activity in Fabric can now be event driven rather than scheduled.
– So, it’s really great to see all the updates for Microsoft Fabric, so where can all the folks watching right now go to learn more?
– Yeah, Real-Time Intelligence is in public preview today, and you can learn more at aka.ms/RealTimeIntelligence, and for all things Microsoft Fabric, check out microsoft.com/fabric.
– Thanks so much for joining us today, Courtney, and of course, keep watching Microsoft Mechanics for all the latest tech updates. Be sure to subscribe if you haven’t already, and as always, thank you for watching.
Microsoft Tech Community – Latest Blogs –Read More
Inconsistent Transcript File Formats in Microsoft Teams
I downloaded transcript files from two recorded meetings in Microsoft Teams.
However, the formats of the two files are different.
The first one is as follows:
00:00:00.000 –> 00:00:00.810
<v {SPEAKER_NAME}>Hello.</v>
The second one is as follows:
00:00:00.000 –> 00:00:00.810
Hello.
Why doesn’t the second file include the speaker’s name?
What could be the possible reasons for this?
The organizers of the meetings were different. Could it be related to the version of Teams that the meeting organizers were using?
Please let me know.
Thank you.
I downloaded transcript files from two recorded meetings in Microsoft Teams.However, the formats of the two files are different. The first one is as follows:00:00:00.000 –> 00:00:00.810
<v {SPEAKER_NAME}>Hello.</v> The second one is as follows:00:00:00.000 –> 00:00:00.810
Hello. Why doesn’t the second file include the speaker’s name?What could be the possible reasons for this?The organizers of the meetings were different. Could it be related to the version of Teams that the meeting organizers were using?Please let me know. Thank you. Read More