Category: Microsoft
Category Archives: Microsoft
UAC during OOBE (after switching from Admin to Standard user in Windows Autopilot)
We switched settings in Windows Autopilot to make the user a standard user instead of an admin. Now, during OOBE I am asked multiple times to execute a PowerShell script as an admin.
What causes this behavior and how to prevent?
We switched settings in Windows Autopilot to make the user a standard user instead of an admin. Now, during OOBE I am asked multiple times to execute a PowerShell script as an admin. What causes this behavior and how to prevent? Read More
Lesson Learned #491: Monitoring Blocking Issues in Azure SQL Database
Time ago, we wrote an article Lesson Learned #22: How to identify blocking issues? today,
I would like to enhance this topic by introducing a monitoring system that expands on that guide. This PowerShell script not only identifies blocking issues but also calculates the total, maximum, average, and minimum blocking times.
My idea is to run this PowerShell script, which executes T-SQL queries to identify blocking issues, showing the impact of the blocking and the blocking chains every 5 seconds. The script will save the details in a file for further review.
# Configure the connection string and folder for log file
$connectionString = “Server=tcp:servername.database.windows.net,1433;Database=dbname;User ID=username;Password=pwd!;Encrypt=true;Connection Timeout=30;”
$Folder = “c:SQLDAta”
# Function to get and display blocking statistics
function Get-BlockingStatistics {
$query = “
select conn.session_id as blockerSession,
conn2.session_id as BlockedSession,
req.wait_time as Waiting_Time_ms,
cast((req.wait_time/1000.) as decimal(18,2)) as Waiting_Time_secs,
cast((req.wait_time/1000./60.) as decimal(18,2)) as Waiting_Time_mins,
t.text as BlockerQuery,
t2.text as BlockedQuery,
req.wait_type from sys.dm_exec_requests as req
inner join sys.dm_exec_connections as conn on req.blocking_session_id=conn.session_id
inner join sys.dm_exec_connections as conn2 on req.session_id=conn2.session_id
cross apply sys.dm_exec_sql_text(conn.most_recent_sql_handle) as t
cross apply sys.dm_exec_sql_text(conn2.most_recent_sql_handle) as t2
“
$connection = Connect-WithRetry -connectionString $connectionString -maxRetries 5 -initialDelaySeconds 2
if ($connection -ne $null)
{
$blockings = Execute-SqlQueryWithRetry -connection $connection -query $query -maxRetries 5 -initialDelaySeconds 2
$connection.Close()
}
if ($blockings.Count -gt 0) {
$totalBlockings = $blockings.Count
$maxWaitTime = $blockings | Measure-Object -Property WaitTimeSeconds -Maximum | Select-Object -ExpandProperty Maximum
$minWaitTime = $blockings | Measure-Object -Property WaitTimeSeconds -Minimum | Select-Object -ExpandProperty Minimum
$avgWaitTime = $blockings | Measure-Object -Property WaitTimeSeconds -Average | Select-Object -ExpandProperty Average
logMsg “Total blockings: $totalBlockings” (1)
logMsg “Maximum blocking time (seconds): $maxWaitTime” (2)
logMsg “Minimum blocking time (seconds): $minWaitTime” (2)
logMsg “Average blocking time (seconds): $avgWaitTime” (2)
logMsg “– — — — Blocking chain details: — — ” (1)
foreach ($blocking in $blockings)
{
logMsg “Blocked Session ID: $($blocking.SessionId)”
logMsg “Wait Time (seconds): $($blocking.WaitTimeSeconds)”
logMsg “Blocker Session ID: $($blocking.BlockingSessionId)”
logMsg “Blocked SQL Text: $($blocking.SqlText)”
logMsg “Blocker SQL Text: $($blocking.BlockingSqlText)”
logMsg “———————————————“
}
} else {
logMsg “No blockings found at this time.”
}
}
# Function to execute a SQL query with retry logic
function Execute-SqlQueryWithRetry {
param (
[System.Data.SqlClient.SqlConnection]$connection,
[string]$query,
[int]$maxRetries = 5,
[int]$initialDelaySeconds = 2
)
$attempt = 0
$success = $false
$blockings = @()
while (-not $success -and $attempt -lt $maxRetries) {
try {
$command = $connection.CreateCommand()
$command.CommandText = $query
$reader = $command.ExecuteReader()
while ($reader.Read()) {
$blockingE = New-Object PSObject -Property @{
SessionId = $reader[“BlockedSession”]
WaitTimeSeconds = $reader[“Waiting_Time_secs”]
BlockingSessionId = $reader[“BlockerSession”]
SqlText = $reader[“BlockedQuery”]
BlockingSqlText = $reader[“BlockerQuery”]
}
$blockings+=$blockingE
}
$success = $true
} catch {
$attempt++
if ($attempt -lt $maxRetries) {
logMsg “Query execution attempt $attempt failed. Retrying in $initialDelaySeconds seconds…” 2
Start-Sleep -Seconds $initialDelaySeconds
$initialDelaySeconds *= 2 # Exponential backoff
} else {
logMsg “Query execution attempt $attempt failed. No more retries.” 2
throw $_
}
}
}
return ,($blockings)
}
#——————————–
#Log the operations
#——————————–
function logMsg
{
Param
(
[Parameter(Mandatory=$true, Position=0)]
[string] $msg,
[Parameter(Mandatory=$false, Position=1)]
[int] $Color,
[Parameter(Mandatory=$false, Position=2)]
[boolean] $Show=$true,
[Parameter(Mandatory=$false, Position=3)]
[string] $sFileName,
[Parameter(Mandatory=$false, Position=4)]
[boolean] $bShowDate=$true,
[Parameter(Mandatory=$false, Position=5)]
[boolean] $bSaveOnLogFile=$true
)
try
{
if($bShowDate -eq $true)
{
$Fecha = Get-Date -format “yyyy-MM-dd HH:mm:ss”
$msg = $Fecha + ” ” + $msg
}
If( TestEmpty($SFileName) )
{
Write-Output $msg | Out-File -FilePath $LogFile -Append
}
else
{
Write-Output $msg | Out-File -FilePath $sFileName -Append
}
$Colores=”White”
$BackGround =
If($Color -eq 1 )
{
$Colores =”Cyan”
}
If($Color -eq 3 )
{
$Colores =”Yellow”
}
if($Color -eq 2 -And $Show -eq $true)
{
Write-Host -ForegroundColor White -BackgroundColor Red $msg
}
else
{
if($Show -eq $true)
{
Write-Host -ForegroundColor $Colores $msg
}
}
}
catch
{
Write-Host $msg
}
}
#——————————–
#Validate Param
#——————————–
function TestEmpty($s)
{
if ([string]::IsNullOrWhitespace($s))
{
return $true;
}
else
{
return $false;
}
}
#————————————————————–
#Create a folder
#————————————————————–
Function CreateFolder
{
Param( [Parameter(Mandatory)]$Folder )
try
{
$FileExists = Test-Path $Folder
if($FileExists -eq $False)
{
$result = New-Item $Folder -type directory
if($result -eq $null)
{
logMsg(“Imposible to create the folder ” + $Folder) (2)
return $false
}
}
return $true
}
catch
{
return $false
}
}
function GiveMeFolderName([Parameter(Mandatory)]$FolderSalida)
{
try
{
$Pos = $FolderSalida.Substring($FolderSalida.Length-1,1)
If( $Pos -ne “” )
{return $FolderSalida + “”}
else
{return $FolderSalida}
}
catch
{
return $FolderSalida
}
}
#——————————-
#Create a folder
#——————————-
Function DeleteFile{
Param( [Parameter(Mandatory)]$FileName )
try
{
$FileExists = Test-Path $FileNAme
if($FileExists -eq $True)
{
Remove-Item -Path $FileName -Force
}
return $true
}
catch
{
return $false
}
}
# Function to connect to the database with retry logic
function Connect-WithRetry {
param (
[string]$connectionString,
[int]$maxRetries = 5,
[int]$initialDelaySeconds = 2
)
$attempt = 0
$connection = $null
while (-not $connection -and $attempt -lt $maxRetries) {
try {
$connection = New-Object System.Data.SqlClient.SqlConnection
$connection.ConnectionString = $connectionString
$connection.Open()
} catch {
$attempt++
if ($attempt -lt $maxRetries) {
logMsg “Connection attempt $attempt failed. Retrying in $initialDelaySeconds seconds…” 2
Start-Sleep -Seconds $initialDelaySeconds
$initialDelaySeconds *= 2 # Exponential backoff
} else {
logMsg “Connection attempt $attempt failed. No more retries.” 2
throw $_
}
}
}
return $connection
}
clear
$result = CreateFolder($Folder) #Creating the folder that we are going to have the results, log and zip.
If( $result -eq $false)
{
write-host “Was not possible to create the folder”
exit;
}
$sFolderV = GiveMeFolderName($Folder) #Creating a correct folder adding at the end .
$LogFile = $sFolderV + “Blockings.Log” #Logging the operations.
logMsg(“Deleting Operation Log file”) (1)
$result = DeleteFile($LogFile) #Delete Log file
logMsg(“Deleted Operation Log file”) (1)
# Loop to run the monitoring every 5 seconds
while ($true) {
Clear-Host
Get-BlockingStatistics
Start-Sleep -Seconds 5
}
Please note that this script is provided as-is and without any warranty. Use it at your own risk. Always test scripts in a development environment before deploying them to production.
Microsoft Tech Community – Latest Blogs –Read More
Introducing the Unified Azure Maps Experience
We are thrilled to announce the unification of Bing Maps for Enterprise (BME) with Azure Maps, marking a significant milestone in our geospatial services at Microsoft. Azure Maps now boasts a robust stack of geospatial offerings, leveraging the powerful capabilities of Microsoft Maps, which also drives Bing Maps (our consumer maps experience). Over the past year, our team has dedicated significant time and effort to combine the strengths of Bing Maps for Enterprise into Azure Maps, enhancing our global quality and coverage.
One of the major enhancements is the adoption of vector tiles in Azure Maps for a more responsive map experience. When utilizing Azure Maps in your solutions, you not only leverage the security and compliance advantages of Azure but also benefit from the extensive quality and coverage provided by Microsoft Maps.
This unification ensures that users of Azure Maps receive a comprehensive mapping solution backed by the unparalleled strengths of Azure’s infrastructure, Microsoft Maps’ data quality and coverage, and many of the same advanced geospatial capabilities that Bing Maps for Enterprise customers depend on. We are excited about the opportunities this integration presents and look forward to continuing to deliver innovative mapping solutions to our customers worldwide.
Azure Maps has many of the same features that BME customers have come to rely on. Nevertheless, this unification also introduces exciting new features to Azure Maps, such as weather APIs, private indoor maps, multiple authentication methods, geolocation service, and robust privacy and compliance benefits.
Ready to Make the Move?
For customers that are using Bing Maps for Enterprise and are migrating over to Azure Maps, some development will be needed. To help you in this transition period, we have written migration documents for our REST APIs and as well for the Azure Maps web control. Also, a good start is our Azure Maps samples site where you can find not only samples for many scenarios, but also the source code.
More resources about Azure Maps can be found here:
Azure Maps Documenation
Azure Maps Samples
Azure Maps Blog
Microsoft Q&A for Azure Maps
Microsoft Tech Community – Latest Blogs –Read More
Creating policy for Defender for Servers
Hello,
Some time ago we enabled Defender for Servers for virtual machines in our tenant. Some users reported me that DfS is using a lot of CPU usage in their machines and it blocks some files and proccesses from being executed. I have questions:
– can we create a policy to set maximum CPU usage for Defender for Servers for specified subscription?
– can we disable quarantine and any other detection for selectied machines to ALERT only but not take any action?
I checked we can set CPU usage by PS command but these machines are removed and added every week, so we would like to automate this process.
Hello,Some time ago we enabled Defender for Servers for virtual machines in our tenant. Some users reported me that DfS is using a lot of CPU usage in their machines and it blocks some files and proccesses from being executed. I have questions:- can we create a policy to set maximum CPU usage for Defender for Servers for specified subscription?- can we disable quarantine and any other detection for selectied machines to ALERT only but not take any action?I checked we can set CPU usage by PS command but these machines are removed and added every week, so we would like to automate this process. Read More
exchange online and on prim
i need help with exchange on-prim not emailing to a user who account was crated in active directory and email account is on office 365 and exchange online. i was able to get remote enabled and guild key but if a copier sends a message thought the server the user does not get it. i tried changing the legacy dn but did not work the mail trace for every one else shows up as copanyvl.mail.onmicrosoft.com but this user in question do not have this happen for so looking for help. any idea would greatly appreciate
exchange 2019 is the on prim
i need help with exchange on-prim not emailing to a user who account was crated in active directory and email account is on office 365 and exchange online. i was able to get remote enabled and guild key but if a copier sends a message thought the server the user does not get it. i tried changing the legacy dn but did not work the mail trace for every one else shows up as copanyvl.mail.onmicrosoft.com but this user in question do not have this happen for so looking for help. any idea would greatly appreciate exchange 2019 is the on prim Read More
Azure DevOps Configuration required on new work item types (Resolved)
** Business Requirement **
Create a new iteration black log type ‘Impediment’
As part of the configuration, if a sprint board has column options other than (New | Active | Resolved | Closed), then a message of “Configuration required” will be displayed once the new work item type has been added to Azure DevOps.
Sprint board with a custom column:
Adding of the new Task type:
Sprint board with a custom column after work item type has been added:
In most cases this can easily be resolved by going to ‘Column Options’ and clicking save to update the sprint board with the correct status.
However, in our case we needed to do this across a big organization with a lot of boards, some with custom columns. As this is a disruptive action to users that would need to go to their sprint boards to update the new status.
An automated method can be used through the API. Though the documentation is vague on what the json body requirement is the engineers from Microsoft was able to provide us with the required structure.
[
{
“mappings”: [
{
“workItemType“: “Task”,
“state”: “New”
},
{
“workItemType“: “Bug”,
“state”: “New”
},
{
“state”: “New”,
“workItemType“: “Impediment”
}
],
“order”: 0,
“name”: “New”,
“id”: “”
}
]
I created the following PowerShell script to add the new work item states to sprint boards with custom added columns.
# Define parameters for the script
Param(
[string]$organisation = “AzureDevOps-Organisation-Name”,
[string]$project = “AzureDevOps-Project-Name”,
[string]$user = “email address removed for privacy reasons”,
[string]$token = “Your-PAT” # Personal Access Token
)
# Convert username and token to Base64 for Basic Authentication
$base64AuthInfo = [Convert]::ToBase64String([Text.Encoding]::ASCII.GetBytes((“{0}:{1}” -f $user,$token)))
# Define headers for the API request
$headers = @{Authorization=(“Basic {0}” -f $base64AuthInfo)}
# Define the URL for the Teams API
$TeamUrl = “https://dev.azure.com/$($organisation)/_apis/projects/$project/teams?api-version=7.1-preview.3”
# Send a GET request to the Teams API
$TeamRequest = Invoke-RestMethod -Uri $TeamUrl -Method Get -ContentType “application/json” -Headers $headers
# Loop through each team in the response
foreach ($Team in $TeamRequest.value) {
# Define the URL for the Task Board Columns API for the current team
$TaskBoardUrl = “https://dev.azure.com/$($organisation)/$project/$($Team.id)/_apis/work/taskboardcolumns?api-version=7.1-preview.1”
# Send a GET request to the Task Board Columns API
$TaskBoardResult = Invoke-RestMethod -Uri $TaskBoardUrl -Method Get -ContentType “application/json” -Headers $headers
# Loop through each column in the response
foreach ($Column in $TaskBoardResult.columns)
{
# If the column name does not match ‘New’, ‘Active’, ‘Resolved’, or ‘Closed’
if ($Column.name -notmatch ‘New|Active|Resolved|Closed’)
{
# Define an empty array for the columns
$columnsArray = @()
# Define valid states
$validStates = @(“New”, “Active”, “Closed”)
# Loop through each column in the response
$TaskBoardResult.columns | ForEach-Object {
# Create a new object for the column
$column = New-Object PSObject -Property @{
id = “”
name = $_.name
order = $_.order
mappings = $_.mappings
}
# Filter the mappings for the column
$column.mappings = $column.mappings | Where-Object { $_.workItemType -ne “Impediment” -or ($_.workItemType -eq “Impediment” -and $_.state -eq $column.name) }
# If the column name is in the valid states
if ($column.name -in $validStates) {
# Create a new mapping for the column
$newMapping = New-Object PSObject -Property @{
state = $column.name
workItemType = “Impediment”
}
# Add the new mapping to the column
$column.mappings += $newMapping
}
# Add the column to the array
$columnsArray += $column
}
# Convert the array to JSON
$jsonBody = $columnsArray | ConvertTo-Json -Depth 10
# Define the URL for the Task Board Columns API for updating
$TaskBoardUrlUpdate = “https://dev.azure.com/$($organisation)/$project/$($Team.id)/_apis/work/taskboardcolumns?api-version=7.1-preview.1”
# Send a PUT request to the Task Board Columns API to update the columns
$ResultCall = Invoke-RestMethod -Uri $TaskBoardUrlUpdate -Method PUT -Body $jsonBody -ContentType “application/json” -Headers $headers
# Print the validation message and columns from the response
$ResultCall.validationMesssage
$ResultCall.columns
}
}
}
Result after running the script:
** Business Requirement ** Create a new iteration black log type ‘Impediment’ As part of the configuration, if a sprint board has column options other than (New | Active | Resolved | Closed), then a message of “Configuration required” will be displayed once the new work item type has been added to Azure DevOps. Sprint board with a custom column:Adding of the new Task type: Sprint board with a custom column after work item type has been added:In most cases this can easily be resolved by going to ‘Column Options’ and clicking save to update the sprint board with the correct status. However, in our case we needed to do this across a big organization with a lot of boards, some with custom columns. As this is a disruptive action to users that would need to go to their sprint boards to update the new status. An automated method can be used through the API. Though the documentation is vague on what the json body requirement is the engineers from Microsoft was able to provide us with the required structure. Source: https://learn.microsoft.com/en-us/rest/api/azure/devops/work/taskboard-columns/update?view=azure-devops-rest-7.1#taskboardcolumn [ { “mappings”: [ { “workItemType”: “Task”, “state”: “New” }, { “workItemType”: “Bug”, “state”: “New” }, { “state”: “New”, “workItemType”: “Impediment” } ], “order”: 0, “name”: “New”, “id”: “” } ] I created the following PowerShell script to add the new work item states to sprint boards with custom added columns. # Define parameters for the script
Param(
[string]$organisation = “AzureDevOps-Organisation-Name”,
[string]$project = “AzureDevOps-Project-Name”,
[string]$user = “email address removed for privacy reasons”,
[string]$token = “Your-PAT” # Personal Access Token
)
# Convert username and token to Base64 for Basic Authentication
$base64AuthInfo = [Convert]::ToBase64String([Text.Encoding]::ASCII.GetBytes((“{0}:{1}” -f $user,$token)))
# Define headers for the API request
$headers = @{Authorization=(“Basic {0}” -f $base64AuthInfo)}
# Define the URL for the Teams API
$TeamUrl = “https://dev.azure.com/$($organisation)/_apis/projects/$project/teams?api-version=7.1-preview.3”
# Send a GET request to the Teams API
$TeamRequest = Invoke-RestMethod -Uri $TeamUrl -Method Get -ContentType “application/json” -Headers $headers
# Loop through each team in the response
foreach ($Team in $TeamRequest.value) {
# Define the URL for the Task Board Columns API for the current team
$TaskBoardUrl = “https://dev.azure.com/$($organisation)/$project/$($Team.id)/_apis/work/taskboardcolumns?api-version=7.1-preview.1”
# Send a GET request to the Task Board Columns API
$TaskBoardResult = Invoke-RestMethod -Uri $TaskBoardUrl -Method Get -ContentType “application/json” -Headers $headers
# Loop through each column in the response
foreach ($Column in $TaskBoardResult.columns)
{
# If the column name does not match ‘New’, ‘Active’, ‘Resolved’, or ‘Closed’
if ($Column.name -notmatch ‘New|Active|Resolved|Closed’)
{
# Define an empty array for the columns
$columnsArray = @()
# Define valid states
$validStates = @(“New”, “Active”, “Closed”)
# Loop through each column in the response
$TaskBoardResult.columns | ForEach-Object {
# Create a new object for the column
$column = New-Object PSObject -Property @{
id = “”
name = $_.name
order = $_.order
mappings = $_.mappings
}
# Filter the mappings for the column
$column.mappings = $column.mappings | Where-Object { $_.workItemType -ne “Impediment” -or ($_.workItemType -eq “Impediment” -and $_.state -eq $column.name) }
# If the column name is in the valid states
if ($column.name -in $validStates) {
# Create a new mapping for the column
$newMapping = New-Object PSObject -Property @{
state = $column.name
workItemType = “Impediment”
}
# Add the new mapping to the column
$column.mappings += $newMapping
}
# Add the column to the array
$columnsArray += $column
}
# Convert the array to JSON
$jsonBody = $columnsArray | ConvertTo-Json -Depth 10
# Define the URL for the Task Board Columns API for updating
$TaskBoardUrlUpdate = “https://dev.azure.com/$($organisation)/$project/$($Team.id)/_apis/work/taskboardcolumns?api-version=7.1-preview.1”
# Send a PUT request to the Task Board Columns API to update the columns
$ResultCall = Invoke-RestMethod -Uri $TaskBoardUrlUpdate -Method PUT -Body $jsonBody -ContentType “application/json” -Headers $headers
# Print the validation message and columns from the response
$ResultCall.validationMesssage
$ResultCall.columns
}
}
} Result after running the script: Read More
Azure Devops Library Variables Audit
HI,
This maybe a difficult question to answer but we are currently developing an ALM strategy with regards to D365 CE. We have had historically 3rd party suppliers deliver our customisations and support via Devops pipelines. The 3rd party suppliers have used various variable groups, some of which we are unsure what they are used for.
Are there any tools which can determine where these variables are used ? Or has anyone with any experience of auditing variable groups used tools or processes which they can direct me towards ?
Any help or advice is appreciated. TIA
HI, This maybe a difficult question to answer but we are currently developing an ALM strategy with regards to D365 CE. We have had historically 3rd party suppliers deliver our customisations and support via Devops pipelines. The 3rd party suppliers have used various variable groups, some of which we are unsure what they are used for.Are there any tools which can determine where these variables are used ? Or has anyone with any experience of auditing variable groups used tools or processes which they can direct me towards ? Any help or advice is appreciated. TIA Read More
No output for Invoke-MgGraphRequest for user presence
Hi All!
I am experiencing some odd behaviour with a Invoke-MgGraphRequest and an Azure Runbook and could do with a nudge in the right direction.
I am trying to report on my Teams presence using GraphAPI. When I use the following code, it works, presence returned:
Invoke-MgGraphRequest -method GET -Uri “https://graph.microsoft.com/v1.0/communications/presences/xxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxxx”
But, When I try and assign this output to a variable (so it can be passed to a SharePoint list) I don’t get any output:
$returned=Invoke-MgGraphRequest -method GET -Uri “https://graph.microsoft.com/v1.0/communications/presences/xxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxxx”
$returned.value | ForEach-Object {$_.availability}
Am I doing something wrong, or is this expected behaviour?
Hi All! I am experiencing some odd behaviour with a Invoke-MgGraphRequest and an Azure Runbook and could do with a nudge in the right direction.I am trying to report on my Teams presence using GraphAPI. When I use the following code, it works, presence returned: Invoke-MgGraphRequest -method GET -Uri “https://graph.microsoft.com/v1.0/communications/presences/xxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxxx” But, When I try and assign this output to a variable (so it can be passed to a SharePoint list) I don’t get any output:$returned=Invoke-MgGraphRequest -method GET -Uri “https://graph.microsoft.com/v1.0/communications/presences/xxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxxx”
$returned.value | ForEach-Object {$_.availability}Am I doing something wrong, or is this expected behaviour? Read More
Searching Shared Mailboxes on MacOS is not showing same result as separate accounts
When I have 4 normal mail accounts in Outlook on MacOS, I can search through all Inboxes (‘Alle postvakken’) and will find all related mails over all mailboxes.
When I change those mailboxes to Share Mailboxes and add them to Outlook in that way, the searching is no longer working as before. From that moment on I only find some results in the currently selected mailbox.
How can we get Outlook (on MacOS) to search through all delegated mailboxes that are vissible in Outlook?
When I have 4 normal mail accounts in Outlook on MacOS, I can search through all Inboxes (‘Alle postvakken’) and will find all related mails over all mailboxes. When I change those mailboxes to Share Mailboxes and add them to Outlook in that way, the searching is no longer working as before. From that moment on I only find some results in the currently selected mailbox. How can we get Outlook (on MacOS) to search through all delegated mailboxes that are vissible in Outlook? Read More
Big Change Coming in Authentication for Outlook Add-ins
On April 9, 2024, Microsoft announced a big change in authentication for Outlook add-ins. It’s likely that people don’t realize the kind of change that’s coming. The change removes legacy Exchange authentication methods and replaces them with Nested App Authentication (NAA). Time is running short for developers to upgrade and test their code and Microsoft 365 tenants to get ready for the changeover.
https://office365itpros.com/2024/05/21/outlook-add-in-authentication/
On April 9, 2024, Microsoft announced a big change in authentication for Outlook add-ins. It’s likely that people don’t realize the kind of change that’s coming. The change removes legacy Exchange authentication methods and replaces them with Nested App Authentication (NAA). Time is running short for developers to upgrade and test their code and Microsoft 365 tenants to get ready for the changeover.
https://office365itpros.com/2024/05/21/outlook-add-in-authentication/ Read More
Calls save as draft under Teams chat
Hello
Please i need your help on this issue.
every calls saves as draft under Teams chat
When checking nothing did appear in the Teams web
The issue is happening on just incoming calls.
The issue is happening for users personal number, but it seems to be users with call queues
We cleared the Cache and they are still coming through
Hello Please i need your help on this issue. every calls saves as draft under Teams chat When checking nothing did appear in the Teams web The issue is happening on just incoming calls. The issue is happening for users personal number, but it seems to be users with call queues We cleared the Cache and they are still coming through Read More
how to comeback to a old version even with a synchronization issue.
Hello,
I have been using OneNote Online for a while now.
Some times, I noticed that the version that appeared was an old version. But the correct version eventually reappeared.
But, last time, I didn’t notice there was a synchronization issue, and I made a modification on the wrong version, the poorly synchronized version.
This modification bug the page and I had to create a new page (I gave it the same title) where I copied-pasted the content of the bug page.
But the bug page is the old version version, not a recent.
Obviously, I delete the bug page after.
Even If I can restore the delete page, I can’t retrieve the recent versions. What I mean is that I know OneNote saves pages regulary, but because of the synchronization issue, I no longer have access to the recent versions of the page. The most recent version I have access to is the one where I made the modification.
I would like to point out that I have already posted a request here : https://answers.microsoft.com/en-us/msoffice/forum/all/how-to-comeback-to-a-old-version-even-with-a/49a29505-46c6-48d8-8374-fb4842229da8
But, the support team asked me to submit an additional one here.
I will add some information that the support team has often asked me.
In the 2 pictures, we can see how my OneNote files are stored (I have hidden some parts for the second picture).
For the synchronization issue, the support team told me that it was bug that it will be fixed in the future.
My issue here is : can I retrieve my lost data, and if so, how ?
Hello,I have been using OneNote Online for a while now.Some times, I noticed that the version that appeared was an old version. But the correct version eventually reappeared.But, last time, I didn’t notice there was a synchronization issue, and I made a modification on the wrong version, the poorly synchronized version. This modification bug the page and I had to create a new page (I gave it the same title) where I copied-pasted the content of the bug page.But the bug page is the old version version, not a recent.Obviously, I delete the bug page after.Even If I can restore the delete page, I can’t retrieve the recent versions. What I mean is that I know OneNote saves pages regulary, but because of the synchronization issue, I no longer have access to the recent versions of the page. The most recent version I have access to is the one where I made the modification. I would like to point out that I have already posted a request here : https://answers.microsoft.com/en-us/msoffice/forum/all/how-to-comeback-to-a-old-version-even-with-a/49a29505-46c6-48d8-8374-fb4842229da8But, the support team asked me to submit an additional one here. I will add some information that the support team has often asked me.In the 2 pictures, we can see how my OneNote files are stored (I have hidden some parts for the second picture). For the synchronization issue, the support team told me that it was bug that it will be fixed in the future.My issue here is : can I retrieve my lost data, and if so, how ? file Bloc-notesBloc-note sync online Read More
Enabling OAuth in Azure DevOps Webhooks
Your message is well-written and clear. Here’s a minor adjustment for improved flow:
Hello Azure Support Team,
I hope this message finds you well. Currently, we’re encountering a 401 Unauthorized error when attempting to authenticate our Azure Function App with Azure AD on Azure DevOps (ADO) Webhooks. This issue arises because ADO Webhooks only support Basic Authentication. Consequently, we are requesting the addition of a feature that enables the usage of OAuth2 in ADO Webhooks. This addition would resolve authentication issues, particularly when our Function App is integrated with Azure AD.
Currently, we are using Basic Authentication, but this does not align with our internal standard security measures.
We hope that this feature will be added to ADO Webhooks.
Your message is well-written and clear. Here’s a minor adjustment for improved flow:Hello Azure Support Team,I hope this message finds you well. Currently, we’re encountering a 401 Unauthorized error when attempting to authenticate our Azure Function App with Azure AD on Azure DevOps (ADO) Webhooks. This issue arises because ADO Webhooks only support Basic Authentication. Consequently, we are requesting the addition of a feature that enables the usage of OAuth2 in ADO Webhooks. This addition would resolve authentication issues, particularly when our Function App is integrated with Azure AD.Currently, we are using Basic Authentication, but this does not align with our internal standard security measures.We hope that this feature will be added to ADO Webhooks. Read More
Appointment Recurrent Calendar on Teams
Sometimes when scheduling periodic meetings on Teams, you may not receive system emails, and the meeting may not have been scheduled. Does anyone know what the problem is?
Sometimes when scheduling periodic meetings on Teams, you may not receive system emails, and the meeting may not have been scheduled. Does anyone know what the problem is? Read More
Sensitivity labels
I am encountering a problem. When creating labels in the compliance portal, I am unable to see the custom classification I created under schematized data assets tab. Instead, only the default classifications already existing in the system are displayed. Please see the attached screenshots for reference. Can anyone suggest me the ways to do it.
I am encountering a problem. When creating labels in the compliance portal, I am unable to see the custom classification I created under schematized data assets tab. Instead, only the default classifications already existing in the system are displayed. Please see the attached screenshots for reference. Can anyone suggest me the ways to do it. Read More
How to Change Table Structure
Hello,
How to change the status of FixedLenNullInSource of a table in sql server.
In This Table I want to set FixedLenNullInSouce as ‘No’.
need help
Regards
Arshad
Hello,How to change the status of FixedLenNullInSource of a table in sql server. In This Table I want to set FixedLenNullInSouce as ‘No’.need help RegardsArshad Read More
sys.dm_io_cluster_valid_path_names returns no result
Hello,
in a series of “identically” configured sql failover clusters I have some where this query does not return a result:
SELECT path_name FROM sys.dm_io_cluster_valid_path_names;
The SQL failover clusters run without any problems. A backup tool used can back up the data, but when restoring I get the error message that the “drivedirectory…” of the database files on the active cluster node cannot be found. If I execute the statement on this cluster node, I also get no result.
These are not AlwaysOn clusters and also not clusters with Cluster Shared Volumes (CSV). The data volumes are provided as shared volumes in a directory as hard link attachments. They are therefore only ever active on the active cluster node of the respective SQL instance.
Has anyone ever had this problem and can tell me how it happens and how it can be solved?
Thank you
Jan
Hello,in a series of “identically” configured sql failover clusters I have some where this query does not return a result: SELECT path_name FROM sys.dm_io_cluster_valid_path_names; The SQL failover clusters run without any problems. A backup tool used can back up the data, but when restoring I get the error message that the “drivedirectory…” of the database files on the active cluster node cannot be found. If I execute the statement on this cluster node, I also get no result. These are not AlwaysOn clusters and also not clusters with Cluster Shared Volumes (CSV). The data volumes are provided as shared volumes in a directory as hard link attachments. They are therefore only ever active on the active cluster node of the respective SQL instance. Has anyone ever had this problem and can tell me how it happens and how it can be solved? Thank you Jan Read More
Documents automatic savings
Hello to all,
I would like to talk about possibility which was taken from Office tools – automatic savings.
I work as a technician in industry company. To my work belongs colect data, some databases about materials etc, about five documents in excel on daily use.
As these documents is internal, so they can not be stored in cloud and have to be on company network.
However some “clever” guy said that if you not use cloud you should not have possibility to automaticaly save documents. Really? In 21. century you have to save your documents manually?
So I wrote first about this to microsoft support and the answer was:
“I am so sorry, bla bla bla bla. There is not any possibility bla bla bla bla you have to make it manually bla bla bla or pay for cloud.”
So thank you I see now it is about money. It is really shame that this kind company has to use practices of mobile game developers.
note: No surprise that your XBOX division going to hell as well
Hello to all, I would like to talk about possibility which was taken from Office tools – automatic savings.I work as a technician in industry company. To my work belongs colect data, some databases about materials etc, about five documents in excel on daily use.As these documents is internal, so they can not be stored in cloud and have to be on company network.However some “clever” guy said that if you not use cloud you should not have possibility to automaticaly save documents. Really? In 21. century you have to save your documents manually? So I wrote first about this to microsoft support and the answer was: “I am so sorry, bla bla bla bla. There is not any possibility bla bla bla bla you have to make it manually bla bla bla or pay for cloud.” So thank you I see now it is about money. It is really shame that this kind company has to use practices of mobile game developers. note: No surprise that your XBOX division going to hell as well Read More
Asking idea for setting multiple automatic data picking logic!
If I wanted to add formula once I insert “Submit Date”, I want “Go-live date(tentative)” date picker to automatically calculate and select date like this following example for the whole year:
For example:
– If i select from 1 May to 10 May then Golive date to be 20 Jun.
– if i select from 11 May to 10 Jun then Golive date to be 20 July.
– if I select 11 Jun to 10July then Golive to be 20 Aug. Please help me out for correct json formatting version coding to use in Microsoft Sharepoint List.
If I wanted to add formula once I insert “Submit Date”, I want “Go-live date(tentative)” date picker to automatically calculate and select date like this following example for the whole year: For example: – If i select from 1 May to 10 May then Golive date to be 20 Jun. – if i select from 11 May to 10 Jun then Golive date to be 20 July. – if I select 11 Jun to 10July then Golive to be 20 Aug. Please help me out for correct json formatting version coding to use in Microsoft Sharepoint List. Read More
Redirect (user OWA rule) does not work
I need to do redirect on one mailbox (so the headers are preserved)
Redirect to EXTERNAL domain
But the rule does not seem to kick in at all.
email comes to user’s mailbox, but the redirect to external domain does not happen at all
I did set that remote domain as per: https://learn.microsoft.com/en-us/exchange/mail-flow-best-practices/remote-domains/manage-remote-domains
with all allow
Yet still only get failure:
Reason: [{LED=250 2.1.5 RESOLVER.MSGTYPE.AF; handled AutoForward addressed to external recipient};{MSG=};{FQDN=};{IP=};{LRT=}]
Thanks
Seb
I need to do redirect on one mailbox (so the headers are preserved)Redirect to EXTERNAL domain But the rule does not seem to kick in at all.email comes to user’s mailbox, but the redirect to external domain does not happen at all I did set that remote domain as per: https://learn.microsoft.com/en-us/exchange/mail-flow-best-practices/remote-domains/manage-remote-domains with all allow Yet still only get failure:Reason: [{LED=250 2.1.5 RESOLVER.MSGTYPE.AF; handled AutoForward addressed to external recipient};{MSG=};{FQDN=};{IP=};{LRT=}] Thanks Seb Read More