Month: June 2024
My watch button doesn’t work. What am I missing?
Hello all, I’m new to JSON/Sharepoint. I’ve pieced together a formatting that I like but I can’t get the watch button to work. ChatGPT said something about a button element not able to be used to open link. But I tried a div to look like a button but it didn’t work either.
Can anyone help me correct the code? The code in question starts at line 250.
I’m trying to make this elmtype link to the $Link column.
{
“$schema”: “https://developer.microsoft.com/json-schemas/sp/v2/tile-formatting.schema.json”,
“hideColumnHeader”: true,
“commandBarProps”: {
“commands”: [
{
“key”: “new”,
“hide”: true
},
{
“key”: “export”,
“hide”: true
},
{
“key”: “alertMe”,
“hide”: true
},
{
“key”: “manageAlert”,
“hide”: true
},
{
“key”: “undoCheckOut”,
“hide”: true
},
{
“key”: “manageForms”,
“hide”: true
},
{
“key”: “editInGridView”,
“hide”: true
},
{
“key”: “integrate”,
“hide”: true
},
{
“key”: “automate”,
“hide”: true
},
{
“key”: “share”,
“hide”: true
},
{
“key”: “delete”,
“hide”: true
},
{
“key”: “edit”,
“hide”: true
}
]
},
“height”: 370,
“width”: 290,
“hideSelection”: false,
“hideListHeader”: true,
“fillHorizontally”: true,
“formatter”: {
“elmType”: “div”,
“attributes”: {
“class”: “sp-card-container”,
“display”: “flex”,
“flex-wrap”: “wrap”,
“align-items”: “stretch”,
“padding”: “8px”,
“margin-bottom”: “25px”,
“max-width”: “420px”,
“border-radius”: “18px”,
“box-shadow”: “4px 4px 8px lightblue”
},
“children”: [
{
“elmType”: “div”
},
{
“elmType”: “div”,
“attributes”: {
“class”: “ms-bgColor-white sp-css-borderColor-neutralLight sp-card-borderHighlight sp-card-subContainer”
},
“children”: [
{
“elmType”: “div”,
“attributes”: {
“class”: “ms-fontSize-s”
},
“style”: {
“width”: “100%”,
“line-height”: “1.5em”,
“padding”: “4px”,
“padding-left”: “16px”,
“background-color”: “=if([$Track] ==’Zoom’, ‘#042b48′, if([$Track] ==’SimpleLists’, ‘#042b48′, if([$Track] ==’Breakout’, ‘#042b48’ ‘#042b48’)))”,
“color”: “white”,
“font-size”: “13.5px”,
“font-weight”: “bold”
},
“txtContent”: “[$Track]”
},
{
“elmType”: “div”,
“attributes”: {
“class”: “sp-card-previewColumnContainer”
},
“children”: [
{
“elmType”: “div”,
“attributes”: {
“class”: “sp-card-imageContainer”
},
“children”: [
{
“elmType”: “div”,
“attributes”: {
“class”: “ms-bgColor-neutralLight sp-card-imagePreviewBackground”
},
“children”: [
{
“elmType”: “img”,
“style”: {
“display”: “=if([$Image] == ”, ‘none’, ”)”
},
“attributes”: {
“src”: “=getThumbnailImage([$Image], 400, 400)”,
“title”: “[$Image.fileName]”,
“class”: “sp-card-imagePreview”
}
},
{
“elmType”: “svg”,
“style”: {
“display”: “=if([$Image] == ”, ”, ‘none’)”
},
“attributes”: {
“preserveAspectRatio”: “none”,
“viewBox”: “0 0 210 105”,
“class”: “sp-card-defaultImage ms-bgColor-themeLighter”
},
“children”: [
{
“elmType”: “path”,
“attributes”: {
“id”: “sp-card-defaultImage-path1”,
“d”: “M0 25.7896L126.5 53.8817L96 105H0V25.7896Z”
}
},
{
“elmType”: “path”,
“attributes”: {
“id”: “sp-card-defaultImage-path2”,
“d”: “M96 105L158.7 0H204C207.314 0 210 2.68629 210 6V105H96Z”
}
}
]
},
{
“elmType”: “svg”,
“style”: {
“display”: “=if([$Image] == ”, ”, ‘none’)”
},
“attributes”: {
“class”: “sp-card-defaultImageOverlay”,
“viewBox”: “0 0 40 40”
},
“children”: [
{
“elmType”: “path”,
“attributes”: {
“id”: “sp-card-defaultImageOverlay-path1”,
“d”: “M 4 4 H 37 V 37 H 4 L 4 4”
}
},
{
“elmType”: “path”,
“attributes”: {
“id”: “sp-card-defaultImageOverlay-path2”,
“d”: “M24.17 21.151L21.66 24.741L17.54 19.191C17.3322 18.914 17.0062 18.751 16.66 18.751C16.3137 18.751 15.9877 18.914 15.78 19.191L9.20997 28.051C8.97126 28.3786 8.93818 28.813 9.12453 29.173C9.31088 29.533 9.68465 29.7567 10.09 29.751H29.91C30.3085 29.7562 30.6769 29.5396 30.866 29.1887C31.0551 28.8378 31.0335 28.411 30.81 28.081L26 21.151C25.7991 20.8407 25.4546 20.6533 25.085 20.6533C24.7153 20.6533 24.3709 20.8407 24.17 21.151Z”
}
},
{
“elmType”: “path”,
“attributes”: {
“id”: “sp-card-defaultImageOverlay-path3”,
“d”: “M28 15.751C29.3807 15.751 30.5 14.6317 30.5 13.251C30.5 11.8703 29.3807 10.751 28 10.751C26.6193 10.751 25.5 11.8703 25.5 13.251C25.5 14.6317 26.6193 15.751 28 15.751Z”
}
},
{
“elmType”: “path”,
“attributes”: {
“id”: “sp-card-defaultImageOverlay-path4”,
“d”: “M4.5 37.251H35.5C36.3284 37.251 37 36.5794 37 35.751V4.75098C37 3.92255 36.3284 3.25098 35.5 3.25098H4.5C3.67157 3.25098 3 3.92255 3 4.75098V35.751C3 36.5794 3.67157 37.251 4.5 37.251ZM4 4.75098C4 4.47483 4.22386 4.25098 4.5 4.25098H35.5C35.7761 4.25098 36 4.47483 36 4.75098V35.751C36 36.0271 35.7761 36.251 35.5 36.251H4.5C4.22386 36.251 4 36.0271 4 35.751V4.75098Z”
}
}
]
}
]
}
]
}
]
},
{
“elmType”: “div”,
“attributes”: {
“class”: “sp-card-displayColumnContainer”,
“display”: “flex”,
“flex-wrap”: “wrap”,
“height”: “400”
},
“children”: [
{
“elmType”: “p”,
“attributes”: {
“title”: “[$Title]”,
“class”: “ms-fontColor-neutralPrimary sp-card-content sp-card-highlightedContent”,
“role”: “heading”,
“aria-level”: “3”,
“display”: “flex”,
“flex-wrap”: “wrap”,
“padding”: “40px”
},
“txtContent”: “=if ([$Title] == ”, ‘–’, [$Title])”,
“style”: {
“white-space”: “wrap”,
“word-break”: “keep-all”
}
}
]
},
{
“elmType”: “div”,
“attributes”: {
“class”: “sp-card-lastTextColumnContainer”
},
“children”: [
{
“elmType”: “p”,
“attributes”: {
“title”: “[$Description]”,
“class”: “ms-fontColor-neutralPrimary sp-card-content “
},
“txtContent”: “=if ([$Description] == ”, ‘–’, [$Description])”,
“style”: {
“white-space”: “wrap”,
“word-break”: “keep-all”
}
},
{
“elmType”: “button”,
“attributes”: {
“class”: “sp-row-button ms-borderColor-blue sp-row-button ms-bgColor-purpleDark–hover ms-fontWeight-semibold ms-fontColor-black”,
“href”: “[$Link]”,
“target”: “_blank”
},
“customRowAction”: {
“action”: “defaultClick”
},
“style”: {
“position”: “absolute”,
“left”: “8px”,
“top”: “320px”,
“width”: “264px”,
“margin-left”: “10px”,
“display”: “span”,
“background-color”: “transparent”,
“border-radius”: “4px”,
“border-color”: “#0082f0”
},
“txtContent”: “Watch”
}
]
}
]
}
]
}
}
Hello all, I’m new to JSON/Sharepoint. I’ve pieced together a formatting that I like but I can’t get the watch button to work. ChatGPT said something about a button element not able to be used to open link. But I tried a div to look like a button but it didn’t work either. Can anyone help me correct the code? The code in question starts at line 250. I’m trying to make this elmtype link to the $Link column. {
“$schema”: “https://developer.microsoft.com/json-schemas/sp/v2/tile-formatting.schema.json”,
“hideColumnHeader”: true,
“commandBarProps”: {
“commands”: [
{
“key”: “new”,
“hide”: true
},
{
“key”: “export”,
“hide”: true
},
{
“key”: “alertMe”,
“hide”: true
},
{
“key”: “manageAlert”,
“hide”: true
},
{
“key”: “undoCheckOut”,
“hide”: true
},
{
“key”: “manageForms”,
“hide”: true
},
{
“key”: “editInGridView”,
“hide”: true
},
{
“key”: “integrate”,
“hide”: true
},
{
“key”: “automate”,
“hide”: true
},
{
“key”: “share”,
“hide”: true
},
{
“key”: “delete”,
“hide”: true
},
{
“key”: “edit”,
“hide”: true
}
]
},
“height”: 370,
“width”: 290,
“hideSelection”: false,
“hideListHeader”: true,
“fillHorizontally”: true,
“formatter”: {
“elmType”: “div”,
“attributes”: {
“class”: “sp-card-container”,
“display”: “flex”,
“flex-wrap”: “wrap”,
“align-items”: “stretch”,
“padding”: “8px”,
“margin-bottom”: “25px”,
“max-width”: “420px”,
“border-radius”: “18px”,
“box-shadow”: “4px 4px 8px lightblue”
},
“children”: [
{
“elmType”: “div”
},
{
“elmType”: “div”,
“attributes”: {
“class”: “ms-bgColor-white sp-css-borderColor-neutralLight sp-card-borderHighlight sp-card-subContainer”
},
“children”: [
{
“elmType”: “div”,
“attributes”: {
“class”: “ms-fontSize-s”
},
“style”: {
“width”: “100%”,
“line-height”: “1.5em”,
“padding”: “4px”,
“padding-left”: “16px”,
“background-color”: “=if([$Track] ==’Zoom’, ‘#042b48′, if([$Track] ==’SimpleLists’, ‘#042b48′, if([$Track] ==’Breakout’, ‘#042b48’ ‘#042b48’)))”,
“color”: “white”,
“font-size”: “13.5px”,
“font-weight”: “bold”
},
“txtContent”: “[$Track]”
},
{
“elmType”: “div”,
“attributes”: {
“class”: “sp-card-previewColumnContainer”
},
“children”: [
{
“elmType”: “div”,
“attributes”: {
“class”: “sp-card-imageContainer”
},
“children”: [
{
“elmType”: “div”,
“attributes”: {
“class”: “ms-bgColor-neutralLight sp-card-imagePreviewBackground”
},
“children”: [
{
“elmType”: “img”,
“style”: {
“display”: “=if([$Image] == ”, ‘none’, ”)”
},
“attributes”: {
“src”: “=getThumbnailImage([$Image], 400, 400)”,
“title”: “[$Image.fileName]”,
“class”: “sp-card-imagePreview”
}
},
{
“elmType”: “svg”,
“style”: {
“display”: “=if([$Image] == ”, ”, ‘none’)”
},
“attributes”: {
“preserveAspectRatio”: “none”,
“viewBox”: “0 0 210 105”,
“class”: “sp-card-defaultImage ms-bgColor-themeLighter”
},
“children”: [
{
“elmType”: “path”,
“attributes”: {
“id”: “sp-card-defaultImage-path1”,
“d”: “M0 25.7896L126.5 53.8817L96 105H0V25.7896Z”
}
},
{
“elmType”: “path”,
“attributes”: {
“id”: “sp-card-defaultImage-path2”,
“d”: “M96 105L158.7 0H204C207.314 0 210 2.68629 210 6V105H96Z”
}
}
]
},
{
“elmType”: “svg”,
“style”: {
“display”: “=if([$Image] == ”, ”, ‘none’)”
},
“attributes”: {
“class”: “sp-card-defaultImageOverlay”,
“viewBox”: “0 0 40 40”
},
“children”: [
{
“elmType”: “path”,
“attributes”: {
“id”: “sp-card-defaultImageOverlay-path1”,
“d”: “M 4 4 H 37 V 37 H 4 L 4 4”
}
},
{
“elmType”: “path”,
“attributes”: {
“id”: “sp-card-defaultImageOverlay-path2”,
“d”: “M24.17 21.151L21.66 24.741L17.54 19.191C17.3322 18.914 17.0062 18.751 16.66 18.751C16.3137 18.751 15.9877 18.914 15.78 19.191L9.20997 28.051C8.97126 28.3786 8.93818 28.813 9.12453 29.173C9.31088 29.533 9.68465 29.7567 10.09 29.751H29.91C30.3085 29.7562 30.6769 29.5396 30.866 29.1887C31.0551 28.8378 31.0335 28.411 30.81 28.081L26 21.151C25.7991 20.8407 25.4546 20.6533 25.085 20.6533C24.7153 20.6533 24.3709 20.8407 24.17 21.151Z”
}
},
{
“elmType”: “path”,
“attributes”: {
“id”: “sp-card-defaultImageOverlay-path3”,
“d”: “M28 15.751C29.3807 15.751 30.5 14.6317 30.5 13.251C30.5 11.8703 29.3807 10.751 28 10.751C26.6193 10.751 25.5 11.8703 25.5 13.251C25.5 14.6317 26.6193 15.751 28 15.751Z”
}
},
{
“elmType”: “path”,
“attributes”: {
“id”: “sp-card-defaultImageOverlay-path4”,
“d”: “M4.5 37.251H35.5C36.3284 37.251 37 36.5794 37 35.751V4.75098C37 3.92255 36.3284 3.25098 35.5 3.25098H4.5C3.67157 3.25098 3 3.92255 3 4.75098V35.751C3 36.5794 3.67157 37.251 4.5 37.251ZM4 4.75098C4 4.47483 4.22386 4.25098 4.5 4.25098H35.5C35.7761 4.25098 36 4.47483 36 4.75098V35.751C36 36.0271 35.7761 36.251 35.5 36.251H4.5C4.22386 36.251 4 36.0271 4 35.751V4.75098Z”
}
}
]
}
]
}
]
}
]
},
{
“elmType”: “div”,
“attributes”: {
“class”: “sp-card-displayColumnContainer”,
“display”: “flex”,
“flex-wrap”: “wrap”,
“height”: “400”
},
“children”: [
{
“elmType”: “p”,
“attributes”: {
“title”: “[$Title]”,
“class”: “ms-fontColor-neutralPrimary sp-card-content sp-card-highlightedContent”,
“role”: “heading”,
“aria-level”: “3”,
“display”: “flex”,
“flex-wrap”: “wrap”,
“padding”: “40px”
},
“txtContent”: “=if ([$Title] == ”, ‘–’, [$Title])”,
“style”: {
“white-space”: “wrap”,
“word-break”: “keep-all”
}
}
]
},
{
“elmType”: “div”,
“attributes”: {
“class”: “sp-card-lastTextColumnContainer”
},
“children”: [
{
“elmType”: “p”,
“attributes”: {
“title”: “[$Description]”,
“class”: “ms-fontColor-neutralPrimary sp-card-content “
},
“txtContent”: “=if ([$Description] == ”, ‘–’, [$Description])”,
“style”: {
“white-space”: “wrap”,
“word-break”: “keep-all”
}
},
{
“elmType”: “button”,
“attributes”: {
“class”: “sp-row-button ms-borderColor-blue sp-row-button ms-bgColor-purpleDark–hover ms-fontWeight-semibold ms-fontColor-black”,
“href”: “[$Link]”,
“target”: “_blank”
},
“customRowAction”: {
“action”: “defaultClick”
},
“style”: {
“position”: “absolute”,
“left”: “8px”,
“top”: “320px”,
“width”: “264px”,
“margin-left”: “10px”,
“display”: “span”,
“background-color”: “transparent”,
“border-radius”: “4px”,
“border-color”: “#0082f0”
},
“txtContent”: “Watch”
}
]
}
]
}
]
}
} Read More
Issue with Provision Command in Teams Toolkit for Copilot Plugin
Hello,
I am writing to seek assistance with an issue I am encountering while using the Teams Toolkit in Visual Studio Code. I have made a prototype of a Copilot Plugin. I need to publish it for our company for testing. However, when I try to execute the Provision command, I repeatedly receive the error: “We couldn’t find a subscription.”
I have double-checked and confirmed that I have the Contributor role in the subscription (Pay-As-You-Go). I have selected it in both the Azure Portal and Visual Studio Code, but I still get this error.
Could someone please assist me in resolving this issue? Any guidance or suggestions on what I might be missing would be greatly appreciated.
Best regards,
Anton
Hello, I am writing to seek assistance with an issue I am encountering while using the Teams Toolkit in Visual Studio Code. I have made a prototype of a Copilot Plugin. I need to publish it for our company for testing. However, when I try to execute the Provision command, I repeatedly receive the error: “We couldn’t find a subscription.”I have double-checked and confirmed that I have the Contributor role in the subscription (Pay-As-You-Go). I have selected it in both the Azure Portal and Visual Studio Code, but I still get this error.Could someone please assist me in resolving this issue? Any guidance or suggestions on what I might be missing would be greatly appreciated. Best regards,Anton Read More
Token Protection – getting “unbound” for admin user
Hello,
I’ve been looking and the Conditional Access Policy for Token Protection but before implementing I checked Azure Sign-In logs. What I found is that when I use mu admin credentials (different than my user credentials) to access for example Azure Portal, Token Protection Sign in Session value is “Unbound”. When I use my standard user, all sign ins are logged with the Token value “bound”. We use hybrid joined devices. I have checked with my colleague and his Token Protection values are always “bound”, no matter if he uses his standard or admin account. What can be the reason? I cannot find much information about how to troubleshoot it. I’m worried that when I enable CA policy, I will cut myself off.
Hello, I’ve been looking and the Conditional Access Policy for Token Protection but before implementing I checked Azure Sign-In logs. What I found is that when I use mu admin credentials (different than my user credentials) to access for example Azure Portal, Token Protection Sign in Session value is “Unbound”. When I use my standard user, all sign ins are logged with the Token value “bound”. We use hybrid joined devices. I have checked with my colleague and his Token Protection values are always “bound”, no matter if he uses his standard or admin account. What can be the reason? I cannot find much information about how to troubleshoot it. I’m worried that when I enable CA policy, I will cut myself off. Read More
Read aloud function
My Read Aloud Function does not work. I am using Word ( on a monthly subscription as part of Office 2011) I am also working on a Mac. Can anyone help?
My Read Aloud Function does not work. I am using Word ( on a monthly subscription as part of Office 2011) I am also working on a Mac. Can anyone help? Read More
Can we report an idea connected to Copilot for M365?
Can we report an idea connected to Copilot for M365?
I can see it is possible for Copilot for Service here Microsoft Copilot for Service Ideas – Microsoft Community Hub
Can we report an idea connected to Copilot for M365? I can see it is possible for Copilot for Service here Microsoft Copilot for Service Ideas – Microsoft Community Hub Read More
Can I update checklist in Planner by API
Hi All
I would like to updade Checklist in a task by API – Could you please help how to do it?
Thanks
Hi All I would like to updade Checklist in a task by API – Could you please help how to do it? Thanks Read More
Troubleshooting Cross-Tenant Mailbox Migrations
In Part 1 of this series, we talked about cross-tenant (sometimes referred to tenant to tenant or T2T) mailbox migrations. In Part 2, we’ll cover how to troubleshoot issues you may encounter during cross-tenant mailbox migrations. There are several tools we wanted to mention that can be useful when troubleshooting.
Here is a table with the key elements:
We cannot stress enough how important it is to ensure the configuration is correct. Ensuring this saves time!
T2T migration errors and fixes
The most common errors in T2T migrations and how to address them. The errors are found using Test-MigrationServerAvailability or Get-MigrationUserStatistics / Get-MoveRequestStatistics.
Test-MigrationServerAvailability
This simulates a move request cross-tenant and will identify configuration issues at user level. When you get the same error for multiple users, it is likely that configuration is incorrect at organization level.
Powerful commands to troubleshoot your migrations and verify progress. The -IncludeReport report is basically a move report which shows all the things that happened in the migration. -DiagnosticInfo provides information on durations, throttling and possible underlying causes for stalls.
Cross tenant mailbox migration validation script
The report will check user level organization level configuration for T2T migration so that you are prepared for migrations.
Permissions related cmdlets:
Know what permissions are migrated in Exchange Online and if they are broken.
Data Consistency Score in migrations
Sometimes, we cannot migrate all the data from source to target, so we skip it. There are 4 main categories of skipped items:
Bad / corrupt items,
Large items (MaxReceiveSize up to 150MB in Exchange Online),
Missing items in the target (which normally doesn’t happen in move requests where the target mailbox is locked until completion and the end-user or other processed wouldn’t be able to access the target mailbox at all),
Other category issues.
You can view what has been skipped (for example permissions that couldn’t be mapped to a user are considered bad items) in move reports. Sometimes, if there is significant data loss (not migrated), the admin will need to approve to complete migration.
Duration estimates for T2T migrations based on data retrieved for 50% (P50) and 90% (P90) of the statistics seen for that migration type. If you exceed P90, your migration may be slow.
Test-MigrationServerAvailability
A very useful tool when troubleshooting cross-tenant mailbox migration is Test-MigrationServerAvailability. You run this command in the target tenant after the migration endpoint has been created. You can run Get-MigrationEndpoint to view the identity of the endpoint.
Test-MigrationServerAvailabiliy -TestMailbox <user@contoso.com> -Endpoint <T2T Migration Endpoint Name>
If the Result is Success, you can proceed with the migration of the user.
Note that a successful test doesn’t guarantee you will be able to migrate the user without any issues, but it is a good starting point to ensure the minimum prerequisites are met.
For a list of common failures, please see the Migration Failures section here: Cross-tenant mailbox migration
Here is an example, a situation where the source archive mailbox is a few bytes over 100GB.
Test-MigrationServerAvailability in target tenant:
Mailbox statistics for archive in the source tenant:
In situations where a source primary, archive, or dumpster is larger than the target tenant’s user quotas you will see errors like ArchiveExceedsTargetQuotaPermanentException, MailboxExceedsTargetQuotaPermanentException, or ArchiveDumpsterExceedsTargetQuotaPermanentException. In this case, you can contact Microsoft support for recommendations and see which options are best for you. Support may be able to provide an exception for an individual mailbox (we cannot process bulk mailboxes) and if you plan on having auto-expanding on the target tenant side and allocate room to grow in the target tenant. Depending on the situation and if you have enabled the auto-expanding archive feature on the source tenant, you can wait until the AuxArchive is provisioned and MainArchive / primary mailbox size is reduced below the 100GB quota (this can take up to 30 days but it’s generally 7 days on average). Or, if possible, ask the user to clean up unnecessary data in the source mailboxes if auto-expanding is not an option in the target tenant. Sometimes, source mailboxes that are or were on hold at some point, have Recoverable Items quota set to 100GB as opposed to the default 30GB, and in this case, you might just want to enable litigation hold on the target tenant mail user to also increase the quotas there.
Coming back to Test-MigrationServerAvailability, if this fails for multiple users, then likely there is something wrong at the organization level and the cross-tenant mailbox migration validation script discussed in the next section can be helpful. For example, if you get an ‘Access is denied’ error (screenshot below) for some or all users, here are some possible causes at organization level:
Erroneous application registration;
Wrong credentials or expired credentials used in the migration endpoint; or
Source tenant organization relationship’s OAuthApplicationId does not match target migration endpoint’s ApplicationId, or the org relationship is not using tenant ID in DomainNames
CrossTenantMailboxMigrationValidation script
When it comes to validating a large cross-tenant mailbox migration (CTMM), a better tool you can rely on is the Cross-Tenant Mailbox Migration validation script written by Alberto Pascual Montoya, published in the official CSS-Exchange GitHub repo and referenced by our official CTMM documentation.
This script will check and validate various related user and organization settings. It is very useful to run it before CTMM takes place to ensure the mailbox(es) can be migrated, and if they can, that they won’t face any validation issues during the move.
You can run the script to validate the configuration between both organizations by running:
.CrossTenantMailboxMigrationValidation.ps1 -CheckOrgs -LogPath <LogFileLocation>
You can also run it to validate the objects on the target tenant by comparing them with the objects in the source tenant by running:
.CrossTenantMailboxMigrationValidation.ps1 -CheckObjects -CSV <CSVFileLocation> -LogPath <LogFileLocation>
As you can see below, it will highlight any discrepancies found, and if the target object is not synced from any AD, it will ask if you would like to correct the discrepancy on the go:
There are other scenarios covered by the script too. For example, running it only from the source tenant side and collecting the needed data that can be sent to the target tenant admins and they run the script against their tenant; or simply collecting the data for support purposes.
Analyzing migration reports
Move request and migration user Exchange Online PowerShell commands are other tools that can help us troubleshoot cross-tenant migration.
Let’s do a quick walkthrough of these commands:
When we create a new migration batch, in the background, a Get-MigrationUser object is created for each user specified in the CSV file for migration. If things go well during a partial validation, a Get-MoveRequest is created for each MigrationUser.
In the example below, we can see that we have a migration batch (Get-MigrationBatch) with 1 user (one user was specified in the CSV file) and a migration user object (Get-MigrationUser) was created for the user, but because we don’t have a mail user with this identity, we failed the migration at this early stage. The move request was not created (Get-MoveRequest fails).
This user is not found at all on the target tenant (wrong identity of the user in the CSV):
And if I had run Test-MigrationServerAvailability on the EmailAddress specified in the CSV, before creating the batch, the error would have been relatively obvious:
Also, you’d want to ensure that the users specified in the migration CSV file don’t have a move request already (or they are not a part of another batch).
There might be situations where you might see that migration is stuck at creating the move request, that is in injection workflow stage:
If State: Waiting takes too long, you can look at DiagnosticInfo on Get-MigrationUserStatistics (check the Durations section in this blog post for more info) to see if you have more information on where migration is stuck.
In this case, we can see that it waited 19 minutes to inject the move request:
And 0 move requests injected successfully since we have 0 InjectedRequestCount and no timestamp at LastSuccessfulInjectionTime or LastInjectionTime.
After waiting 19 minutes for the injection of the move request, we finally got a permanent failure: ErrorCrossTenantSourceUserIsInHoldOrRetentionPolicyAppliedPermanentException, and the service gave up trying to inject the move request. The lesson is to check Test-MigrationServerAvailability before creating the batch (prerequisite missed) and avoid this waiting time and failure that could have been easily detected.
You might wonder when we use Get-MigrationUserStatistics versus Get-MoveRequestStatistics. It mostly depends on preference. I am more used to Get-MoveRequestStatistics, but in the situation where we don’t have a Get-MoveRequest created (like in the example above), we are forced to use Get-MigrationUserStatistics. Normally I append -DiagnosticInfo “verbose,showtimeslots” to make sure I get the most details. Also, if you are looking to check skipped Items, you will append -IncludeSkippedItems to Get-MigrationUserStatistics. If using Get-MoveRequestStatistics, it is enough to use -IncludeReport.
Let’s take another example where a move request was injected successfully by the service (New-MoveRequest) but this one failed. We will troubleshoot using the Get-MoveRequestStatistics command (since we have a Get-MoveRequest in place):
Move requests in cross-tenant migrations will have Onboarding_CrossTenant WorkloadType and the SourceServer and TargetServer are Exchange Online servers, usually in different forests.
In this case, the move request failed because we don’t have the Cross-tenant User Data Migration license on source or target tenant.
Get-MoveRequestStatistics EXOMailbox1 |FL MailboxIdentity, WorkloadType, SourceServer, TargetServer, RemoteHostName, FailureType, Message, FailureTimestamp
If I want to check the full details of the failure, I can run a command like this:
$stats = Get-MoveRequestStatistics <User> -IncludeReport
$stats.Report.Failures
If you’d want to have a quick overview of failures encountered, you would run:
$stats.Report.Failures | group Failuretype | FT -a
To check if any items would be skipped during the move, you can run the following command:
Get-MoveRequestStatistics <User> | FL DataConsistency*, BadItem*, LargeItem*, MissingItem*, Skipped*
For more information on DCS (Data Consistency Score) please see Improving Migrations Using Data Consistency Scoring and Track and prevent migration data loss. For troubleshooting DCS and approving skipped items, please see Migrations with Data Consistency Score (DCS) – more than you ever wanted to know! – Microsoft Community Hub.
If you need to look at the available properties of the source or target user before and after the move, you can use the IncludeReport switch in Get-MoveRequestStatistics.
Run the following in the target tenant after move is completed:
$stats = Get-MoveRequestStatistics <User> -IncludeReport
Then you would look into the properties like this:
$stats.report.SourceMailboxBeforeMove.Props
$stats.report.TargetMailUserBeforeMove.Props
$stats.report.SourceMailuserAfterMove.Props
$stats.report.TargetMailboxAfterMove.Props
Suppose that we want to see if migration service stamped the ExternalEmailAddress on the source recipient after the move (according to the TargetDeliveryDomain set in the migration batch), we would run this command:
$stats.report.SourceMailUserAfterMove.Props | ? PropertyName -eq “ExternalEmailAddress” | FL PropertyName, Values
Troubleshooting migration of permissions
Let’s teach you how to examine migration reports to see if permissions were migrated correctly and if there are any permissions errors in move reports. We provide PowerShell commands for the target tenant to help corroborate outputs with the migration reports analysis.
We will first check mailbox Full Access permissions, which should move during the cross-tenant migration, and we check them by using the move reports and PowerShell commands.
In this example, we moved MyMailbox10 and MyMailbox19, from one tenant to another:
We can see that MyMailbox10 has maintained the FullAccess permission on MyMailbox19 after migration:
We can confirm this by looking at the property called ExchangeSecurityDescriptor (which is MsExchMailboxSecurityDescriptor in AD) and noticing the SID values in there.
Note: you first need to store the Get-MoveRequestStatistics <user> -IncludeReport into a variable, in this example $stats19.
$stats.Report.SourceMailboxBeforeMove.Props | ? PropertyName -eq “ExchangeSecuritydescriptor”
$stats.Report.TargetMailboxAfterMove.Props | ? PropertyName -eq “ExchangeSecuritydescriptor”
Source mailbox: before move the SID belongs to MyMailbox10 in the source tenant.
Target mailbox: after move the SID belongs to MyMailbox10 in the target tenant.
If you have multiple SID values listed here, you can use these commands to display them in a more readable format:
$valueAfterMove = ($stats19.Report.TargetMailboxAfterMove.Props | ? PropertyName -eq “ExchangeSecurityDescriptor”).values.Strvalue
$sdobj = New-Object System.Security.AccessControl.RawSecurityDescriptor($valueAfterMove)
$sdObj.DiscretionaryAcl | select -Skip 1 |FT AceQualifier,AccessMask, SecurityIdentifier, AceFlags, IsInherited
The AccessMask 1 here means Full Access Permission:
The same information is also available in the DebugEntries in the move report when we look for “MailboxSecurityDescriptor’.
$stats.report.DebugEntries | ? LocalizedString -match “MailboxSecurityDescriptor”| % {[string] $_}
Besides Get-MailboxPermission cmdlet, we can look at the ExchangeSecurityDescriptor property with the Get-Mailbox command:
get-mailbox mymailbox19 | select -ExpandProperty ExchangeSecurityDescriptor | select -ExpandProperty discretionaryACL
A quick reminder that Send on Behalf permissions are NOT migrated, so we won’t see anything in move reports.
Send As permissions
The Send-As permissions are migrated. In target tenant, after migration:
You can similarly check the permissions in the source tenant.
When checking the move reports for Send-As permission, you can put the DebugEntries in a Notepad++ file and search for the user’s SID (source user SID and target user SID). These permissions would fall into the category UserSecurityDescriptor. You might see many AD permissions (screenshot below where entries were separated by new line), so it can be quite hard to spot them.
If you want to list all SIDs with Send-As, you can search specifically for ‘CR’ and the following GUID: ab721a54-1e2f-11d0-9819-00aa0040529b.
$stats.report.DebugEntries | ? LocalizedString -match “UserSecurityDescriptor”| % {[string] $_} | clip
Then paste from clipboard to Notepad++, for example:
To have them listed on new lines, you can do the following:
(OA; replace with n(OA;
(A; replace with n(A;
(OD; replace with n(OD;
These are called Ace Types, for a full list see documentation.
Then, since Send-As is an Extended Right, we will see CR ab721a54-1e2f-11d0-9819-00aa0040529b, which is the SendAs Extended Right GUID.
Checking the first SID in the source tenant:
Checking the first SID in the target tenant:
On-premises, looking at the LDP dump of the security descriptor of the user, you would see something like this:
Mailbox folder permissions
We also migrate mailbox folder permissions.
In the target tenant, after migration, you will run Get-MailboxFolderPermission <user>:<Folder> to check permissions present. For example:
Based on this output alone, we would think that there are no permissions issues.
But expand the User property and notice that Cloud3 is of Unknown user type:
And, if we check the move report, we will see TargetPrincipal errors and FolderACL issues for cloud3.
In this case, the DataConsistencyScore is usually Good instead of Perfect, and we will have corresponding BadItems and Failures recorded if we don’t find the principal user with permissions, in the target tenant.
Here is a quick command to see the alias of the principal user that is not found in the target tenant (Cloud3 in our example).
$stats.report.BadItems | select Kind, Folder, ScoringClassifications, {$_.Failure.DataContext} | ft -a
We can further look into DataContext with |FT or |FL for full details:
$stats.report.BadItems | select {$_.Failure.DataContext} | ft -a
Or look at the entire failure on specific bad items. BadItems[1] is the second CorruptFolderAcl in the screenshot above (the first line count starts from 0, not 1).
$stats.report.BadItems[1].Failure
For migration of Calendar and FreeBusy data folder permissions entries, we can look at folder ACL in Report.DebugEntries:
$stats.report.DebugEntries | ? LocalizedString -match “FolderAcl”| % {[string] $_}
Checking migration durations
Now that you rock at troubleshooting migration of permissions (which was lengthy and a bit boring), we will get into another topic: duration of the cross-tenant migrations and when to recognize you have an issue.
To see how much time your migration took and how it is progressing, you can run these commands:
$stats = Get-MoveRequestStatistics <User> -IncludeReport -DiagnosticInfo “Verbose,showtimeslots”
$stats |FL *duration*
In my example, I see that overall duration was about 30 minutes (it was a very small 35 MB mailbox) but I have 51 days of TotalFailedDuration:
If I check the $stats.DiagnosticInfo property, we will see these durations in a more detailed and accurate way:
Overall Move = 57 days (about 2 months) and 28 min, out of which:
In Progress = 6 min (actual copying of the data)
Suspended = 21 min (I chose to complete it at a certain point after initial sync, so it went into suspended mode)
Failed = 51 days and about 8 hours (it was in a failed state for 51 days)
Completed = 5 days and 16 hours (the move completed 5 days ago but I ran the Get-MoveRequest command now, 5 days after completion)
You can check the table from Microsoft 365 and Office 365 migration performance and best practices | Microsoft Learn for duration estimates during mailbox migrations in Exchange Online.
For example, a mailbox with a size of less than 10 GB is estimated to be migrated within 1 day. You can open a case with Microsoft Support if P90 is exceeded and it is because of our service (for example it is not a configuration issue on your side that wasn’t remediated fast enough).
In my situation above, I had a permanent failure, and left the move sit there for 51 days in a Failed state. This doesn’t count against the P90, as I neglected the move. The InProgress duration was 6 min for my 35MB mailbox which meets the P90 estimation (90% of mailbox migrations less than 10GB would complete in 1 day).
We would like to thank Anshul Dube, Roman Powell and Nino Bilic for contributing and reviewing this blog post.
Mirela Buruiana and Alberto Pascual Montoya
Microsoft Tech Community – Latest Blogs –Read More
Make your voice chatbots more engaging with new text to speech features
In our increasingly digital world, the importance of giving a voice and image to chatbots cannot be overstated. Transforming a chatbot from an impersonal, automated responder into a relatable and personable assistant significantly enhances user engagement.
Today we’re thrilled to announce Azure AI Speech’s latest updates, enhancing text to speech capabilities for a more engaging and lifelike chatbot experience. These updates include:
A winder range of multilingual voices for natural and authentic interactions;
More prebuilt avatar options, with latest sample codes for seamless GPT-4o integration; and
A new text stream API that significantly reduces latency for ChatGPT integration, ensuring smoother and faster responses.
Introducing new multilingual and IVR-styled voices
We’re excited to introduce our newest collection of voices, equipped with advanced multilingual features. These voices are crafted from a variety of source languages, bringing a rich diversity of personas to enhance your user experience. With their authentic and natural interactions, they promise to transform your chatbot engagement through our technology.
Discover the diverse range of our new voices:
Voice name
Main locale
Gender
en-GB-AdaMultilingualNeural
en-GB (English – United Kingdom)
Female
en-GB-OllieMultilingualNeural
en-GB (English – United Kingdom)
Male
pt-BR-ThalitaMultilingualNeural
pt-BR (Portuguese – Portugal)
Female
es-ES-IsidoraMultilingualNeural
es-ES (Spanish – Spain)
Female
es-ES-ArabellaMultilingualNeural
es-ES (Spanish – Spain)
Female
it-IT-IsabellaMultilingualNeural
it-IT (Italian – Italy)
Female
it-IT-MarcelloMultilingualNeural
it-IT (Italian – Italy)
Male
it-IT-AlessioMultilingualNeural
it-IT (Italian – Italy)
Male
We’re also delighted to present two new optimized en-US voices, specifically designed for call center scenarios – a prevalent application of text-to-speech technology.
They are:
Voice name
Main locale
Gender
en-US-LunaNeural
En-US (English – United States)
Female
en-US-KaiNeural
En-US (English – United States)
Male
These voices are currently available for public preview in three regions: East US, West Europe, and South East Asia. Discover more in our Voice Gallery and delve deeper into the details via our developer documentation.
Announcing advanced features for text to speech avatars
Text to speech avatar, previewed at Ignite 2023, enables users to create realistic videos of speaking avatars simply by giving text input and allows users to create real-time interactive bots with visual elements that are more engaging. Since its preview, we have received great feedback and appreciation from customers in various industries. Today, we are glad to share what’s been added to the avatar portfolio.
More prebuilt avatar options and more regions available
Our prebuilt text-to-speech avatars offer ready-to-deploy solutions for our customers. We’ve recently enriched our portfolio’s diversity by introducing five new avatars. They can be used for both batch synthesis and real-time conversational scenarios. We remain committed to expanding our avatar collections to encompass a broader range of cultures and visual identities.
These newly introduced avatars can be accessed in Speech Studio for video creation and live chats. Dive deeper into the process of synthesizing a text-to-speech avatar using Speech SDK for real-time synthesis in chatbot interactions, or batch synthesis for generating creativity videos.
Beyond the previously available service regions – West US 2, West Europe, and Southeast Asia – we are excited to announce the expansion of our avatar service to three additional regions: Sweden Central, North Europe, and South Central US. Learn more here.
Enhanced text to speech avatar chat experience with Azure OpenAI capabilities
Text-to-speech avatars are increasingly leveraged for live chatbots, with many of our customers utilizing Azure OpenAI to develop customer service bots, virtual assistants, AI educators, and virtual tourist guides, among others. These avatars, with their lifelike appearance and natural sounding neural TTS or custom voice, combined with the advanced natural language processing capabilities of the Azure OpenAI GPT model, provide an interaction experience that closely mirrors human conversation.
The Azure OpenAI GPT-4o model is now part of the live chat avatar application in Speech Studio. This allows users to see firsthand the collaborative functioning of the live chat avatar and Azure OpenAI GPT-4o. Additionally, we provide sample code to aid in integrating the text-to-speech avatar with the GPT-4o model. Learn more about how to create lifelike chatbots with real-time avatars and Azure OpenAI GPTs, or dive into code samples here (JS code sample, and python code sample) .
This update also includes sample codes to assist in customizing Azure OpenAI GPT on your data. Azure OpenAI On Your Data is a new feature that enables users to tailor the chatbot’s responses according to their unique data source. This proves especially beneficial for enterprise customers aiming to develop an avatar-based live chat application capable of addressing business-specific queries from clients. For guidance on creating a live chat app using Azure OpenAI On Your Data, please refer to this sample code (search “On Your Data”).
More Responsible AI support for avatars
Ensuring responsibility in both the development and delivery of AI products is a core value for us. In line with this, we’ve introduced two features to bolster the responsible AI support for text-to-speech avatars, supplementing our existing transparency note, code of conduct, and disclosure guidelines.
We’ve integrated Azure AI Content Safety into the batch synthesis process of text to speech avatars for video creation scenarios. This added layer of text moderation allows for the detection of offensive, risky, or undesirable text input, thereby preventing the avatar from producing harmful output. The text moderation feature spans multiple categories, including sexual, violent, hate, self-harm content, and more. It’s available for batch synthesis of text-to-speech avatars both in Speech Studio and via the batch synthesis API.
In our bid to provide audiences with clearer insights into the source and history of video content created by text to speech avatars, we’ve adopted the Coalition for Content Provenance and Authenticity (C2PA) Standard. This standard offers transparent information about AI-generation of video content. For more details on the integration of C2PA with text to speech avatars, refer to Content Credentials in Azure Text to Speech Avatar .
Unlocking real-time speech synthesis with the new text stream API
Our latest release introduces an innovative Text Stream API designed to harness the power of real-time text processing to generate speech with unprecedented speed. This new API is perfect for dynamic text vocalization, such as reading outputs from AI models like GPT in real-time.
The Text Stream API represents a significant leap forward from traditional non-text stream TTS technologies. By accepting input in chunks (as opposed to whole responses), it significantly reduces the latency that typically hinders seamless audio synthesis.
Comparison: Non-Text Stream vs. Text Stream
Non-Text Stream
Text Stream
Input Type
Whole GPT response
Each GPT output chunk
TTS First Byte Latency
High (Total GPT response time + TTS time)
Low (Few GPT chunks time + TTS time)
The Text Stream API not only minimizes latency but also enhances the fluidity and responsiveness of real-time speech outputs, making it an ideal choice for interactive applications, live events, and responsive AI-driven dialogues.
Utilizing the Text Stream API is straightforward. Simply follow the steps provided with the Speech SDK. For detailed implementation, see the sample code on GitHub.
Get started
Microsoft provides access to more than 500 neural voices spanning over more than 140 languages and locales, complemented by avatar add-ons. These text-to-speech capabilities, part of Azure AI Speech service, allow you to swiftly imbue chatbots with a natural voice and realistic image, thereby enriching the conversational experience for users. Furthermore, the Custom Neural Voice and Custom Avatar features facilitate the creation of a distinctive brand voice and image for your chatbots. With a unique voice and image, a chatbot can seamlessly integrate into your brand’s identity, contributing to a cohesive and unforgettable brand experience.
For more information
Try our demo to listen to existing neural voices
Add Text-to-Speech to your apps today
Apply for access to Custom Avatar and Custom Neural Voice
Join Discord to collaborate and share feedback
Zheng Niu and Junwei Gan also contributed to this article.
Microsoft Tech Community – Latest Blogs –Read More
Arc Length Continuation Method: Numerical Method
I am trying plot a curve (frequency vs amplitude) that has a (sn) birfurcation, and I am trying to write a code that would use the Arc Length Continuation to get the solve the nonlinear equations and get the (frequency response) curve (include regions where two solutions exist).
The methodology is as follows
Initialization and Initial Guess: Find an initial guess for a specific omega (parameter)
Residual Calculation: Using a function handle to calculate residuals, based on the nonlinear equation
Newton-Raphson Solver: Solves the nonlinear equation using the Newton-Raphson method.
Continuation Method: Continuation method uses the arc length continuation to trace the solution curve by extrapolating new guesses and solving using the Newton-Raphson method.
Arc Length Constraint: Ensures the next step in the continuation process is a fixed arc length from the current solution.
Plotting: Code for plotting the FRF is included but commented out because the code is incomplete
My issue is in the function called continuation–which continunes to solve for nSteps (number of steps), by calculating the new guess, and augemeniting the Jacobain to have the new constraint equation–I believe I have the correct(?) logic but my code is incorrect.
(I know there are straight-forward ways to obtain the FRF for the function defined in Analytic Equation, the reason I am testing it out on a straight-forward equation is to make sure it is running correctly.
Any pointers appreciated
clc; clear; close all
w = [0]; % Path-finding parameters,omega–w, for initial search; starting at 0 Hz
x0 = [0;0]; % Initial guess (x(1) is the solution to the cosine and x(2) is the solution to the sine)
nSteps = 1000; % Number of steps to continue forward on the curve
% Initialize q_val
q_val = zeros(length(x0)+1,1); %this is the solution vector, it will have [x(1); x(2); w]
% Find the residuals using the initial w and the equation
equation_f = @(x, w) equation(x, w); % Function to calculate the residuals for each forcing
[sol, iter] = nlsolver(equation_f, x0, w, 10e-10); % Using Newton-Raphson to solve the equation
q_val = [sol;w]; % Store solution and corresponding w
% Perform continuation–trace the solution curve by extrapolating new guesses and solving using the Newton-Raphson method.
continuation_combined(q_val, nSteps)
% Plotting the results
% w = q_val(3,:);
% a = sqrt(q_val(1,:).^2 + q_val(2,:).^2);
% plot(w, a); grid on
%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
%%%%% CONTINUATION FUNCTIONS
function q_val = continuation_combined(q, N)
q_val = q;
for n = 1:N
xguess = 2 * q_val(:, end) – q_val(:, end-1); % Extrapolate a new guess for the solution
xguess = nlsolver(continuation_side(xguess), xguess); % Solve using the extrapolated guess
q_val = [q_val, xguess]; % Append the new solution
end
function new_q = continuation_side(q_val)
w = q_val(end); % Extract the current parameter value
x = q_val(1:end-1); % Extract the state variables
% Constants
k = 1; m = 1; c = 0.02; f0 = 1; F = [f0; 0]; kc = 0.5;
% Linear part of the equation
A = [k – w^2 * m, -c * w; c * w, k – w^2 * m];
% Nonlinear part of the equation
B = [x(1) * (x(2)^2 + x(1)^2); x(2) * (x(2)^2 + x(1)^2)];
% Residuals
R = A * x + 0.75 * kc * B – F;
% Arc length constraint
S = 0.01;
aa = norm(q_val(:, end) – q_val) – S;
new_q = [R; aa]; % Return the combined residual and arc length constraint
end
end
%%%%% ANALYTIC EQUATION (OBTAINED FROM HBM)
function [R] = equation(x, w)
% Constants
k = 1; m = 1; c = 0.02; f0 = 1; F = [f0; 0]; kc = 0.5;
% Linear part of the equation
A = [k – w^2 * m, -c * w; c * w, k – w^2 * m];
% Nonlinear part of the equation
B = [x(1) * (x(2)^2 + x(1)^2); x(2) * (x(2)^2 + x(1)^2)];
% Residuals
R = A * x + 0.75 * kc * B – F;
end
%%%%% NEWTON-RAPHSON SOLVER
function [sol, iter] = nlsolver(func, guess, w, errorbound)
% Initializations
norm_R0 = 1;
iteration = 0;
hj = 1e-8; % Step size
x = guess;
while norm_R0 > errorbound && iteration < 1000 % Either error bound satisfied or iteration hits max threshold for each point
R0 = func(x, w); % Evaluate the function at x (initial conditions)
J = zeros(length(x), length(x)); % Initialize the Jacobian matrix
for r = 1:length(x)
for c = 1:length(x)
ej = zeros(length(x), 1);
ej(c) = 1;
Ri = func(x + hj * ej, w); % Evaluate the function at x + hj * ej
J(r, c) = (Ri(r) – R0(r)) / hj; % Compute the Jacobian entry
end
end
x = x – J R0; % Update x using the Jacobian and function values
iteration = iteration + 1;
norm_R0 = norm(R0); % Update the norm of R0
end
sol = x;
iter = iteration;
endI am trying plot a curve (frequency vs amplitude) that has a (sn) birfurcation, and I am trying to write a code that would use the Arc Length Continuation to get the solve the nonlinear equations and get the (frequency response) curve (include regions where two solutions exist).
The methodology is as follows
Initialization and Initial Guess: Find an initial guess for a specific omega (parameter)
Residual Calculation: Using a function handle to calculate residuals, based on the nonlinear equation
Newton-Raphson Solver: Solves the nonlinear equation using the Newton-Raphson method.
Continuation Method: Continuation method uses the arc length continuation to trace the solution curve by extrapolating new guesses and solving using the Newton-Raphson method.
Arc Length Constraint: Ensures the next step in the continuation process is a fixed arc length from the current solution.
Plotting: Code for plotting the FRF is included but commented out because the code is incomplete
My issue is in the function called continuation–which continunes to solve for nSteps (number of steps), by calculating the new guess, and augemeniting the Jacobain to have the new constraint equation–I believe I have the correct(?) logic but my code is incorrect.
(I know there are straight-forward ways to obtain the FRF for the function defined in Analytic Equation, the reason I am testing it out on a straight-forward equation is to make sure it is running correctly.
Any pointers appreciated
clc; clear; close all
w = [0]; % Path-finding parameters,omega–w, for initial search; starting at 0 Hz
x0 = [0;0]; % Initial guess (x(1) is the solution to the cosine and x(2) is the solution to the sine)
nSteps = 1000; % Number of steps to continue forward on the curve
% Initialize q_val
q_val = zeros(length(x0)+1,1); %this is the solution vector, it will have [x(1); x(2); w]
% Find the residuals using the initial w and the equation
equation_f = @(x, w) equation(x, w); % Function to calculate the residuals for each forcing
[sol, iter] = nlsolver(equation_f, x0, w, 10e-10); % Using Newton-Raphson to solve the equation
q_val = [sol;w]; % Store solution and corresponding w
% Perform continuation–trace the solution curve by extrapolating new guesses and solving using the Newton-Raphson method.
continuation_combined(q_val, nSteps)
% Plotting the results
% w = q_val(3,:);
% a = sqrt(q_val(1,:).^2 + q_val(2,:).^2);
% plot(w, a); grid on
%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
%%%%% CONTINUATION FUNCTIONS
function q_val = continuation_combined(q, N)
q_val = q;
for n = 1:N
xguess = 2 * q_val(:, end) – q_val(:, end-1); % Extrapolate a new guess for the solution
xguess = nlsolver(continuation_side(xguess), xguess); % Solve using the extrapolated guess
q_val = [q_val, xguess]; % Append the new solution
end
function new_q = continuation_side(q_val)
w = q_val(end); % Extract the current parameter value
x = q_val(1:end-1); % Extract the state variables
% Constants
k = 1; m = 1; c = 0.02; f0 = 1; F = [f0; 0]; kc = 0.5;
% Linear part of the equation
A = [k – w^2 * m, -c * w; c * w, k – w^2 * m];
% Nonlinear part of the equation
B = [x(1) * (x(2)^2 + x(1)^2); x(2) * (x(2)^2 + x(1)^2)];
% Residuals
R = A * x + 0.75 * kc * B – F;
% Arc length constraint
S = 0.01;
aa = norm(q_val(:, end) – q_val) – S;
new_q = [R; aa]; % Return the combined residual and arc length constraint
end
end
%%%%% ANALYTIC EQUATION (OBTAINED FROM HBM)
function [R] = equation(x, w)
% Constants
k = 1; m = 1; c = 0.02; f0 = 1; F = [f0; 0]; kc = 0.5;
% Linear part of the equation
A = [k – w^2 * m, -c * w; c * w, k – w^2 * m];
% Nonlinear part of the equation
B = [x(1) * (x(2)^2 + x(1)^2); x(2) * (x(2)^2 + x(1)^2)];
% Residuals
R = A * x + 0.75 * kc * B – F;
end
%%%%% NEWTON-RAPHSON SOLVER
function [sol, iter] = nlsolver(func, guess, w, errorbound)
% Initializations
norm_R0 = 1;
iteration = 0;
hj = 1e-8; % Step size
x = guess;
while norm_R0 > errorbound && iteration < 1000 % Either error bound satisfied or iteration hits max threshold for each point
R0 = func(x, w); % Evaluate the function at x (initial conditions)
J = zeros(length(x), length(x)); % Initialize the Jacobian matrix
for r = 1:length(x)
for c = 1:length(x)
ej = zeros(length(x), 1);
ej(c) = 1;
Ri = func(x + hj * ej, w); % Evaluate the function at x + hj * ej
J(r, c) = (Ri(r) – R0(r)) / hj; % Compute the Jacobian entry
end
end
x = x – J R0; % Update x using the Jacobian and function values
iteration = iteration + 1;
norm_R0 = norm(R0); % Update the norm of R0
end
sol = x;
iter = iteration;
end I am trying plot a curve (frequency vs amplitude) that has a (sn) birfurcation, and I am trying to write a code that would use the Arc Length Continuation to get the solve the nonlinear equations and get the (frequency response) curve (include regions where two solutions exist).
The methodology is as follows
Initialization and Initial Guess: Find an initial guess for a specific omega (parameter)
Residual Calculation: Using a function handle to calculate residuals, based on the nonlinear equation
Newton-Raphson Solver: Solves the nonlinear equation using the Newton-Raphson method.
Continuation Method: Continuation method uses the arc length continuation to trace the solution curve by extrapolating new guesses and solving using the Newton-Raphson method.
Arc Length Constraint: Ensures the next step in the continuation process is a fixed arc length from the current solution.
Plotting: Code for plotting the FRF is included but commented out because the code is incomplete
My issue is in the function called continuation–which continunes to solve for nSteps (number of steps), by calculating the new guess, and augemeniting the Jacobain to have the new constraint equation–I believe I have the correct(?) logic but my code is incorrect.
(I know there are straight-forward ways to obtain the FRF for the function defined in Analytic Equation, the reason I am testing it out on a straight-forward equation is to make sure it is running correctly.
Any pointers appreciated
clc; clear; close all
w = [0]; % Path-finding parameters,omega–w, for initial search; starting at 0 Hz
x0 = [0;0]; % Initial guess (x(1) is the solution to the cosine and x(2) is the solution to the sine)
nSteps = 1000; % Number of steps to continue forward on the curve
% Initialize q_val
q_val = zeros(length(x0)+1,1); %this is the solution vector, it will have [x(1); x(2); w]
% Find the residuals using the initial w and the equation
equation_f = @(x, w) equation(x, w); % Function to calculate the residuals for each forcing
[sol, iter] = nlsolver(equation_f, x0, w, 10e-10); % Using Newton-Raphson to solve the equation
q_val = [sol;w]; % Store solution and corresponding w
% Perform continuation–trace the solution curve by extrapolating new guesses and solving using the Newton-Raphson method.
continuation_combined(q_val, nSteps)
% Plotting the results
% w = q_val(3,:);
% a = sqrt(q_val(1,:).^2 + q_val(2,:).^2);
% plot(w, a); grid on
%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
%%%%% CONTINUATION FUNCTIONS
function q_val = continuation_combined(q, N)
q_val = q;
for n = 1:N
xguess = 2 * q_val(:, end) – q_val(:, end-1); % Extrapolate a new guess for the solution
xguess = nlsolver(continuation_side(xguess), xguess); % Solve using the extrapolated guess
q_val = [q_val, xguess]; % Append the new solution
end
function new_q = continuation_side(q_val)
w = q_val(end); % Extract the current parameter value
x = q_val(1:end-1); % Extract the state variables
% Constants
k = 1; m = 1; c = 0.02; f0 = 1; F = [f0; 0]; kc = 0.5;
% Linear part of the equation
A = [k – w^2 * m, -c * w; c * w, k – w^2 * m];
% Nonlinear part of the equation
B = [x(1) * (x(2)^2 + x(1)^2); x(2) * (x(2)^2 + x(1)^2)];
% Residuals
R = A * x + 0.75 * kc * B – F;
% Arc length constraint
S = 0.01;
aa = norm(q_val(:, end) – q_val) – S;
new_q = [R; aa]; % Return the combined residual and arc length constraint
end
end
%%%%% ANALYTIC EQUATION (OBTAINED FROM HBM)
function [R] = equation(x, w)
% Constants
k = 1; m = 1; c = 0.02; f0 = 1; F = [f0; 0]; kc = 0.5;
% Linear part of the equation
A = [k – w^2 * m, -c * w; c * w, k – w^2 * m];
% Nonlinear part of the equation
B = [x(1) * (x(2)^2 + x(1)^2); x(2) * (x(2)^2 + x(1)^2)];
% Residuals
R = A * x + 0.75 * kc * B – F;
end
%%%%% NEWTON-RAPHSON SOLVER
function [sol, iter] = nlsolver(func, guess, w, errorbound)
% Initializations
norm_R0 = 1;
iteration = 0;
hj = 1e-8; % Step size
x = guess;
while norm_R0 > errorbound && iteration < 1000 % Either error bound satisfied or iteration hits max threshold for each point
R0 = func(x, w); % Evaluate the function at x (initial conditions)
J = zeros(length(x), length(x)); % Initialize the Jacobian matrix
for r = 1:length(x)
for c = 1:length(x)
ej = zeros(length(x), 1);
ej(c) = 1;
Ri = func(x + hj * ej, w); % Evaluate the function at x + hj * ej
J(r, c) = (Ri(r) – R0(r)) / hj; % Compute the Jacobian entry
end
end
x = x – J R0; % Update x using the Jacobian and function values
iteration = iteration + 1;
norm_R0 = norm(R0); % Update the norm of R0
end
sol = x;
iter = iteration;
end matlab MATLAB Answers — New Questions
Matlab Parallel pool test failure
I am using Matlab R2019b, I have setup an HPC cluster. In the validation stage, the last one (parallel pool test) fails. It shows the host IP refuses the connection. I tried some other IPs, still the same issue. Also, I have added the inbound rule in the firewall for Matlab and TCP ports 27370-27370.Also, on the left bottom corner of Matlab, starting parallel pool always fails too. I can also see my jobs sent to the HPC job manager, but I guess when the data coming back they are refused by my local IP.I am using Matlab R2019b, I have setup an HPC cluster. In the validation stage, the last one (parallel pool test) fails. It shows the host IP refuses the connection. I tried some other IPs, still the same issue. Also, I have added the inbound rule in the firewall for Matlab and TCP ports 27370-27370.Also, on the left bottom corner of Matlab, starting parallel pool always fails too. I can also see my jobs sent to the HPC job manager, but I guess when the data coming back they are refused by my local IP. I am using Matlab R2019b, I have setup an HPC cluster. In the validation stage, the last one (parallel pool test) fails. It shows the host IP refuses the connection. I tried some other IPs, still the same issue. Also, I have added the inbound rule in the firewall for Matlab and TCP ports 27370-27370.Also, on the left bottom corner of Matlab, starting parallel pool always fails too. I can also see my jobs sent to the HPC job manager, but I guess when the data coming back they are refused by my local IP. matlab MATLAB Answers — New Questions
please sanity check my function for calculating Kendall’s tau
Hi, I’m trying to make my own function for calculating Kendall’s tau in a practice purpose, although I know there is Matlab function ( corr(A, ‘type’, ‘Kendall’) ). But I get two different answers when I run the Matlab function and my function. I don’t know the reason…
Here are example verctors:
xx = [1 0 1 1 0 1];
yy = [0 1 1 1 1 0];
Matlab corr function gives me -0.5, but my own function below gives me -0.26667
function taua=myKendall(a,b)
%% preparations
a=a(:);b=b(:);
validEntryIs = ~isnan(a)&~isnan(b);
a=a(validEntryIs);b=b(validEntryIs);
n=size(a,1);
%% compute Kendall rank correlation coefficient tau-a
K = 0;
for k = 1:n-1
pairRelations_a=sign(a(k)-a(k+1:n));
pairRelations_b=sign(b(k)-b(k+1:n));
K = K + sum(pairRelations_a.*pairRelations_b);
end
taua=K/(n*(n-1)/2); % normalise by the total number of pairs
end%functionHi, I’m trying to make my own function for calculating Kendall’s tau in a practice purpose, although I know there is Matlab function ( corr(A, ‘type’, ‘Kendall’) ). But I get two different answers when I run the Matlab function and my function. I don’t know the reason…
Here are example verctors:
xx = [1 0 1 1 0 1];
yy = [0 1 1 1 1 0];
Matlab corr function gives me -0.5, but my own function below gives me -0.26667
function taua=myKendall(a,b)
%% preparations
a=a(:);b=b(:);
validEntryIs = ~isnan(a)&~isnan(b);
a=a(validEntryIs);b=b(validEntryIs);
n=size(a,1);
%% compute Kendall rank correlation coefficient tau-a
K = 0;
for k = 1:n-1
pairRelations_a=sign(a(k)-a(k+1:n));
pairRelations_b=sign(b(k)-b(k+1:n));
K = K + sum(pairRelations_a.*pairRelations_b);
end
taua=K/(n*(n-1)/2); % normalise by the total number of pairs
end%function Hi, I’m trying to make my own function for calculating Kendall’s tau in a practice purpose, although I know there is Matlab function ( corr(A, ‘type’, ‘Kendall’) ). But I get two different answers when I run the Matlab function and my function. I don’t know the reason…
Here are example verctors:
xx = [1 0 1 1 0 1];
yy = [0 1 1 1 1 0];
Matlab corr function gives me -0.5, but my own function below gives me -0.26667
function taua=myKendall(a,b)
%% preparations
a=a(:);b=b(:);
validEntryIs = ~isnan(a)&~isnan(b);
a=a(validEntryIs);b=b(validEntryIs);
n=size(a,1);
%% compute Kendall rank correlation coefficient tau-a
K = 0;
for k = 1:n-1
pairRelations_a=sign(a(k)-a(k+1:n));
pairRelations_b=sign(b(k)-b(k+1:n));
K = K + sum(pairRelations_a.*pairRelations_b);
end
taua=K/(n*(n-1)/2); % normalise by the total number of pairs
end%function correlation, kendall’s tau MATLAB Answers — New Questions
Why am I receiving an error when generating a Trial License?
I am trying to generate a Trial License through my MathWorks Account. I receive an error message that says:
"We are unable to offer you a Trial. If you believe you are receiving this message in error, please contact your local Customer Support Representative"
Why am I receiving an error when generating a Trial License?I am trying to generate a Trial License through my MathWorks Account. I receive an error message that says:
"We are unable to offer you a Trial. If you believe you are receiving this message in error, please contact your local Customer Support Representative"
Why am I receiving an error when generating a Trial License? I am trying to generate a Trial License through my MathWorks Account. I receive an error message that says:
"We are unable to offer you a Trial. If you believe you are receiving this message in error, please contact your local Customer Support Representative"
Why am I receiving an error when generating a Trial License? trial, license, error MATLAB Answers — New Questions
Need Help Resolving Quick-Books Error 6010, 100 – Assistance Appreciated!
I encountered Quick-Books Error 6010, 100 while trying to open my company file. The error message suggests a network issue, but I’ve checked my connection, and everything seems fine. I’m unable to access any of my data. Can anyone in the community help with a solution?
I encountered Quick-Books Error 6010, 100 while trying to open my company file. The error message suggests a network issue, but I’ve checked my connection, and everything seems fine. I’m unable to access any of my data. Can anyone in the community help with a solution? Read More
Need Help with Quick-Books Error 193: How to Fix?
Hello,
I’m experiencing Quick-Books Error 193 when trying to open my company file. The error message says the file is already in use, but no one else is accessing it. I’ve tried restarting Quick-Books and my computer, but the issue persists. Any advice?
Thanks!
Hello,I’m experiencing Quick-Books Error 193 when trying to open my company file. The error message says the file is already in use, but no one else is accessing it. I’ve tried restarting Quick-Books and my computer, but the issue persists. Any advice?Thanks! Read More
Ordering custom term templates when creating a new Glossary term in Microsoft Purview
Hello
I have created multiple custom term templates for use in the Glossary section of Microsoft Purview. I would like these term templates to appear against new Glossary Terms in a specific order but despite multiple attempts including adding an alphabetical prefix to the custom term template names, Purview seems to order the term templates at random.
Does anyone have a solution that would allow me to customise the ordering of the custom term templates in a Glossary Term?
Thanks
HelloI have created multiple custom term templates for use in the Glossary section of Microsoft Purview. I would like these term templates to appear against new Glossary Terms in a specific order but despite multiple attempts including adding an alphabetical prefix to the custom term template names, Purview seems to order the term templates at random.Does anyone have a solution that would allow me to customise the ordering of the custom term templates in a Glossary Term?Thanks Read More
Purview Information Protection for internal and external emails
I’m working with an organisation that is starting to use sensitivity labels. They have Office 365 E3 licenses. The current plan is to set up a default label for documents and emails called “Internal Only”. This label will encrypt contents and grant co-author permissions to all staff.
The challenge will be when emails include external recipients. Ideally, the user will change from the default label to one that grants access to any recipients. However, I can imagine that there will be many cases where they forget to do this.
If we had Office 365 E5 licenses, we would have the option to create a DLP policy to show a policy tip. I I would expect this would reduce the incidents of mislabeling.
I have seen recommendations to avoid encrypting by default and only use it where needed. However this client is keen to use encryption to protect as much content as possible.
One suggestion could be to change the default email label to only grant access to the sender and recipients, regardless of whether they are internal or external.
I’m interested in any real-world feedback on how others have tackled this issue.
I’m working with an organisation that is starting to use sensitivity labels. They have Office 365 E3 licenses. The current plan is to set up a default label for documents and emails called “Internal Only”. This label will encrypt contents and grant co-author permissions to all staff. The challenge will be when emails include external recipients. Ideally, the user will change from the default label to one that grants access to any recipients. However, I can imagine that there will be many cases where they forget to do this. If we had Office 365 E5 licenses, we would have the option to create a DLP policy to show a policy tip. I I would expect this would reduce the incidents of mislabeling. I have seen recommendations to avoid encrypting by default and only use it where needed. However this client is keen to use encryption to protect as much content as possible. One suggestion could be to change the default email label to only grant access to the sender and recipients, regardless of whether they are internal or external. I’m interested in any real-world feedback on how others have tackled this issue. Read More
Is it possible to protect the Primary Refresh Token (PRT) if attacker has hands on keyboard
Hi everyone,
I want to ask if anyone know if possible to defend against pass-the-prt attack? We are about to embark on a journey to deploy privilege access workstations to all IT admins with more or less no internet access. The idea is to have a clean source and heavily reduce an attacker getting hold of the credentials / PRT of an admin account. But because it is so heavily locked down it is already causing issues for us.
So I want to find out how big of an issue it is if an attacker was able to get a foothold on a device which is used by a standard user account that has Microsoft Entra ID roles assigned via PIM.
So we have Defender for Endpoint installed on all devices, Tamper protection is on and the ASR rule “Block credential stealing from the Windows local security authority subsystem (lsass.exe)” is set to block. further to that we require a FIDO2 security key for all IT admins and CA policies are set to require both MFA and a compliant device.
But as mentioned above, if an attacker gets a foothold on a device used by an IT admin user who logs in with his or hers standard account and elevate into an Entra admin role, is it game over by then?
If that is the case, it seems to me that the PRT is the weekend and we would be better off not having the device used for admin privileged joined Microsoft Entra.
Hi everyone, I want to ask if anyone know if possible to defend against pass-the-prt attack? We are about to embark on a journey to deploy privilege access workstations to all IT admins with more or less no internet access. The idea is to have a clean source and heavily reduce an attacker getting hold of the credentials / PRT of an admin account. But because it is so heavily locked down it is already causing issues for us.So I want to find out how big of an issue it is if an attacker was able to get a foothold on a device which is used by a standard user account that has Microsoft Entra ID roles assigned via PIM.So we have Defender for Endpoint installed on all devices, Tamper protection is on and the ASR rule “Block credential stealing from the Windows local security authority subsystem (lsass.exe)” is set to block. further to that we require a FIDO2 security key for all IT admins and CA policies are set to require both MFA and a compliant device.But as mentioned above, if an attacker gets a foothold on a device used by an IT admin user who logs in with his or hers standard account and elevate into an Entra admin role, is it game over by then? If that is the case, it seems to me that the PRT is the weekend and we would be better off not having the device used for admin privileged joined Microsoft Entra. Read More
Mac Book Air M2
Hello,
It’s impossible for some weeks to open a link in my outlook. It is possible to send or to reveive emails but impossible to open the links.
Thanks if some body has a solution.
Hello,It’s impossible for some weeks to open a link in my outlook. It is possible to send or to reveive emails but impossible to open the links. Thanks if some body has a solution. Read More
MailMessage Content-Type “multipart/mixed” to “multipart/report; report-type=feedback-report”
I’m updating our in-house security software to send XARF reports rather than having to use abuse portals(DigitalOcean). The only issue I’m having is that a requirement is to have the mail message Content-Type header be “multipart/report; report-type=feedback-report”. I can’t find any way to change the message header. It’s either the content-type of the body, or if I add attachments/alternate views, it’s “multipart/mixed”.
if (mailMessage.Headers[“Content-Type”] != null)
mailMessage.Headers.Remove(“Content-Type”);
mailMessage.Headers.Add(“Content-Type”, “multipart/report; report-type=feedback-report”);
I’m updating our in-house security software to send XARF reports rather than having to use abuse portals(DigitalOcean). The only issue I’m having is that a requirement is to have the mail message Content-Type header be “multipart/report; report-type=feedback-report”. I can’t find any way to change the message header. It’s either the content-type of the body, or if I add attachments/alternate views, it’s “multipart/mixed”. if (mailMessage.Headers[“Content-Type”] != null)
mailMessage.Headers.Remove(“Content-Type”);
mailMessage.Headers.Add(“Content-Type”, “multipart/report; report-type=feedback-report”); Read More