Category: News
Copilot Flight Crew: Earn Your Wings?
At the M365 Community Conference in Orlando, we were told that a “Copilot Flight Crew” challenge would be starting on June 1st but I am unable to find any information about it. See image from presentation below:
At the M365 Community Conference in Orlando, we were told that a “Copilot Flight Crew” challenge would be starting on June 1st but I am unable to find any information about it. See image from presentation below: Read More
Excel file opening very slowly
I have moderately complex excel spreadsheet that opens very slowly.
I have moderately complex excel spreadsheet that opens very slowly. Read More
Project Online (PWA) Issue: When a save a project next day all my tasks are gone.
I have a problem with the Project Online (PWA). Every time I create a new project, I open it through Project Desktop, adding all the tasks, save it, and check in, everything looks fine. But the next day, when I open it, all my tasks are gone. I have given my users owner permissions to the SharePoint Site Project Web App to they check off any task in the task list. Also, when I open the project website in SharePoint, all the tasks are there, but when I open it through Project Desktop with different users, all is gone. Do you know why this is happening and how to fix it?
I have a backup to restore my project but frustrating and I don’t know what to do. Please advise.
I have a problem with the Project Online (PWA). Every time I create a new project, I open it through Project Desktop, adding all the tasks, save it, and check in, everything looks fine. But the next day, when I open it, all my tasks are gone. I have given my users owner permissions to the SharePoint Site Project Web App to they check off any task in the task list. Also, when I open the project website in SharePoint, all the tasks are there, but when I open it through Project Desktop with different users, all is gone. Do you know why this is happening and how to fix it? I have a backup to restore my project but frustrating and I don’t know what to do. Please advise. Read More
How can I write this update better?
The query below is to check if a specific date falls on a weekend and if it does update the date to the Monday following the weekend. Ex: date = 6/15/2024. This falls on a Saturday. In this case, the date is set to 6/17/2024.
The query works as expected, however, curious to know if there is a shorter, efficient version.
declare
@day int, @date datetime
set @date = convert(varchar,getdate(),101)
select @day = datepart(dw, @date)
select
case when @day = 7 then @date + 2
when @day = 1 then @date + 1
else @date
end
update table_name set run_date = @date where id = 123
The query below is to check if a specific date falls on a weekend and if it does update the date to the Monday following the weekend. Ex: date = 6/15/2024. This falls on a Saturday. In this case, the date is set to 6/17/2024.The query works as expected, however, curious to know if there is a shorter, efficient version. declare@day int, @date datetimeset @date = convert(varchar,getdate(),101)select @day = datepart(dw, @date)select case when @day = 7 then @date + 2 when @day = 1 then @date + 1 else @date endupdate table_name set run_date = @date where id = 123 Read More
TempDB space in Azure SQL DB appears much less than the published values
Issue
An issue was brought to our attention recently where an azure SQL DB was throwing TempDB related errors although the customer felt that the TempDB usage never came close to the value published in the official Microsoft document. Here’s the error the customer had complained about:
Error
Here is a more detailed error text :
The database ‘tempdb’ has reached its size quota. Partition or delete data, drop indexes, or consult the documentation for possible resolutions.’. Possible failure reasons: Problems with the query, ‘ResultSet’ property not set correctly, parameters not set correctly, or connection not established correctly.
WorkAround/Fix:
The customer wanted to know the TempDB allocated to them. While the official documentation suggested that the SLO had 1.2 TB of Temp DB, In reality they always received a TempDB full error after 64GB.
I asked their technical team to run the following command to check the current Temp DB space allocated:
SELECT FileName = df.name,
current_file_size_MB = df.size*1.0/128,
max_size = CASE df.max_size
WHEN 0 THEN ‘Autogrowth is off.’
WHEN -1 THEN ‘Autogrowth is on.’
ELSE ‘Log file grows to a maximum size of 2 TB.’
END,
growth_value =
CASE
WHEN df.growth = 0 THEN df.growth
WHEN df.growth > 0 AND df.is_percent_growth = 0 THEN df.growth*1.0/128.0
WHEN df.growth > 0 AND df.is_percent_growth = 1 THEN df.growth
END,
growth_increment_unit =
CASE
WHEN df.growth = 0 THEN ‘Size is fixed.’
WHEN df.growth > 0 AND df.is_percent_growth = 0 THEN ‘Growth value is MB.’
WHEN df.growth > 0 AND df.is_percent_growth = 1 THEN ‘Growth value is a percentage.’
END
FROM tempdb.sys.database_files AS df;
GO
The Output that was shared, verified that the TempDB allocated was indeed 1.2TB as described in the public documentation against the DB SLO. Here’s the output:
Now the next step was to check what was the free space available in the TempDB at that point because per the customer, they were not having any heavy-duty jobs/queries running on the DB at that time. I asked them to execute the query below:
— Determining the amount of free space in tempdb
SELECT SUM(unallocated_extent_page_count) AS [free pages],
(SUM(unallocated_extent_page_count)*1.0/128) AS [free space in MB]
FROM tempdb.sys.dm_db_file_space_usage;
— Determining the amount of space used by the version store
SELECT SUM(version_store_reserved_page_count) AS [version store pages used],
(SUM(version_store_reserved_page_count)*1.0/128) AS [version store space in MB]
FROM tempdb.sys.dm_db_file_space_usage;
— Determining the amount of space used by internal objects
SELECT SUM(internal_object_reserved_page_count) AS [internal object pages used],
(SUM(internal_object_reserved_page_count)*1.0/128) AS [internal object space in MB]
FROM tempdb.sys.dm_db_file_space_usage;
— Determining the amount of space used by user objects
SELECT SUM(user_object_reserved_page_count) AS [user object pages used],
(SUM(user_object_reserved_page_count)*1.0/128) AS [user object space in MB]
FROM tempdb.sys.dm_db_file_space_usage;
The output shared by the team below surprised even the customer as they didn’t expect to see this in their DB. Here’s the output:
Now the next question from the team was what was occupying >95% space on their TempDB. While I pointed out a specific section from the official documentation, I had to send them a query to get some material evidence around this. Here’s the public documentation states around User object Pages Used section in the output above:
user_object_reserved_page_count – Total number of pages allocated from uniform extents for user objects in the database. Unused pages from an allocated extent are included in the count.
You can use the total_pages column in the sys.allocation_units catalog view to return the reserved page count of each allocation unit in the user object. However, note that the total_pages column includes IAM pages.
The following objects are included in the user object page counters:
User-defined tables and indexes
System tables and indexes
Global temporary tables and indexes
Local temporary tables and indexes
Table variables
Tables returned in the table-valued functions
After that I shared the query below to help them investigate more into their TempDB:
SELECT
OBJECT_NAME(p.object_id) AS TableName,
au.*
FROM
tempdb.sys.allocation_units au
JOIN
tempdb.sys.partitions p ON au.container_id = p.partition_id
JOIN
tempdb.sys.objects o ON p.object_id = o.object_id
WHERE
au.type_desc = ‘IN_ROW_DATA’; — Optional: Add condition based on allocation unit type
Here is the output that we received and this explained the entire scenario to the customer:
The output above indicated that the 3rd party application that was connecting to the Azure SQL DB, was creating a lot of Global Temp Objects or objects that seem to persist beyond the session lifetime inside the Temp DB. The sum of the space occupied by those objects was a little over 1 TB, thereby explaining the 64 GB Temp DB space left for the rest of the queries. The customer also suspected that a new module of the 3rd party vendor could be either creating permanent objects inside the Temp DB or objects that are different from the usual Temp objects. After the discussion, the customer started a separate conversation with their vendor to address the issue.
They however had a complete understanding of the issue by the end of the troubleshooting session.
References
SORT_IN_TEMPDB Option For Indexes – SQL Server | Microsoft Learn
tempdb database – SQL Server | Microsoft Learn
Index Disk Space Example – SQL Server | Microsoft Learn
Azure SQL DB and TEMPDB usage tracking – Microsoft Community Hub
Microsoft Tech Community – Latest Blogs –Read More
BLDC Motor Control with boost converter
Hi, there is a BLDC Motor Control Simulink model in Matlab and it works perfect. It uses buck converter to regulate the dc voltage. I created a boost converter model to increase my 24V battery instead of constant dc voltage. It works perfect separate from BLDC model but when I add this model to BLDC control model, the duty cycle that is coming out of PID block is always 0 and the desired speed input doesnt change anything. This is BLDC Motor Control Simulink model with Buck converter:
And this is my boost converter and complete model:
And these is duty cycle:
Anyone who can help solving this problem? Thank you!Hi, there is a BLDC Motor Control Simulink model in Matlab and it works perfect. It uses buck converter to regulate the dc voltage. I created a boost converter model to increase my 24V battery instead of constant dc voltage. It works perfect separate from BLDC model but when I add this model to BLDC control model, the duty cycle that is coming out of PID block is always 0 and the desired speed input doesnt change anything. This is BLDC Motor Control Simulink model with Buck converter:
And this is my boost converter and complete model:
And these is duty cycle:
Anyone who can help solving this problem? Thank you! Hi, there is a BLDC Motor Control Simulink model in Matlab and it works perfect. It uses buck converter to regulate the dc voltage. I created a boost converter model to increase my 24V battery instead of constant dc voltage. It works perfect separate from BLDC model but when I add this model to BLDC control model, the duty cycle that is coming out of PID block is always 0 and the desired speed input doesnt change anything. This is BLDC Motor Control Simulink model with Buck converter:
And this is my boost converter and complete model:
And these is duty cycle:
Anyone who can help solving this problem? Thank you! bldc, electric_motor_control, boost converter MATLAB Answers — New Questions
Why is there 100% CPU use by keeping synchronizing certain files in MATLAB Connector?
MATLAB Connector keeps telling me that it’s updating for just two files like main and HEAD, but it doesn’t seem to actually update because those files do not exist in MATLAB Drive folder.
While trying to update those files, my computer is slow because it takes up 100% of CPU utilization. If synchronization is paused, the CPU utilization rate, which was 100%, drops, and if synchronization is not stopped, CPU utilization is continuously used.
I uninstalled and reinstalled MATLAB Connector by following the manual but this issue is not resolved.
https://www.mathworks.com/help/matlab/matlab_env/install-matlab-drive.htmlMATLAB Connector keeps telling me that it’s updating for just two files like main and HEAD, but it doesn’t seem to actually update because those files do not exist in MATLAB Drive folder.
While trying to update those files, my computer is slow because it takes up 100% of CPU utilization. If synchronization is paused, the CPU utilization rate, which was 100%, drops, and if synchronization is not stopped, CPU utilization is continuously used.
I uninstalled and reinstalled MATLAB Connector by following the manual but this issue is not resolved.
https://www.mathworks.com/help/matlab/matlab_env/install-matlab-drive.html MATLAB Connector keeps telling me that it’s updating for just two files like main and HEAD, but it doesn’t seem to actually update because those files do not exist in MATLAB Drive folder.
While trying to update those files, my computer is slow because it takes up 100% of CPU utilization. If synchronization is paused, the CPU utilization rate, which was 100%, drops, and if synchronization is not stopped, CPU utilization is continuously used.
I uninstalled and reinstalled MATLAB Connector by following the manual but this issue is not resolved.
https://www.mathworks.com/help/matlab/matlab_env/install-matlab-drive.html matlabconnector MATLAB Answers — New Questions
how to compute ERGODIC CHANNEL CAPACITY for 8 element mimo antenna
how to compute ERGODIC CHANNEL CAPACITY for 8 element mimo antenna .also plot frequency vs channel capacityhow to compute ERGODIC CHANNEL CAPACITY for 8 element mimo antenna .also plot frequency vs channel capacity how to compute ERGODIC CHANNEL CAPACITY for 8 element mimo antenna .also plot frequency vs channel capacity antenna, mimo MATLAB Answers — New Questions
How to convert displacment data in time series to frequency domain ie (from Amplitude (m) v/s time (s) to Amplitude (m) v/s Frequency (Hz)) using fourier transform
Hi,
I have a excel sheet containing data of heave response in meters v/s time from 1 to 1000seconds. This is the heave response of a structure subjected to regular wave of 3 m wave height. I need to convert it into frequency domain data ie heave response vs frequency in Hz.
File attached contains time series data ( Column A is time and Column B is the response ).
please help me regarding the sameHi,
I have a excel sheet containing data of heave response in meters v/s time from 1 to 1000seconds. This is the heave response of a structure subjected to regular wave of 3 m wave height. I need to convert it into frequency domain data ie heave response vs frequency in Hz.
File attached contains time series data ( Column A is time and Column B is the response ).
please help me regarding the same Hi,
I have a excel sheet containing data of heave response in meters v/s time from 1 to 1000seconds. This is the heave response of a structure subjected to regular wave of 3 m wave height. I need to convert it into frequency domain data ie heave response vs frequency in Hz.
File attached contains time series data ( Column A is time and Column B is the response ).
please help me regarding the same fft conversion MATLAB Answers — New Questions
Better support for Crosstab in Excel
This may be a sweeping statement but most data in Excel is stored as CrossTab, and not as a basic table (unless an export from another system).
It seems all LLMs including CoPilot can only function off Tabular structure and not CrossTabs, so unpivoted is a must before you can leverage CoPilot.
My wish would be that Excel supports the CrossTab structure more formally like it does with Excel Table. This will allow CoPilot to seamlessly read crosstab data without the need of the time intensive task of using Power Query to unpivot the data. This is also radically improve import of spreadsheets into Power BI!
Thoughts folks?
This may be a sweeping statement but most data in Excel is stored as CrossTab, and not as a basic table (unless an export from another system). It seems all LLMs including CoPilot can only function off Tabular structure and not CrossTabs, so unpivoted is a must before you can leverage CoPilot. My wish would be that Excel supports the CrossTab structure more formally like it does with Excel Table. This will allow CoPilot to seamlessly read crosstab data without the need of the time intensive task of using Power Query to unpivot the data. This is also radically improve import of spreadsheets into Power BI!Thoughts folks? Read More
Problems with DNS Replication after upgrade
I created a new A/D Server 2022 to replace my A/D Server 2012. The install completed and the FSMO roles transferred fine. However, I cannot manage the A/D GPO. In looking at the logs, it appears that DNS cannot replicate.
Password hash synchronization failed for domain: mrc.net, domain controller hostname: MIDSRVR01.mrc.net, domain controller IP address: 172.16.1.43. Details:
Microsoft.Online.PasswordSynchronization.SynchronizationManagerException: Unable to open connection to domain: mrc.net. Error: There was an error establishing a connection to the directory replication service. Domain controller hostname: MIDSRVR01.mrc.net, domain controller IP address: 192.168.99.12 —> Microsoft.Online.PasswordSynchronization.DirectoryReplicationServices.DrsCommunicationException: There was an error establishing a connection to the directory replication service. Domain controller hostname: MIDSRVR01.mrc.net, domain controller IP address: 192.168.99.12 —> Microsoft.Online.PasswordSynchronization.DirectoryReplicationServices.DrsException: There was an error creating the connection context. —> Microsoft.Online.PasswordSynchronization.DirectoryReplicationServices.DrsCommunicationException: RPC Error 1722 : The RPC server is unavailable. Error creating the RPC binding handle
The original A/D Server 2012 is multi-honed and it appears DNS is trying to use an IP Address on a private segment 192.168.99.12 which isn’t available to the new Server. The new server is on segment 172.16.1.x.
If I look at DNS, the server IP addresses appears in this order. How an I make the 172.16.1.43 the primary address?
How can I change the IP address to point to the other segment?
I created a new A/D Server 2022 to replace my A/D Server 2012. The install completed and the FSMO roles transferred fine. However, I cannot manage the A/D GPO. In looking at the logs, it appears that DNS cannot replicate. Password hash synchronization failed for domain: mrc.net, domain controller hostname: MIDSRVR01.mrc.net, domain controller IP address: 172.16.1.43. Details: Microsoft.Online.PasswordSynchronization.SynchronizationManagerException: Unable to open connection to domain: mrc.net. Error: There was an error establishing a connection to the directory replication service. Domain controller hostname: MIDSRVR01.mrc.net, domain controller IP address: 192.168.99.12 —> Microsoft.Online.PasswordSynchronization.DirectoryReplicationServices.DrsCommunicationException: There was an error establishing a connection to the directory replication service. Domain controller hostname: MIDSRVR01.mrc.net, domain controller IP address: 192.168.99.12 —> Microsoft.Online.PasswordSynchronization.DirectoryReplicationServices.DrsException: There was an error creating the connection context. —> Microsoft.Online.PasswordSynchronization.DirectoryReplicationServices.DrsCommunicationException: RPC Error 1722 : The RPC server is unavailable. Error creating the RPC binding handle The original A/D Server 2012 is multi-honed and it appears DNS is trying to use an IP Address on a private segment 192.168.99.12 which isn’t available to the new Server. The new server is on segment 172.16.1.x. If I look at DNS, the server IP addresses appears in this order. How an I make the 172.16.1.43 the primary address? How can I change the IP address to point to the other segment? Read More
Issue with Batch Insertion using PreparedStatement in SQL Server
Hi,
We are experiencing an issue while executing batch insertion using the preparedStatement.executeBatch() method in our Scala code. The problem is that it is inserting object references instead of actual values for VARCHAR columns. Below are the settings and the code snippet we are using:
Settings:
sendStringParametersAsUnicode=falseconnection.setUseBulkCopyForBatchInsert(true)There is no Date type column in the target table. (issue not reproducible with date columns)
Mssql Driver version: Tested with latest driver
mssql-jdbc-12.6.2.jre11
Issue Description:
When using the above method, we notice that for VARCHAR columns, instead of inserting the actual values, object references are being inserted into the table.
Could you please provide guidance on how to resolve this issue or let us know if there are any specific configurations or code adjustments required?
Following is Code Example:
private def executeSqlBatch(query: String, records: List[scala.Array[Any]], conn: Connection): Unit = {
var connection = conn
var preparedStatement: PreparedStatement = null
try {
preparedStatement = connection.prepareStatement(query)
records.foreach(record => {
var idx = 0
record.foreach { field =>
idx += 1
preparedStatement.setObject(idx, field)
}
preparedStatement.addBatch()
})
if (records.nonEmpty) {
preparedStatement.executeBatch()
} else {
preparedStatement.execute()
}
if (conn == null && connection != null) {
connection.commit()
}
} catch {
case ex: Exception =>
try {
if (conn == null && connection != null) connection.rollback()
} catch {
case rEx: Exception => logger.error(“An exception occurred during rollback:”, rEx)
}
throw ex
} finally {
preparedStatement.clearBatch()
preparedStatement.clearParameters()
DbUtils.closeQuietly(preparedStatement)
if (conn == null) {
DbUtils.closeQuietly(connection)
}
}
}
Hi,We are experiencing an issue while executing batch insertion using the preparedStatement.executeBatch() method in our Scala code. The problem is that it is inserting object references instead of actual values for VARCHAR columns. Below are the settings and the code snippet we are using:Settings:sendStringParametersAsUnicode=falseconnection.setUseBulkCopyForBatchInsert(true)There is no Date type column in the target table. (issue not reproducible with date columns)Mssql Driver version: Tested with latest drivermssql-jdbc-12.6.2.jre11Issue Description:When using the above method, we notice that for VARCHAR columns, instead of inserting the actual values, object references are being inserted into the table.Could you please provide guidance on how to resolve this issue or let us know if there are any specific configurations or code adjustments required?Following is Code Example:private def executeSqlBatch(query: String, records: List[scala.Array[Any]], conn: Connection): Unit = {var connection = connvar preparedStatement: PreparedStatement = nulltry {preparedStatement = connection.prepareStatement(query)records.foreach(record => {var idx = 0record.foreach { field =>idx += 1preparedStatement.setObject(idx, field)}preparedStatement.addBatch()})if (records.nonEmpty) {preparedStatement.executeBatch()} else {preparedStatement.execute()}if (conn == null && connection != null) {connection.commit()}} catch {case ex: Exception =>try {if (conn == null && connection != null) connection.rollback()} catch {case rEx: Exception => logger.error(“An exception occurred during rollback:”, rEx)}throw ex} finally {preparedStatement.clearBatch()preparedStatement.clearParameters()DbUtils.closeQuietly(preparedStatement)if (conn == null) {DbUtils.closeQuietly(connection)}}} Read More
MVP’s Favorite Content: Copilot+ PC, AI, Identity, Aspire
In this blog series dedicated to Microsoft’s technical articles, we’ll highlight our MVPs’ favorite article along with their personal insights.
Tomokazu Kizawa, Windows and Devices MVP, Japan
“The Copilot+ PC is a new PC concept that allows AI processing, which was previously done in the cloud, to be performed at the edge (on the PC itself). It requires an NPU capable of performing 40 trillion operations per second, high-speed and large-capacity memory, and fast storage. The Surface Pro 11 and Surface Laptop 7, equipped with Qualcomm’s Snapdragon X series processors, have been announced as high-performance PCs that make the Copilot+ PC a reality.
This article provides an excellent explanation of the new Surface series.”
(In Japanese, Copilot+ PCは、クラウドで処理をしていたAI処理をエッジ(PC)でも実行できるようにした新しいPCのコンセプトです。1秒間に40兆回の処理を行うNPU、高速で大容量なメモリ、高速なストレージを持つことが条件になっています。そして、Copilot+ PCを実現する高性能PCとしてQualcommのSnapdragon Xシリーズプロセッサを搭載した、Surface Pro 11とSurface Laptop 7が発表されました。
この記事は新型Surfaceシリーズをわかりやすく解説した素晴らしい記事です。)
*Relevant Video: 第731回 パソコンが変わる。Surfaceが変える。Copilot+ PC・新型Surface Pro 11とSurface Laptop (2024/5/26) (youtube.com)
Komes Chandavimol, AI MVP, Thailand
“My favorite site is the Microsoft Generative AI Hackathon, where over 1,000 participants compete in creating multimodal applications. The winner of the competition is ChatEDU, an innovative tool designed to transform students’ use of generative AI from mere task and assignment automation into a dynamic copilot that collaborates and learns with them. The runner-up is Garvis, which eliminates the need for users to verbally describe visual problems by directly analyzing and understanding the scene and replaces text-based instructions with intuitive visual demonstrations directly in the user’s environment. Both projects are incredible. You can explore more interesting multimodal applications in the gallery at Microsoft Generative AI Hackathon Project Gallery.”
(In Thai, งาน Microsoft Generative AI Hackathon ล่าสุด มีผู้เข้าแข่งขันหลายพันคน โดยมีหลายๆทีมที่น่าสนใจมาใน AI Hackathon Gallery โดยผู้ชนะ คือ ChatEDU เครื่องมือช่วยนักเรียนใช้ Gen AI ช่วยในการเรียน)
Zheng Xing, Windows Development MVP, China
Use Identity to secure a Web API backend for SPAs | Microsoft Learn
“As the architect of the project, I am responsible for guiding the team in designing and implementing products that comply with the secure development lifecycle. For our project, in a web application that is split between front-end and back-end and will integrate multiple third-party systems in the future, introducing ASP.NET Core Identity to achieve comprehensive identity authentication and authorization is an efficient and reasonable solution. This article provides detailed guidance and a link to an example. I have benefited a lot from it. Meanwhile, through community interactions, I’ve found that not all developers are aware of ASP.NET Core Identity. Therefore, I seize this opportunity to highly recommend it to everyone!”
(In Chinese, 作为项目的架构师,我需要指导团队设计并实现符合安全开发生命周期的产品。对于我们的项目来说,一个前后端分离,且未来会集成多个三方系统的Web Application中,通过引入ASP.NET Core Identity来实现完善的身份认证和授权,是一个高效合理的方案。这篇文章给出了详细的指导,并提供了示例的链接。我受益良多,同时在社区的交流中我发现,并不是所有的开发者都知道ASP.NET Core Identity,借此机会强烈推荐给大家!)
Tomomitsu Kusaba, Developer Technologies MVP, Japan
“.NET Aspire has reached General Availability (GA). I believe this will have a significant impact on creating cloud-native applications.”
(In Japanese, .NET AspireがGAしました。これはクラウドネイティブアプリケーションを作成する上で大きなインパクトを与える事項になると考えています。)
Microsoft Tech Community – Latest Blogs –Read More
Single-region deployment without Global Reach, using Secure Virtual WAN Hub with Routing-Intent
This article describes the best practices for connectivity and traffic flows with single-region Azure VMware Solution when using Azure Secure Virtual WAN with Routing Intent. You learn the design details of using Secure Virtual WAN with Routing-Intent without Global Reach. This article breaks down Virtual WAN with Routing Intent topology from the perspective of an Azure VMware Solution private cloud, on-premises sites, and Azure native. The implementation and configuration of Secure Virtual WAN with Routing Intent are beyond the scope and aren’t discussed in this document.
In regions without Global Reach support or with a security requirement to inspect traffic between Azure VMware Solution and on-premises at the hub firewall, a support ticket must be opened to enable ExpressRoute to ExpressRoute transitivity. ExpressRoute to ExpressRoute transitivity isn’t supported by default with Virtual WAN. – see Transit connectivity between ExpressRoute circuits with routing intent
Secure Virtual WAN with Routing Intent is only supported with Virtual WAN Standard SKU. Secure Virtual WAN with Routing Intent provides the capability to send all Internet traffic and Private network traffic (RFC 1918) to a security solution like Azure Firewall, a third-party Network Virtual Appliance (NVA), or SaaS solution. In the scenario, we have a network topology that spans a single region. There’s one Virtual WAN with a single hub located in the Region. The Hub has its own instance of an Azure Firewall deployed, essentially making it a Secure Virtual WAN Hub. Having a Secure Virtual WAN hub is a technical prerequisite to Routing Intent. The Secure Virtual WAN Hub has Routing Intent enabled.
The single region also has an Azure VMware Solution Private Cloud and an Azure Virtual Network. There’s also an on-premises site connecting to the region, which we review in more detail later in this document.
Note
If you’re using non-RFC1918 prefixes in your connected on-premises, Virtual Networks or Azure VMware Solution, make sure you have specified those prefixes in the “Private Traffic Prefixes” text box for Routing Intent. Keep in mind that you should always enter summarized routes only in the “Private Traffic Prefixes” section to cover your range. Do not input the exact range that is being advertised to Virtual WAN as this can lead to routing issues. For example, if the ExpressRoute Circuit is advertising 40.0.0.0/24 from on-premises, put a /23 CIDR range or larger in the Private Traffic Prefix text box (example: 40.0.0.0/23). – see Configure routing intent and policies through Virtual WAN portal
Note
When configuring Azure VMware Solution with Secure Virtual WAN Hubs, ensure optimal routing results on the hub by setting the Hub Routing Preference option to “AS Path.” – see Virtual hub routing preference
Understanding Topology Connectivity
Connection
Description
Connections (D)
Azure VMware Solution private cloud managed ExpressRoute connection to the hub.
Connections (E)
on-premises ExpressRoute connection to the hub.
The following sections cover traffic flows and connectivity for Azure VMware Solution, on-premises, Azure Virtual Networks, and the Internet.
This section focuses on only the Azure VMware Solution private cloud. The Azure VMware Solution private cloud has an ExpressRoute connection to the hub (connections labeled as “D”).
With ExpressRoute to ExpressRoute transitivity enabled on the Secure Hub and Routing-Intent enabled, the Secure Hub sends the default RFC 1918 addresses (10.0.0.0/8, 172.16.0.0/12, 192.168.0.0/16) to Azure VMware Solution over connection “D”. In addition to the default RFC 1918 addresses, Azure VMware Solution learns more specific routes from Azure Virtual Networks and Branch Networks (S2S VPN, P2S VPN, SDWAN) that are connected to the hub. Azure VMware Solution doesn’t learn specific routes from on-premises networks. For routing traffic back to on-premises networks, it uses the default RFC 1918 addresses that it learned from connection “D”. This traffic transits through the Hub firewall, as shown in the diagram. The Hub firewall has the specific routes for on-premises networks and routes traffic toward the destination over connection “E”. Traffic from Azure VMware Solution, heading towards Virtual Networks, will transit the Hub firewall. For more information, see the traffic flow section.
The diagram illustrates traffic flows from the perspective of the Azure VMware Solution Private Cloud.
Traffic Flow Chart
Traffic Flow Number
Source
Direction
Destination
Traffic Inspected on Secure Virtual WAN Hub firewall?
1
Azure VMware Solution Cloud
→
Virtual Network
Yes, traffic is inspected at the Hub firewall
2
Azure VMware Solution Cloud
→
on-premises
Yes, traffic is inspected at the Hub firewall
This section focuses only on the on-premises site. As shown in the diagram, the on-premises site has an ExpressRoute connection to the hub (connection labeled as “E”).
With ExpressRoute to ExpressRoute transitivity enabled on the Secure Hub and Routing-Intent enabled, the Secure Hub sends the default RFC 1918 addresses (10.0.0.0/8, 172.16.0.0/12, 192.168.0.0/16) to on-premises over connection “E”. In addition to the default RFC 1918 addresses, on-premises learns more specific routes from Azure Virtual Networks and Branch Networks (S2S VPN, P2S VPN, SDWAN) that are connected to the hub. On-premises doesn’t learn specific routes from Azure VMware Solution networks. For routing traffic back to Azure VMware Solution networks, it uses the default RFC 1918 addresses that it learned from connection “E”. This traffic transits through the Hub firewall, as shown in the diagram. The Hub firewall has the specific routes for Azure VMware Solution networks and routes traffic toward the destination over connection “D”. Traffic from on-premises, heading towards Virtual Networks, will transit the Hub firewall. For more information, see the traffic flow section for more detailed information.
As mentioned earlier, when you enable ExpressRoute to ExpressRoute transitivity on the Hub, it sends the default RFC 1918 addresses (10.0.0.0/8, 172.16.0.0/12, 192.168.0.0/16) to your on-premises network. Therefore, you shouldn’t advertise the exact RFC 1918 prefixes (10.0.0.0/8, 172.16.0.0/12, 192.168.0.0/16) back to Azure. Advertising the same exact routes creates routing problems within Azure. Instead, you should advertise more specific routes back to Azure for your on-premises networks.
Note
If you’re currently advertising the default RFC 1918 addresses from on-premises to Azure and wish to continue this practice, you need to split each RFC 1918 range into two equal sub-ranges and advertise these sub-ranges back to Azure. The sub-ranges are 10.0.0.0/9, 10.128.0.0/9, 172.16.0.0/13, 172.24.0.0/13, 192.168.0.0/17, and 192.168.128.0/17.
The diagram illustrates traffic flows from the perspective of on-premises.
Traffic Flow Chart
Traffic Flow Number
Source
Direction
Destination
Traffic Inspected on Secure Virtual WAN Hub firewall?
3
on-premises
→
Azure VMware Solution Cloud
Yes, traffic is inspected at the Hub firewall
4
on-premises
→
Virtual Network
Yes, traffic is inspected at the Hub firewall
This section focuses only on connectivity from an Azure Virtual Network perspective. As depicted in the diagram, the Virtual Network has a Virtual Network peering directly to the hub.
The diagram illustrates how all Azure native resources in the Virtual Network learn routes under their “Effective Routes”. A Secure Hub with Routing Intent enabled always sends the default RFC 1918 addresses (10.0.0.0/8, 172.16.0.0/12, 192.168.0.0/16) to peered Virtual Networks. Azure native resources in the Virtual Network don’t learn specific routes from outside of their Virtual Network. With Routing Intent enabled, all resources in the Virtual Network currently possess the default RFC 1918 address and use the hub firewall as the next hop. All traffic ingressing and egressing the Virtual Networks will always transit the Hub Firewall. For more information, see the traffic flow section for more detailed information.
The diagram illustrates traffic flows from an Azure Virtual Network perspective.
Traffic Flow Chart
Traffic Flow Number
Source
Direction
Destination
Traffic Inspected on Secure Virtual WAN Hub firewall?
5
Virtual Network
→
Azure VMware Solution Cloud
Yes, traffic is inspected at the Hub firewall
6
Virtual Network
→
on-premises
Yes, traffic is inspected at the Hub firewall
This section focuses only on how internet connectivity is provided for Azure native resources in Virtual Networks and Azure VMware Solution Private Clouds in a single region. There are several options to provide internet connectivity to Azure VMware Solution. – see Internet Access Concepts for Azure VMware Solution
Option 1: Internet Service hosted in Azure
Option 2: VMware Solution Managed SNAT
Option 3: Azure Public IPv4 address to NSX-T Data Center Edge
Although you can use all three options with Single Region Secure Virtual WAN with Routing Intent, “Option 1: Internet Service hosted in Azure” is the best option when using Secure Virtual WAN with Routing Intent and is the option that is used to provide internet connectivity in the scenario. The reason why “Option 1” is considered the best option with Secure Virtual WAN is due to its ease of security inspection, deployment, and manageability.
As mentioned earlier, when you enable Routing Intent on the Secure Hub, it advertises RFC 1918 to all peered Virtual Networks. However, you can also advertise a default route 0.0.0.0/0 for internet connectivity to downstream resources. With Routing Intent, you can choose to generate a default route from the hub firewall. This default route is advertised to your Virtual Network and to Azure VMware Solution. This section is broken into two sections, one that explains internet connectivity from an Azure VMware Solution perspective and another from the Virtual Network perspective.
When Routing Intent is enabled for internet traffic, the default behavior of the Secure Virtual WAN Hub is to not advertise the default route across ExpressRoute circuits. To ensure the default route is propagated to the Azure VMware Solution from the Azure Virtual WAN, you must enable default route propagation on your Azure VMware Solution ExpressRoute circuits – see To advertise default route 0.0.0.0/0 to endpoints. Once changes are complete, the default route 0.0.0.0/0 is then advertised via connection “D” from the hub. It’s important to note that this setting shouldn’t be enabled for on-premises ExpressRoute circuits. As a best practice, it’s recommended to implement a BGP Filter on your on-premises equipment. A BGP Filter in place prevents the inadvertent learning of the default route, adds an extra layer of precaution, and ensures that on-premises internet connectivity isn’t impacted.
When Routing Intent for internet access is enabled, the default route generated from the Secure Virtual WAN Hub is automatically advertised to the hub-peered Virtual Network connections. You’ll notice under Effective Routes for the Virtual Machines’ NICs in the Virtual Network that the 0.0.0.0/0 next hop is the hub firewall.
For more information, see the traffic flow section.
The diagram illustrates traffic flows from a Virtual Network and Azure VMware Solution perspective.
Traffic Flow Chart
Traffic Flow Number
Source
Direction
Destination
Traffic Inspected on Secure Virtual WAN hub firewall?
7
Virtual Network
→
Internet
Yes, traffic is inspected at the Hub firewall
8
Azure VMware Solution Cloud
→
Internet
Yes, traffic is inspected at the Hub firewall
HCX Mobility Optimized Networking (MON) is an optional feature to enable when using HCX Network Extensions (NE). Mobility Optimized Networking (MON) provides optimal traffic routing under certain scenarios to prevent network tromboning between the on-premises-based and cloud-based resources on extended networks.
Enabling Mobility Optimized Networking (MON) for a specific extended network and a virtual machine changes the traffic flow. After implementing Mobility Optimized Networking (MON), egress traffic from that virtual machine does not trombone back to on-premises. Instead, it bypasses the Network Extensions (NE) IPSEC tunnel. Traffic for that virtual machine will now egress out of the Azure VMware Solution NSX-T Tier-1 Gateway> NSX-T Tier-0 Gateway>Azure Virtual WAN.
Enabling Mobility Optimized Networking (MON) for a specific extended network and a virtual machine results in a change. From Azure VMware Solution NSX-T, it injects a /32 host route back to Azure Virtual WAN. Azure Virtual WAN advertises this /32 route back to on-premises, Virtual Networks, and Branch Networks (S2S VPN, P2S VPN, SDWAN). The purpose of this /32 host route is to ensure that traffic from on-premises, Virtual Networks, and Branch Networks doesn’t use the Network Extensions (NE) IPSEC tunnel when destined for the Mobility Optimized Networking (MON) enabled Virtual Machine. Traffic from source networks is directed straight to the Mobility Optimized Networking (MON) enabled Virtual Machine due to the /32 route that is learned.
With ExpressRoute to ExpressRoute transitivity enabled on the Secure Hub and Routing-Intent enabled, the Secure Hub sends the default RFC 1918 addresses (10.0.0.0/8, 172.16.0.0/12, 192.168.0.0/16) to both the on-premises and Azure VMware Solution. In addition to the default RFC 1918 addresses, both on-premises and Azure VMware Solution learn more specific routes from Azure Virtual Networks and Branch Networks (S2S VPN, P2S VPN, SDWAN) that are connected to the hub. However, on-premises networks don’t learn any specific routes from the Azure VMware Solution, nor does the reverse occur. Instead, both environments rely on the default RFC 1918 addresses to facilitate routing back to one another via the Hub firewall. This means that more specific routes, such as HCX Mobility Optimized Networking (MON) Host Routes, aren’t advertised from the Azure VMware Solution ExpressRoute to the on-premises-based ExpressRoute circuit and vice-versa. The inability to learn specific routes introduces asymmetric traffic flows. Traffic egresses Azure VMware Solution via the NSX-T Tier-0 gateway, but returning traffic from on-premises returns over the Network Extensions (NE) IPSEC tunnel.
To correct any traffic asymmetry, you need to adjust the HCX Mobility Optimized Networking (MON) Policy Routes. Mobility Optimized Networking (MON) policy routes determine which traffic goes back to the on-premises Gateway via an L2 extension. They also decide which traffic is routed through the Azure VMware Solution NSX Tier-0 Gateway.
If a destination IP matches and is set to “allow” in the Mobility Optimized Networking (MON) policy configuration, then two actions occur. First, the packet is identified. Second, it’s sent to the on-premises gateway through the HCX Network Extension appliance.
If a destination IP doesn’t match, or is set to “deny” in the Mobility Optimized Networking (MON) policy, the system sends the packet to the Azure VMware Solution Tier-0 for routing.
HCX Policy Routes
Network
Redirect to Peer
Note
Azure Virtual Network Address Space
Deny
Please ensure to explicitly include the address ranges for all your Virtual Networks. Traffic intended for Azure is directed out via the Azure VMware Solution and doesn’t return to the on-premises network.
Default RFC 1918 Address Spaces
Allow
Add in the default RFC 1918 addresses 10.0.0.0/8, 172.16.0.0/12, and 192.168.0.0/16. This configuration ensures that any traffic not matching the above criteria is rerouted back to the on-premises network. If your on-premises setup utilizes addresses that aren’t part of RFC 1918, you must explicitly include those ranges.
0.0.0.0/0
Deny
For addresses that aren’t covered by RFC 1918, such as Internet-routable IPs, or any traffic that doesn’t match the specified entries above, exits directly through the Azure VMware Solution and isn’t redirected back to the on-premises network.
For more information on Virtual WAN hub configuration, see About virtual hub settings.
For more information on how to configure Azure Firewall in a Virtual Hub, see Configure Azure Firewall in a Virtual WAN hub.
For more information on how to configure the Palo Alto Next Generation SAAS firewall on Virtual WAN, see Configure Palo Alto Networks Cloud NGFW in Virtual WAN.
For more information on Virtual WAN hub routing intent configuration, see Configure routing intent and policies through Virtual WAN portal.
Microsoft Tech Community – Latest Blogs –Read More
Why does my GPU not outperform my CPU/another GPU? Troubleshooting Steps
Why does my GPU not outperform my CPU / another GPU?
Here are some troubleshooting steps to understanding factors which affect performance.Why does my GPU not outperform my CPU / another GPU?
Here are some troubleshooting steps to understanding factors which affect performance. Why does my GPU not outperform my CPU / another GPU?
Here are some troubleshooting steps to understanding factors which affect performance. gpu, graphics, performance, kernel, timeout, benchmark MATLAB Answers — New Questions
What does v51 or v37 or v46 etc. mean in license manager?
I am trying to make a tool to help my company organize our MATLAB license use. Currently I am using the command "lmutil lmstat -a -c <license server>" and I’m just parsing the string output. The command prints out each type of license we have such as MATLAB, SIMULINK, etc. then within each license section it prints information about the license for a certain block of users in a nodelocked or floating license. I’m guessing this is grouped by location like IP addresses of the same location or something similar. Within this information, right next to the license type it prints v51 or v46 or some 2 digit number. For example the section starts with MATLAB v51. Then below that section it prints information about each user currently using that license. Each user line also prints out a vXY, and it can be the same or different than the corresponding license above. For example MATLAB v51 could have a user with v37. I’m just wondering what this v number means. Does it correspond to a MATLAB version somehow? If not, is there other information in the output that would tell me a users version?I am trying to make a tool to help my company organize our MATLAB license use. Currently I am using the command "lmutil lmstat -a -c <license server>" and I’m just parsing the string output. The command prints out each type of license we have such as MATLAB, SIMULINK, etc. then within each license section it prints information about the license for a certain block of users in a nodelocked or floating license. I’m guessing this is grouped by location like IP addresses of the same location or something similar. Within this information, right next to the license type it prints v51 or v46 or some 2 digit number. For example the section starts with MATLAB v51. Then below that section it prints information about each user currently using that license. Each user line also prints out a vXY, and it can be the same or different than the corresponding license above. For example MATLAB v51 could have a user with v37. I’m just wondering what this v number means. Does it correspond to a MATLAB version somehow? If not, is there other information in the output that would tell me a users version? I am trying to make a tool to help my company organize our MATLAB license use. Currently I am using the command "lmutil lmstat -a -c <license server>" and I’m just parsing the string output. The command prints out each type of license we have such as MATLAB, SIMULINK, etc. then within each license section it prints information about the license for a certain block of users in a nodelocked or floating license. I’m guessing this is grouped by location like IP addresses of the same location or something similar. Within this information, right next to the license type it prints v51 or v46 or some 2 digit number. For example the section starts with MATLAB v51. Then below that section it prints information about each user currently using that license. Each user line also prints out a vXY, and it can be the same or different than the corresponding license above. For example MATLAB v51 could have a user with v37. I’m just wondering what this v number means. Does it correspond to a MATLAB version somehow? If not, is there other information in the output that would tell me a users version? license manager MATLAB Answers — New Questions
Clearly Identifying circular regions on a chip in a noisy environment
Hey everyone
As the summary suggests, I have been working with chip images in hopes of clearly identifying the circles via pre-processing so that I can binarize the image and use regionprops on them afterwards. I haven’t had much success and any help would be much appreciated. I have shared some photos that I am working with that should help!
My current algorithm is very slow but also not very good at identification.Hey everyone
As the summary suggests, I have been working with chip images in hopes of clearly identifying the circles via pre-processing so that I can binarize the image and use regionprops on them afterwards. I haven’t had much success and any help would be much appreciated. I have shared some photos that I am working with that should help!
My current algorithm is very slow but also not very good at identification. Hey everyone
As the summary suggests, I have been working with chip images in hopes of clearly identifying the circles via pre-processing so that I can binarize the image and use regionprops on them afterwards. I haven’t had much success and any help would be much appreciated. I have shared some photos that I am working with that should help!
My current algorithm is very slow but also not very good at identification. image processing, image segmentation, image analyst MATLAB Answers — New Questions
How can I use a custom board with the Zynq workflow provided by MATLAB/Simulink?
MathWorks offers an integrated workflow for targeting the Zynq platform using HDL Coder and Embedded Coder.
https://www.mathworks.com/help/hdlcoder/ug/getting-started-with-hardware-software-codesign-workflow-for-xilinx-zynq-platform.html
The currently supported boards are Zedboard, ZC702, ZC706, ZCU102.
However, I am using a board based on Zynq that is not supported by MathWorks (e.g. MicroZed, PicoZed, Arty), or a completely custom-made board. How can I use a custom board with the Zynq workflow provided by MATLAB/Simulink?
MathWorks offers an integrated workflow for targeting the Zynq platform using HDL Coder and Embedded Coder.
https://www.mathworks.com/help/hdlcoder/ug/getting-started-with-hardware-software-codesign-workflow-for-xilinx-zynq-platform.html
The currently supported boards are Zedboard, ZC702, ZC706, ZCU102.
However, I am using a board based on Zynq that is not supported by MathWorks (e.g. MicroZed, PicoZed, Arty), or a completely custom-made board. How can I use a custom board with the Zynq workflow provided by MATLAB/Simulink?
MathWorks offers an integrated workflow for targeting the Zynq platform using HDL Coder and Embedded Coder.
https://www.mathworks.com/help/hdlcoder/ug/getting-started-with-hardware-software-codesign-workflow-for-xilinx-zynq-platform.html
The currently supported boards are Zedboard, ZC702, ZC706, ZCU102.
However, I am using a board based on Zynq that is not supported by MathWorks (e.g. MicroZed, PicoZed, Arty), or a completely custom-made board. How can I use a custom board with the Zynq workflow provided by MATLAB/Simulink?
custom, board, soc, workflow, hardware, software, hw/sw, codesign, co-design, hdl, coder, zynq, xilinx, zynq-7000, all, programmable, blockset MATLAB Answers — New Questions
How to fix QuickBook Error 12007 after update?
How can I resolve QuickBook Error 12007? I encounter this error during software updates, and it seems to be related to network issues. What are the steps to troubleshoot and fix this problem?
How can I resolve QuickBook Error 12007? I encounter this error during software updates, and it seems to be related to network issues. What are the steps to troubleshoot and fix this problem? Read More
autofill column only when data are added
I have a column (Column C) that calculates a formula based on the entry in another column (Column B). The spreadsheet will be used by other users. The issue is that I do not know how many entries any particular user will have. They may have 20 or they may have 200 or they may have 2000. I do not want to autofill the entire column with the formula because (a) it slows everything down, (b) it is ugly/not user-friendly (displaying #DIV/0! until the column it needs is filled, and (c) it can actually end up affecting the results. I’ve seen another sheet that does the same thing. For example, there are 22 entries right now. Cell B24 and C24 are currently blank – no formula is showing for cell C24. If I add a number to B24, then C24 auto-populates the formula and calculates it. I cannot figure out how to make my spreadsheet do that, despite much searching and exploration. I’ve tried various fill options, an arrayfunction, a table, and I haven’t yet found a solution that work. Any help is appreciated!
I have a column (Column C) that calculates a formula based on the entry in another column (Column B). The spreadsheet will be used by other users. The issue is that I do not know how many entries any particular user will have. They may have 20 or they may have 200 or they may have 2000. I do not want to autofill the entire column with the formula because (a) it slows everything down, (b) it is ugly/not user-friendly (displaying #DIV/0! until the column it needs is filled, and (c) it can actually end up affecting the results. I’ve seen another sheet that does the same thing. For example, there are 22 entries right now. Cell B24 and C24 are currently blank – no formula is showing for cell C24. If I add a number to B24, then C24 auto-populates the formula and calculates it. I cannot figure out how to make my spreadsheet do that, despite much searching and exploration. I’ve tried various fill options, an arrayfunction, a table, and I haven’t yet found a solution that work. Any help is appreciated! Read More