Month: September 2024
Regarding the C2000 blockset, how would I have one SPI block transmit right after another SPI block?
Hello all,
I need to send 32 bits of data to an external DAC via a microcontroller. The SPI blocks given in the C2000 blockset can only send a maximum of 16 bits. How would I go about sending 16 bits of data right after another 16 bits of data without the CS going high in between.
Thank you in advance.Hello all,
I need to send 32 bits of data to an external DAC via a microcontroller. The SPI blocks given in the C2000 blockset can only send a maximum of 16 bits. How would I go about sending 16 bits of data right after another 16 bits of data without the CS going high in between.
Thank you in advance. Hello all,
I need to send 32 bits of data to an external DAC via a microcontroller. The SPI blocks given in the C2000 blockset can only send a maximum of 16 bits. How would I go about sending 16 bits of data right after another 16 bits of data without the CS going high in between.
Thank you in advance. spi, c200, code generation, simulink MATLAB Answers — New Questions
Azure Translator access Private storage account
I’m currently facing an issue with the Translator service and Storage Account. As configured, the Translator service has the Storage Blob Data Contributor role.
And the storage account is disabled from public network
With some private endpoints
I’ve re-generated the SAS token but the Translation Status is not expected – Cannot access source document location with the current permissions
I’m currently facing an issue with the Translator service and Storage Account. As configured, the Translator service has the Storage Blob Data Contributor role. And the storage account is disabled from public networkWith some private endpointsI’ve re-generated the SAS token but the Translation Status is not expected – Cannot access source document location with the current permissions{ “id”: “809520ea-e237-4708-a247-a9ac3add53a6”, “createdDateTimeUtc”: “2024-09-30T04:32:07.5447751Z”, “lastActionDateTimeUtc”: “2024-09-30T04:32:07.8266224Z”, “status”: “ValidationFailed”, “error”: { “code”: “InvalidRequest”, “message”: “Cannot access source document location with the current permissions.”, “target”: “Operation”, “innerError”: { “code”: “InvalidDocumentAccessLevel”, “message”: “Cannot access source document location with the current permissions.” } }, “summary”: { “total”: 0, “failed”: 0, “success”: 0, “inProgress”: 0, “notYetStarted”: 0, “cancelled”: 0, “totalCharacterCharged”: 0 } } Read More
Peer To Peer Replication
I wanna ask about peer-to-peer replication. When i’m create peer-to-peer replication, i’m using 2 nodes that can pull and push subscription (multi-master). When it works until i’m updating, insert, or deleted it. The problem is when i’m got conflict detection cases. When i’m using Update/Delete case. From node 1 is update, but when in node 2 is delete. It’s got conflict. When we check in Conflict Detection Viewer. It still conflict and not resolved.
I’m already start enable conflict detection. and pointing originator_id from node 1. Do you have any solution for this. In my opinion, could it be set if this case is multi-master, keep the parent data from originator_id from node 1, not from node 2. So the data from node 1 is not in accordance with node 2.
Maybe could you help my problem,
Thanks
I wanna ask about peer-to-peer replication. When i’m create peer-to-peer replication, i’m using 2 nodes that can pull and push subscription (multi-master). When it works until i’m updating, insert, or deleted it. The problem is when i’m got conflict detection cases. When i’m using Update/Delete case. From node 1 is update, but when in node 2 is delete. It’s got conflict. When we check in Conflict Detection Viewer. It still conflict and not resolved. I’m already start enable conflict detection. and pointing originator_id from node 1. Do you have any solution for this. In my opinion, could it be set if this case is multi-master, keep the parent data from originator_id from node 1, not from node 2. So the data from node 1 is not in accordance with node 2. Maybe could you help my problem, Thanks Read More
Making Searching and Curating Data Assets in Microsoft Purview easier.
1. Introduction.
Currently, IT infrastructure stores and maintains data assets, even though IT doesn’t own or use the data.
There’s a disconnect between how data needs to be discovered and maintained within the business, and the teams that maintain it.
Without standardized procedures for data governance, data handling often relies on manual processes, leading to inefficiencies, data loss, insufficient data protection and higher operational costs.
Microsoft Purview is designed to help enterprises get the most value from their existing information assets.
The catalog makes data sources easily discoverable and understandable by the users who manage the data.
With Purview, organizations can gain insights into data lineage, data usage, and data connections, helping them to comply with regulations such as GDPR, CCPA, and others.
Microsoft Purview provides a cloud-based service into which you can register data sources. During registration, the data remains in its existing location, but a copy of its metadata is added to Microsoft Purview, along with a reference to the data source location.
After you register a data source, you can scan it and enrich its metadata.
Discovering and understanding data sources and their use is the primary purpose of registering the sources.
In this article, we describe smart features that allow you to search previously scanned data assets using natural language queries, along with automated metadata enrichment for curating these assets.
2. Data Catalog.
The Data Catalog is a Data Governance solution that enables business experts and technical data owners to collaborate and contribute to a shared understanding of data.
Among other functionalities, in Data Catalog you can use data search to discover data assets from multiple data sources, including Microsoft Fabric items and workspaces Exploring the Relationship Between Microsoft Fabric and Microsoft Purview: What You Need to Know
Microsoft Purview’s Smart Data Searching primarily works with scanned data assets. For unscanned data assets, manual classification and tagging can be done, but they may not fully leverage the capabilities of Smart Data Searching. For real-time or “Live View” data, you would typically need to scan the data source first to make it searchable in Microsoft Purview.
2.1 Smart Data Search.
Once the metadata is ingested into the Microsoft Purview Data Map, it can be searched using Microsoft Purview’s Smart Data Searching in the Data Catalog.
Now you can use natural language description for data assets searching in the Microsoft Purview Catalog.
Go to the new Microsoft Purview Portal: https://purview.microsoft.com
Select the Data Catalog solution and then, Data Search.
Once you enter your search, Microsoft Purview returns a list of data assets and glossary terms that match the entered keywords, provided the user has data reader permissions for them.
In the example below, the search phrase was “I want to know the data related with diseases in Latam”.
You should know that the search returns all data assets in the collection(s) that best match the query. If a collection contains data assets that match the phrase, all scanned items are returned, even if some items do not match exactly.
The correspondence between search results and desired results depends on your Data Map design, the registered data sources, and the scope applied in scanning, which helps narrow down the most common searches in your business.
See for example a multiregional and business concepts as a design for your Data Map.
The Microsoft Purview relevance engine sorts through all the matches and ranks them based on what it believes their usefulness is to a user.
Many factors determine an asset’s relevance score, and the Microsoft Purview search team is constantly tuning the relevance engine to ensure the top search results have value to you.
2.2 Browse by applying filters.
Once you entered the search phrase and wait for a few seconds, you can see a Filters Pane where you can apply the following filtering criteria:
Asset Type
Data Source Type
Collection
Classification
Contact
Endorsement
Assigned Term
Sensitivity Label
Rating
Next figure shows the Asset Type filtering:
Filtering by “Data”, you can refine your search selecting one or more data asset types according to your referred data source:
Next figure shows the assets of type “Report”:
Then select any filter category you would like to narrow your results by and select any values you would like to narrow results to. For some filters, you can select the ellipses to choose between an AND condition or an OR condition, as next figure shows:
2.3 Curation process.
The process of contributing to the catalog by tagging, documenting, and annotating data sources that have already been registered is known as metadata curation [Metadata curation in Microsoft Purview | Microsoft Learn].
The curation process is facilitated by selecting one or more data assets returned in the search that are assumed to be the curator’s responsibility.
For example, in the next figure we show two selected data assets:
By clicking on “View selected,” you can access a screen to start adding attributes to the data asset’s metadata.
Click on “Bulk edit”:
Selecting an attribute, you can add new values, replace an existing value with another one or remove values.
You can add as many attributes as you need.
Depending on the attribute, you can manage the proper values.
Purview’s Copilot can help enrich metadata by suggesting additional context, classifications, and annotations based on the data asset’s content and usage, as well as can ensure consistency in metadata definitions and standards across the organization, reducing discrepancies and improving data quality.
Selecting “Suggestions”, you can observe many derived suggestions based on your business concepts.
You should know that AI models use general internet knowledge base data so it will not return company specific or custom definitions or terms. All terms should be stewarded before being published to ensure that the term and definition aligns to company use and the specific knowledge about the term. Microsoft Purview Data Catalog Responsible AI FAQ (Preview) | Microsoft Learn
Any terms and definitions provided via suggestions should be reviewed and aligned with the company’s specific language standards. When a term is selected and created, it will be in draft status, allowing the steward to complete and finalize the term before deciding to publish it.
Conclusions.
Smart data searching and automated metadata enrichment significantly enhance the cataloging process, making it more efficient, comprehensive, and insightful.
These advanced capabilities not only improve data discoverability and governance but also empower users with richer, more contextualized information, leading to more informed and effective decision-making.
Learn more:
Microsoft Purview collections architecture and best practices | Microsoft Learn
Scans and ingestion | Microsoft Learn
How to search the Data Catalog | Microsoft Learn
Discover data with natural language search – YouTube
Metadata curation in Microsoft Purview | Microsoft Learn.
Best practices for describing data in Microsoft Purview | Microsoft Learn
How to create and manage glossary terms (Preview) | Microsoft Learn
Curate your data with Business Concepts (youtube.com)
How to configure and manage data catalog access policies (Preview) | Microsoft Learn
Microsoft Tech Community – Latest Blogs –Read More
Using Runge Kutta to solve second-order differential equations with two degrees of freedom
Hi everyone, I am beginner in Matlab Programming,I have been trying to solve the differential equation system given below, but I am unable to establish a workflow. Any assistance or related work would be greatly appreciated. Thanks in advance.
In the following system of differential equations, x and xb are structural responses, omga is frequency ratio, tau is time, and other parameters are constants. The requirement is that we need to provide the relationship between the structural response of the system and the frequency ratio omga.
matlab program:
function [T,X,dX] = ODE_RK4( Hfun,t,h,x0 )
if nargin < 4
error(‘The initial value must be given’);
end
if isstr(Hfun)
eval([‘Hfun = @’,Hfun,’;’]);
end
n = length(t);
if n == 1
T = 0:h:t;
elseif n == 2
T = t(1):h:t(2);
else
T = t;
end
T = T(:);
N = length(T);
x0 = x0(:);
x0 = x0′;
m = length(x0);
X = zeros(N,m);
dX = zeros(N,m);
X(1,:) = x0;
for k = 2:N
h = T(k) – T(k-1);
K1 = Hfun( T(k-1) , X(k-1,:)’ );
K2 = Hfun( T(k-1)+h/2 , X(k-1,:)’+h*K1/2 );
K3 = Hfun( T(k-1)+h/2 , X(k-1,:)’+h*K2/2 );
K4 = Hfun( T(k-1)+h , X(k-1,:)’+h*K3 );
X(k,:) = X(k-1,:)’ + (h/6) * ( K1 + 2*K2 + 2*K3 + K4 );
dX(k-1,:) = (1/6) * ( K1 + 2*K2 + 2*K3 + K4 );
end
dX(N,:) = Hfun( T(N),X(N,:) );
if nargout == 0
plot(T,X)
end
clc;
clear;
lambda_b = 0.01;
lambda_ns = -0.01;
lambda = 0.5;
zeta_b = 0.025;
chi = 0.0001;
alpha = 0.2;
zeta = 0.05;
z = 0.06;
omega_values = linspace(0.05, 1, 1000);
tau = linspace(0, 2*pi, 1000);
dt = tau(2) – tau(1);
X0 = [0; 0; 0; 0]; % Initial value
x_max = zeros(1, length(omega_values));
for i = 1:length(omega_values)
omega = omega_values(i);
f = @(t, X) [X(2);
-(2*zeta*omega*X(2) + 4*(2*alpha^2*X(1)^3*lambda+1/sqrt(alpha)) + lambda_b*(X(1) – X(3)) + z*cos(t)) / omega^2;
X(4);
-1/(chi*omega^2)*(2*zeta_b*omega*X(4) + lambda_ns*X(3) – lambda_b*(X(1) – X(3)))];
[T,X]= ODE_RK4(f, tau, dt,X0);
x_max(i) = max(abs(X(i, :)));
end
figure;
plot(omega_values, x_max);
xlabel(‘omega’);
ylabel(‘x_{max}’);Hi everyone, I am beginner in Matlab Programming,I have been trying to solve the differential equation system given below, but I am unable to establish a workflow. Any assistance or related work would be greatly appreciated. Thanks in advance.
In the following system of differential equations, x and xb are structural responses, omga is frequency ratio, tau is time, and other parameters are constants. The requirement is that we need to provide the relationship between the structural response of the system and the frequency ratio omga.
matlab program:
function [T,X,dX] = ODE_RK4( Hfun,t,h,x0 )
if nargin < 4
error(‘The initial value must be given’);
end
if isstr(Hfun)
eval([‘Hfun = @’,Hfun,’;’]);
end
n = length(t);
if n == 1
T = 0:h:t;
elseif n == 2
T = t(1):h:t(2);
else
T = t;
end
T = T(:);
N = length(T);
x0 = x0(:);
x0 = x0′;
m = length(x0);
X = zeros(N,m);
dX = zeros(N,m);
X(1,:) = x0;
for k = 2:N
h = T(k) – T(k-1);
K1 = Hfun( T(k-1) , X(k-1,:)’ );
K2 = Hfun( T(k-1)+h/2 , X(k-1,:)’+h*K1/2 );
K3 = Hfun( T(k-1)+h/2 , X(k-1,:)’+h*K2/2 );
K4 = Hfun( T(k-1)+h , X(k-1,:)’+h*K3 );
X(k,:) = X(k-1,:)’ + (h/6) * ( K1 + 2*K2 + 2*K3 + K4 );
dX(k-1,:) = (1/6) * ( K1 + 2*K2 + 2*K3 + K4 );
end
dX(N,:) = Hfun( T(N),X(N,:) );
if nargout == 0
plot(T,X)
end
clc;
clear;
lambda_b = 0.01;
lambda_ns = -0.01;
lambda = 0.5;
zeta_b = 0.025;
chi = 0.0001;
alpha = 0.2;
zeta = 0.05;
z = 0.06;
omega_values = linspace(0.05, 1, 1000);
tau = linspace(0, 2*pi, 1000);
dt = tau(2) – tau(1);
X0 = [0; 0; 0; 0]; % Initial value
x_max = zeros(1, length(omega_values));
for i = 1:length(omega_values)
omega = omega_values(i);
f = @(t, X) [X(2);
-(2*zeta*omega*X(2) + 4*(2*alpha^2*X(1)^3*lambda+1/sqrt(alpha)) + lambda_b*(X(1) – X(3)) + z*cos(t)) / omega^2;
X(4);
-1/(chi*omega^2)*(2*zeta_b*omega*X(4) + lambda_ns*X(3) – lambda_b*(X(1) – X(3)))];
[T,X]= ODE_RK4(f, tau, dt,X0);
x_max(i) = max(abs(X(i, :)));
end
figure;
plot(omega_values, x_max);
xlabel(‘omega’);
ylabel(‘x_{max}’); Hi everyone, I am beginner in Matlab Programming,I have been trying to solve the differential equation system given below, but I am unable to establish a workflow. Any assistance or related work would be greatly appreciated. Thanks in advance.
In the following system of differential equations, x and xb are structural responses, omga is frequency ratio, tau is time, and other parameters are constants. The requirement is that we need to provide the relationship between the structural response of the system and the frequency ratio omga.
matlab program:
function [T,X,dX] = ODE_RK4( Hfun,t,h,x0 )
if nargin < 4
error(‘The initial value must be given’);
end
if isstr(Hfun)
eval([‘Hfun = @’,Hfun,’;’]);
end
n = length(t);
if n == 1
T = 0:h:t;
elseif n == 2
T = t(1):h:t(2);
else
T = t;
end
T = T(:);
N = length(T);
x0 = x0(:);
x0 = x0′;
m = length(x0);
X = zeros(N,m);
dX = zeros(N,m);
X(1,:) = x0;
for k = 2:N
h = T(k) – T(k-1);
K1 = Hfun( T(k-1) , X(k-1,:)’ );
K2 = Hfun( T(k-1)+h/2 , X(k-1,:)’+h*K1/2 );
K3 = Hfun( T(k-1)+h/2 , X(k-1,:)’+h*K2/2 );
K4 = Hfun( T(k-1)+h , X(k-1,:)’+h*K3 );
X(k,:) = X(k-1,:)’ + (h/6) * ( K1 + 2*K2 + 2*K3 + K4 );
dX(k-1,:) = (1/6) * ( K1 + 2*K2 + 2*K3 + K4 );
end
dX(N,:) = Hfun( T(N),X(N,:) );
if nargout == 0
plot(T,X)
end
clc;
clear;
lambda_b = 0.01;
lambda_ns = -0.01;
lambda = 0.5;
zeta_b = 0.025;
chi = 0.0001;
alpha = 0.2;
zeta = 0.05;
z = 0.06;
omega_values = linspace(0.05, 1, 1000);
tau = linspace(0, 2*pi, 1000);
dt = tau(2) – tau(1);
X0 = [0; 0; 0; 0]; % Initial value
x_max = zeros(1, length(omega_values));
for i = 1:length(omega_values)
omega = omega_values(i);
f = @(t, X) [X(2);
-(2*zeta*omega*X(2) + 4*(2*alpha^2*X(1)^3*lambda+1/sqrt(alpha)) + lambda_b*(X(1) – X(3)) + z*cos(t)) / omega^2;
X(4);
-1/(chi*omega^2)*(2*zeta_b*omega*X(4) + lambda_ns*X(3) – lambda_b*(X(1) – X(3)))];
[T,X]= ODE_RK4(f, tau, dt,X0);
x_max(i) = max(abs(X(i, :)));
end
figure;
plot(omega_values, x_max);
xlabel(‘omega’);
ylabel(‘x_{max}’); runge-kutta, matlab MATLAB Answers — New Questions
Getting ‘default’ theme despite Teams is in dark mode.
I have been getting ‘default’ theme despite my Teams is in dark mode and I’m not sure why.
I have been getting ‘default’ theme despite my Teams is in dark mode and I’m not sure why. Read More
Comprehensive Nvidia GPU Monitoring for Azure N-Series VMs Using Telegraf with Azure Monitor
In today’s AI and HPC landscapes, GPU monitoring has become essential due to the complexity and high resource demands of these workloads. Effective monitoring ensures that GPUs are utilized optimally, preventing both underutilization and overutilization, which can negatively impact performance and drive up costs. By identifying bottlenecks such as memory limitations or thermal throttling, GPU monitoring allows for performance optimization, enabling smoother workflows. In cloud environments like Azure, where GPU resources can be costly, monitoring plays a key role in managing expenses by tracking usage patterns and facilitating efficient resource allocation. Additionally, monitoring helps with capacity planning, scaling workloads, and forecasting, ensuring that resources are properly allocated for future needs.
While Azure Monitor provides robust tools for tracking CPU, memory, storage, and network usage, it does not natively support GPU monitoring for Azure N-series VMs. To track GPU performance, additional configuration through third-party tools or integration such as Telegraf is required. At the time of writing, Azure Monitor lacks built-in GPU metrics without these external solutions.
Telegraf is an open-source, lightweight agent developed by InfluxData, designed to collect, process, and send metrics and event data from various systems, applications, and services. It supports a wide range of input plugins, allowing it to gather data from sources like system stats, databases, and APIs. Telegraf can then output this data to different destinations, such as monitoring platforms like InfluxDB, Azure Monitor, or other time-series databases. Its flexibility and low resource footprint make it ideal for monitoring infrastructure and applications in real-time, especially in cloud environments.
In this blog, we will explore how to configure Telegraf to send GPU monitoring metrics to Azure Monitor. This comprehensive guide will cover all the necessary steps to enable GPU monitoring, ensuring you can track and optimize GPU performance in Azure effectively.
Step 1: Making changes in Azure for sending GPU metrics from Telegraf agents to Azure monitor from VM or VMSS.
Register the microsoft.insights resource provider in your Azure subscription. Refer: Resource providers and resource types – Azure Resource Manager | Microsoft Learn
2. Enable Managed Service Identities to authenticate an Azure VM or Azure VMSS. In the example we are using Managed Identity for authentication. You can also use User Managed Identities or Service Principle to authenticate the VM. Refer: telegraf/plugins/outputs/azure_monitor at release-1.15 · influxdata/telegraf (github.com)
Step 2: Set Up the Telegraf Agent Inside the VM or VMSS to Send Data to Azure Monitor
In this example, I will be using an Azure Standard_ND96asr_v4 VM with the Ubuntu-HPC 2204 image to configure the environment for both VM and VMSS. The Ubuntu-HPC 2204 image comes pre-installed with NVIDIA GPU drivers and CUDA. If you choose to use a different image, make sure to install the necessary GPU drivers and the CUDA toolkit.
Download and execute the `gpumon-setup.sh` script to install the Telegraf agent on Ubuntu 22.04. This script will also configure the NVIDIA SMI input plugin and set up the Telegraf configuration to send data to Azure Monitor.
Run the following commands:
wget -q https://raw.githubusercontent.com/vinil-v/gpu-monitoring/refs/heads/main/scripts/gpumon-setup.sh -O gpumon-setup.sh
chmod +x gpumon-setup.sh
./gpumon-setup.sh
Test the Telegraf configuration by executing the following command:
sudo telegraf –config /etc/telegraf/telegraf.conf –test
Step 3: Creating Dashboards in Azure Monitor to Check NVIDIA GPU Usage
Telegraf includes an output plugin specifically designed for Azure Monitor, enabling users to send custom metrics directly to the platform. Azure Monitor functions with a metric resolution of one minute; thus, the Telegraf output plugin automatically aggregates metrics into one-minute buckets, which are sent to Azure Monitor at each flush interval. Each input plugin’s metrics are recorded in a separate Azure Monitor namespace, defaulting to the prefix “Telegraf/” for easy identification.
To visualize NVIDIA GPU usage, navigate to the Metrics section in Azure portal. Select the VM name as the scope, and then choose the Metric Namespace as `telegraf/nvidia-smi`. From there, you can select various metrics to view NVIDIA GPU utilization. You can also apply filters and splits for a more detailed analysis of the data.
You can create GPU monitoring dashboards for both VM and VMSS. Below are some sample charts to consider.
Bonus: Simulating GPU usage using a sample training program.
If you’re testing and lack a program to simulate GPU usage, I have a solution for you! I’ve created a script that runs a multi-GPU distributed training model. This script will install the Anaconda software and set up the environment needed for executing the distributed training model using TensorFlow. By running this script, you can effectively simulate GPU usage, allowing you to verify the monitoring metrics you’ve set up.
To get started, run the following commands:
wget -q https://raw.githubusercontent.com/vinil-v/gpu-monitoring/refs/heads/main/scripts/gpu_test_program.sh -O gpu_test_program.sh
chmod +x gpu_test_program.sh
./gpu_test_program.sh
I hope you find this blog post helpful. With the right tools and insights, you can unlock the full potential of your GPU resources. Happy reading!
Reference:
telegraf/plugins/outputs/azure_monitor at release-1.15 · influxdata/telegraf (github.com)
telegraf/plugins/inputs/nvidia_smi at release-1.15 · influxdata/telegraf (github.com)
Microsoft Tech Community – Latest Blogs –Read More
Sharepoint api keeps throttling even we exponential backoff and sleep retry-after times.
Sharepoint api keeps throttling even we exponential backoff and sleep retry-after times. It happens to all below api, and we’ve seen retry-after response header being 5 seconds or 300 seconds. But even if we wait for 300 seconds, it still keeps responding with 429 too many requests. Is there any problem with sharepoint api recently?
/_api/Web、/_api/Web/Lists、/_api/Web/Lists(@lid)/RootFolder
Sharepoint api keeps throttling even we exponential backoff and sleep retry-after times. It happens to all below api, and we’ve seen retry-after response header being 5 seconds or 300 seconds. But even if we wait for 300 seconds, it still keeps responding with 429 too many requests. Is there any problem with sharepoint api recently?/_api/Web、/_api/Web/Lists、/_api/Web/Lists(@lid)/RootFolder Read More
Flow field over a flat plate airfoil
I would like to plot a velocity field over a flate plate airfoil inclined at an angle of attack alpha (alpha = 5°).
The velocity is defined by the Mach numbber being 5. The air is considerate as an ideal gas, and at sea level.
The dimension of the airfoil are (2D):
chord length = 1m
height = 0.05m
I also would like to have a CFD solution
Tell me if there are missing informations.
Thank you in advanceI would like to plot a velocity field over a flate plate airfoil inclined at an angle of attack alpha (alpha = 5°).
The velocity is defined by the Mach numbber being 5. The air is considerate as an ideal gas, and at sea level.
The dimension of the airfoil are (2D):
chord length = 1m
height = 0.05m
I also would like to have a CFD solution
Tell me if there are missing informations.
Thank you in advance I would like to plot a velocity field over a flate plate airfoil inclined at an angle of attack alpha (alpha = 5°).
The velocity is defined by the Mach numbber being 5. The air is considerate as an ideal gas, and at sea level.
The dimension of the airfoil are (2D):
chord length = 1m
height = 0.05m
I also would like to have a CFD solution
Tell me if there are missing informations.
Thank you in advance homework, compressible flow, cfd MATLAB Answers — New Questions
How to import result data from OpenFOAM to matlab?
hello,
I want to import the result data from OpenFOAM which are compressed to matlab to plot the results and compare them with the results of other flow models. Can anyone please help me with the code?
Regards,
Mahendrahello,
I want to import the result data from OpenFOAM which are compressed to matlab to plot the results and compare them with the results of other flow models. Can anyone please help me with the code?
Regards,
Mahendra hello,
I want to import the result data from OpenFOAM which are compressed to matlab to plot the results and compare them with the results of other flow models. Can anyone please help me with the code?
Regards,
Mahendra data import, zip file import, data import fromopenfoam to matlab MATLAB Answers — New Questions
Integrating Simulink with OpenFOAM.
Is there any way of integrating Simulink with OpenFOAM (e.g. having an engine model in the first and a propeller model in the second)?Is there any way of integrating Simulink with OpenFOAM (e.g. having an engine model in the first and a propeller model in the second)? Is there any way of integrating Simulink with OpenFOAM (e.g. having an engine model in the first and a propeller model in the second)? simulink, openfoam, integration, matlab, communication MATLAB Answers — New Questions
Update issue and activation request
Update issue read –
Update Stack Package – (Version 922.415.111.0) wont download via update.
Update issue read -Update Stack Package – (Version 922.415.111.0) wont download via update.The update certification has expired. Make sure your date, time, and time zone settings are correct and we’ll try to install the update later. I said: I have restarted twice but not successfully updated. Also, PC requested that I activate windows again on start up.I had a live chat session and was asked to post here as it may be a bug with update on this build.He stated the following -Upon checking the version of Windows that you have it shows that you are using the insider version which is Windows 24h2. This is not the stable version yet and may still contain bugs. Since you are a member of Windows 11 insider you may need to post this error to our insider community. The moderators will respond to you within 24 to 48 hours.This is the post. Read More
Azure Resource Graph query to get subscription properties
I am very new to ARG queries. I am struggling to figure out how to get a list of our Azure Subscriptions using ARG, including some of the properties you see on the properties pane when using the azure portal. In particular, I want the property visually labelled “ACCOUNT ADMIN”.
Can anyone point me in the right direction?
I am very new to ARG queries. I am struggling to figure out how to get a list of our Azure Subscriptions using ARG, including some of the properties you see on the properties pane when using the azure portal. In particular, I want the property visually labelled “ACCOUNT ADMIN”.Can anyone point me in the right direction?
resourcecontainers
| where type == ‘microsoft.resources/subscriptions’
| project subscriptionId, name, owner = ??? Read More
Why do I receive a privimporthdl error when importing the operator.vhd example
Hello,
I am trying to import a VHDL file using the operator.vhd example. However, I receive the following error when using the importhdl function.
I beleive I have followed the example correctly, but I am at a loss as to why this is occuring. Is this a problem experienced by others at all? I’m not sure what I can try next.
Below is the operator.vhd code when it is opened within the MATLAB editor.
library IEEE;
use IEEE.STD_LOGIC_1164.ALL;
use IEEE.STD_LOGIC_ARITH.ALL;
use IEEE.STD_LOGIC_UNSIGNED.ALL;
entity Operator is
Port ( A : in STD_LOGIC_VECTOR(3 downto 0);
B : in STD_LOGIC_VECTOR(3 downto 0);
OpSelect : in STD_LOGIC_VECTOR(2 downto 0);
Result : out STD_LOGIC_VECTOR(3 downto 0));
end Operator;
architecture Behavioral of Operator is
begin
process(A, B, OpSelect)
begin
case OpSelect is
when "000" => — Addition
Result <= A + B;
when "001" => — Subtraction
Result <= A – B;
when "010" => — Bitwise AND
Result <= A and B;
when "011" => — Bitwise OR
Result <= A or B;
when "100" => — Bitwise XOR
Result <= A xor B;
when others => — Default case
Result <= not (A + B);
end case;
end process;
end Behavioral;Hello,
I am trying to import a VHDL file using the operator.vhd example. However, I receive the following error when using the importhdl function.
I beleive I have followed the example correctly, but I am at a loss as to why this is occuring. Is this a problem experienced by others at all? I’m not sure what I can try next.
Below is the operator.vhd code when it is opened within the MATLAB editor.
library IEEE;
use IEEE.STD_LOGIC_1164.ALL;
use IEEE.STD_LOGIC_ARITH.ALL;
use IEEE.STD_LOGIC_UNSIGNED.ALL;
entity Operator is
Port ( A : in STD_LOGIC_VECTOR(3 downto 0);
B : in STD_LOGIC_VECTOR(3 downto 0);
OpSelect : in STD_LOGIC_VECTOR(2 downto 0);
Result : out STD_LOGIC_VECTOR(3 downto 0));
end Operator;
architecture Behavioral of Operator is
begin
process(A, B, OpSelect)
begin
case OpSelect is
when "000" => — Addition
Result <= A + B;
when "001" => — Subtraction
Result <= A – B;
when "010" => — Bitwise AND
Result <= A and B;
when "011" => — Bitwise OR
Result <= A or B;
when "100" => — Bitwise XOR
Result <= A xor B;
when others => — Default case
Result <= not (A + B);
end case;
end process;
end Behavioral; Hello,
I am trying to import a VHDL file using the operator.vhd example. However, I receive the following error when using the importhdl function.
I beleive I have followed the example correctly, but I am at a loss as to why this is occuring. Is this a problem experienced by others at all? I’m not sure what I can try next.
Below is the operator.vhd code when it is opened within the MATLAB editor.
library IEEE;
use IEEE.STD_LOGIC_1164.ALL;
use IEEE.STD_LOGIC_ARITH.ALL;
use IEEE.STD_LOGIC_UNSIGNED.ALL;
entity Operator is
Port ( A : in STD_LOGIC_VECTOR(3 downto 0);
B : in STD_LOGIC_VECTOR(3 downto 0);
OpSelect : in STD_LOGIC_VECTOR(2 downto 0);
Result : out STD_LOGIC_VECTOR(3 downto 0));
end Operator;
architecture Behavioral of Operator is
begin
process(A, B, OpSelect)
begin
case OpSelect is
when "000" => — Addition
Result <= A + B;
when "001" => — Subtraction
Result <= A – B;
when "010" => — Bitwise AND
Result <= A and B;
when "011" => — Bitwise OR
Result <= A or B;
when "100" => — Bitwise XOR
Result <= A xor B;
when others => — Default case
Result <= not (A + B);
end case;
end process;
end Behavioral; importhdl MATLAB Answers — New Questions
MATLAB ANALYSIS run Successfully on ThingSpeak, BUT always generates Server Error 500, so doesn’t send email
The error is:
"Failed to send alert: The server returned the status 500 with message "Internal Server Error" in response to the request to URL https://api.thingspeak.com/alerts/send."
The message says MATLAB Analysis ran successfully, so the code is good, but never sends the email because of the server error.
Simple code from examples:
% Catch errors so the MATLAB code does not disable a TimeControl if it fails
try
webwrite(alertUrl , "body", alertBody, "subject", alertSubject, options);
catch someException
fprintf("Failed to send alert: %sn", someException.message);
end
Where to go from here?The error is:
"Failed to send alert: The server returned the status 500 with message "Internal Server Error" in response to the request to URL https://api.thingspeak.com/alerts/send."
The message says MATLAB Analysis ran successfully, so the code is good, but never sends the email because of the server error.
Simple code from examples:
% Catch errors so the MATLAB code does not disable a TimeControl if it fails
try
webwrite(alertUrl , "body", alertBody, "subject", alertSubject, options);
catch someException
fprintf("Failed to send alert: %sn", someException.message);
end
Where to go from here? The error is:
"Failed to send alert: The server returned the status 500 with message "Internal Server Error" in response to the request to URL https://api.thingspeak.com/alerts/send."
The message says MATLAB Analysis ran successfully, so the code is good, but never sends the email because of the server error.
Simple code from examples:
% Catch errors so the MATLAB code does not disable a TimeControl if it fails
try
webwrite(alertUrl , "body", alertBody, "subject", alertSubject, options);
catch someException
fprintf("Failed to send alert: %sn", someException.message);
end
Where to go from here? thingspeak MATLAB Answers — New Questions
IntranetFileLinksEnabled does not work with shared file links
I refer to this setting https://learn.microsoft.com/en-us/DeployEdge/microsoft-edge-policies#intranetfilelinksenabled
Which allows links to local file resources using the file:// protocol
It works as expected if the file is on the local computer, e.g. C:/file/example.txt but if the file is in a shared location, Edge strips the preceeding slashes and cannot find the file, e.g.
<a href=”file:////share/file/example.txt”>link</a>
If you click on the link above, edge will try to go to share/file/example.txt instead of //share/file/example.txt – we need it to go to the second location as this is a valid file share location
I refer to this setting https://learn.microsoft.com/en-us/DeployEdge/microsoft-edge-policies#intranetfilelinksenabled Which allows links to local file resources using the file:// protocol It works as expected if the file is on the local computer, e.g. C:/file/example.txt but if the file is in a shared location, Edge strips the preceeding slashes and cannot find the file, e.g. <a href=”file:////share/file/example.txt”>link</a> If you click on the link above, edge will try to go to share/file/example.txt instead of //share/file/example.txt – we need it to go to the second location as this is a valid file share location Read More
Forwarded email to vtext
I’m trying to forward selected email to a vtext.com address. Selected emails to a conventional email work fine, the ones to a vtext.com address do not. Looking for help.
Thank you,
Jim
I’m trying to forward selected email to a vtext.com address. Selected emails to a conventional email work fine, the ones to a vtext.com address do not. Looking for help. Thank you, Jim Read More
Issue with Image Upload in VIVA Engage via Power Automate
Hi Experts,
We’re currently using Power Automate to post data in VIVA Engage via the VIVA Engage Post connector. However, despite trying various HTML codes, the images are being posted as links instead of displaying directly.
Please see the attached screenshot for reference. Is there an alternative method to resolve this issue?
Thank you for your help!
Hi Experts,We’re currently using Power Automate to post data in VIVA Engage via the VIVA Engage Post connector. However, despite trying various HTML codes, the images are being posted as links instead of displaying directly.Please see the attached screenshot for reference. Is there an alternative method to resolve this issue? Thank you for your help! Read More
Bulk Edit of List Propagating Slowly
Hi Folks,
I’m running into what I think is unusual behavior with a sharepoint list. Through the interface, if I select ~100 items, click edit, change a few fields and click save, it can take up to a few minutes before the data that I’ve changed shows up in rest calls filtering on those fields.
I’ve tested this behavior in 3 ways:
Get Items action in a power automate flow using these fields in an odata filter
Filtering using CAML in a GetItems post call
Connection & Query from Excel
In all cases the behavior is the same: over the span of 4-5 minutes more and more of the items I modified become available in the results. The Modified timestamp on all of them however is the same (it’s the time I made the change).
For example: If I select then edit 160 rows and set the Action column (choice column) to “Add” and then immediately run a Get Items in power automate with this filter: Action eq ‘Add’, on the first run, it’ll return 3 or 4 items, if I run it again it might return 10-15 items, If I run it 30 seconds later, it’ll return around 30-40 items and so on so forth until eventually a few minutes later it’ll return all the records that I had modified to Action = Add. The modified timestamp on all the items however are the same.
What’s also interesting is that on the list itself, it reflects the change immediately. So what I see in the list is different than what’s being returned by filters against those fields.
Has anyone run into weird behavior like this? It’s almost like there some sort of massive latency between the bulk edit action and the new data being available to filter against.
Hi Folks, I’m running into what I think is unusual behavior with a sharepoint list. Through the interface, if I select ~100 items, click edit, change a few fields and click save, it can take up to a few minutes before the data that I’ve changed shows up in rest calls filtering on those fields. I’ve tested this behavior in 3 ways:Get Items action in a power automate flow using these fields in an odata filterFiltering using CAML in a GetItems post callConnection & Query from Excel In all cases the behavior is the same: over the span of 4-5 minutes more and more of the items I modified become available in the results. The Modified timestamp on all of them however is the same (it’s the time I made the change). For example: If I select then edit 160 rows and set the Action column (choice column) to “Add” and then immediately run a Get Items in power automate with this filter: Action eq ‘Add’, on the first run, it’ll return 3 or 4 items, if I run it again it might return 10-15 items, If I run it 30 seconds later, it’ll return around 30-40 items and so on so forth until eventually a few minutes later it’ll return all the records that I had modified to Action = Add. The modified timestamp on all the items however are the same. What’s also interesting is that on the list itself, it reflects the change immediately. So what I see in the list is different than what’s being returned by filters against those fields. Has anyone run into weird behavior like this? It’s almost like there some sort of massive latency between the bulk edit action and the new data being available to filter against. Read More
How to unblock UHRS
Please my Microsoft account for UHRS was blocked immediately after creating it indicating that unusual activity was detected please how can I reopen it. THANKS
Please my Microsoft account for UHRS was blocked immediately after creating it indicating that unusual activity was detected please how can I reopen it. THANKS Read More