Month: June 2024
How to find respective features from Principal Components
Hello Everyone!
I have a feature matrix of 4378*54. After doing PCA, I get a reduced feature matrix of 4378*29. I have used a threshold variance of 95%.
In summary, 29 principal components explain 95% of variance in my feature matrix.
How can I find the respective features in my feature matrix from the 29 principal components? Ofcourse, these 29 principal components correspond to 29 features in my feature matrix which have 54 features. How can I find those features?
I am using PCA function which gives me coeff, scores, latent and explained.
Secondly, do I have to use same PCA for both training and testing dataset? I have read in an article that I have to store the selected coeff (95% variance) from training dataset and multiply this with my testing dataset which will yield a reduced feature matrix. I have tried it and it works. But how it works, I dont understand.
Your comments will be highly appreciated.Hello Everyone!
I have a feature matrix of 4378*54. After doing PCA, I get a reduced feature matrix of 4378*29. I have used a threshold variance of 95%.
In summary, 29 principal components explain 95% of variance in my feature matrix.
How can I find the respective features in my feature matrix from the 29 principal components? Ofcourse, these 29 principal components correspond to 29 features in my feature matrix which have 54 features. How can I find those features?
I am using PCA function which gives me coeff, scores, latent and explained.
Secondly, do I have to use same PCA for both training and testing dataset? I have read in an article that I have to store the selected coeff (95% variance) from training dataset and multiply this with my testing dataset which will yield a reduced feature matrix. I have tried it and it works. But how it works, I dont understand.
Your comments will be highly appreciated. Hello Everyone!
I have a feature matrix of 4378*54. After doing PCA, I get a reduced feature matrix of 4378*29. I have used a threshold variance of 95%.
In summary, 29 principal components explain 95% of variance in my feature matrix.
How can I find the respective features in my feature matrix from the 29 principal components? Ofcourse, these 29 principal components correspond to 29 features in my feature matrix which have 54 features. How can I find those features?
I am using PCA function which gives me coeff, scores, latent and explained.
Secondly, do I have to use same PCA for both training and testing dataset? I have read in an article that I have to store the selected coeff (95% variance) from training dataset and multiply this with my testing dataset which will yield a reduced feature matrix. I have tried it and it works. But how it works, I dont understand.
Your comments will be highly appreciated. pca, features, machine learning MATLAB Answers — New Questions
De-embed S parameters for 3 Fixtures/ Pads
Hello everyone,
I am a student and have the following problem:
I would like to use the function “deembedsparams” to de-embed my DUT. Unfortunately, i am not using two fixtures or pads as intended for this function, but three. I would therefore have to de-embed all three. Can someone tell me to what extent I can use the function for my problem and what I have to consider. maybe there is a workaround solution like de-embedding each pad iteratively?
I hope someone can help me with my problem.
Many thanks in advance
PatrickHello everyone,
I am a student and have the following problem:
I would like to use the function “deembedsparams” to de-embed my DUT. Unfortunately, i am not using two fixtures or pads as intended for this function, but three. I would therefore have to de-embed all three. Can someone tell me to what extent I can use the function for my problem and what I have to consider. maybe there is a workaround solution like de-embedding each pad iteratively?
I hope someone can help me with my problem.
Many thanks in advance
Patrick Hello everyone,
I am a student and have the following problem:
I would like to use the function “deembedsparams” to de-embed my DUT. Unfortunately, i am not using two fixtures or pads as intended for this function, but three. I would therefore have to de-embed all three. Can someone tell me to what extent I can use the function for my problem and what I have to consider. maybe there is a workaround solution like de-embedding each pad iteratively?
I hope someone can help me with my problem.
Many thanks in advance
Patrick s-parameter, deembedding, deembedsparams function, sparams MATLAB Answers — New Questions
Cannot echo Topics from Simulink in ROS on WSL2
I’m troubling in connecting Simulink and Unreal Engine through ROS on WSL2.
I want to send some messages from Simulink to UE (ROS Integration) using ROS Topics, but the topics from Simulink cannot be subscribed. (Actually the topic itself is subscribed, but no message is coming.)
My environment is as follows:
MATLAB R2024a + ROS Toolbox
Unreal Engine 5.1.1 (+ ROS Integration)
Ubuntu 20.04 on WSL2
ROS Noetic
I have confirmed the ROS communications below:
Simulink to the same Simulink: succes
Simulink to the other Simulink on the same computer: partly success (✔A→B, while ✘B→A. I suspected that the same port cannot be used for both publishing and subscribing, but alternating their roles did not work.)
Simulink to ROS: failed (The topics can be found by rostopic list, but I couldn’t rostopic echo.)
Simulink to UE: failed
UE to ROS: success (only String messages)
UE to Simulink: success (only String messages)
Regarding to the rostopic list, the rosnode info /SimulinkSystem_xxxxx (the node created by Simulink) response says that the connection to the node is failed.
I searched for the similar issues on internet and it may be caused by that the hostname cannot be resolved by ROS.
I have tried to set the environment variables so that they are not "localhost" nor "127.0.0.1", but it did not work. (When doing it, I could not connect to the ROS master.)
Could anyone figure out what is the true problem(s), and how to solve it?
Thank youI’m troubling in connecting Simulink and Unreal Engine through ROS on WSL2.
I want to send some messages from Simulink to UE (ROS Integration) using ROS Topics, but the topics from Simulink cannot be subscribed. (Actually the topic itself is subscribed, but no message is coming.)
My environment is as follows:
MATLAB R2024a + ROS Toolbox
Unreal Engine 5.1.1 (+ ROS Integration)
Ubuntu 20.04 on WSL2
ROS Noetic
I have confirmed the ROS communications below:
Simulink to the same Simulink: succes
Simulink to the other Simulink on the same computer: partly success (✔A→B, while ✘B→A. I suspected that the same port cannot be used for both publishing and subscribing, but alternating their roles did not work.)
Simulink to ROS: failed (The topics can be found by rostopic list, but I couldn’t rostopic echo.)
Simulink to UE: failed
UE to ROS: success (only String messages)
UE to Simulink: success (only String messages)
Regarding to the rostopic list, the rosnode info /SimulinkSystem_xxxxx (the node created by Simulink) response says that the connection to the node is failed.
I searched for the similar issues on internet and it may be caused by that the hostname cannot be resolved by ROS.
I have tried to set the environment variables so that they are not "localhost" nor "127.0.0.1", but it did not work. (When doing it, I could not connect to the ROS master.)
Could anyone figure out what is the true problem(s), and how to solve it?
Thank you I’m troubling in connecting Simulink and Unreal Engine through ROS on WSL2.
I want to send some messages from Simulink to UE (ROS Integration) using ROS Topics, but the topics from Simulink cannot be subscribed. (Actually the topic itself is subscribed, but no message is coming.)
My environment is as follows:
MATLAB R2024a + ROS Toolbox
Unreal Engine 5.1.1 (+ ROS Integration)
Ubuntu 20.04 on WSL2
ROS Noetic
I have confirmed the ROS communications below:
Simulink to the same Simulink: succes
Simulink to the other Simulink on the same computer: partly success (✔A→B, while ✘B→A. I suspected that the same port cannot be used for both publishing and subscribing, but alternating their roles did not work.)
Simulink to ROS: failed (The topics can be found by rostopic list, but I couldn’t rostopic echo.)
Simulink to UE: failed
UE to ROS: success (only String messages)
UE to Simulink: success (only String messages)
Regarding to the rostopic list, the rosnode info /SimulinkSystem_xxxxx (the node created by Simulink) response says that the connection to the node is failed.
I searched for the similar issues on internet and it may be caused by that the hostname cannot be resolved by ROS.
I have tried to set the environment variables so that they are not "localhost" nor "127.0.0.1", but it did not work. (When doing it, I could not connect to the ROS master.)
Could anyone figure out what is the true problem(s), and how to solve it?
Thank you matlab, simulink, ros, wsl2, noetic MATLAB Answers — New Questions
I need a safe & best MP4 to MP3 converter for Windows 11, Any suggestions?
I need some recommendations for a safe and best MP4 to MP3 converter for Windows 11. For my current project, I have a lot of MP4 videos that need to be converted to MP3 format. I have tried several online MP4 to MP3 tools, but they are full of ads, have poor audio quality or seem to be a security risk. For my work, ease of use and maintaining high audio quality during the conversion process is crucial, as well as wanting to be able to convert multiple MP4s to mp3 at once, as I don’t want to convert them one by one!
Thank you very much for your help!
I need some recommendations for a safe and best MP4 to MP3 converter for Windows 11. For my current project, I have a lot of MP4 videos that need to be converted to MP3 format. I have tried several online MP4 to MP3 tools, but they are full of ads, have poor audio quality or seem to be a security risk. For my work, ease of use and maintaining high audio quality during the conversion process is crucial, as well as wanting to be able to convert multiple MP4s to mp3 at once, as I don’t want to convert them one by one! Thank you very much for your help! Read More
Certification and Learning Hub
Hey,
Does Microsoft offer a Learning Hub and certification programs with free vouchers for their partners, similar to Google’s offerings? If so, could someone share the relevant links for more information?”
Cheers!
Hey,Does Microsoft offer a Learning Hub and certification programs with free vouchers for their partners, similar to Google’s offerings? If so, could someone share the relevant links for more information?” Cheers! Read More
Co-Pilot pro (Work) suggested an online document from company’s own SharePoint
Hi I have a quick question (hopefully) about Co-Pilot pro for business.
One of my users was using the chat functionality in WEB to ask it some basic questions about excel functionality.
The response that came back, at the end of the prompt, seemingly found, and suggested a document created by user’s in our tennant and hosted in SharePoint, it was able to understand the authors names too.
This is a really cool feature, but my question is more around how or if, Co-Pilot is able to understand who has permissions to the document that it’s suggesting. For example, could an unsuspecting user bump into a prompt reply suggesting a document that they should not have access to, or is it smart enough to only suggest a document should the user, using co-pilot have access to it?
Can anyone comment on this?
Hi I have a quick question (hopefully) about Co-Pilot pro for business. One of my users was using the chat functionality in WEB to ask it some basic questions about excel functionality. The response that came back, at the end of the prompt, seemingly found, and suggested a document created by user’s in our tennant and hosted in SharePoint, it was able to understand the authors names too. This is a really cool feature, but my question is more around how or if, Co-Pilot is able to understand who has permissions to the document that it’s suggesting. For example, could an unsuspecting user bump into a prompt reply suggesting a document that they should not have access to, or is it smart enough to only suggest a document should the user, using co-pilot have access to it?Can anyone comment on this? Read More
Calculating difference between dates and times
Trying to calculate the difference between planned finish date/time and actual finish date/time.
When the difference is positive the answer is returned in the correct format (line 3) but when negative the answer is to large to show (lines 1 & 2).
Planned Finish TimeActual Finish TimeDiff(10) 7:59 AM(10) 9:00 AM############(10) 9:30 AM(12) 8:30 AM############(12) 10:34 AM(12) 7:00 AM(00) 3:34 AM
How is the fixed to show the negative difference’s in the correct format?
Trying to calculate the difference between planned finish date/time and actual finish date/time. When the difference is positive the answer is returned in the correct format (line 3) but when negative the answer is to large to show (lines 1 & 2). Planned Finish TimeActual Finish TimeDiff(10) 7:59 AM(10) 9:00 AM############(10) 9:30 AM(12) 8:30 AM############(12) 10:34 AM(12) 7:00 AM(00) 3:34 AM How is the fixed to show the negative difference’s in the correct format? Read More
Identifying different sections to be completed in a work document for multiple people?
I don’t know if there is a way to actually do this, but I have a word document that needs to be filled out by 3 separate individuals from different departments (eg: sales/marketing/operations). Some bits to be filled are fields in a table, and others are simply typing a response below a question.
I am looking for a way to easily show the individuals which sections they need to complete, perhaps automatically highlighting the fields based on a department selection in a drop down? Does anyone have an idea on how to achieve this?
A point to note is that this will be an offline document that is only accessible by one individual at a time.
If anyone can how to achieve my vision or come up with an alternative solution that would be great!
I don’t know if there is a way to actually do this, but I have a word document that needs to be filled out by 3 separate individuals from different departments (eg: sales/marketing/operations). Some bits to be filled are fields in a table, and others are simply typing a response below a question. I am looking for a way to easily show the individuals which sections they need to complete, perhaps automatically highlighting the fields based on a department selection in a drop down? Does anyone have an idea on how to achieve this? A point to note is that this will be an offline document that is only accessible by one individual at a time. If anyone can how to achieve my vision or come up with an alternative solution that would be great! Read More
Does Windows have a Guest Halt Polling feature?
Can a Windows virtual machine implement a function similar to Linux’s Guest Halt Polling? Specifically, can Windows be configured to poll for a period before executing the HLT (Halt) instruction, to facilitate faster response times?
For reference, the description of Guest Halt Polling is provided here: https://docs.kernel.org/virt/guest-halt-polling.html
Can a Windows virtual machine implement a function similar to Linux’s Guest Halt Polling? Specifically, can Windows be configured to poll for a period before executing the HLT (Halt) instruction, to facilitate faster response times? For reference, the description of Guest Halt Polling is provided here: https://docs.kernel.org/virt/guest-halt-polling.html Read More
How to identify the most impactful GenAI use case
Generative AI solutions are very transformational for businesses nowadays. Organisations have access to tools and AI capabilities that will enable them to generate immense value for the business.
At the same time though, some businesses are struggling to identify the areas they should invest in AI. Most of the AI engagements fail because there isn’t really a business value that the AI solution is delivering. At times we find ourselves in a situation whereby we have an AI solution looking for a problem.
This is what leads to failure. In order to make sure a Generative AI or AI solution is successful, we need to make sure it’s clear the business value it will deliver before any technical work starts. It’s very important that we work with business and technical decision makers to make sure they are aligned on where to invest in AI, why they think it’s a good idea for their organization to invest and what is the expected business outcome the solution is supposed to deliver. AI experts need to work alongside the customer’s decision makers to engage in a Value Creation exercise. The objective of the Value Creation exercise is to identify those High Value use cases that will impact key business processes of an organization, enterprise-wide and deliver the highest business impact; consequently making clear the Value that the AI solution will deliver. The Value can be anything: cost savings, increase profit, put a new service on the market, employee productivity, etc.
When we work in a GenAI project, we should work alongside our customers and help uncover the following points:
What would we like to achieve in the AI space and by when?
To whom is this AI initiative important? Who is sponsoring it and why?
What are the core business processes within our organisation that deliver the highest business impact?
What are the core business processes within our organisation that would benefit from AI?
How are we measuring the impact of the core business processes?
Where do we think there is the highest ROI for our business if we were to invest in AI/Gen AI?
Where is the data at the moment? (i.e: Cloud, on-premise waiting for a DC migration?)
The above will help focus the conversation on what really matters for a given organisation that is planning to invest in AI.
Ultimately this conversation should be leading the business and technical decision makers to a list of use cases organised by: priority, complexity, business impact and ROI. Similar to the table below:
Use Case
Business Area
Priority
Complexity
ROI
M&A
Legal
H
M
£xM
Document drafting
Legal
H
H
£xM
To derive the ROI, it is important for the organisation to work on a Strawman business case. The Strawman business case is an estimate of the value that the AI solution in scope is expected to deliver to the business.
Below is an example template of Strawman business case in the legal space:
The Strawman business case will help the business and technical stakeholders where they should be focusing their investment in AI.
Once a use case or list of top X use cases have been identified, it’s important to derive the metrics and KPIs of the use case/s. These metrics, will be the same that the PoV will need to measure so that the business can validate the expected ROI which is delivered by the PoV and be able to extract the expected ROI once the solution is in production.
After the KPIs and metrics have been identified, it’s important for the business and technical stakeholder to understand if the ROI is compelling enough to invest in AI. If the business decides to press ahead, it’s important to create an evaluation team which is normally made by domain experts or end users of the AI solution.
The evaluation team is tasked to review/validate the output of the solution, collect the KPIs and at the end of the PoV to derive and validate the actual ROI the AI solution is delivering.
Once we have identified the right use case to focus on alongside the business case/ROI the GenAI solution will deliver, it’s necessary to understand which type of Copilot to use. That is, an out of the box GenAI capability like M365 Copilot or build a custom Copilot.
M365 Copilot is a GenAI SaaS offering from Microsoft that enables organisation to quickly adopt GenAI to extract insight not only within the M365 ecosystem but also within external systems too(i.e: SQL DB, etc).
The experience an end user gets with M365 Copilot is directly linked to how “good/precise” the prompt is and the quality of the data that particular end user has access to.
The M365 Copilot indexes all the data the end user has access to, so it’s important to remember that a given user has access to working progress data and final data. Ultimately the quality of the output of a prompt is also linked to the quality of the data besides the quality of the prompt itself.
What if we need to be in control of:
The knowledge base(data) the GenAI model is grounded to/has access to
Prompt structure and how the prompts are executed
Evaluate the quality of the output of the prompts
Be able to get back to the user and ask for more information if the submitted prompt is “too generic”.
This is where a Custom Copilot will come handy.
When we talk about Custom Copilot, we have two options:
Low Code/No code option with Copilot Studio
Code first approach with Azure OpenAI
Copilot Studio or the ex PVA is a cloud service that enables organisation to build intelligent Bots powered by GenAI(if needed)
As the development team is designing the interaction between the end user and the Intelligent Bot, we can cater for all those scenario whereby we need to ask the end user for more information if the prompt is not too accurate or if we need to augment the data from an external system.
Copilot Studio comes with integration with Azure OpenAI or the wider Azure AI ecosystem and non – via API integration
Being a Low Code/No Code development tool, Copilot Studio works well for those use cases whereby it’s not required a huge data integration, aggregation, filtering etc before providing context to the GenAI model to execute a prompt.
Also, if a given organisation needs to be in control of the amount of compute the GenAI application needs alongside a pro code development approach, then this is where the Custom Copilot built with Azure OpenAI will help.
Via the Azure AI Studio developers have access to best in class data integration, LLM models, model evaluation, LLM Ops etc to build GenAI application that can be exposed within an organisation or externally to its own customers.
Moreover with Copilot Extensions available within Copilot Studio, we can build a custom plugin around our custom Copilot (built either with Azure AI Studio or Copilot Studio) and publish it to M365 Copilot. This allows it to leverage more specialised Copilots as and when needed from within a single UI.
Below is a decision tree that helps navigate among the different Copilots and when to use which Copilot:
In the next article we will discuss Use Cases with Generative AI across industries : Potential Use Cases for Generative AI (microsoft.com)
@arung @Stephan Rhodes @Renata Bafaloukou @morgan Gladwell
Microsoft Tech Community – Latest Blogs –Read More
solving nonlinear wave equation
I need to solve this PDE below.
This is quite similar to the Burgers’ equation. However, I can’t understand how to get the exact solution to this equation. Please give an explicit solution to a given initial condition. Any initial condition is fine, just as long as the explicit solution is given. I will use it to check if my FDM code is correct.
In addition, if anyone has the code to solving this equation numerically, please post it here.
Thanks in advance!
below is my code.
for i = 2:(step(1)+1) % t
for j = 2:(step(2)+1) % x
rho(i, j) = rho(i-1, j) – v_max*(1-2*(rho(i-1, j)/rho_max))*(dt/dx)*(rho(i-1, j) – rho(i-1, j-1));
end
end
%———– set up —————
tspan = [0, 3]; % t
xspan = [-10, 10]; % x
step = [1000, 1000]; % step(1) : t, step(2) : x
t_values = linspace(tspan(1), tspan(2), step(1)+1);
x_values = linspace(xspan(1), xspan(2), step(2)+1);
v_max = 5; rho_max = 5; % vmax, rhomax
dt = (tspan(2) – tspan(1)) / step(1);
dx = (xspan(2) – xspan(1)) / step(2);
rho = zeros(step(1)+1, step(2)+1);
rhoo = zeros(step(1)+1, step(2)+1);
for j = 1:(step(1)+1)
rho(1, j) = 1/(1 + exp(j*dt));
end
%————- (FDM) ————-
for i = 2:(step(1)+1) % t
for j = 2:(step(2)+1) % x
rho(i, j) = rho(i-1, j) – v_max*(1-2*(rho(i-1, j)/rho_max))*(dt/dx)*(rho(i-1, j) – rho(i-1, j-1));
end
end
% for i = 1:(step(1)+1)
% for j = 1:(step(2)+1)
% rhoo(i, j) =
% end
% end
%———— animation ————–
filename = ‘animation.gif’;
figure;
for i = 1:(step(1)+1)
plot(x_values, rho(i, :));
xlabel(‘Position’);
ylabel(‘Density’);
title([‘Time = ‘, num2str(t_values(i))]);
drawnow;
frame = getframe(gcf);
img = frame2im(frame);
[imind, cm] = rgb2ind(img, 256);
if i == 1
imwrite(imind, cm, filename, ‘gif’, ‘Loopcount’, inf, ‘DelayTime’, dt);
else
imwrite(imind, cm, filename, ‘gif’, ‘WriteMode’, ‘append’, ‘DelayTime’, dt);
end
endI need to solve this PDE below.
This is quite similar to the Burgers’ equation. However, I can’t understand how to get the exact solution to this equation. Please give an explicit solution to a given initial condition. Any initial condition is fine, just as long as the explicit solution is given. I will use it to check if my FDM code is correct.
In addition, if anyone has the code to solving this equation numerically, please post it here.
Thanks in advance!
below is my code.
for i = 2:(step(1)+1) % t
for j = 2:(step(2)+1) % x
rho(i, j) = rho(i-1, j) – v_max*(1-2*(rho(i-1, j)/rho_max))*(dt/dx)*(rho(i-1, j) – rho(i-1, j-1));
end
end
%———– set up —————
tspan = [0, 3]; % t
xspan = [-10, 10]; % x
step = [1000, 1000]; % step(1) : t, step(2) : x
t_values = linspace(tspan(1), tspan(2), step(1)+1);
x_values = linspace(xspan(1), xspan(2), step(2)+1);
v_max = 5; rho_max = 5; % vmax, rhomax
dt = (tspan(2) – tspan(1)) / step(1);
dx = (xspan(2) – xspan(1)) / step(2);
rho = zeros(step(1)+1, step(2)+1);
rhoo = zeros(step(1)+1, step(2)+1);
for j = 1:(step(1)+1)
rho(1, j) = 1/(1 + exp(j*dt));
end
%————- (FDM) ————-
for i = 2:(step(1)+1) % t
for j = 2:(step(2)+1) % x
rho(i, j) = rho(i-1, j) – v_max*(1-2*(rho(i-1, j)/rho_max))*(dt/dx)*(rho(i-1, j) – rho(i-1, j-1));
end
end
% for i = 1:(step(1)+1)
% for j = 1:(step(2)+1)
% rhoo(i, j) =
% end
% end
%———— animation ————–
filename = ‘animation.gif’;
figure;
for i = 1:(step(1)+1)
plot(x_values, rho(i, :));
xlabel(‘Position’);
ylabel(‘Density’);
title([‘Time = ‘, num2str(t_values(i))]);
drawnow;
frame = getframe(gcf);
img = frame2im(frame);
[imind, cm] = rgb2ind(img, 256);
if i == 1
imwrite(imind, cm, filename, ‘gif’, ‘Loopcount’, inf, ‘DelayTime’, dt);
else
imwrite(imind, cm, filename, ‘gif’, ‘WriteMode’, ‘append’, ‘DelayTime’, dt);
end
end I need to solve this PDE below.
This is quite similar to the Burgers’ equation. However, I can’t understand how to get the exact solution to this equation. Please give an explicit solution to a given initial condition. Any initial condition is fine, just as long as the explicit solution is given. I will use it to check if my FDM code is correct.
In addition, if anyone has the code to solving this equation numerically, please post it here.
Thanks in advance!
below is my code.
for i = 2:(step(1)+1) % t
for j = 2:(step(2)+1) % x
rho(i, j) = rho(i-1, j) – v_max*(1-2*(rho(i-1, j)/rho_max))*(dt/dx)*(rho(i-1, j) – rho(i-1, j-1));
end
end
%———– set up —————
tspan = [0, 3]; % t
xspan = [-10, 10]; % x
step = [1000, 1000]; % step(1) : t, step(2) : x
t_values = linspace(tspan(1), tspan(2), step(1)+1);
x_values = linspace(xspan(1), xspan(2), step(2)+1);
v_max = 5; rho_max = 5; % vmax, rhomax
dt = (tspan(2) – tspan(1)) / step(1);
dx = (xspan(2) – xspan(1)) / step(2);
rho = zeros(step(1)+1, step(2)+1);
rhoo = zeros(step(1)+1, step(2)+1);
for j = 1:(step(1)+1)
rho(1, j) = 1/(1 + exp(j*dt));
end
%————- (FDM) ————-
for i = 2:(step(1)+1) % t
for j = 2:(step(2)+1) % x
rho(i, j) = rho(i-1, j) – v_max*(1-2*(rho(i-1, j)/rho_max))*(dt/dx)*(rho(i-1, j) – rho(i-1, j-1));
end
end
% for i = 1:(step(1)+1)
% for j = 1:(step(2)+1)
% rhoo(i, j) =
% end
% end
%———— animation ————–
filename = ‘animation.gif’;
figure;
for i = 1:(step(1)+1)
plot(x_values, rho(i, :));
xlabel(‘Position’);
ylabel(‘Density’);
title([‘Time = ‘, num2str(t_values(i))]);
drawnow;
frame = getframe(gcf);
img = frame2im(frame);
[imind, cm] = rgb2ind(img, 256);
if i == 1
imwrite(imind, cm, filename, ‘gif’, ‘Loopcount’, inf, ‘DelayTime’, dt);
else
imwrite(imind, cm, filename, ‘gif’, ‘WriteMode’, ‘append’, ‘DelayTime’, dt);
end
end pde MATLAB Answers — New Questions
how to Create random signal
random signal how torandom signal how to random signal how to MATLAB Answers — New Questions
USRP X310 HDL coder
Does HDL Coder support USRP X310?Does HDL Coder support USRP X310? Does HDL Coder support USRP X310? x310, hdl coder, usrp MATLAB Answers — New Questions
Roadmap – Languages?
Hi,
i couldn’t find and information about possible languages that might be supported in the future in viva engage. Is there anyone who knows or can give me a hint whether or not
Arabic
Bulgarian
Malay
Serbian
Slovak
are coming up anytime soon?
Thanks in advance
Hi, i couldn’t find and information about possible languages that might be supported in the future in viva engage. Is there anyone who knows or can give me a hint whether or not ArabicBulgarianMalaySerbianSlovak are coming up anytime soon?Thanks in advance Read More
Eventid 5014 is Missing in Documentation
Eventid 5014 is Missing in Documentation Microsoft Defender Antivirus event IDs and error codes – Microsoft Defender for Endpoint | Microsoft Learn
Log Name: Microsoft-Windows-Windows Defender/Operational
Source: Microsoft-Windows-Windows Defender
Date: 12.06.2024 11:20:33
Event ID: 5014
Task Category: None
Level: Error
Keywords:
User: SYSTEM
Computer: pc.some.domain
Description:
Microsoft Defender Antivirus Resource Monitor: Memory consumption exceeded its limit.
Hit count: 49
Current Threshold: 3211316
Event Xml:
<Event xmlns=”http://schemas.microsoft.com/win/2004/08/events/event“>
<System>
<Provider Name=”Microsoft-Windows-Windows Defender” Guid=”{11cd958a-c507-4ef3-b3f2-5fd9dfbd2c78}” />
<EventID>5014</EventID>
<Version>0</Version>
<Level>2</Level>
<Task>0</Task>
<Opcode>0</Opcode>
<Keywords>0x8000000000000000</Keywords>
<TimeCreated SystemTime=”2024-06-12T09:20:33.3043903Z” />
<EventRecordID>11449</EventRecordID>
<Correlation />
<Execution ProcessID=”6168″ ThreadID=”11428″ />
<Channel>Microsoft-Windows-Windows Defender/Operational</Channel>
<Computer>PC.some.domain</Computer>
<Security UserID=”S-1-5-18″ />
</System>
<EventData>
<Data Name=”Product Name”>Microsoft Defender Antivirus</Data>
<Data Name=”Product Version”>4.18.24050.7</Data>
<Data Name=”Hit Count”>49</Data>
<Data Name=”Threshold”>3211316</Data>
</EventData>
</Event>
Eventid 5014 is Missing in Documentation Microsoft Defender Antivirus event IDs and error codes – Microsoft Defender for Endpoint | Microsoft LearnLog Name: Microsoft-Windows-Windows Defender/OperationalSource: Microsoft-Windows-Windows DefenderDate: 12.06.2024 11:20:33Event ID: 5014Task Category: NoneLevel: ErrorKeywords:User: SYSTEMComputer: pc.some.domainDescription:Microsoft Defender Antivirus Resource Monitor: Memory consumption exceeded its limit.Hit count: 49Current Threshold: 3211316Event Xml:<Event xmlns=”http://schemas.microsoft.com/win/2004/08/events/event”><System><Provider Name=”Microsoft-Windows-Windows Defender” Guid=”{11cd958a-c507-4ef3-b3f2-5fd9dfbd2c78}” /><EventID>5014</EventID><Version>0</Version><Level>2</Level><Task>0</Task><Opcode>0</Opcode><Keywords>0x8000000000000000</Keywords><TimeCreated SystemTime=”2024-06-12T09:20:33.3043903Z” /><EventRecordID>11449</EventRecordID><Correlation /><Execution ProcessID=”6168″ ThreadID=”11428″ /><Channel>Microsoft-Windows-Windows Defender/Operational</Channel><Computer>PC.some.domain</Computer><Security UserID=”S-1-5-18″ /></System><EventData><Data Name=”Product Name”>Microsoft Defender Antivirus</Data><Data Name=”Product Version”>4.18.24050.7</Data><Data Name=”Hit Count”>49</Data><Data Name=”Threshold”>3211316</Data></EventData></Event> Read More
Viva Amplify Campaigns – Visibility / Teams Approvals
Hello everyone,
with recent demos of Viva Amplify, I have run into two issues:
Campaign Visibility
Users can only see campaigns they themselves have created in their dashboard, no matter their role.
So Campaign 1, created by User A, will have User B as editor or approver. User B, however, sees only their own campaign, Campaign 2, in their Dashboard, but not Campaign 1. Using the campaign’s URL, User B can view and edit Campaign 1 without issues. Direct navigation to the campaign’s page from the dashboard, however, is not possible.
The issue is the same regardless the role and whether a user is an initial part of the campaign or added later. Both users are part of the M365 group and can navigate to the sites e.g. via the associated team sites.
Approvals
Approvals seem to generally be causing errors at the moment. If an approver does receive an approval, the approval does not contain an option to view details / jump to the element that the approver needs to review. Below is a full view of the approval request details – I am almost certain there was a link leading to Amplify page included in the approval request not two months ago.
I am thankful for any thoughts on this :).
Hello everyone, with recent demos of Viva Amplify, I have run into two issues: Campaign VisibilityUsers can only see campaigns they themselves have created in their dashboard, no matter their role.So Campaign 1, created by User A, will have User B as editor or approver. User B, however, sees only their own campaign, Campaign 2, in their Dashboard, but not Campaign 1. Using the campaign’s URL, User B can view and edit Campaign 1 without issues. Direct navigation to the campaign’s page from the dashboard, however, is not possible.The issue is the same regardless the role and whether a user is an initial part of the campaign or added later. Both users are part of the M365 group and can navigate to the sites e.g. via the associated team sites. ApprovalsApprovals seem to generally be causing errors at the moment. If an approver does receive an approval, the approval does not contain an option to view details / jump to the element that the approver needs to review. Below is a full view of the approval request details – I am almost certain there was a link leading to Amplify page included in the approval request not two months ago. I am thankful for any thoughts on this :). Read More
Massive reduction in Threat Intelligence IP data since Monday 10th June
Hi,
Anyone else see a massive reduction in Threat Intelligence IP data since Monday 10th June into Sentinel platforms? I operate two Sentinel environments and they both seen the same change.
The screenshot below is the past 30 days.
The past 48 hours still reports some IP information being sent but at a very reduced rate.
What’s changed with the feed?
Hi, Anyone else see a massive reduction in Threat Intelligence IP data since Monday 10th June into Sentinel platforms? I operate two Sentinel environments and they both seen the same change. The screenshot below is the past 30 days. The past 48 hours still reports some IP information being sent but at a very reduced rate.What’s changed with the feed? Read More
SPO individual file views – extracting/adding to views
Hi, is there any progress with being able to export, or add to views, individual file views in SPO, i.e. a way to surface them without manually selecting documents individually and looking at the information panel? Thanks
Hi, is there any progress with being able to export, or add to views, individual file views in SPO, i.e. a way to surface them without manually selecting documents individually and looking at the information panel? Thanks Read More
Fingerprint based SIT
Hello,
I wanted to ideally create Fingerprint Based SIT for certain document templates in the organization.
When I did create the Fingerprint Based SIT, I could notice the below:
-I can use SIT in DLP policies.
-I am not able to find it as an option in Labeling settings (for Auto Labeling configuration in Label settings)
Steps:
Compliance portal -> Information Protection -> Labels -> Edit label or create new -> go to the steps “Auto Labeling for files and emails” ->Content Contains -> Add -> Sensitive info types -> Look for the Fingerprint based SIT: Not found
Hello,I wanted to ideally create Fingerprint Based SIT for certain document templates in the organization. When I did create the Fingerprint Based SIT, I could notice the below:-I can use SIT in DLP policies.-I am not able to find it as an option in Labeling settings (for Auto Labeling configuration in Label settings) Steps:Compliance portal -> Information Protection -> Labels -> Edit label or create new -> go to the steps “Auto Labeling for files and emails” ->Content Contains -> Add -> Sensitive info types -> Look for the Fingerprint based SIT: Not found Read More
ICYMI: Fine-tuning – Azure AI Discord Community Roundtable
Hello Everyone!
Last week we hosted a Community Roundtable in the Azure AI Discord on Fine-tuning.
The Community Roundtables are a time where members of the Azure AI Discord can come and learn together on a specific AI topic that is voted on by the community. For this session, we covered Fine-tuning LLMs. The session started with a brief lighting talk from @nitya and then we open the stage for members to ask questions and talk about their experiences with fine-tuning.
Here is the recording for anyone who missed it:
Here is a summary of some of the questions and topics discussed generated with the help of CoPilot:
Our next session in the AI Discord is an Office Hours about the RAG on Thursday, June 13th. This is an ask me anything and open floor discussion.
See you in the Discord!
– Korey
Hello Everyone!
Last week we hosted a Community Roundtable in the Azure AI Discord on Fine-tuning.
The Community Roundtables are a time where members of the Azure AI Discord can come and learn together on a specific AI topic that is voted on by the community. For this session, we covered Fine-tuning LLMs. The session started with a brief lighting talk from @nitya and then we open the stage for members to ask questions and talk about their experiences with fine-tuning.
Here is the recording for anyone who missed it:
Here is a summary of some of the questions and topics discussed generated with the help of CoPilot:
beginners. 0:59
Question: What is fine-tuning in the context of AI?
Answer: Fine-tuning is described as retraining an existing model with new data to improve its performance. 7:40
Question: Why should we fine-tune AI models?
Answer: Fine-tuning is appropriate if the response quality is not achievable with other techniques, considering the trade-offs such as cost efficiency. 8:50
Question: When should we consider fine-tuning AI models?
Answer: Fine-tuning should be considered only if the benefits outweigh the costs, after trying other approaches first. 10:22
Question: What are the steps involved in fine-tuning an AI model?
Answer: The steps include preparing data, training and evaluating the model, and then deploying and using it. 11:20
Question: How does one decide on the amount of data needed for fine-tuning?
Answer: The amount of data needed depends on the model, provider, and use case, ranging from a few hundred to thousands of samples. 11:45
Question: What are the considerations for deploying a fine-tuned AI model?
Answer: Considerations include deployment constraints, model rate limits, and validating the model in a playground before production. 13:53
Our next session in the AI Discord is an Office Hours about the RAG on Thursday, June 13th. This is an ask me anything and open floor discussion.
See you in the Discord!
– Korey Read More