Category: News
Multiple lines to write using fid open
I have a matrix and some operation is performed on the rows e.g average of each row. I want to store those rows whose average is above some condition and if multiple rows satisfy the condition then both should be written in a text file. My code only writes last row [10 11 2] where as I want both 2 and 3rd rows.
M=[2 3 4; 6 7 9; 10 11 2];
R1=sum(M(1,:))/3; % ans is 3
R2=sum(M(2,:))/3;% ans is 7.3333
R3=sum(M(3,:))/3; % ans is 7.6667
T=[R1,R2,R3];
for i = 1:3
if T(i) >= 7.3333
fid = fopen([‘Result/T(i)’,’,’,’.txt’], ‘w’);
fprintf(fid,’%d’,M(i,:));
fprintf(fid,’,’);
fclose(fid);
end
endI have a matrix and some operation is performed on the rows e.g average of each row. I want to store those rows whose average is above some condition and if multiple rows satisfy the condition then both should be written in a text file. My code only writes last row [10 11 2] where as I want both 2 and 3rd rows.
M=[2 3 4; 6 7 9; 10 11 2];
R1=sum(M(1,:))/3; % ans is 3
R2=sum(M(2,:))/3;% ans is 7.3333
R3=sum(M(3,:))/3; % ans is 7.6667
T=[R1,R2,R3];
for i = 1:3
if T(i) >= 7.3333
fid = fopen([‘Result/T(i)’,’,’,’.txt’], ‘w’);
fprintf(fid,’%d’,M(i,:));
fprintf(fid,’,’);
fclose(fid);
end
end I have a matrix and some operation is performed on the rows e.g average of each row. I want to store those rows whose average is above some condition and if multiple rows satisfy the condition then both should be written in a text file. My code only writes last row [10 11 2] where as I want both 2 and 3rd rows.
M=[2 3 4; 6 7 9; 10 11 2];
R1=sum(M(1,:))/3; % ans is 3
R2=sum(M(2,:))/3;% ans is 7.3333
R3=sum(M(3,:))/3; % ans is 7.6667
T=[R1,R2,R3];
for i = 1:3
if T(i) >= 7.3333
fid = fopen([‘Result/T(i)’,’,’,’.txt’], ‘w’);
fprintf(fid,’%d’,M(i,:));
fprintf(fid,’,’);
fclose(fid);
end
end fid open MATLAB Answers — New Questions
Find minimum values based on unique number
I have a dataset of temperatures collected over multiple depth profiles (1-147 profiles). The data is all listed in one long table, not by each profile (attached).
Each profile has a different temperature minimum, and I want to find this minimum for each profile, and colour all of the temperatures above this in grey in a figure (essentially discarding them).
Evidently I’m going about this the wrong way as my output (T_min) is all the same number (see code below).
Once I have the T_min for each profile, when I do a scatter plot, how can I colour each dot less than the T_min – for that particular profile – grey?
Thanks in advance – sorry if this isn’t very clear.
j=1;
for i=1:length(dives)
T_min(j) = nanmin(SG579_FINAL_table_MF.Cons_temp(dives));
j=j+1;
endI have a dataset of temperatures collected over multiple depth profiles (1-147 profiles). The data is all listed in one long table, not by each profile (attached).
Each profile has a different temperature minimum, and I want to find this minimum for each profile, and colour all of the temperatures above this in grey in a figure (essentially discarding them).
Evidently I’m going about this the wrong way as my output (T_min) is all the same number (see code below).
Once I have the T_min for each profile, when I do a scatter plot, how can I colour each dot less than the T_min – for that particular profile – grey?
Thanks in advance – sorry if this isn’t very clear.
j=1;
for i=1:length(dives)
T_min(j) = nanmin(SG579_FINAL_table_MF.Cons_temp(dives));
j=j+1;
end I have a dataset of temperatures collected over multiple depth profiles (1-147 profiles). The data is all listed in one long table, not by each profile (attached).
Each profile has a different temperature minimum, and I want to find this minimum for each profile, and colour all of the temperatures above this in grey in a figure (essentially discarding them).
Evidently I’m going about this the wrong way as my output (T_min) is all the same number (see code below).
Once I have the T_min for each profile, when I do a scatter plot, how can I colour each dot less than the T_min – for that particular profile – grey?
Thanks in advance – sorry if this isn’t very clear.
j=1;
for i=1:length(dives)
T_min(j) = nanmin(SG579_FINAL_table_MF.Cons_temp(dives));
j=j+1;
end unique, minimum, loop MATLAB Answers — New Questions
Roles For Quick Access/Responsibility Assignment
It would be great if there were “focused” roles that could quickly be assigned to select users so that they can manage the following for the entire organisation or even where you could select up to most of the users in an organisation which you can manage the following for.
-Meeting responses (there are times when persons within the organisation might respond via email rather than accept/decline a meeting in response to a meeting). This would be a great focused role, only supplying the feature/capability required
-Mailbox delegation: A role should be created that can quickly provide such persons holding such responsibility as managing a users mailbox whilst they are out of office. Such a feature could provide ability to choose whether to automatically enable access when a user is out of office only or at all times. It would also be great for those requiring management OOO user mailboxes for the entire organisation if so required.
I’m sure there are other possible focused roles also.
It would be great if there were “focused” roles that could quickly be assigned to select users so that they can manage the following for the entire organisation or even where you could select up to most of the users in an organisation which you can manage the following for. -Meeting responses (there are times when persons within the organisation might respond via email rather than accept/decline a meeting in response to a meeting). This would be a great focused role, only supplying the feature/capability required -Mailbox delegation: A role should be created that can quickly provide such persons holding such responsibility as managing a users mailbox whilst they are out of office. Such a feature could provide ability to choose whether to automatically enable access when a user is out of office only or at all times. It would also be great for those requiring management OOO user mailboxes for the entire organisation if so required. I’m sure there are other possible focused roles also. Read More
Additional Organiser Rights/Capabilities For Meetings
Additional rights/capabilities should be provided to organisers of meetings. There are times when, instead of clicking the accept/decline button for whatever reason, recipients instead shoot an email response stating that they will be attending the meeting.
Outlook should, maybe with the help of CoPilot, recognise this and automatically accept/decline meeting invites for the recipient placing them in the Attendee list under Yes or No. Alternatively, the meeting organiser should be able to accept/decline the meeting on the recipients behalf which will send a follow-up email to the recipeint making them aware that the meeting was accepted/declined.
This would be a great feature and could surely utilise the AI features that Microsoft looks to impliment into M365.
Additional rights/capabilities should be provided to organisers of meetings. There are times when, instead of clicking the accept/decline button for whatever reason, recipients instead shoot an email response stating that they will be attending the meeting. Outlook should, maybe with the help of CoPilot, recognise this and automatically accept/decline meeting invites for the recipient placing them in the Attendee list under Yes or No. Alternatively, the meeting organiser should be able to accept/decline the meeting on the recipients behalf which will send a follow-up email to the recipeint making them aware that the meeting was accepted/declined. This would be a great feature and could surely utilise the AI features that Microsoft looks to impliment into M365. Read More
SharePoint library PDF viewer looks different for a user
One of the users reported that his files are rendered in different way compared to other users. When the user opens a PDFs in a SharePoint document library it opens in black transparent screen ,the top bar is also different from other users. What could be the reason for this issue. Here is the screenshot of the users view and normal view.
One of the users reported that his files are rendered in different way compared to other users. When the user opens a PDFs in a SharePoint document library it opens in black transparent screen ,the top bar is also different from other users. What could be the reason for this issue. Here is the screenshot of the users view and normal view. Read More
JSON Header Formatting SharePoint list – Subheading
Hi
I have a SharePoint list form, and I have been asked to add “Something small in the header to say ‘For Internal Use Only'”
I am struggling to change the JSON to add a subheading. Can anyone advise?
{
“elmType”: “div”,
“attributes”: {
“class”: “ms-borderColor-Green”
},
“style”: {
“width”: “99%”,
“border-top-width”: “0px”,
“border-bottom-width”: “1px”,
“border-left-width”: “0px”,
“border-right-width”: “0px”,
“border-style”: “solid”,
“margin-bottom”: “16px”,
“background-color”: “#13A10E”
},
“children”: [
{
“elmType”: “div”,
“style”: {
“display”: “flex”,
“box-sizing”: “border-box”,
“align-items”: “center”
},
“children”: [
{
“elmType”: “div”,
“attributes”: {
“iconName”: “Page”,
“class”: “ms-fontSize-42 ms-fontWeight-regular ms-fontColor-#0C0C0C”,
“title”: “Details”
},
“style”: {
“flex”: “none”,
“padding”: “0px”,
“padding-left”: “0px”,
“height”: “36px”
}
}
]
},
{
“elmType”: “div”,
“attributes”: {
“class”: “ms-fontColor-#0C0C0Cy ms-fontWeight-bold ms-fontSize-24”
},
“style”: {
“box-sizing”: “border-box”,
“width”: “100%”,
“text-align”: “left”,
“padding”: “21px 12px”,
“overflow”: “hidden”
},
“children”: [
{
“elmType”: “div”,
“txtContent”: “= [$Title] + ‘ PID ‘”
}
]
}
]
}
Thanks if anyone can help
Conn
Hi I have a SharePoint list form, and I have been asked to add “Something small in the header to say ‘For Internal Use Only'” I am struggling to change the JSON to add a subheading. Can anyone advise? {
“elmType”: “div”,
“attributes”: {
“class”: “ms-borderColor-Green”
},
“style”: {
“width”: “99%”,
“border-top-width”: “0px”,
“border-bottom-width”: “1px”,
“border-left-width”: “0px”,
“border-right-width”: “0px”,
“border-style”: “solid”,
“margin-bottom”: “16px”,
“background-color”: “#13A10E”
},
“children”: [
{
“elmType”: “div”,
“style”: {
“display”: “flex”,
“box-sizing”: “border-box”,
“align-items”: “center”
},
“children”: [
{
“elmType”: “div”,
“attributes”: {
“iconName”: “Page”,
“class”: “ms-fontSize-42 ms-fontWeight-regular ms-fontColor-#0C0C0C”,
“title”: “Details”
},
“style”: {
“flex”: “none”,
“padding”: “0px”,
“padding-left”: “0px”,
“height”: “36px”
}
}
]
},
{
“elmType”: “div”,
“attributes”: {
“class”: “ms-fontColor-#0C0C0Cy ms-fontWeight-bold ms-fontSize-24”
},
“style”: {
“box-sizing”: “border-box”,
“width”: “100%”,
“text-align”: “left”,
“padding”: “21px 12px”,
“overflow”: “hidden”
},
“children”: [
{
“elmType”: “div”,
“txtContent”: “= [$Title] + ‘ PID ‘”
}
]
}
]
} Thanks if anyone can helpConn Read More
Group/DL Meetings- Ability to see all responses and non-responses rather than just accepted
Organisers of meetings sent to Distribution Lists or Groups (Microsoft 365 Groups included) are unable to view who has not responded to a meeting in list. They receive emails regarding this but it would be great for this information to be placed in one location for review so that they know who, if anyone, to chase for a response.
Yes, it is great to see who has accepted but it is not the whole picture that meeting organisers need to see. Maybe this info can be included on either the Scheduling Poll or the Tracking section in Outlook.
Organisers of meetings sent to Distribution Lists or Groups (Microsoft 365 Groups included) are unable to view who has not responded to a meeting in list. They receive emails regarding this but it would be great for this information to be placed in one location for review so that they know who, if anyone, to chase for a response. Yes, it is great to see who has accepted but it is not the whole picture that meeting organisers need to see. Maybe this info can be included on either the Scheduling Poll or the Tracking section in Outlook. Read More
Azure – PowerShell Script to delete a specific Tag for any resources in all your Subscriptions
A classical question after many months of usage and delegation to different admin is related to the TAG Cleanup.
You can be faced to a large diversity of Tags created at one moment, but not useful and mainly not maintained.
This small script will help you to execute this cleanup in all your subscriptions you are in charge.
Import-module Az
Connect-AzAccount
[string]$TagName = “YourSpecificTagKey”
$TagCount = 0
$All_Az_Subscriptions = Get-AzSubscription
Foreach ($Az_Subscription in $All_Az_Subscriptions)
{
Write-Host ” “
Write-Host ” ————————————— “
Write-Host “Working on subscription “”$($Az_Subscription.Name)””” -foregroundcolor “yellow”
$TagCount = 0
Set-AzContext -SubscriptionObject $Az_Subscription | Out-Null
$AllTaggedresources = Get-AzResource -TagName $TagName
$TagCount = $AllTaggedresources.Count
Write-Host ” >> TAG “” $($TagName) “” found “” $($TagCount) “” times” -foregroundcolor “green”
if($TagCount -gt 0)
{
$AllTaggedresources.ForEach{
if ( $_.tags.ContainsKey($TagName) ) {
$_.tags.Remove($TagName)
}
$_ | Set-AzResource -Tags $_.tags -Force
}
}
}
This script was inspired by these pages:
https://stackoverflow.com/questions/54162372/how-to-fix-this-error-in-azure-powershell-can-not-remove-tag-tag-value-becaushttps://learn.microsoft.com/en-us/powershell/module/az.resources/set-azresource?view=azps-11.6.0
Fabrice Romelard
A classical question after many months of usage and delegation to different admin is related to the TAG Cleanup.You can be faced to a large diversity of Tags created at one moment, but not useful and mainly not maintained.This small script will help you to execute this cleanup in all your subscriptions you are in charge. Import-module Az
Connect-AzAccount
[string]$TagName = “YourSpecificTagKey”
$TagCount = 0
$All_Az_Subscriptions = Get-AzSubscription
Foreach ($Az_Subscription in $All_Az_Subscriptions)
{
Write-Host ” “
Write-Host ” ————————————— “
Write-Host “Working on subscription “”$($Az_Subscription.Name)””” -foregroundcolor “yellow”
$TagCount = 0
Set-AzContext -SubscriptionObject $Az_Subscription | Out-Null
$AllTaggedresources = Get-AzResource -TagName $TagName
$TagCount = $AllTaggedresources.Count
Write-Host ” >> TAG “” $($TagName) “” found “” $($TagCount) “” times” -foregroundcolor “green”
if($TagCount -gt 0)
{
$AllTaggedresources.ForEach{
if ( $_.tags.ContainsKey($TagName) ) {
$_.tags.Remove($TagName)
}
$_ | Set-AzResource -Tags $_.tags -Force
}
}
}This script was inspired by these pages:https://stackoverflow.com/questions/54162372/how-to-fix-this-error-in-azure-powershell-can-not-remove-tag-tag-value-becaushttps://learn.microsoft.com/en-us/powershell/module/az.resources/set-azresource?view=azps-11.6.0Fabrice Romelard Read More
Clinical Trials Custom Copilot
I’m a Sr. Architect and Responsible AI Champ at Microsoft, Industry Solutions Delivery, Healthcare and Life Science OU and have extensive experience, over 24 years in the industry. I’ve always tied technology to their use cases and what problems it can solve and the business outcomes. I am very technical, a background in computer engineering, computer science, programming and development but approaching solutions only from a technical perspective hasn’t always panned out without looking at how it’ll be used and who’s using it.
Idea:
I wanted to learn hands-on Azure OpenAI resources, model deployments of i.e. chat model (e.g. gpt-35-turbo-16k, gpt-4) with some kind of use case in mind.
Since I’m in the Healthcare and Life Science industry, I was looking for use cases that could be helpful and ended up asking “Is there a way to search Clinical Trials data using ChatGPT?” Of course, there are other ways to search clinical trials via ClinicalTrials.gov and the dataset is public but wanted to combine OpenAI + AI Search + ChatGPT.
So, I embarked on answering this question and learning along the way ended up deploying the following components:
Azure OpenAI Service
Blob Storage (Several ways to get data here: https://classic.clinicaltrials.gov/ct2/resources/download. I just went with a simple storage so I can house the XML files)
Azure AI Search Index
A Chat App UX (Luckily there is a sample app available I can use to start from https://github.com/microsoft/sample-app-aoai-chatGPT/). I’ve cloned that repo to start the clinical trials copilot repo.
Azure CosmosDB to store chat history (this is also included in the sample AOAI chat app)
PowerBI to view chat data from CosmosDB
Conceptual Architecture:
See the full details in my repo, Enjoy! https://github.com/dondinulos/clinical-trials-copilot/
Microsoft Tech Community – Latest Blogs –Read More
Host Microsoft Defender data locally in Switzerland
We are pleased to announce that local data residency support in Switzerland is now generally available for Microsoft Defender for Endpoint and Microsoft Defender for Identity.
This announcement demonstrates our commitment to providing customers with the highest levels of security and compliance by offering services that are aligned to local data sovereignty requirements. Swiss customers can now confidently onboard to Defender for Endpoint and Defender for Identity in Switzerland, knowing that their data at rest will remain within Swiss boundaries, which ensures that customers in Switzerland can meet their regulatory obligations and maintain control over their data.
In addition to Switzerland, Defender data can also be hosted in other regions including the United States, European Union, the United Kingdom, and Australia.
Configure Microsoft Defender for Endpoint with local data hosted in Switzerland.
Prerequisites
Your EntraID tenant needs to be set to Switzerland, so the Microsoft Defender for Endpoint tenant will also be provisioned in this geo.
To access the GoLocal Geo instance in Switzerland, you need to ensure each device is onboarded using Streamlined Connectivity for devices on their network (see Enable access to Microsoft Defender for Endpoint service URLs in the Proxy Server for further details).
I am a new Defender for Endpoint customer
Once the EntraID tenant is created, access the Security Portal (https://security.microsoft.com) and continue with the onboarding in the GoLocal geo.
Once that process is completed, the Microsoft Defender for Endpoint / Microsoft Defender XDR tenant should be located in the GoLocal geo.
Confirmation: In the portal, go to Settings -> Microsoft Defender XDR-> Account; and see where the service is storing your data at rest.
For example: in the image below, the service location for this Microsoft Defender XDR demo tenant is Switzerland.
However, if the location of the data at rest is in one of the current service locations of US/UK/EU/AU, then a tenant reset needs to be requested via Customer Service and Support (CSS) (see next section).
I am a Defender for Endpoint customer with existing tenants in geographies different from the Swiss GoLocal Geo and want to move to the local Geo in Switzerland.
Existing customers have to request a tenant reset by contacting the Microsoft Customer Support. Support can be reached by clicking on the “?” top right in the portal when signed in as an Admin. If you are a Microsoft Unified support customer, please reach out to your Customer Success Account Manager to support you in the process.
Microsoft Defender for Endpoint will store and process data in the same location as used by Microsoft Defender XDR. If Microsoft Defender XDR has not been turned on yet, onboarding to Microsoft Defender for Endpoint will also turn on Microsoft Defender XDR and a new data center location is automatically selected based on the location of active Microsoft 365 security services. https://learn.microsoft.com/en-us/microsoft-365/security/defender-endpoint/production-deployment?view=o365-worldwide#data-center-location
Streamlined connectivity
Configure Microsoft Defender for Identity data to be hosted in Switzerland
Prerequisites
EntraID tenant needs to be set to Switzerland, so the Microsoft Defender for Identity workspace would be provisioned in this geo as well.
I am a new Microsoft Defender for Identity customer
Once the EntraID tenant is created, access the Security Portal (https://security.microsoft.com) and continue with the Microsoft Defender for Identity workspace onboarding in the GoLocal geo.
The previous point is required because when a Microsoft Defender for Identity workspace is created, it is created in the Azure region closest to the customer’s EntraID tenant location. See Microsoft Defender for Identity frequently asked questions – Microsoft Defender for Identity | Microsoft Learn.
I am a Defender for Identity customer with existing tenants in geographies different from the Swiss GoLocal Geo and want to move to the local Geo in Switzerland.
Existing customers have to request a workspace reset by contacting the Microsoft Customer Support. Support can be reached by clicking on the “?” top right in the portal when signed in as an Admin. If you are a Microsoft Unified support customer, please reach out to your Customer Success Account Manager to support you in the process.
With both our Endpoint Detection and Response, as well as our Identity Threat Detection and Response (ITDR) products now available for local data residency in Switzerland, we are giving more organizations the ability to meet local data sovereignty requirements, while deploying the best security solutions for their estate.
More information:
Ready to go local? Read our documentation for more information on how to get started.
Not yet a customer? Start a 90-day trial for Defender for Endpoint
Check out our website to learn more about our industry leading Endpoint protection platform
Discover why ITDR is critical to keep your organization safe against rising identity threats
Microsoft Tech Community – Latest Blogs –Read More
Setting up Azure API on Postman and Azure CLI – Step-by-step guide
Setting up Azure API on Postman and Azure CLI – Step-by-step guide
Dive into the World of Azure APIs on Postman – Step-by-Step Guide by Suzaril Shah
Hello tech enthusiasts! I’m Suzaril Shah, a Gold Microsoft Learn Student Ambassador, here to guide you through the exciting process of setting up Azure API on Postman and Azure CLI. Whether you’re a student or a seasoned developer, our comprehensive guide is designed to enhance your skills in managing Azure resources effectively.
What You’ll Learn:
Efficient Setup: Begin with the basics as we walk you through the Azure CLI setup and login process, crucial for interacting with Azure from your command line.
Hands-On Experience: Create and configure your Azure service principal credentials and dive into real-world API testing scenarios using Postman.
Practical Insights: Gain practical insights on how to leverage Azure APIs for managing resources, understanding CLI toolsets (Azure, Azure Developer, GitHub CLI’s), and much more.
Join us to not only enhance your technical skills but also to prepare yourself for a future in cloud services and infrastructure management. Let’s explore the vast capabilities of Azure together! In this step-by-step guide, I will guide you through setting up Azure API on Postman and Azure CLI.
Prerequisites:
Postman Client
Azure CLI
Azure subscription
If you don’t already have one, you can sign up for an Azure free account.
For Students, you can use the free Azure for Students offer which doesn’t require a credit card only your school email.
Step 1 – Getting Started
On Azure CLI, run the “az login” command. You will be redirected to log in to your Azure account in a web browser, and upon successful login, you will be presented with your account detail, as shown below. Please take note of the “id” variable, as we need them later. The ‘id’ variable is our subscription ID on Azure.
Select the subscription for the Azure Account using the az account set command. Use the -n parameter to specify the subscription name, i.e: az account set -n “MSDN Platforms Subscription”
After that, create a resource group using CLI using the command “az group create –location [Azure Location, i.e: westus] –resource-group [Resource Group]”
Next, create a service principal credential on Azure using this command:
az ad sp create-for-rbac -n [SP_Name] –role Owner –scope “/subscriptions/[Subscription_ID]/resourceGroups/[ Resource Group]”
The output should look like this:
This command will provide the credentials we need to work on Postman to test some Azure API:
AppID
displayname
Password
Tenant
Copy the credentials to somewhere safe. Please do not expose the credentials! You can also explore other roles when creating a service principal by using the –role flag and specify the scope of the SP credentials with the –scope flag. Documentations included here.
Some built-in roles in Azure RBAC include
Owner – Total control of a Resource Group
Contributor – Has control over Actions on a Resource Group, like modifying a Resource Group (i.e. Deleting a VM) but cannot assign permission to the RG.
Reader – only has the ability to view the resource group. Learn More
Step 2 – Rocking with Postman!
Create a new Collection on your current Workspace and Click on the collection name. Under the collection name, you should find “Variables” tab. Create variables as listed below and map the values from the Service Principle and Subscription ID output from earlier.
ClientId = AppID
clientSecret = Password
tenantId = tenant
resource = https://management.azure.com/
subscriptionId = [Subscription ID]
resourceGroup = [Resource Group]
bearerToken = “leave it blank, we will programmatically fill the field later”
The configuration should look like this:
Click on “Save” and head to the Pre-request Script Tab and copy and paste the script below:
pm.test(“Check for collectionVariables”, function () {
let vars = [‘clientId’, ‘clientSecret’, ‘tenantId’, ‘subscriptionId’];
vars.forEach(function (item, index, array) {
console.log(item, index);
pm.expect(pm.collectionVariables.get(item), item + ” variable not set”).to.not.be.undefined;
pm.expect(pm.collectionVariables.get(item), item + ” variable not set”).to.not.be.empty;
});
if (!pm.collectionVariables.get(“bearerToken”) || Date.now() > new Date(pm.collectionVariables.get(“bearerTokenExpiresOn”) * 1000)) {
pm.sendRequest({
url: ‘https://login.microsoftonline.com/’ + pm.collectionVariables.get(“tenantId”) + ‘/oauth2/token’,
method: ‘POST’,
header: ‘Content-Type: application/x-www-form-urlencoded’,
body: {
mode: ‘urlencoded’,
urlencoded: [
{ key: “grant_type”, value: “client_credentials”, disabled: false },
{ key: “client_id”, value: pm.collectionVariables.get(“clientId”), disabled: false },
{ key: “client_secret”, value: pm.collectionVariables.get(“clientSecret”), disabled: false },
{ key: “resource”, value: pm.collectionVariables.get(“resource”) || “https://management.azure.com/”, disabled: false }
]
}
}, function (err, res) {
if (err) {
console.log(err);
} else {
let resJson = res.json();
pm.collectionVariables.set(“bearerTokenExpiresOn”, resJson.expires_on);
pm.collectionVariables.set(“bearerToken”, resJson.access_token);
}
});
}
});
This script will get the bearer Token use to authenticate to access Azure API and programmatically populate the token into the bearerToken variable we created earlier. Click on “Run” button to run the script. You should see the bearerToken is already generated and placed in the “Current Value” fields on the “Variable” tab.
Head to the Authorization tab, make sure to select the Authorization method to “Bearer Token”, and set the token to the variable as displayed below:
Step 3 – Testing Phase!
Right-Click on the ‘Collection’ name, and click on the “Add Request” option. Name the Request as “Get Resource Group Info”.
On the Request Type, select “GET” request, and using the variable setup on the Collection folder, type:
{{resource}}/subscriptions/{{subscriptionId}}/resourcegroups/{{resourceGroup}}?api-version=2020-09-01
This GET request will fetch information about the specified “Resource Group” on Azure. Click on the “Save” and “Send” buttons.
You should be seeing this output:
Voila! You have successfully set up Azure API authentication and performed an Azure API GET Request on Postman!
Create API Requests on the Collection.
You can explore the list of Azure API from the documentation
What’s next?
To kickstart your Azure API journey, you can find the Azure Cloud Onboarding collection on Postman that I have been working on from this postman collection.
Microsoft Tech Community – Latest Blogs –Read More
Introducing Microsoft Learn for Organizations Playbook, customizable Plans
As AI changes the way everyone works, many organizations struggle to take advantage of its transformative potential. If you’re like the learning managers and team leaders we work with, you’re facing a talent crunch and wondering where to start. Plus, our customers report that off-the-shelf training options don’t fit their business objectives or provide the flexibility that workers with busy schedules need.
As part of the solution to the talent crunch, organizations are adopting a skills-first approach to talent acquisition and development—that is, the focus shifts to an individual’s proven skills and capabilities and not just traditional qualifications like degrees and résumés. And as for training, developing technical skills is a clear imperative for any organization, but it’s only one strategy in a larger playbook.
That’s why we’re happy to announce a new playbook and innovative planning functionality that can help you chart your organization’s unique learning journey. By helping your learners build skills and earn credentials for Microsoft’s AI apps and services, Azure, and other in-demand technologies, you can help them thrive as individuals while building a culture of continuous learning that empowers your workforce to handle disruption with greater resilience and can advance your organization’s success.
Take the long view, and empower a culture of continuous learning
Your organization may be investing in reskilling and upskilling programs, custom courseware, or other learning tools. Yet the Association for Talent Development (ATD), in its 2023 State of the Industry report, points out that the average organization spent less per employee on workplace learning in 2023—and employees spent less time in training—compared to previous years. Microsoft Learn is always looking for ways to help our customers’ and partners’ learners develop and deepen technical skill sets, and we offer a wide variety of learning resources. To help with the sheer volume, the Microsoft Learn for Organizations Playbook puts these resources into perspective for you. It helps you organize your skill-building objectives so that each investment in learning, like a smart play on the field, moves you closer to the goal of building a culture of continuous learning.
The playbook helps you to get started developing and carrying out a plan that identifies and assesses skills gaps in your organization. And it provides resources to maximize your impact on the individual, team, and organization level through actions you can plan, tailor, implement, measure, and optimize. Plus, you can take advantage of Microsoft resources, such as the Microsoft Learn Assessments hub and the Microsoft Learn AI learning hub. You can also find recommendations for building and rolling out skilling plans, measuring progress, earning credentials, and sharing successes throughout the process. You can even connect with Training Services Partners.
We’re happy to make this skill-building resource available through Microsoft Learn for Organizations, which is the front door to all that Microsoft Learn has to offer learners engaged in team training.
Close skills gaps with customizable Plans on Microsoft Learn
The newly announced Plans on Microsoft Learn provide a reliable way for your organization to build and validate skills across your learning community. Plans help learners, teams, and organizations accelerate the achievement of their learning goals by using curated sets of content combined with milestones and automated nudges to keep learners focused and motivated. Plans are made up of thoughtfully created milestones of Microsoft Learn content across a wide variety of training and resources, along with the recommended number of days to complete them.
Experts who are familiar with Microsoft technology can create a custom technical Plan, targeting specific learning goals, and then can share that Plan with others. A Plan’s creator can track the progress of learners and generate reports to verify that the Plans have been completed. Learners who join the Plan can progress at their own speed or follow the recommended pace. If they’ve opted in to receive email notifications from Microsoft Learn, they get reminders that help them to stay on track and complete their Plans.
The owner of a Plan can tailor the collection of content as needed to create a curriculum that efficiently leads individuals and groups to achieve desired outcomes, such as learning how to use new technologies or to quickly get project-ready. They can even incorporate Microsoft Official Plans—curated collections created by Microsoft that help close the skills gap for common scenarios. Sharing a Plan is as easy as sharing a document, and any learner with the link can start and complete that Plan while the Plan’s owner measures progress.
Key features of Plans include:
⦁ Clear learning outcomes.
⦁ Content milestones.
⦁ Optional email reminders.
⦁ Tracking and reporting.
Plans not only help your organization build in-demand technical skills but also can accelerate individual success. For more details on Plans and the growing Plans library, along with links to helpful videos and instructions on how to create, customize, and work through Plans, read the announcement post, Introducing Plans on Microsoft Learn.
Using Plans on Microsoft Learn, you can tailor content to your learners’ needs and track their progress.
Together, the Microsoft Learn for Organizations Playbook and Plans on Microsoft Learn can help your organization address immediate skills gaps while cultivating a resilient culture of continuous learning.
Learn more
Make the most of the Microsoft Learn for Organizations Playbook.
Explore Microsoft Learn for Organizations.
Read Introducing Plans on Microsoft Learn.
Discover Plans on Microsoft Learn.
Microsoft Tech Community – Latest Blogs –Read More
Securing your API Management service from day one with Defender for APIs
Introduction
We are excited to announce that you can now secure your Azure API Management (APIM) managed APIs from day one with Defender for APIs. This allows you to enable security as soon as you create your APIM service within the Azure portal. This means that security for APIs is no longer an afterthought and API management administrators do not need to leave the Azure API Management portal experience to turn on protection for their APIs which is a critical entry point into the API attack surface.
Defender for APIs provides full lifecycle protection, detection, and response coverage. Defender for APIs includes unified visibility across your APIM Services within the Azure subscription, security insights with hardening recommendations, classification of sensitive data exposure, and continuous monitoring of APIs with machine learning and threat intelligence-based detections to alert against top OWASP API risks.
Enabling Defender for APIs from APIM instance creation experience in Azure portal
Step 1 – Create a new API Management Service
From the Azure Portal, select Create a resource. You can also select Create a resource on the Azure Home page.
On the Create a resource page, select Integration > API Management.
On the API Management services page select Create
Step 2 – Enable Defender for APIs
After filling out the information in the Basics tab, select the Monitor + secure tab. Select the Enable check box to enable the Defender for APIs plan. In order to enable the plan, you must have the proper role and permissions that can be found here.
Note: Enabling the Defender for APIs is at the Azure subscription level, and will apply to all APIM services within the Azure subscription
Step 3 – Select Pricing plan
Finally, Select Choose a plan dropdown menu to choose the correct Defender plan for your environment.
Note: For detailed information on pricing, click on View all plans to view more details on each individual plan and pricing. After selecting your desired pricing plan click Save. To estimate what is the right plan for you, please see our documentation to check your API Management Traffic analytics and use the Defender for APIs cost estimator script that will help in accurately deciding the plan costs.
After completing the rest of the setup for your API Management Service, select the Review + Install tab and select Create after you validate all information is correct. Your APIs that are onboarded to that APIM Service will now be protected with the added security of Defender for APIs!
Note: All APIs must still be onboarded manually. Any new APIs that are added to your APIM Service after this action will still need to be manually onboarded to Defender for APIs.
Conclusion and More Resources
To learn more about Defender for APIs please visit Overview of the Microsoft Defender for APIs plan – Microsoft Defender for Cloud | Microsoft Learn. To provide feedback on this article visit https://aka.ms/MDCUserVoice
Reviewers
Ajinkya Gore, Senior Product Manager – Defender for APIs
Haris Sohail, Product Manager 2 – Defender for APIs
Preetham Anand Naik, Senior Product Manager – Defender for APIs
Yuri Diogenes, Principal PM Manager – CxE Defender for Cloud
Microsoft Tech Community – Latest Blogs –Read More
A guide to Viva Connections 1st party dashboard cards
Viva Connections serves as a gateway to a modern employee experience, and dashboard cards are the building blocks of this experience. From streamlining tasks to enhancing communication, dashboard cards bring in actionable tasks and information from virtually anywhere into a single dynamic app for employees. Here’s a full list of first party dashboard cards released by Microsoft to help you boost productivity and foster a more connected and efficient work environment. If you’re new to dashboard cards or Viva Connections, we recommend you start with this overview before checking out the rest of this blog.
News card
Add the News card to the Viva Connections Dashboard to promote news from a variety of sources that you wish to prominently display, including boosted news from SharePoint. And it works with boosted news posts. If you choose any news posts that have already been boosted, they will display in the News card for the duration of the boost period.
People card
The People Search card will automatically retrieve contact information from members of your organization using Microsoft Entra ID (formerly Azure Active Directory). Users can access the People Search card to look up contact information and can jump into chat, email, or a call with the contact directly from the card view.
Events card
Surface any events from the organization on to the Viva Connections dashboard to spread awareness and increase the number of relevant audiences in the event.
Praise card
Site owners/authors of Viva Connections can add the Praise card from Microsoft Viva Insights on the dashboard, allowing employees to send praise to their colleagues. The Praise card shows the same experience as the one available within Microsoft Viva Insights today.
Learning card
The Learning card in Viva Connections serves as a convenient entryway for employees to access their assigned learning resources and training materials. It encourages continuous learning and skill development by providing personalized training content and integrating seamlessly with Viva Learning— it can even show upcoming learning deadlines.
Pulse card
The Viva Pulse card in Viva Connections allows you to create Pulse survey(s) via the dashboard, giving employees a quick and targeted way to share feedback and participate in workplace surveys. Once the survey is complete, the owner can view the response rate right on the card.
• Availability – Coming CY24Q2
Approvals card
The Approvals card allows you to create new approvals, view incoming requests, streamline requests, and access a history of previous approvals—all conveniently in one place. You can select the card type (small, medium, or large) and even target it to specific audiences by adding users or groups in the property pane.
Shifts Card
The Shifts card shows users information about their next or current shift from the Shifts app in Teams. Built with hourly and frontline workers in mind, it allows employees to clock in and clock out and track break time directly from their employee app. Note: this requires Time clock to be enabled in Teams.
Assigned Tasks Card
The Assigned Tasks card automatically displays information to users about their assigned tasks to. This information is retrieved from the Tasks app in Teams and enables users to quickly take action and accomplish tasks right from their dashboard.
Teams App card
With the Teams app card, you can create a card for an existing Teams app to drive traffic and create easy access for users. There are a variety of built-in Teams apps and integrations available to choose from.
Web link Card
Add a web link card to integrate any external websites seamlessly into your dashboard. It’s designed to provide quick access to important web resources, ensuring that relevant information is just a click away. The web link card can point to any internal or external URL, and can be a quick way to drive traffic without building or customizing a card.
Card Designer
You can choose the Card designer option to design your own card that includes a quick view, without needing to write code. The new updated card designer allows for advanced dynamic content capability including data pulls from MSGraph and SharePoint APIs.
To design your own cards, you should be familiar with JSON and Adaptive Card templates. There are also templates available to help you get started. For more information, see Adaptive Cards Templating.
Get the most out of your Viva connections dashboard
The Viva Connections dashboard cards are more than just features; they represent a shift towards a more integrated and intuitive work environment where employees can access all of the tools, info, and tasks they need to accomplish their job – all from a personalized in employee app that can be targeted and edited for different employee needs and jobs. The power of Viva Connections lies in its flexibility and user-centric design, so take the time to explore and experiment with these cards to discover the full potential of your dashboard.
Learn more about how to get started with Viva Connections with this quick video and learn more about designing your own cards with Adaptive Cards Templating.
Stay tuned as Microsoft and partners continue to build and release new cards. We’ll be sharing a follow up to this blog that highlights 3rd party and industry cards soon.
Microsoft Tech Community – Latest Blogs –Read More
Spring I/O 2024 – Join Microsoft and Broadcom to Celebrate All Things Spring and Azure!
Get ready for the ultimate Spring conference in Barcelona from May 30-31! Connect with over 1200 attendees, enjoy 70 expert speakers, and engage in 60 talks and 7 hands-on workshops. Microsoft Azure (as a Gold Sponsor) and VMware Tanzu (as a Platinum sponsor) bring you in-depth sessions on Spring and AI development, a dynamic full-day workshop, and an interactive booth experience. Learn, network, and enhance your Java skills with the latest tools and frameworks. Don’t miss out on this exciting event!
Java on Azure
We are making it easier for Java developers to use cloud technology. Java is popular for its flexibility, and with Azure, developers can work efficiently with familiar tools and frameworks. This helps them deploy and scale their applications in the cloud smoothly.
We have our own Microsoft Build of OpenJDK, reinforcing our commitment to core development and community collaboration. Over 2.5 million Java developers use Visual Studio Code. Java is widely used on platforms like GitHub and Azure OpenAI, especially for AI development. We partner with companies like Broadcom, IBM, Oracle, and Red Hat to support and promote Java in the developer community.
We offer tools for Java developers, like GitHub Copilot for code suggestions and Azure OpenAI Service for integrating advanced AI. We also provide services like Azure Spring Apps, Azure App Service, Azure Container Apps, Azure Kubernetes Service, and Azure Red Hat OpenShift to offer a complete cloud experience for Java applications, from deployment to management and scaling
What’s new with Spring applications on Azure
Savings Plan:
Starting January 1, 2024, Azure Spring Apps users can save up to 47% through an expanded Azure Savings Plan. This plan offers significant cost reductions for those committing to one or three-year terms.
Enhanced Reliability and Reduced Downtime:
Azure Spring Apps now offers a 99.95% SLA, reducing potential downtime to around 4.34 hours annually. This comprehensive assurance covers all the Azure resources used to build the service like Azure Kubernetes Service, Azure Storage, and Spring Cloud Gateway.
Increased Scalability:
The service can now support up to 1000 app instances, 8000 virtual CPUs, and 32 terabytes of memory per service instance. It also offers larger app instances with up to 8 vCPUs and 32GB of memory, plus more efficient build processes with increased resource allocation.
Bring Your Own Azure Container Registry:
Users can now bring their own Azure Container Registry (ACR), allowing seamless application deployment across different environments and regions.
Landing Zone Accelerator:
This tool helps establish secure, compliant, and scalable cloud environments for development, testing, and production within 15-30 minutes.
Extended Spring Boot Support:
Azure Spring Apps provides extended support for Spring Boot 2.x.x until August 2025, giving users more time to upgrade their applications.
Azure Migrate Support:
Azure Migrate now supports the discovery and assessment of Spring applications, making cloud transitions smoother and more efficient.
Java Native Image Support:
Support for Java native images is now available, providing faster startup times and optimized memory usage. This includes the ability to compile Java applications to standalone executables using GraalVM, benefiting from significant memory and performance improvements.
Session, Workshop and Booth Experience
At Spring I/O 2024, we’re showcasing Microsoft Azure’s commitment to the Spring community. We are offering a session filled with expert insights on using Spring and Azure together, a full-day workshop for hands-on learning with Azure for Spring Developers, and an interactive booth featuring demos, Q&A sessions, and exciting giveaways.
Making coding fun with Spring and Azure
Come see the ultimate way to combine Spring and Azure to make coding a blast! They are both amazing on their own. Put them together and you get developers who are over the moon, making cool apps that people love. However, navigating the vastness of Spring and Azure can be daunting. Learning them can be tough. Our talk is like a fast-forward button. We will show you the fastest ways to combine Spring and Azure. This means your coding is quick, your tests are smooth, your apps can grow and be intelligent (AI), and your design is solid. You will watch quick demos that highlight the top tricks for using Spring with Azure. And you will not leave empty-handed — you will get a Git repo with the best Spring plus Azure setups, straight from the experts.
Speaker(s): Adib Saikali – Broadcom / Asir Selvasingh – Microsoft
Full-day Workshop on May 29: Azure for Spring Developers
Join us for “Azure for Spring Developers”, a dynamic full-day workshop designed specifically for developers, DevOps, and operations personnel looking to leverage Azure for deploying and scaling Spring applications. This workshop will guide you through the essentials of Azure, from setting up your environment and managing applications, to deploying and monitoring Spring applications efficiently using Azure Spring Apps. Additionally, you’ll explore how to enhance your applications with AI capabilities using Azure Open AI.
The workshop is hands-on—bring your laptop and a willingness to learn. We will provide Azure Subscription, sample code, and steps to learn and experiment. By the end of the day, you will know how to effectively deploy and scale Spring projects on Azure, making use of cloud resources to build more scalable and robust applications. Whether you are new to Azure or looking to deepen your existing skills, this workshop will equip you with the practical knowledge and hands-on experience to enhance your development capabilities.
Workshop instructors: Sandra Ahlgrimm – Microsoft / Neven Cvetkovic – Broadcom / David Caron – Broadcom / Monica Calleja – Microsoft
How to register: The workshop will take place the day before the conference at Hotel Catalonia Barcelona Plaza, Plaça d’Espanya, 6-8, 08014 Barcelona. You can register for the workshop separately and do not need to pay for the conference registration. For workshop registration, please visit: https://reg.springio.net/event/afsdt
Make sure to stop by at the Microsoft Azure booth
Watch live demos, exclusive insights, and your chance to score cool swag! Fill out a survey, participate in our booth talks, and interact with Microsoft and Broadcom experts. Ask insightful questions, and you might even walk away with a pair of Microsoft earbuds!
Make sure to stop by the Microsoft Azure booth to supercharge your Spring-Azure skills, troubleshoot your toughest challenges, and discover the latest innovations.
Mark your calendars for Spring I/O 2024 in beautiful Barcelona! Don’t miss the Microsoft and Broadcom sessions, workshop, and booth – it is your chance to accelerate your Spring development experience with Microsoft Azure and VMware Tanzu. We can’t wait to connect!
Resources
Quickstart: Deploy your first application to Azure Spring Apps
Azure Spring Apps product page
Deploy to Azure Spring Apps video tutorial
Azure Spring Apps integrated with landing zones architecture guide
Microsoft Tech Community – Latest Blogs –Read More
Setting Up Slurm Cloud Bursting Using CycleCloud on Azure
Azure CycleCloud is an enterprise-friendly tool for orchestrating and managing High-Performance Computing (HPC) environments on Azure. With CycleCloud, users can provision infrastructure for HPC systems, deploy familiar HPC schedulers, and automatically scale the infrastructure to run jobs efficiently at any scale.
Slurm is a widely used open-source HPC scheduler that can manage workloads across clusters of compute nodes. Slurm can also be configured to interact with cloud resources, such as Azure CycleCloud, to dynamically add or remove nodes based on the demand of the jobs. This allows users to optimize their resource utilization and cost efficiency, as well as to access the scalability and flexibility of the cloud.
In this blog post, we are discussing how to integrate an external Slurm Scheduler to send jobs to CycleCloud for cloud bursting (Enabling on-premises workloads to be sent to the cloud for processing, known as “cloud bursting”) or hybrid HPC scenarios. For demonstration purposes, we are creating a Slurm Scheduler node in Azure as an external Slurm Scheduler in a different VNET and the execute nodes are in CycleCloud in a separate VNET. We are not discussing the complexities of networking involved in Hybrid scenarios.
Prerequisites
Before we start, we need to have the following items ready:
An Azure subscription
CycleCloud Version: 8.6.0-3223
OS version in Scheduler and execute nodes: Alma Linux release 8.7 (almalinux:almalinux-hpc:8_7-hpc-gen2:latest)
Slurm Version: 23.02.7-1
cyclecloud-slurm Project: 3.0.6
An external Slurm Scheduler node in Azure or on-premises. in this example we are using Azure VM running with Alma Linux 8.7.
A network connection between the external Slurm Scheduler node and the CycleCloud cluster. You can use Azure Virtual Network peering, VPN gateway, ExpressRoute, or other methods to establish the connection. In this example, we are using a very basic network setup.
A shared file system between the external Slurm Scheduler node and the CycleCloud cluster. You can use Azure NetApp Files, Azure Files, NFS, or other methods to mount the same file system on both sides. In this example, we are using a Scheduler VM as a NFS server.
Steps
After we have the prerequisites ready, we can follow these steps to integrate the external Slurm Scheduler node with the CycleCloud cluster:
1. On CycleCloud VM:
Ensure CycleCloud 8.6 VM is running and accessible via cyclecloud CLI.
Clone this repository and import a cluster using the provided CycleCloud template (slurm-headless.txt).
We are importing a cluster named hpc1using theslurm-headless.txt template.
git clone https://github.com/vinil-v/slurm-cloud-bursting-using-cyclecloud.git
cyclecloud import_cluster hpc1 -c Slurm-HL -f slurm-cloud-bursting-using-cyclecloud/templates/slurm-headless.txt
Output:
[vinil@cc86 ~]$ cyclecloud import_cluster hpc1 -c Slurm-HL -f slurm-cloud-bursting-using-cyclecloud/cyclecloud-template/slurm-headless.txt
Importing cluster Slurm-HL and creating cluster hpc1….
———-
hpc1 : off
———-
Resource group:
Cluster nodes:
Total nodes: 0
2. Preparing Scheduler VM:
Deploy a VM using the specified AlmaLinux image (If you have an existing Slurm Scheduler, you can skip this).
Run the Slurm scheduler installation script (slurm-scheduler-builder.sh) and provide the cluster name (hpc1) when prompted.
This script will install and configure Slurm Scheduler.
git clone https://github.com/vinil-v/slurm-cloud-bursting-using-cyclecloud.git
cd slurm-cloud-bursting-using-cyclecloud/scripts
sh slurm-scheduler-builder.sh
Output:
——————————————————————————————————————————
Building Slurm scheduler for cloud bursting with Azure CycleCloud
——————————————————————————————————————————
Enter Cluster Name: hpc1
——————————————————————————————————————————
Summary of entered details:
Cluster Name: hpc1
Scheduler Hostname: masternode2
NFSServer IP Address: 10.222.1.26
3. CycleCloud UI:
Access the CycleCloud UI, edit the hpc1 cluster settings, and configure VM SKUs and networking settings.
Enter the NFS server IP address for /sched and /shared mounts in the Network Attached Storage section.
Save & Start hpc1 cluster
4. On Slurm Scheduler Node:
Integrate External Slurm Scheduler with CycleCloud using the cyclecloud-integrator.sh script.
Provide CycleCloud details (username, password, and URL) when prompted. (Try entering the details manually instead of copy and paste. The copy & paste might contain some whitespaces and it might create issues in building the connection.)
cd slurm-cloud-bursting-using-cyclecloud/scripts
sh cyclecloud-integrator.sh
Output:
[root@masternode2 scripts]# sh cyclecloud-integrator.sh
Please enter the CycleCloud details to integrate with the Slurm scheduler
Enter Cluster Name: hpc1
Enter CycleCloud Username: vinil
Enter CycleCloud Password:
Enter CycleCloud URL (e.g., https://10.222.1.19): https://10.222.1.19
——————————————————————————————————————————
Summary of entered details:
Cluster Name: hpc1
CycleCloud Username: vinil
CycleCloud URL: https://10.222.1.19
——————————————————————————————————————————
5. User and Group Setup:
Ensure consistent user and group IDs across all nodes.
Better to use a centralized User Management system like LDAP to ensure the UID and GID are consistent across all the nodes.
In this example we are using the users.sh script to create a test user vinil and group for job submission. (User vinil exists in CycleCloud)
cd slurm-cloud-bursting-using-cyclecloud/scripts
sh users.sh
6. Testing & Job Submission:
Log in as a test user (vinil in this example) on the Scheduler node.
Submit a test job to verify the setup.
su – vinil
srun hostname &
Output:
[root@masternode2 scripts]# su – vinil
Last login: Tue May 14 04:54:51 UTC 2024 on pts/0
[vinil@masternode2 ~]$ srun hostname &
[1] 43448
[vinil@masternode2 ~]$ squeue
JOBID PARTITION NAME USER ST TIME NODES NODELIST(REASON)
1 hpc hostname vinil CF 0:04 1 hpc1-hpc-1
[vinil@masternode2 ~]$ hpc1-hpc-1
You will see a new node getting created in hpc1 cluster.
Congratulations! You have successfully set up Slurm bursting with CycleCloud on Azure.
Conclusion
In this blog post, we have shown how to integrate an external Slurm Scheduler node with Azure CycleCloud for cloud bursting or hybrid HPC scenarios. This enables users to leverage the power and flexibility of the cloud for their HPC workloads, while maintaining their existing Slurm workflows and tools. We hope this guide helps you to get started with your HPC journey on Azure.
Reference:
GitHub repo – slurm-cloud-bursting-using-cyclecloud
Azure CycleCloud Documentation
Microsoft Tech Community – Latest Blogs –Read More
I am using UV sensor and wanted the real time data to be exported to MATLAB in Real time. I need the continuous changes to be simultaneously noted
I wanted to work on the data that i get from the sensor. But couldn;t find a way to save the com3 output to csv file in the newer system.I wanted to work on the data that i get from the sensor. But couldn;t find a way to save the com3 output to csv file in the newer system. I wanted to work on the data that i get from the sensor. But couldn;t find a way to save the com3 output to csv file in the newer system. uv sensor MATLAB Answers — New Questions
Error during implementation of Wavelet based denoising using hybrid approach of adaptive flitering and single value decomposition
I want to denoise the signal using hybrid approach of adaptive flitering and single value decomposition. I have written the following code but it’s showing error in Line 32
S_thresh = diag(S – threshold*(S > threshold));
Following is the code and signal image with nosie:
clear all
clc
load("Signal")
% Load the noisy signal or image
noisy_signal = C;
% Set parameters
wavelet = ‘db2’; % Choose wavelet type
level = 5; % Decomposition level
threshold_type = ‘s’; % ‘s’ for soft thresholding, ‘h’ for hard thresholding
threshold_multiplier = median(abs(C)) / 0.6745; % Threshold multiplier for adaptive thresholding
% Perform wavelet decomposition
[c, l] = wavedec2(noisy_signal, level, wavelet);
% Initialize denoised coefficients
denoised_c = cell(1, level + 1);
% Loop through each detail coefficient subband for denoising
for i = 1:level
% Extract detail coefficients for current subband
cd = detcoef2(‘all’, c, l, i);
% Apply adaptive filtering
denoised_cd = zeros(size(cd));
for j = 1:size(cd, 1)
% Apply adaptive thresholding using singular value decomposition
[U, S, V] = svd(cd(j, :));
if threshold_type == ‘s’
threshold = threshold_multiplier * median(diag(S));
S_thresh = diag(S – threshold*(S > threshold));
elseif threshold_type == ‘h’
S_thresh = diag(S.*(abs(S) > threshold_multiplier * median(diag(S))));
end
denoised_cd(j, 🙂 = U * S_thresh * V’;
end
% Update denoised coefficients
denoised_c{i} = denoised_cd;
end
% Keep approximation coefficients as is
denoised_c{level + 1} = appcoef2(c, l, wavelet, level);
% Reconstruct the denoised signal or image
denoised_signal = waverec2(denoised_c, l, wavelet);
% Display the results
figure;
subplot(1, 2, 1);
imshow(noisy_signal);
title(‘Noisy Signal’);
subplot(1, 2, 2);
imshow(uint8(denoised_signal));
title(‘Denoised Signal’);I want to denoise the signal using hybrid approach of adaptive flitering and single value decomposition. I have written the following code but it’s showing error in Line 32
S_thresh = diag(S – threshold*(S > threshold));
Following is the code and signal image with nosie:
clear all
clc
load("Signal")
% Load the noisy signal or image
noisy_signal = C;
% Set parameters
wavelet = ‘db2’; % Choose wavelet type
level = 5; % Decomposition level
threshold_type = ‘s’; % ‘s’ for soft thresholding, ‘h’ for hard thresholding
threshold_multiplier = median(abs(C)) / 0.6745; % Threshold multiplier for adaptive thresholding
% Perform wavelet decomposition
[c, l] = wavedec2(noisy_signal, level, wavelet);
% Initialize denoised coefficients
denoised_c = cell(1, level + 1);
% Loop through each detail coefficient subband for denoising
for i = 1:level
% Extract detail coefficients for current subband
cd = detcoef2(‘all’, c, l, i);
% Apply adaptive filtering
denoised_cd = zeros(size(cd));
for j = 1:size(cd, 1)
% Apply adaptive thresholding using singular value decomposition
[U, S, V] = svd(cd(j, :));
if threshold_type == ‘s’
threshold = threshold_multiplier * median(diag(S));
S_thresh = diag(S – threshold*(S > threshold));
elseif threshold_type == ‘h’
S_thresh = diag(S.*(abs(S) > threshold_multiplier * median(diag(S))));
end
denoised_cd(j, 🙂 = U * S_thresh * V’;
end
% Update denoised coefficients
denoised_c{i} = denoised_cd;
end
% Keep approximation coefficients as is
denoised_c{level + 1} = appcoef2(c, l, wavelet, level);
% Reconstruct the denoised signal or image
denoised_signal = waverec2(denoised_c, l, wavelet);
% Display the results
figure;
subplot(1, 2, 1);
imshow(noisy_signal);
title(‘Noisy Signal’);
subplot(1, 2, 2);
imshow(uint8(denoised_signal));
title(‘Denoised Signal’); I want to denoise the signal using hybrid approach of adaptive flitering and single value decomposition. I have written the following code but it’s showing error in Line 32
S_thresh = diag(S – threshold*(S > threshold));
Following is the code and signal image with nosie:
clear all
clc
load("Signal")
% Load the noisy signal or image
noisy_signal = C;
% Set parameters
wavelet = ‘db2’; % Choose wavelet type
level = 5; % Decomposition level
threshold_type = ‘s’; % ‘s’ for soft thresholding, ‘h’ for hard thresholding
threshold_multiplier = median(abs(C)) / 0.6745; % Threshold multiplier for adaptive thresholding
% Perform wavelet decomposition
[c, l] = wavedec2(noisy_signal, level, wavelet);
% Initialize denoised coefficients
denoised_c = cell(1, level + 1);
% Loop through each detail coefficient subband for denoising
for i = 1:level
% Extract detail coefficients for current subband
cd = detcoef2(‘all’, c, l, i);
% Apply adaptive filtering
denoised_cd = zeros(size(cd));
for j = 1:size(cd, 1)
% Apply adaptive thresholding using singular value decomposition
[U, S, V] = svd(cd(j, :));
if threshold_type == ‘s’
threshold = threshold_multiplier * median(diag(S));
S_thresh = diag(S – threshold*(S > threshold));
elseif threshold_type == ‘h’
S_thresh = diag(S.*(abs(S) > threshold_multiplier * median(diag(S))));
end
denoised_cd(j, 🙂 = U * S_thresh * V’;
end
% Update denoised coefficients
denoised_c{i} = denoised_cd;
end
% Keep approximation coefficients as is
denoised_c{level + 1} = appcoef2(c, l, wavelet, level);
% Reconstruct the denoised signal or image
denoised_signal = waverec2(denoised_c, l, wavelet);
% Display the results
figure;
subplot(1, 2, 1);
imshow(noisy_signal);
title(‘Noisy Signal’);
subplot(1, 2, 2);
imshow(uint8(denoised_signal));
title(‘Denoised Signal’); wavelet denoising, partial discharge signal, single value decomopsition (svd), adaptive flitering MATLAB Answers — New Questions
trucksim-simulink co-simulaition
I use Simulink and Trucksim for co-simulation. Sometimes it works well, but sometimes it keeps reporting errors, indicating that the model in Trucksim is unstable. However, running the simulation alone in Trucksim can proceed normallyI use Simulink and Trucksim for co-simulation. Sometimes it works well, but sometimes it keeps reporting errors, indicating that the model in Trucksim is unstable. However, running the simulation alone in Trucksim can proceed normally I use Simulink and Trucksim for co-simulation. Sometimes it works well, but sometimes it keeps reporting errors, indicating that the model in Trucksim is unstable. However, running the simulation alone in Trucksim can proceed normally co-simulation, simulink, trucksim MATLAB Answers — New Questions
Shortcut to clean filters
Does anyone knows if there is a shortcut in keypad to clean filters?
I just want to clean the filter, not to remove the entire filter.
Does anyone knows if there is a shortcut in keypad to clean filters?I just want to clean the filter, not to remove the entire filter. Read More