Category: News
pull data from list
Hi everyone
I’m very much a beginner with excel and I’m in need of some help.
I’m currently using an excel sheet for work that pulls data from various other excel sheets and i wanted to create a bit of a summary sheet, i currently have quite a long list of locations and using x lookup its pulling what item code is in this location and if there’s orders for this today as below:
I’m wanting it to show me all the “Empty” locations and all the “no order for today” locations, i know i can use filter but i want to see both lists side by side and will be adding some more in future too.
I’ve tried using =filter but cant seem to get it to work.
any help will be massively appreciated
Hi everyone I’m very much a beginner with excel and I’m in need of some help. I’m currently using an excel sheet for work that pulls data from various other excel sheets and i wanted to create a bit of a summary sheet, i currently have quite a long list of locations and using x lookup its pulling what item code is in this location and if there’s orders for this today as below:I’m wanting it to show me all the “Empty” locations and all the “no order for today” locations, i know i can use filter but i want to see both lists side by side and will be adding some more in future too. I’ve tried using =filter but cant seem to get it to work. any help will be massively appreciated Read More
Unable to print on macOS 15 beta
Universal Print has stopped working on macOS 15 beta and I’m wondering if anyone else has encountered this or knows of a workaround.
The job enters the print queue but it essentially says “Printer is busy”, even though the printer sleeping, and the progress bar continually swipes back and forth. Trying to use ipptool results in “GETTOKEN” and the function cupsError() returns “Unauthorized”.
I’ve logged in to the UP preference panel, I have permission to print to the printer and it’s shared to the Mac, checked via the portal. I’ve deleted and added the printer via the UP preference pane and the Printers & Scanners panel shows the printer as idle. In all cases, the Mac just won’t communicate to the server to submit the job
None of the troubleshooting steps are relevant nor help, does anyone have an idea of what to try next?
Thanks,
Dave
Universal Print has stopped working on macOS 15 beta and I’m wondering if anyone else has encountered this or knows of a workaround.The job enters the print queue but it essentially says “Printer is busy”, even though the printer sleeping, and the progress bar continually swipes back and forth. Trying to use ipptool results in “GETTOKEN” and the function cupsError() returns “Unauthorized”. I’ve logged in to the UP preference panel, I have permission to print to the printer and it’s shared to the Mac, checked via the portal. I’ve deleted and added the printer via the UP preference pane and the Printers & Scanners panel shows the printer as idle. In all cases, the Mac just won’t communicate to the server to submit the job None of the troubleshooting steps are relevant nor help, does anyone have an idea of what to try next? Thanks,Dave Read More
Does the Storage Size shown inside SharePoint online admin center cover both SharePoint and OneDrive
When we access the SharePoint Online admin center it shows the available storage out of the total size:-
so do the available size and the total size cover both SharePoint sites and OneDrive? or only SharePoint? and is exchange mail box also calculated here?
When we access the SharePoint Online admin center it shows the available storage out of the total size:- so do the available size and the total size cover both SharePoint sites and OneDrive? or only SharePoint? and is exchange mail box also calculated here? Read More
Copilot, Sharepoint Online, and LLM Training Data
When one creates a Copilot for a SPO site, what data becomes part of the model’s training? Are prompts used by msft in training the model? Is any part of the site data or AI’s reponses used in training the model? This is a critical issue for any organization with protected data, like law firms, health care delivery orgs, etc.
When one creates a Copilot for a SPO site, what data becomes part of the model’s training? Are prompts used by msft in training the model? Is any part of the site data or AI’s reponses used in training the model? This is a critical issue for any organization with protected data, like law firms, health care delivery orgs, etc. Read More
File locations in SPO subsites vs. Teams group sites
Current practice: Org is a law firm with one tenant, that has multiple subsites and some secondary subsites below those. Subsites are topical, one of which contains active clients, and each of those has a subsite. Matters are folders within the subsites. Collaboration with externals is either by invitaton per file or by access by Entra groups that include the necessary externals. This allows easy synching via the OneDrive client, easy searching, and easy archiving.
Objective: To use Teams for internals and externals on cases requiring a lot of collaboration.
Problem: If one sets up a team in Teams with channels for cases, unless something has changed those files are kept in a dedicated SPO site, located hierarchically directly under the tenant. That interrupts the current file management system. Is there a way to use Teams and channels with externals while keeping the files in their current SPO subsites and folders? Or am I envisioning incorrectly how this would work?
Current practice: Org is a law firm with one tenant, that has multiple subsites and some secondary subsites below those. Subsites are topical, one of which contains active clients, and each of those has a subsite. Matters are folders within the subsites. Collaboration with externals is either by invitaton per file or by access by Entra groups that include the necessary externals. This allows easy synching via the OneDrive client, easy searching, and easy archiving. Objective: To use Teams for internals and externals on cases requiring a lot of collaboration. Problem: If one sets up a team in Teams with channels for cases, unless something has changed those files are kept in a dedicated SPO site, located hierarchically directly under the tenant. That interrupts the current file management system. Is there a way to use Teams and channels with externals while keeping the files in their current SPO subsites and folders? Or am I envisioning incorrectly how this would work? Read More
How to create a component reference in the system composer from a simscape component
Hello,
I have a Simscape Isothermal Fluid component that is an isolated component (it have its own test harness, test cases and requirements. You can see it is composed with simscape signals and simulink signals.
I would like to build a system composer, were this component is part of. But when I add a "reference component" on the system composer ad link to the compoenent, the physical lines does not come.
The only way I made it works is creating the sismcape compoenent inside the system composer, but this way I need to manage changes and so on inside the composer.
Any solution to create a simscape component that can be referenced inside the system composer and have the physical connections available?Hello,
I have a Simscape Isothermal Fluid component that is an isolated component (it have its own test harness, test cases and requirements. You can see it is composed with simscape signals and simulink signals.
I would like to build a system composer, were this component is part of. But when I add a "reference component" on the system composer ad link to the compoenent, the physical lines does not come.
The only way I made it works is creating the sismcape compoenent inside the system composer, but this way I need to manage changes and so on inside the composer.
Any solution to create a simscape component that can be referenced inside the system composer and have the physical connections available? Hello,
I have a Simscape Isothermal Fluid component that is an isolated component (it have its own test harness, test cases and requirements. You can see it is composed with simscape signals and simulink signals.
I would like to build a system composer, were this component is part of. But when I add a "reference component" on the system composer ad link to the compoenent, the physical lines does not come.
The only way I made it works is creating the sismcape compoenent inside the system composer, but this way I need to manage changes and so on inside the composer.
Any solution to create a simscape component that can be referenced inside the system composer and have the physical connections available? simscape, system composer MATLAB Answers — New Questions
In Simulink, How to decode CAN data with length greater than 8?
There is an issue using CAN Unpack to decode CAN message with length greater than 8. Is there any other recommanded way to decode CAN data? For starter, I have used bit extract and bitwise AND Shift Arithmetic to extract bits in simulink. I’m not sure if matlab function in simulink can also handle this if it’s possible.There is an issue using CAN Unpack to decode CAN message with length greater than 8. Is there any other recommanded way to decode CAN data? For starter, I have used bit extract and bitwise AND Shift Arithmetic to extract bits in simulink. I’m not sure if matlab function in simulink can also handle this if it’s possible. There is an issue using CAN Unpack to decode CAN message with length greater than 8. Is there any other recommanded way to decode CAN data? For starter, I have used bit extract and bitwise AND Shift Arithmetic to extract bits in simulink. I’m not sure if matlab function in simulink can also handle this if it’s possible. data, control MATLAB Answers — New Questions
Why do I get an invalid Python executable path error when running the “pyenv” function?
Previously I was able to use the "pyenv" function in MATLAB with my Python environment. However, now when I try to use the "pyenv" function, I get the following error message. Why am I suddenly receiving this error?
Error using pyenv
‘<path>’ is not a path to a valid Python executable.Previously I was able to use the "pyenv" function in MATLAB with my Python environment. However, now when I try to use the "pyenv" function, I get the following error message. Why am I suddenly receiving this error?
Error using pyenv
‘<path>’ is not a path to a valid Python executable. Previously I was able to use the "pyenv" function in MATLAB with my Python environment. However, now when I try to use the "pyenv" function, I get the following error message. Why am I suddenly receiving this error?
Error using pyenv
‘<path>’ is not a path to a valid Python executable. python, pyenv MATLAB Answers — New Questions
Finding the closest coordinate from a surface plot based on a X, Y location
Hello, I want to extrapolate a point (longitude,latitude) from the coordinates of a surface file (attached here as "slab_strike") as it is empty (Strike=NaN) when using interp2 as the points are outside the boundary. Despsite I used the option "nearest" is empty anyway.
% Coordinates of the points:
lat_GMM= -17.8990;
lon_GMM=-73.5295;
% The surface plot
load slab_strike % Loading the slab strike
Slab_strike.x=x
Slab_strike.y=y
Slab_strike.z=z
Strike = interp2(Slab_strike.x,Slab_strike.y,Slab_strike.z,lon_GMM,lat_GMM)
As Strike=NaN, there is a way I can choose the closest point value from the surface instead.
I would appreciate the helpHello, I want to extrapolate a point (longitude,latitude) from the coordinates of a surface file (attached here as "slab_strike") as it is empty (Strike=NaN) when using interp2 as the points are outside the boundary. Despsite I used the option "nearest" is empty anyway.
% Coordinates of the points:
lat_GMM= -17.8990;
lon_GMM=-73.5295;
% The surface plot
load slab_strike % Loading the slab strike
Slab_strike.x=x
Slab_strike.y=y
Slab_strike.z=z
Strike = interp2(Slab_strike.x,Slab_strike.y,Slab_strike.z,lon_GMM,lat_GMM)
As Strike=NaN, there is a way I can choose the closest point value from the surface instead.
I would appreciate the help Hello, I want to extrapolate a point (longitude,latitude) from the coordinates of a surface file (attached here as "slab_strike") as it is empty (Strike=NaN) when using interp2 as the points are outside the boundary. Despsite I used the option "nearest" is empty anyway.
% Coordinates of the points:
lat_GMM= -17.8990;
lon_GMM=-73.5295;
% The surface plot
load slab_strike % Loading the slab strike
Slab_strike.x=x
Slab_strike.y=y
Slab_strike.z=z
Strike = interp2(Slab_strike.x,Slab_strike.y,Slab_strike.z,lon_GMM,lat_GMM)
As Strike=NaN, there is a way I can choose the closest point value from the surface instead.
I would appreciate the help selecting the closest point reference MATLAB Answers — New Questions
In Outlook how to create common event Categorize for the whole organization
How to create event categories for calendar events to be selected by the user when creating a new event. Example
Needs analysis Demo Contract Renewal
How to create event categories for calendar events to be selected by the user when creating a new event. Example Needs analysis Demo Contract Renewal Read More
Powershell masking password
Hello Everyone,
I have the script (API POST) below which is working fine.
$UserPass = “username:password”
[string]$stringToEncode=$UserPass
$encodedString=[Convert]::ToBase64String([System.Text.Encoding]::UTF8.GetBytes($stringToEncode))
[Net.ServicePointManager]::SecurityProtocol = [Net.SecurityProtocolType]::Tls12
$headers = New-Object “System.Collections.Generic.Dictionary[[String],[String]]”
$headers.Add(“content-type”, “application/json”)
$headers.Add(“accept”, “application/json”)
$headers.Add(“Authorization”, “Basic ” + $encodedString)
Invoke-RestMethod -Uri “https://api_base_url/session” -Headers $headers -Method POST
I want to mask the password instead of plain text. So modified it to below which will ask to enter the masked password.
$UserPass = “username:$(Read-Host -Prompt “Please type the password” -AsSecureString)“
But the whole script is not working anymore. When I troubleshoot, there is a difference in encoding/decoding. I ran an online base64 decode and encode, the result is different.
Plain text password – username:password
dXNlcm5hbWU6cGFzc3dvcmQ= —> username:password
Masked password – username:$(Read-Host -Prompt “Please type the password” -AsSecureString)
dXNlcm5hbWU6U3lzdGVtLlNlY3VyaXR5LlNlY3VyZVN0cmluZw== —> username:System.Security.SecureString
How can I mask the password but able to read the System.Security.SecureString as the actual password?
Thank you in advanced.
Hello Everyone, I have the script (API POST) below which is working fine. $UserPass = “username:password” [string]$stringToEncode=$UserPass$encodedString=[Convert]::ToBase64String([System.Text.Encoding]::UTF8.GetBytes($stringToEncode)) [Net.ServicePointManager]::SecurityProtocol = [Net.SecurityProtocolType]::Tls12$headers = New-Object “System.Collections.Generic.Dictionary[[String],[String]]”$headers.Add(“content-type”, “application/json”)$headers.Add(“accept”, “application/json”)$headers.Add(“Authorization”, “Basic ” + $encodedString) Invoke-RestMethod -Uri “https://api_base_url/session” -Headers $headers -Method POST I want to mask the password instead of plain text. So modified it to below which will ask to enter the masked password. $UserPass = “username:$(Read-Host -Prompt “Please type the password” -AsSecureString)” But the whole script is not working anymore. When I troubleshoot, there is a difference in encoding/decoding. I ran an online base64 decode and encode, the result is different. Plain text password – username:passworddXNlcm5hbWU6cGFzc3dvcmQ= —> username:password Masked password – username:$(Read-Host -Prompt “Please type the password” -AsSecureString)dXNlcm5hbWU6U3lzdGVtLlNlY3VyaXR5LlNlY3VyZVN0cmluZw== —> username:System.Security.SecureString How can I mask the password but able to read the System.Security.SecureString as the actual password? Thank you in advanced. Read More
Expose Prompt Flow as API
Hi,
We have a few POCs implemented with Azure AI Prompt Flow.
Is there any way to expose Prompt Flow as an API and have it called by an external web app (ex: for instance developed with React)?
I searched the Internet and got no promising results.
Thanks
Hi,We have a few POCs implemented with Azure AI Prompt Flow. Is there any way to expose Prompt Flow as an API and have it called by an external web app (ex: for instance developed with React)? I searched the Internet and got no promising results. Thanks Read More
365 CoPilot for Outlook using shared calendars
Is there any update coming where I can use 365 copilot to look across multiple shared calendars to schedule a meeting. This seems like it would be a prime use of CoPilot.
Is there any update coming where I can use 365 copilot to look across multiple shared calendars to schedule a meeting. This seems like it would be a prime use of CoPilot. Read More
Cannot switch users in microsoft to-do
Because multiple services use microsoft, I have to log in with multiple microsoft accounts. When going to the to do webpage, it says my account isn’t on exchange online. It doesn’t because it is an admin login and there is no way to switch to the other account that has to do.
Because multiple services use microsoft, I have to log in with multiple microsoft accounts. When going to the to do webpage, it says my account isn’t on exchange online. It doesn’t because it is an admin login and there is no way to switch to the other account that has to do. Read More
API: Anonymous user can start meeting
Dear community.
Similar to this request: https://techcommunity.microsoft.com/t5/microsoft-teams/allowing-guests-to-start-a-meeting-without-the-organizer-present/m-p/2080397/page/2 , which recommends to set this setting on account level.
Is it possible to enable this setting via API just for a specific call? I was unsuccessful in finding this in the documentation.
Thank you for any help!
Dear community.Similar to this request: https://techcommunity.microsoft.com/t5/microsoft-teams/allowing-guests-to-start-a-meeting-without-the-organizer-present/m-p/2080397/page/2 , which recommends to set this setting on account level.Is it possible to enable this setting via API just for a specific call? I was unsuccessful in finding this in the documentation. Thank you for any help! Read More
Timeline Not Matching Other Filters
Hello all,
I tried to find a similar topic discussion on here but have not seen one. I currently have a pivot table with a few sliders and one timeline filter. One of the filters is a “Six Month” filter, which shows items that are due for review within the next 6 months. This filter is from a column on the original table that shows either “Six Months Review” or “Do Not Need to be Reviewed” and is based off a formula of a date column. My issue is, when I click to filter on “Six Months Review”, the date timeline doesn’t filter on those 6 months, it still shows all the dates present in the table. Is there a way for the Timeline filter to show data based off of the other filters as well?
Would appreciate any insight at all, thank you!
Hello all,I tried to find a similar topic discussion on here but have not seen one. I currently have a pivot table with a few sliders and one timeline filter. One of the filters is a “Six Month” filter, which shows items that are due for review within the next 6 months. This filter is from a column on the original table that shows either “Six Months Review” or “Do Not Need to be Reviewed” and is based off a formula of a date column. My issue is, when I click to filter on “Six Months Review”, the date timeline doesn’t filter on those 6 months, it still shows all the dates present in the table. Is there a way for the Timeline filter to show data based off of the other filters as well? Would appreciate any insight at all, thank you! Read More
selecting certain cells to search for a name if the condition for the row exists.
Hi there i am a basic excel user and have hit a major hurdle creating a summary sheet of occurrence’s for people.
I have 31 sheets which are identical as below, and if a condition on any of the rows for colum E contains the Liv i need a summary sheet to check the name cells further along the row to see if the persons name exists. If so i need this to be added together for all the sheets for a monthly figure.
Hope this explains what i mean.
I use the countif function for other set area’s in the sheet and that works fine as id does not require any specific condition to be met before checking.
Any help on this would be greatly appreciated, Thanks in advance Mike.
Hi there i am a basic excel user and have hit a major hurdle creating a summary sheet of occurrence’s for people.I have 31 sheets which are identical as below, and if a condition on any of the rows for colum E contains the Liv i need a summary sheet to check the name cells further along the row to see if the persons name exists. If so i need this to be added together for all the sheets for a monthly figure.Hope this explains what i mean. I use the countif function for other set area’s in the sheet and that works fine as id does not require any specific condition to be met before checking. Any help on this would be greatly appreciated, Thanks in advance Mike. Read More
Revolutionizing log collection with Azure Monitor Agent
With the deprecation of Log analytics agent (also called MMA or OMS), it’s a great opportunity to discuss its successor – the Azure Monitor Agent or in short – (AMA), and why it is so much better and keeps improving!
AMA is a lightweight log collection agent, designed to consume as little resources as possible when collecting metrics and logs from your server. It can be installed on various flavors and OS versions of both Linux as well as Windows machines hosted in Azure, on-premises or any other cloud environments. When installed on non-Azure machines, AMA requires the installation of Azure Arc agentry to provide mirroring and centralized cloud management capabilities to your machine.
Associated with a Microsoft Sentinel workspace, all logs collected form AMA-installed machines, are sent to the various Microsoft Sentinel tables, depending on the source type from which they were collected (Windows DNS, Windows security events, Firewall, IIS, Syslog, CEF, etc.).
AMA can be controlled using Data Collection Rules (DCR), enabling you to define where to collect the logs from, what data manipulations to perform with KQL transformations (enabling you filtering, parsing, enrichment and more) and where to send the logs to, whether that be a workspace, Eventhubs (for Azure VMS only), Auxiliary tier and so on. You can group machines by using different DCRs.
DCRs for AMA can be created in multiple ways:
Through the Portals (Azure or Unified Security Operations Platform): This method provides a user-friendly interface for defining data collection scenarios:
Configuring an AMA-based connector in Microsoft Sentinel > Configuration > Data collection (in either Azure or the security portals), will create the DCR for you. Through this option data will be directed to Microsoft Sentinel tables, some of which are not accessible when defining the DCR in Azure Monitor portal (e.g CommonSecurityLog, SecurityEvent, WindowsEvent and ASIM tables).
Creating and editing the DCRs through Azure Monitor by browsing to Azure portal > Azure Monitor > Settings > Data Collection Rules. Note: using this UI creation of DCR does not enable access to Microsoft Sentinel tables are not accessible and require editing of the DCR’s outputStream to divert the data to Microsoft Sentinel’s tables).
Azure Resource Manager (ARM) Templates: ARM templates allow you to define DCRs as code, enabling you to deploy and manage them programmatically. This is useful for automation and consistency across environments.
Azure CLI and PowerShell: These command-line tools provide another way to create and manage DCRs programmatically. They are particularly useful for scripting and automation.
REST API: The Azure Monitor REST API allows you to create, update, and manage DCRs programmatically. This method offers the most flexibility and integration with other systems.
PowerShell: Use Azure cmdlet to create, edit and deploy DCRs.
So why do we like AMA better?
Ah! Easy:
AMA is more performant than the legacy agent, reaching 25% increase in performance when compared to the Linux OMS, and 500% better in EPS than the MMA for Windows!
It’s centrally configured through the cloud, by the DCRs, enabling grouping of machines. Using Azure policies enables scale deployments, upgrades, and configuration changes over time. Any change in the DCR configuration is automatically rolled out to all agents without the need for further action at the client-side.
AMA supports multi-homing, sending events to multiple workspaces, cross region and cross-tenant (with Azure Lighthouse).
Most importantly, the AMA is more secured connecting to the cloud using Managed Identity and Microsoft Entra tokens. With non-Azure controlled machines, Arc enhances the security of the connection by enforcing and handling authentication and identity tokens
The greatest thing is that AMA keeps evolving through multiple enhancements and improvements we’re constantly working on!
Next, we’ll cover a few noticeable changes to connectors.
Windows Security Events:
We’ve enhanced the schema of the SecurityEvent table, hosting Windows Security Event, and have added new columns that AMA version 1.28.2 and up will be populating. These enhancements are designated to provide better coverage and visibility of the events collected.
New columns added are:
Keywords: A 64 bitmask of keywords defined in the event. Keywords classify types of events (e.g events associated with reading data), with the left most 2 bits representing audit success or failure.
Opcode: Operation code identifying the location in the application from where the event was logged, together with Task.
EventRecordId: The record number assigned to the event when it was logged
SystemThreadId: The thread that generated the event
SystemProcessId: The process that generated the event
Correlation: Activity identifiers that consumers can use to group related events together
Version: The version number of the event’s definition
SystemUserId: The ID of the user responsible for the event
Common Event Format (CEF) and Syslog:
We all know how important it is to collect and analyze data from various sources, such as firewalls, routers, switches, servers, DNS and applications. Two of the most common protocols used by many devices to emit their logs are CEF and Syslog.
With the legacy agent you had to configure a connector for each source separately, which could be tedious and time-consuming. That’s why we are excited to announce the updates to the Syslog and CEF data connectors via AMA, which will improve your overall experience with Microsoft Sentinel data connectors. All devices will now depend on either the generic CEF or the generic Syslog connectors, based on the log source used protocol. The relevant generic connector will be deployed as part of the device solution (don’t forget to check the box to select it for installation after you click the ‘install with dependencies’ button!).
To monitor the ingestion of your logs from the separated device types with the graphs, we’ve added a dedicated workbook, installed with the solution, where device types are aggregated in a single location. You can further filter the view based on device type or connectivity status.
To help you set the source device to streamline the logs, we’ve included the instructions or relevant referrals for many common CEF appliances or Syslog in our documentation.
Windows Events:
What happens if you wish to collect other Windows audit events? You cannot send them to the SecurityEvents table as those events are not from the security channel and do not match that table schema. Instead, the non-security events can be directed to the WindowsEvents table using the Windows Forwarded Events data connector, which can be used to stream both forwarded events collected from a WEC/WEF server, as well as those Windows server, by setting the DCR wizard to Custom option and specifying the XPath expression to point to the desired events.
Windows Firewall Logs:
This connector enables the collection of the machine’s Firewall logs. We’ve added a granularity selection of the profile from which to collect and stream logs to the ASimNetworkSessionLogs table.
Custom Logs:
Some appliances packaged in Content Hub solutions are streaming data to _CL tables. For those 15 specific devices and to enable a quick setting up of file collection, we’ve added the Custom logs connector.
We hope this post was informative and that you have already upgraded your agents to AMA, or plan to do so shortly. For more information on other connectors agent-based or others, refer to our Data connectors documentation or browse the content hub to locate your source of interest. If you would like more content about using AMA, please let us know in the comments below!
Lastly, to stay current with the latest updates and announcements stay tuned to our What’s new page.
Microsoft Tech Community – Latest Blogs –Read More
Inside Look: How Azure Kubernetes Service supports OSSKU Migration between Ubuntu and Azure Linux
Microsoft recently announced the general availability of OSSKU Migration in Azure Kubernetes Service (AKS). This new feature enables users to take an existing AKS node pool and update the OSSKU for an in-place move between Ubuntu and Azure Linux. Previously when OSSKU was immutable, users had to create new node pools and explicitly drain their workloads into them, which was both labor intensive and required additional VM quota.
In this blog post we will dive into how to use this feature, the tech stack that supports it, and some considerations to make sure the upgrade goes smoothly.
Using OSSKU Migration
OSSKU Migration is supported by az-cli, ARM/Bicep templates, and Terraform. All three options will put the affected node pools into the upgrading state, which will take several minutes to resolve. During this time your cluster will scale up depending on your max surge setting, then your pods will be drained and scheduled on to other VMs in your nodepool or cluster.
If you are using az-cli your version must be 2.61.0 or higher. To trigger a migration with az-cli run the following command on your node pool.
az aks nodepool update –resource-group myResourceGroup –cluster-name myAKSCluster –name myNodePool –os-sku AzureLinux
If you are using ARM/Bicep templates you must update your apiVersion to 2023-07-01 or newer. Then update the ossku field in your agentPoolProfile section to “AzureLinux” and redeploy your template.
If you are using Terraform your azurerm provider version must be v3.111.0 or higher. Then update the os_sku field of your node pools to “AzureLinux” and redeploy your Terraform plan.
How it Works
When you send a request to AKS (1) and it notices that your node pool’s OSSKU value has changed, it performs some additional validation to make sure that the change is allowed:
OSSKU Migration cannot change node pool names.
Only Ubuntu and AzureLinux are supported OSSKU targets.
Ubuntu node pools with UseGPUDedicatedVHD enabled cannot change OSSKU.
Ubuntu node pools with CVM 20.04 enabled cannot change OSSKU.
AzureLinux node pools with Kata enabled cannot change OSSKU.
Windows node pools cannot change OSSKU.
If all these conditions pass, then AKS puts the node pool into the upgrading state and picks the latest available image for your new chosen OSSKU. It will then follow the exact same flow as a node image upgrade, scaling the node pool up based on your max surge value (2), then replacing the image on your existing VMs one by one until each node is on the latest image for your new chosen OSSKU (3). Once all your VMs are upgraded to the new image, AKS then removes the surge nodes and signals back to the caller that the upgrade is complete (4).
Things to Consider
Before running an OSSKU Migration in your production clusters, there are two very important things to check:
Deploy a node pool of your new target OSSKU into both development and production environments to confirm that everything works as expected on your new OSSKU before performing the migration on the rest of your node pools.
Ensure that your workload has sufficient Pod Disruption Budget to allow AKS to move pods between VMs during the upgrade. This is necessary for OSSKU migration and any AKS node image upgrade to safely move workloads around your cluster while nodes are restarting. For information on troubleshooting PDB failures during upgrade see this documentation.
Conclusion
Throughout public preview, multiple teams within Microsoft have utilized OSSKU Migration to seamlessly move their workloads over to the Azure Linux OSSKU without large surge capacity and without the need for manual intervention within their clusters. We’re looking forward to more users experiencing how easy it is now to update the OSSKU on an existing AKS node pool.
Microsoft Tech Community – Latest Blogs –Read More
Issue connecting Simulink to Arduino Nano 33 IoT
I’m having an issue connecting my Arduino Nano 33 IoT to simulink. When I try to run it, I get the error message: "Unrecognized feild name " HostCOMPort" " I’m unsure how to resolve this and would greatly appricate the help!
The device shows up in my device manager and I can run code in Arduino IDE and Matlab without issues. Its only when I use Simulink is there a problem. Im using the cable that came in the Engineering Rev 2 kit from Arduino as well as a usb to usb-c converter to plug into my surface go to connect my computer to the Nano. I am able to run a regular Arduino Uno using the same setup through simiulink.
I’ve notcied that everytime I try to run the simulink program, the port in the device manager will change from "Arduino Nano 33 IoT" to the bootloader version. And also Matlab will tell me that an arduino device is detected again. My guess is that the Nano is restarting everytime or something? Im not really sure.I’m having an issue connecting my Arduino Nano 33 IoT to simulink. When I try to run it, I get the error message: "Unrecognized feild name " HostCOMPort" " I’m unsure how to resolve this and would greatly appricate the help!
The device shows up in my device manager and I can run code in Arduino IDE and Matlab without issues. Its only when I use Simulink is there a problem. Im using the cable that came in the Engineering Rev 2 kit from Arduino as well as a usb to usb-c converter to plug into my surface go to connect my computer to the Nano. I am able to run a regular Arduino Uno using the same setup through simiulink.
I’ve notcied that everytime I try to run the simulink program, the port in the device manager will change from "Arduino Nano 33 IoT" to the bootloader version. And also Matlab will tell me that an arduino device is detected again. My guess is that the Nano is restarting everytime or something? Im not really sure. I’m having an issue connecting my Arduino Nano 33 IoT to simulink. When I try to run it, I get the error message: "Unrecognized feild name " HostCOMPort" " I’m unsure how to resolve this and would greatly appricate the help!
The device shows up in my device manager and I can run code in Arduino IDE and Matlab without issues. Its only when I use Simulink is there a problem. Im using the cable that came in the Engineering Rev 2 kit from Arduino as well as a usb to usb-c converter to plug into my surface go to connect my computer to the Nano. I am able to run a regular Arduino Uno using the same setup through simiulink.
I’ve notcied that everytime I try to run the simulink program, the port in the device manager will change from "Arduino Nano 33 IoT" to the bootloader version. And also Matlab will tell me that an arduino device is detected again. My guess is that the Nano is restarting everytime or something? Im not really sure. simulink, arduino, nano MATLAB Answers — New Questions