Category: News
Powershell masking password
Hello Everyone,
I have the script (API POST) below which is working fine.
$UserPass = “username:password”
[string]$stringToEncode=$UserPass
$encodedString=[Convert]::ToBase64String([System.Text.Encoding]::UTF8.GetBytes($stringToEncode))
[Net.ServicePointManager]::SecurityProtocol = [Net.SecurityProtocolType]::Tls12
$headers = New-Object “System.Collections.Generic.Dictionary[[String],[String]]”
$headers.Add(“content-type”, “application/json”)
$headers.Add(“accept”, “application/json”)
$headers.Add(“Authorization”, “Basic ” + $encodedString)
Invoke-RestMethod -Uri “https://api_base_url/session” -Headers $headers -Method POST
I want to mask the password instead of plain text. So modified it to below which will ask to enter the masked password.
$UserPass = “username:$(Read-Host -Prompt “Please type the password” -AsSecureString)“
But the whole script is not working anymore. When I troubleshoot, there is a difference in encoding/decoding. I ran an online base64 decode and encode, the result is different.
Plain text password – username:password
dXNlcm5hbWU6cGFzc3dvcmQ= —> username:password
Masked password – username:$(Read-Host -Prompt “Please type the password” -AsSecureString)
dXNlcm5hbWU6U3lzdGVtLlNlY3VyaXR5LlNlY3VyZVN0cmluZw== —> username:System.Security.SecureString
How can I mask the password but able to read the System.Security.SecureString as the actual password?
Thank you in advanced.
Hello Everyone, I have the script (API POST) below which is working fine. $UserPass = “username:password” [string]$stringToEncode=$UserPass$encodedString=[Convert]::ToBase64String([System.Text.Encoding]::UTF8.GetBytes($stringToEncode)) [Net.ServicePointManager]::SecurityProtocol = [Net.SecurityProtocolType]::Tls12$headers = New-Object “System.Collections.Generic.Dictionary[[String],[String]]”$headers.Add(“content-type”, “application/json”)$headers.Add(“accept”, “application/json”)$headers.Add(“Authorization”, “Basic ” + $encodedString) Invoke-RestMethod -Uri “https://api_base_url/session” -Headers $headers -Method POST I want to mask the password instead of plain text. So modified it to below which will ask to enter the masked password. $UserPass = “username:$(Read-Host -Prompt “Please type the password” -AsSecureString)” But the whole script is not working anymore. When I troubleshoot, there is a difference in encoding/decoding. I ran an online base64 decode and encode, the result is different. Plain text password – username:passworddXNlcm5hbWU6cGFzc3dvcmQ= —> username:password Masked password – username:$(Read-Host -Prompt “Please type the password” -AsSecureString)dXNlcm5hbWU6U3lzdGVtLlNlY3VyaXR5LlNlY3VyZVN0cmluZw== —> username:System.Security.SecureString How can I mask the password but able to read the System.Security.SecureString as the actual password? Thank you in advanced. Read More
Expose Prompt Flow as API
Hi,
We have a few POCs implemented with Azure AI Prompt Flow.
Is there any way to expose Prompt Flow as an API and have it called by an external web app (ex: for instance developed with React)?
I searched the Internet and got no promising results.
Thanks
Hi,We have a few POCs implemented with Azure AI Prompt Flow. Is there any way to expose Prompt Flow as an API and have it called by an external web app (ex: for instance developed with React)? I searched the Internet and got no promising results. Thanks Read More
365 CoPilot for Outlook using shared calendars
Is there any update coming where I can use 365 copilot to look across multiple shared calendars to schedule a meeting. This seems like it would be a prime use of CoPilot.
Is there any update coming where I can use 365 copilot to look across multiple shared calendars to schedule a meeting. This seems like it would be a prime use of CoPilot. Read More
Cannot switch users in microsoft to-do
Because multiple services use microsoft, I have to log in with multiple microsoft accounts. When going to the to do webpage, it says my account isn’t on exchange online. It doesn’t because it is an admin login and there is no way to switch to the other account that has to do.
Because multiple services use microsoft, I have to log in with multiple microsoft accounts. When going to the to do webpage, it says my account isn’t on exchange online. It doesn’t because it is an admin login and there is no way to switch to the other account that has to do. Read More
API: Anonymous user can start meeting
Dear community.
Similar to this request: https://techcommunity.microsoft.com/t5/microsoft-teams/allowing-guests-to-start-a-meeting-without-the-organizer-present/m-p/2080397/page/2 , which recommends to set this setting on account level.
Is it possible to enable this setting via API just for a specific call? I was unsuccessful in finding this in the documentation.
Thank you for any help!
Dear community.Similar to this request: https://techcommunity.microsoft.com/t5/microsoft-teams/allowing-guests-to-start-a-meeting-without-the-organizer-present/m-p/2080397/page/2 , which recommends to set this setting on account level.Is it possible to enable this setting via API just for a specific call? I was unsuccessful in finding this in the documentation. Thank you for any help! Read More
Timeline Not Matching Other Filters
Hello all,
I tried to find a similar topic discussion on here but have not seen one. I currently have a pivot table with a few sliders and one timeline filter. One of the filters is a “Six Month” filter, which shows items that are due for review within the next 6 months. This filter is from a column on the original table that shows either “Six Months Review” or “Do Not Need to be Reviewed” and is based off a formula of a date column. My issue is, when I click to filter on “Six Months Review”, the date timeline doesn’t filter on those 6 months, it still shows all the dates present in the table. Is there a way for the Timeline filter to show data based off of the other filters as well?
Would appreciate any insight at all, thank you!
Hello all,I tried to find a similar topic discussion on here but have not seen one. I currently have a pivot table with a few sliders and one timeline filter. One of the filters is a “Six Month” filter, which shows items that are due for review within the next 6 months. This filter is from a column on the original table that shows either “Six Months Review” or “Do Not Need to be Reviewed” and is based off a formula of a date column. My issue is, when I click to filter on “Six Months Review”, the date timeline doesn’t filter on those 6 months, it still shows all the dates present in the table. Is there a way for the Timeline filter to show data based off of the other filters as well? Would appreciate any insight at all, thank you! Read More
selecting certain cells to search for a name if the condition for the row exists.
Hi there i am a basic excel user and have hit a major hurdle creating a summary sheet of occurrence’s for people.
I have 31 sheets which are identical as below, and if a condition on any of the rows for colum E contains the Liv i need a summary sheet to check the name cells further along the row to see if the persons name exists. If so i need this to be added together for all the sheets for a monthly figure.
Hope this explains what i mean.
I use the countif function for other set area’s in the sheet and that works fine as id does not require any specific condition to be met before checking.
Any help on this would be greatly appreciated, Thanks in advance Mike.
Hi there i am a basic excel user and have hit a major hurdle creating a summary sheet of occurrence’s for people.I have 31 sheets which are identical as below, and if a condition on any of the rows for colum E contains the Liv i need a summary sheet to check the name cells further along the row to see if the persons name exists. If so i need this to be added together for all the sheets for a monthly figure.Hope this explains what i mean. I use the countif function for other set area’s in the sheet and that works fine as id does not require any specific condition to be met before checking. Any help on this would be greatly appreciated, Thanks in advance Mike. Read More
Revolutionizing log collection with Azure Monitor Agent
With the deprecation of Log analytics agent (also called MMA or OMS), it’s a great opportunity to discuss its successor – the Azure Monitor Agent or in short – (AMA), and why it is so much better and keeps improving!
AMA is a lightweight log collection agent, designed to consume as little resources as possible when collecting metrics and logs from your server. It can be installed on various flavors and OS versions of both Linux as well as Windows machines hosted in Azure, on-premises or any other cloud environments. When installed on non-Azure machines, AMA requires the installation of Azure Arc agentry to provide mirroring and centralized cloud management capabilities to your machine.
Associated with a Microsoft Sentinel workspace, all logs collected form AMA-installed machines, are sent to the various Microsoft Sentinel tables, depending on the source type from which they were collected (Windows DNS, Windows security events, Firewall, IIS, Syslog, CEF, etc.).
AMA can be controlled using Data Collection Rules (DCR), enabling you to define where to collect the logs from, what data manipulations to perform with KQL transformations (enabling you filtering, parsing, enrichment and more) and where to send the logs to, whether that be a workspace, Eventhubs (for Azure VMS only), Auxiliary tier and so on. You can group machines by using different DCRs.
DCRs for AMA can be created in multiple ways:
Through the Portals (Azure or Unified Security Operations Platform): This method provides a user-friendly interface for defining data collection scenarios:
Configuring an AMA-based connector in Microsoft Sentinel > Configuration > Data collection (in either Azure or the security portals), will create the DCR for you. Through this option data will be directed to Microsoft Sentinel tables, some of which are not accessible when defining the DCR in Azure Monitor portal (e.g CommonSecurityLog, SecurityEvent, WindowsEvent and ASIM tables).
Creating and editing the DCRs through Azure Monitor by browsing to Azure portal > Azure Monitor > Settings > Data Collection Rules. Note: using this UI creation of DCR does not enable access to Microsoft Sentinel tables are not accessible and require editing of the DCR’s outputStream to divert the data to Microsoft Sentinel’s tables).
Azure Resource Manager (ARM) Templates: ARM templates allow you to define DCRs as code, enabling you to deploy and manage them programmatically. This is useful for automation and consistency across environments.
Azure CLI and PowerShell: These command-line tools provide another way to create and manage DCRs programmatically. They are particularly useful for scripting and automation.
REST API: The Azure Monitor REST API allows you to create, update, and manage DCRs programmatically. This method offers the most flexibility and integration with other systems.
PowerShell: Use Azure cmdlet to create, edit and deploy DCRs.
So why do we like AMA better?
Ah! Easy:
AMA is more performant than the legacy agent, reaching 25% increase in performance when compared to the Linux OMS, and 500% better in EPS than the MMA for Windows!
It’s centrally configured through the cloud, by the DCRs, enabling grouping of machines. Using Azure policies enables scale deployments, upgrades, and configuration changes over time. Any change in the DCR configuration is automatically rolled out to all agents without the need for further action at the client-side.
AMA supports multi-homing, sending events to multiple workspaces, cross region and cross-tenant (with Azure Lighthouse).
Most importantly, the AMA is more secured connecting to the cloud using Managed Identity and Microsoft Entra tokens. With non-Azure controlled machines, Arc enhances the security of the connection by enforcing and handling authentication and identity tokens
The greatest thing is that AMA keeps evolving through multiple enhancements and improvements we’re constantly working on!
Next, we’ll cover a few noticeable changes to connectors.
Windows Security Events:
We’ve enhanced the schema of the SecurityEvent table, hosting Windows Security Event, and have added new columns that AMA version 1.28.2 and up will be populating. These enhancements are designated to provide better coverage and visibility of the events collected.
New columns added are:
Keywords: A 64 bitmask of keywords defined in the event. Keywords classify types of events (e.g events associated with reading data), with the left most 2 bits representing audit success or failure.
Opcode: Operation code identifying the location in the application from where the event was logged, together with Task.
EventRecordId: The record number assigned to the event when it was logged
SystemThreadId: The thread that generated the event
SystemProcessId: The process that generated the event
Correlation: Activity identifiers that consumers can use to group related events together
Version: The version number of the event’s definition
SystemUserId: The ID of the user responsible for the event
Common Event Format (CEF) and Syslog:
We all know how important it is to collect and analyze data from various sources, such as firewalls, routers, switches, servers, DNS and applications. Two of the most common protocols used by many devices to emit their logs are CEF and Syslog.
With the legacy agent you had to configure a connector for each source separately, which could be tedious and time-consuming. That’s why we are excited to announce the updates to the Syslog and CEF data connectors via AMA, which will improve your overall experience with Microsoft Sentinel data connectors. All devices will now depend on either the generic CEF or the generic Syslog connectors, based on the log source used protocol. The relevant generic connector will be deployed as part of the device solution (don’t forget to check the box to select it for installation after you click the ‘install with dependencies’ button!).
To monitor the ingestion of your logs from the separated device types with the graphs, we’ve added a dedicated workbook, installed with the solution, where device types are aggregated in a single location. You can further filter the view based on device type or connectivity status.
To help you set the source device to streamline the logs, we’ve included the instructions or relevant referrals for many common CEF appliances or Syslog in our documentation.
Windows Events:
What happens if you wish to collect other Windows audit events? You cannot send them to the SecurityEvents table as those events are not from the security channel and do not match that table schema. Instead, the non-security events can be directed to the WindowsEvents table using the Windows Forwarded Events data connector, which can be used to stream both forwarded events collected from a WEC/WEF server, as well as those Windows server, by setting the DCR wizard to Custom option and specifying the XPath expression to point to the desired events.
Windows Firewall Logs:
This connector enables the collection of the machine’s Firewall logs. We’ve added a granularity selection of the profile from which to collect and stream logs to the ASimNetworkSessionLogs table.
Custom Logs:
Some appliances packaged in Content Hub solutions are streaming data to _CL tables. For those 15 specific devices and to enable a quick setting up of file collection, we’ve added the Custom logs connector.
We hope this post was informative and that you have already upgraded your agents to AMA, or plan to do so shortly. For more information on other connectors agent-based or others, refer to our Data connectors documentation or browse the content hub to locate your source of interest. If you would like more content about using AMA, please let us know in the comments below!
Lastly, to stay current with the latest updates and announcements stay tuned to our What’s new page.
Microsoft Tech Community – Latest Blogs –Read More
Inside Look: How Azure Kubernetes Service supports OSSKU Migration between Ubuntu and Azure Linux
Microsoft recently announced the general availability of OSSKU Migration in Azure Kubernetes Service (AKS). This new feature enables users to take an existing AKS node pool and update the OSSKU for an in-place move between Ubuntu and Azure Linux. Previously when OSSKU was immutable, users had to create new node pools and explicitly drain their workloads into them, which was both labor intensive and required additional VM quota.
In this blog post we will dive into how to use this feature, the tech stack that supports it, and some considerations to make sure the upgrade goes smoothly.
Using OSSKU Migration
OSSKU Migration is supported by az-cli, ARM/Bicep templates, and Terraform. All three options will put the affected node pools into the upgrading state, which will take several minutes to resolve. During this time your cluster will scale up depending on your max surge setting, then your pods will be drained and scheduled on to other VMs in your nodepool or cluster.
If you are using az-cli your version must be 2.61.0 or higher. To trigger a migration with az-cli run the following command on your node pool.
az aks nodepool update –resource-group myResourceGroup –cluster-name myAKSCluster –name myNodePool –os-sku AzureLinux
If you are using ARM/Bicep templates you must update your apiVersion to 2023-07-01 or newer. Then update the ossku field in your agentPoolProfile section to “AzureLinux” and redeploy your template.
If you are using Terraform your azurerm provider version must be v3.111.0 or higher. Then update the os_sku field of your node pools to “AzureLinux” and redeploy your Terraform plan.
How it Works
When you send a request to AKS (1) and it notices that your node pool’s OSSKU value has changed, it performs some additional validation to make sure that the change is allowed:
OSSKU Migration cannot change node pool names.
Only Ubuntu and AzureLinux are supported OSSKU targets.
Ubuntu node pools with UseGPUDedicatedVHD enabled cannot change OSSKU.
Ubuntu node pools with CVM 20.04 enabled cannot change OSSKU.
AzureLinux node pools with Kata enabled cannot change OSSKU.
Windows node pools cannot change OSSKU.
If all these conditions pass, then AKS puts the node pool into the upgrading state and picks the latest available image for your new chosen OSSKU. It will then follow the exact same flow as a node image upgrade, scaling the node pool up based on your max surge value (2), then replacing the image on your existing VMs one by one until each node is on the latest image for your new chosen OSSKU (3). Once all your VMs are upgraded to the new image, AKS then removes the surge nodes and signals back to the caller that the upgrade is complete (4).
Things to Consider
Before running an OSSKU Migration in your production clusters, there are two very important things to check:
Deploy a node pool of your new target OSSKU into both development and production environments to confirm that everything works as expected on your new OSSKU before performing the migration on the rest of your node pools.
Ensure that your workload has sufficient Pod Disruption Budget to allow AKS to move pods between VMs during the upgrade. This is necessary for OSSKU migration and any AKS node image upgrade to safely move workloads around your cluster while nodes are restarting. For information on troubleshooting PDB failures during upgrade see this documentation.
Conclusion
Throughout public preview, multiple teams within Microsoft have utilized OSSKU Migration to seamlessly move their workloads over to the Azure Linux OSSKU without large surge capacity and without the need for manual intervention within their clusters. We’re looking forward to more users experiencing how easy it is now to update the OSSKU on an existing AKS node pool.
Microsoft Tech Community – Latest Blogs –Read More
Issue connecting Simulink to Arduino Nano 33 IoT
I’m having an issue connecting my Arduino Nano 33 IoT to simulink. When I try to run it, I get the error message: "Unrecognized feild name " HostCOMPort" " I’m unsure how to resolve this and would greatly appricate the help!
The device shows up in my device manager and I can run code in Arduino IDE and Matlab without issues. Its only when I use Simulink is there a problem. Im using the cable that came in the Engineering Rev 2 kit from Arduino as well as a usb to usb-c converter to plug into my surface go to connect my computer to the Nano. I am able to run a regular Arduino Uno using the same setup through simiulink.
I’ve notcied that everytime I try to run the simulink program, the port in the device manager will change from "Arduino Nano 33 IoT" to the bootloader version. And also Matlab will tell me that an arduino device is detected again. My guess is that the Nano is restarting everytime or something? Im not really sure.I’m having an issue connecting my Arduino Nano 33 IoT to simulink. When I try to run it, I get the error message: "Unrecognized feild name " HostCOMPort" " I’m unsure how to resolve this and would greatly appricate the help!
The device shows up in my device manager and I can run code in Arduino IDE and Matlab without issues. Its only when I use Simulink is there a problem. Im using the cable that came in the Engineering Rev 2 kit from Arduino as well as a usb to usb-c converter to plug into my surface go to connect my computer to the Nano. I am able to run a regular Arduino Uno using the same setup through simiulink.
I’ve notcied that everytime I try to run the simulink program, the port in the device manager will change from "Arduino Nano 33 IoT" to the bootloader version. And also Matlab will tell me that an arduino device is detected again. My guess is that the Nano is restarting everytime or something? Im not really sure. I’m having an issue connecting my Arduino Nano 33 IoT to simulink. When I try to run it, I get the error message: "Unrecognized feild name " HostCOMPort" " I’m unsure how to resolve this and would greatly appricate the help!
The device shows up in my device manager and I can run code in Arduino IDE and Matlab without issues. Its only when I use Simulink is there a problem. Im using the cable that came in the Engineering Rev 2 kit from Arduino as well as a usb to usb-c converter to plug into my surface go to connect my computer to the Nano. I am able to run a regular Arduino Uno using the same setup through simiulink.
I’ve notcied that everytime I try to run the simulink program, the port in the device manager will change from "Arduino Nano 33 IoT" to the bootloader version. And also Matlab will tell me that an arduino device is detected again. My guess is that the Nano is restarting everytime or something? Im not really sure. simulink, arduino, nano MATLAB Answers — New Questions
How can I create an offset of a boundary?
How can I create an offset of a boundary?
rng(‘default’)
x = rand(30,1);
y = rand(30,1);
plot(x,y,’.’)
xlim([-0.2 1.2])
ylim([-0.2 1.2])
k = boundary(x,y);
hold on;
plot(x(k),y(k));
My desired output:How can I create an offset of a boundary?
rng(‘default’)
x = rand(30,1);
y = rand(30,1);
plot(x,y,’.’)
xlim([-0.2 1.2])
ylim([-0.2 1.2])
k = boundary(x,y);
hold on;
plot(x(k),y(k));
My desired output: How can I create an offset of a boundary?
rng(‘default’)
x = rand(30,1);
y = rand(30,1);
plot(x,y,’.’)
xlim([-0.2 1.2])
ylim([-0.2 1.2])
k = boundary(x,y);
hold on;
plot(x(k),y(k));
My desired output: offset, boundary MATLAB Answers — New Questions
How can I control the output data type of the “Signal Editor” block?
How can I control the output data type of the "Signal Editor" block?How can I control the output data type of the "Signal Editor" block? How can I control the output data type of the "Signal Editor" block? signal, editor, simulink, data, types, mat, file, output MATLAB Answers — New Questions
Use Powershell to print O365 Outlook Out of Office rules
I am looking into what I can pull for Outlook using Powershell but have only come across rules to get generic Rules & Alerts and generic settings for Out of Office.
I want to be able to pull the rule settings for Out of Office, Actions and Conditions any suggestions?
I am looking into what I can pull for Outlook using Powershell but have only come across rules to get generic Rules & Alerts and generic settings for Out of Office.I want to be able to pull the rule settings for Out of Office, Actions and Conditions any suggestions? Read More
Azure Firewall Application Rules – “MSSQL” not available in Rule Collection Groups
Hi,
Working on a IaC project for Azure Firewall.
Have created Azure Firewall, Azure Firewall Policy and working on implementing rules using Rule Collection Groups.
In the Portal, Application Groups support protocol type “http”, “https” and “mssql”.
However, when provisioning this using the Rule Collection Group module, that is just not an option at all, only HTTP and HTTPS is available:
However, in the Azure Firewall module, you have all three:
I am more fan of doing this modular, so would like to avoid having to do the rules directly in the Azure Firewall module.
Is there any particular reason for why Mssql is not available directly from “Rule Collection Group” module?
Is there any Github issue page for Azure networking where I could report this?
Thanks!
Hi, Working on a IaC project for Azure Firewall. Have created Azure Firewall, Azure Firewall Policy and working on implementing rules using Rule Collection Groups. In the Portal, Application Groups support protocol type “http”, “https” and “mssql”.However, when provisioning this using the Rule Collection Group module, that is just not an option at all, only HTTP and HTTPS is available: However, in the Azure Firewall module, you have all three: I am more fan of doing this modular, so would like to avoid having to do the rules directly in the Azure Firewall module. Is there any particular reason for why Mssql is not available directly from “Rule Collection Group” module? Is there any Github issue page for Azure networking where I could report this? Thanks! Read More
Can csp partners Authorize idirect resellers to create new Azure CSP subscriptions
Can a CSP Direct partner Authorize idirect resellers to create new Azure subscriptions that will then be billed under the customers CSP Account. Billing going to partner and Indirect customer receives invoices from their partner.
OR
That customer can create New Azure subscription where the customer would have to Accept another MCA and supply a credit card (MOSP Azure Plan Direct only?
Can a CSP Direct partner Authorize idirect resellers to create new Azure subscriptions that will then be billed under the customers CSP Account. Billing going to partner and Indirect customer receives invoices from their partner. OR That customer can create New Azure subscription where the customer would have to Accept another MCA and supply a credit card (MOSP Azure Plan Direct only? Read More
Azure Open AI: Error when calling completions api with GPT4o-mini
Hello,
I am getting the following error when calling completions api with GPT4o-mini model.
{
“error”: {
“code”: “OperationNotSupported”,
“message”: “The completion operation does not work with the specified model, gpt-4o-mini. Please choose different model and try again. You can learn more about which models can be used with each operation here: https://go.microsoft.com/fwlink/?linkid=2197993.”
}
}
If I use an older model like GPT3.5, I get no errors.
The endpoint I am using is similar to https://dev-swedencentral-openai.openai.azure.com/openai/deployments/dev-gpt-4o-mini/completions?api-version=2024-04-01-preview.
I saw that at https://learn.microsoft.com/en-us/azure/ai-services/openai/overview
it says that GPT4o-mini is already supported.
Can you help to solve this error?
Thanks
Hello, I am getting the following error when calling completions api with GPT4o-mini model. {
“error”: {
“code”: “OperationNotSupported”,
“message”: “The completion operation does not work with the specified model, gpt-4o-mini. Please choose different model and try again. You can learn more about which models can be used with each operation here: https://go.microsoft.com/fwlink/?linkid=2197993.”
}
} If I use an older model like GPT3.5, I get no errors. The endpoint I am using is similar to https://dev-swedencentral-openai.openai.azure.com/openai/deployments/dev-gpt-4o-mini/completions?api-version=2024-04-01-preview. I saw that at https://learn.microsoft.com/en-us/azure/ai-services/openai/overviewit says that GPT4o-mini is already supported. Can you help to solve this error? Thanks Read More
Understanding Disposition Review
Is it possible that the Disposition Review uses the Managed Folder Assistant to bring the labeled documents to the Content Review?
That’s where my question comes from:
I extended a labeled document for two days during the Disposition Review, but it never came back for another Disposition Review. The document was not deleted either. It simply remained labeled in the corresponding SharePoint library.
Could it be that the Disposition Review works with the Managed Folder Assistant and only checks after seven days whether a new document is due for review? And if you select a period of less than 7 days as the disposition period, then it simply drops out of the loop and no longer appears?
Is there anyone who knows the technical facts about this and can enlighten me?
Many thanks in advance and best regards, Sophie
Is it possible that the Disposition Review uses the Managed Folder Assistant to bring the labeled documents to the Content Review? That’s where my question comes from:I extended a labeled document for two days during the Disposition Review, but it never came back for another Disposition Review. The document was not deleted either. It simply remained labeled in the corresponding SharePoint library.Could it be that the Disposition Review works with the Managed Folder Assistant and only checks after seven days whether a new document is due for review? And if you select a period of less than 7 days as the disposition period, then it simply drops out of the loop and no longer appears? Is there anyone who knows the technical facts about this and can enlighten me?Many thanks in advance and best regards, Sophie Read More
date diff for power auto for 2 column
I am creating a flow that will automatically send an email once a week if the dates between 2 columns become < 30 days.
I created a switch where if the project is Closed, nothing happens. if it is open it will get the items of the 2 values, but unsure how to do a date diff.
Do i do another switch where if above 30 nothing happens?
Also, if I have a switch that if it doesn’t match does nothing is there a standard practice to close any potential loops? ty
I am creating a flow that will automatically send an email once a week if the dates between 2 columns become < 30 days. I created a switch where if the project is Closed, nothing happens. if it is open it will get the items of the 2 values, but unsure how to do a date diff.Do i do another switch where if above 30 nothing happens? Also, if I have a switch that if it doesn’t match does nothing is there a standard practice to close any potential loops? ty Read More
New Blog | Microsoft Defender for Endpoint’s Safe Deployment Practices
By jweberMSFT
For customers it is key to understand that software vendors use safe deployment practices that help them build resilient processes that maintain productivity. This blog addresses Microsoft Defender for Endpoint’s architectural design and its approach to delivering security updates, which is grounded in Safe Deployment Practices (SDP).
Microsoft Defender for Endpoint helps protect organizations against sophisticated adversaries while optimizing for resiliency, performance, and compatibility, following best practices for managing security tools in Windows.
Security tools running on Windows can balance security and reliability through careful product design, as described in this post by David Weston. Security vendors can use optimized sensors which operate within kernel mode for data collection and enforcement, limiting the risk of reliability issues. The remainder of the security solution, including managing updates, loading content, and user interaction, can occur isolated within user mode, where any reliability issues are less impactful. This architecture enables Defender for Endpoint to limit its reliance on kernel mode while protecting customers in real-time.
Read the full post here: Microsoft Defender for Endpoint’s Safe Deployment Practices
By jweberMSFT
For customers it is key to understand that software vendors use safe deployment practices that help them build resilient processes that maintain productivity. This blog addresses Microsoft Defender for Endpoint’s architectural design and its approach to delivering security updates, which is grounded in Safe Deployment Practices (SDP).
Microsoft Defender for Endpoint helps protect organizations against sophisticated adversaries while optimizing for resiliency, performance, and compatibility, following best practices for managing security tools in Windows.
Security tools running on Windows can balance security and reliability through careful product design, as described in this post by David Weston. Security vendors can use optimized sensors which operate within kernel mode for data collection and enforcement, limiting the risk of reliability issues. The remainder of the security solution, including managing updates, loading content, and user interaction, can occur isolated within user mode, where any reliability issues are less impactful. This architecture enables Defender for Endpoint to limit its reliance on kernel mode while protecting customers in real-time.
Read the full post here: Microsoft Defender for Endpoint’s Safe Deployment Practices Read More
Windows Server 2025: The upgrade and update experience
Windows Server 2025 is the most secure and performant release yet! Download the evaluation now!
Looking to migrate from VMware to Windows Server 2025? Contact your Microsoft account team!
Looking to migrate from VMware to Windows Server 2025? Contact your Microsoft account team!
The 2024 Windows Server Summit was held in March and brought three days of demos, technical sessions, and Q&A, led by Microsoft engineers, guest experts from Intel®, and our MVP community. For more videos from this year’s Windows Server Summit, please find the full session list here.
This article focuses on the upgrade experience to Windows Server 2025.
Windows Server 2025: The upgrade and update experience
Discover the streamlined upgrade process to Windows Server 2025 in our session. We will cover N-4 media-based upgrades, feature upgrades through Windows Update, and efficient management of feature and quality updates with Windows Server Update Services (WSUS). Gain insights into best practices and tools for a smooth transition, ensuring your infrastructure aligns seamlessly with the latest advancements. Don’t miss this opportunity for valuable insights, practical tips, and a roadmap to upgrade your Windows Servers effectively.
Microsoft Tech Community – Latest Blogs –Read More