Tag Archives: microsoft
selecting certain cells to search for a name if the condition for the row exists.
Hi there i am a basic excel user and have hit a major hurdle creating a summary sheet of occurrence’s for people.
I have 31 sheets which are identical as below, and if a condition on any of the rows for colum E contains the Liv i need a summary sheet to check the name cells further along the row to see if the persons name exists. If so i need this to be added together for all the sheets for a monthly figure.
Hope this explains what i mean.
I use the countif function for other set area’s in the sheet and that works fine as id does not require any specific condition to be met before checking.
Any help on this would be greatly appreciated, Thanks in advance Mike.
Hi there i am a basic excel user and have hit a major hurdle creating a summary sheet of occurrence’s for people.I have 31 sheets which are identical as below, and if a condition on any of the rows for colum E contains the Liv i need a summary sheet to check the name cells further along the row to see if the persons name exists. If so i need this to be added together for all the sheets for a monthly figure.Hope this explains what i mean. I use the countif function for other set area’s in the sheet and that works fine as id does not require any specific condition to be met before checking. Any help on this would be greatly appreciated, Thanks in advance Mike. Read More
Revolutionizing log collection with Azure Monitor Agent
With the deprecation of Log analytics agent (also called MMA or OMS), it’s a great opportunity to discuss its successor – the Azure Monitor Agent or in short – (AMA), and why it is so much better and keeps improving!
AMA is a lightweight log collection agent, designed to consume as little resources as possible when collecting metrics and logs from your server. It can be installed on various flavors and OS versions of both Linux as well as Windows machines hosted in Azure, on-premises or any other cloud environments. When installed on non-Azure machines, AMA requires the installation of Azure Arc agentry to provide mirroring and centralized cloud management capabilities to your machine.
Associated with a Microsoft Sentinel workspace, all logs collected form AMA-installed machines, are sent to the various Microsoft Sentinel tables, depending on the source type from which they were collected (Windows DNS, Windows security events, Firewall, IIS, Syslog, CEF, etc.).
AMA can be controlled using Data Collection Rules (DCR), enabling you to define where to collect the logs from, what data manipulations to perform with KQL transformations (enabling you filtering, parsing, enrichment and more) and where to send the logs to, whether that be a workspace, Eventhubs (for Azure VMS only), Auxiliary tier and so on. You can group machines by using different DCRs.
DCRs for AMA can be created in multiple ways:
Through the Portals (Azure or Unified Security Operations Platform): This method provides a user-friendly interface for defining data collection scenarios:
Configuring an AMA-based connector in Microsoft Sentinel > Configuration > Data collection (in either Azure or the security portals), will create the DCR for you. Through this option data will be directed to Microsoft Sentinel tables, some of which are not accessible when defining the DCR in Azure Monitor portal (e.g CommonSecurityLog, SecurityEvent, WindowsEvent and ASIM tables).
Creating and editing the DCRs through Azure Monitor by browsing to Azure portal > Azure Monitor > Settings > Data Collection Rules. Note: using this UI creation of DCR does not enable access to Microsoft Sentinel tables are not accessible and require editing of the DCR’s outputStream to divert the data to Microsoft Sentinel’s tables).
Azure Resource Manager (ARM) Templates: ARM templates allow you to define DCRs as code, enabling you to deploy and manage them programmatically. This is useful for automation and consistency across environments.
Azure CLI and PowerShell: These command-line tools provide another way to create and manage DCRs programmatically. They are particularly useful for scripting and automation.
REST API: The Azure Monitor REST API allows you to create, update, and manage DCRs programmatically. This method offers the most flexibility and integration with other systems.
PowerShell: Use Azure cmdlet to create, edit and deploy DCRs.
So why do we like AMA better?
Ah! Easy:
AMA is more performant than the legacy agent, reaching 25% increase in performance when compared to the Linux OMS, and 500% better in EPS than the MMA for Windows!
It’s centrally configured through the cloud, by the DCRs, enabling grouping of machines. Using Azure policies enables scale deployments, upgrades, and configuration changes over time. Any change in the DCR configuration is automatically rolled out to all agents without the need for further action at the client-side.
AMA supports multi-homing, sending events to multiple workspaces, cross region and cross-tenant (with Azure Lighthouse).
Most importantly, the AMA is more secured connecting to the cloud using Managed Identity and Microsoft Entra tokens. With non-Azure controlled machines, Arc enhances the security of the connection by enforcing and handling authentication and identity tokens
The greatest thing is that AMA keeps evolving through multiple enhancements and improvements we’re constantly working on!
Next, we’ll cover a few noticeable changes to connectors.
Windows Security Events:
We’ve enhanced the schema of the SecurityEvent table, hosting Windows Security Event, and have added new columns that AMA version 1.28.2 and up will be populating. These enhancements are designated to provide better coverage and visibility of the events collected.
New columns added are:
Keywords: A 64 bitmask of keywords defined in the event. Keywords classify types of events (e.g events associated with reading data), with the left most 2 bits representing audit success or failure.
Opcode: Operation code identifying the location in the application from where the event was logged, together with Task.
EventRecordId: The record number assigned to the event when it was logged
SystemThreadId: The thread that generated the event
SystemProcessId: The process that generated the event
Correlation: Activity identifiers that consumers can use to group related events together
Version: The version number of the event’s definition
SystemUserId: The ID of the user responsible for the event
Common Event Format (CEF) and Syslog:
We all know how important it is to collect and analyze data from various sources, such as firewalls, routers, switches, servers, DNS and applications. Two of the most common protocols used by many devices to emit their logs are CEF and Syslog.
With the legacy agent you had to configure a connector for each source separately, which could be tedious and time-consuming. That’s why we are excited to announce the updates to the Syslog and CEF data connectors via AMA, which will improve your overall experience with Microsoft Sentinel data connectors. All devices will now depend on either the generic CEF or the generic Syslog connectors, based on the log source used protocol. The relevant generic connector will be deployed as part of the device solution (don’t forget to check the box to select it for installation after you click the ‘install with dependencies’ button!).
To monitor the ingestion of your logs from the separated device types with the graphs, we’ve added a dedicated workbook, installed with the solution, where device types are aggregated in a single location. You can further filter the view based on device type or connectivity status.
To help you set the source device to streamline the logs, we’ve included the instructions or relevant referrals for many common CEF appliances or Syslog in our documentation.
Windows Events:
What happens if you wish to collect other Windows audit events? You cannot send them to the SecurityEvents table as those events are not from the security channel and do not match that table schema. Instead, the non-security events can be directed to the WindowsEvents table using the Windows Forwarded Events data connector, which can be used to stream both forwarded events collected from a WEC/WEF server, as well as those Windows server, by setting the DCR wizard to Custom option and specifying the XPath expression to point to the desired events.
Windows Firewall Logs:
This connector enables the collection of the machine’s Firewall logs. We’ve added a granularity selection of the profile from which to collect and stream logs to the ASimNetworkSessionLogs table.
Custom Logs:
Some appliances packaged in Content Hub solutions are streaming data to _CL tables. For those 15 specific devices and to enable a quick setting up of file collection, we’ve added the Custom logs connector.
We hope this post was informative and that you have already upgraded your agents to AMA, or plan to do so shortly. For more information on other connectors agent-based or others, refer to our Data connectors documentation or browse the content hub to locate your source of interest. If you would like more content about using AMA, please let us know in the comments below!
Lastly, to stay current with the latest updates and announcements stay tuned to our What’s new page.
Microsoft Tech Community – Latest Blogs –Read More
Inside Look: How Azure Kubernetes Service supports OSSKU Migration between Ubuntu and Azure Linux
Microsoft recently announced the general availability of OSSKU Migration in Azure Kubernetes Service (AKS). This new feature enables users to take an existing AKS node pool and update the OSSKU for an in-place move between Ubuntu and Azure Linux. Previously when OSSKU was immutable, users had to create new node pools and explicitly drain their workloads into them, which was both labor intensive and required additional VM quota.
In this blog post we will dive into how to use this feature, the tech stack that supports it, and some considerations to make sure the upgrade goes smoothly.
Using OSSKU Migration
OSSKU Migration is supported by az-cli, ARM/Bicep templates, and Terraform. All three options will put the affected node pools into the upgrading state, which will take several minutes to resolve. During this time your cluster will scale up depending on your max surge setting, then your pods will be drained and scheduled on to other VMs in your nodepool or cluster.
If you are using az-cli your version must be 2.61.0 or higher. To trigger a migration with az-cli run the following command on your node pool.
az aks nodepool update –resource-group myResourceGroup –cluster-name myAKSCluster –name myNodePool –os-sku AzureLinux
If you are using ARM/Bicep templates you must update your apiVersion to 2023-07-01 or newer. Then update the ossku field in your agentPoolProfile section to “AzureLinux” and redeploy your template.
If you are using Terraform your azurerm provider version must be v3.111.0 or higher. Then update the os_sku field of your node pools to “AzureLinux” and redeploy your Terraform plan.
How it Works
When you send a request to AKS (1) and it notices that your node pool’s OSSKU value has changed, it performs some additional validation to make sure that the change is allowed:
OSSKU Migration cannot change node pool names.
Only Ubuntu and AzureLinux are supported OSSKU targets.
Ubuntu node pools with UseGPUDedicatedVHD enabled cannot change OSSKU.
Ubuntu node pools with CVM 20.04 enabled cannot change OSSKU.
AzureLinux node pools with Kata enabled cannot change OSSKU.
Windows node pools cannot change OSSKU.
If all these conditions pass, then AKS puts the node pool into the upgrading state and picks the latest available image for your new chosen OSSKU. It will then follow the exact same flow as a node image upgrade, scaling the node pool up based on your max surge value (2), then replacing the image on your existing VMs one by one until each node is on the latest image for your new chosen OSSKU (3). Once all your VMs are upgraded to the new image, AKS then removes the surge nodes and signals back to the caller that the upgrade is complete (4).
Things to Consider
Before running an OSSKU Migration in your production clusters, there are two very important things to check:
Deploy a node pool of your new target OSSKU into both development and production environments to confirm that everything works as expected on your new OSSKU before performing the migration on the rest of your node pools.
Ensure that your workload has sufficient Pod Disruption Budget to allow AKS to move pods between VMs during the upgrade. This is necessary for OSSKU migration and any AKS node image upgrade to safely move workloads around your cluster while nodes are restarting. For information on troubleshooting PDB failures during upgrade see this documentation.
Conclusion
Throughout public preview, multiple teams within Microsoft have utilized OSSKU Migration to seamlessly move their workloads over to the Azure Linux OSSKU without large surge capacity and without the need for manual intervention within their clusters. We’re looking forward to more users experiencing how easy it is now to update the OSSKU on an existing AKS node pool.
Microsoft Tech Community – Latest Blogs –Read More
Use Powershell to print O365 Outlook Out of Office rules
I am looking into what I can pull for Outlook using Powershell but have only come across rules to get generic Rules & Alerts and generic settings for Out of Office.
I want to be able to pull the rule settings for Out of Office, Actions and Conditions any suggestions?
I am looking into what I can pull for Outlook using Powershell but have only come across rules to get generic Rules & Alerts and generic settings for Out of Office.I want to be able to pull the rule settings for Out of Office, Actions and Conditions any suggestions? Read More
Azure Firewall Application Rules – “MSSQL” not available in Rule Collection Groups
Hi,
Working on a IaC project for Azure Firewall.
Have created Azure Firewall, Azure Firewall Policy and working on implementing rules using Rule Collection Groups.
In the Portal, Application Groups support protocol type “http”, “https” and “mssql”.
However, when provisioning this using the Rule Collection Group module, that is just not an option at all, only HTTP and HTTPS is available:
However, in the Azure Firewall module, you have all three:
I am more fan of doing this modular, so would like to avoid having to do the rules directly in the Azure Firewall module.
Is there any particular reason for why Mssql is not available directly from “Rule Collection Group” module?
Is there any Github issue page for Azure networking where I could report this?
Thanks!
Hi, Working on a IaC project for Azure Firewall. Have created Azure Firewall, Azure Firewall Policy and working on implementing rules using Rule Collection Groups. In the Portal, Application Groups support protocol type “http”, “https” and “mssql”.However, when provisioning this using the Rule Collection Group module, that is just not an option at all, only HTTP and HTTPS is available: However, in the Azure Firewall module, you have all three: I am more fan of doing this modular, so would like to avoid having to do the rules directly in the Azure Firewall module. Is there any particular reason for why Mssql is not available directly from “Rule Collection Group” module? Is there any Github issue page for Azure networking where I could report this? Thanks! Read More
Can csp partners Authorize idirect resellers to create new Azure CSP subscriptions
Can a CSP Direct partner Authorize idirect resellers to create new Azure subscriptions that will then be billed under the customers CSP Account. Billing going to partner and Indirect customer receives invoices from their partner.
OR
That customer can create New Azure subscription where the customer would have to Accept another MCA and supply a credit card (MOSP Azure Plan Direct only?
Can a CSP Direct partner Authorize idirect resellers to create new Azure subscriptions that will then be billed under the customers CSP Account. Billing going to partner and Indirect customer receives invoices from their partner. OR That customer can create New Azure subscription where the customer would have to Accept another MCA and supply a credit card (MOSP Azure Plan Direct only? Read More
Azure Open AI: Error when calling completions api with GPT4o-mini
Hello,
I am getting the following error when calling completions api with GPT4o-mini model.
{
“error”: {
“code”: “OperationNotSupported”,
“message”: “The completion operation does not work with the specified model, gpt-4o-mini. Please choose different model and try again. You can learn more about which models can be used with each operation here: https://go.microsoft.com/fwlink/?linkid=2197993.”
}
}
If I use an older model like GPT3.5, I get no errors.
The endpoint I am using is similar to https://dev-swedencentral-openai.openai.azure.com/openai/deployments/dev-gpt-4o-mini/completions?api-version=2024-04-01-preview.
I saw that at https://learn.microsoft.com/en-us/azure/ai-services/openai/overview
it says that GPT4o-mini is already supported.
Can you help to solve this error?
Thanks
Hello, I am getting the following error when calling completions api with GPT4o-mini model. {
“error”: {
“code”: “OperationNotSupported”,
“message”: “The completion operation does not work with the specified model, gpt-4o-mini. Please choose different model and try again. You can learn more about which models can be used with each operation here: https://go.microsoft.com/fwlink/?linkid=2197993.”
}
} If I use an older model like GPT3.5, I get no errors. The endpoint I am using is similar to https://dev-swedencentral-openai.openai.azure.com/openai/deployments/dev-gpt-4o-mini/completions?api-version=2024-04-01-preview. I saw that at https://learn.microsoft.com/en-us/azure/ai-services/openai/overviewit says that GPT4o-mini is already supported. Can you help to solve this error? Thanks Read More
Understanding Disposition Review
Is it possible that the Disposition Review uses the Managed Folder Assistant to bring the labeled documents to the Content Review?
That’s where my question comes from:
I extended a labeled document for two days during the Disposition Review, but it never came back for another Disposition Review. The document was not deleted either. It simply remained labeled in the corresponding SharePoint library.
Could it be that the Disposition Review works with the Managed Folder Assistant and only checks after seven days whether a new document is due for review? And if you select a period of less than 7 days as the disposition period, then it simply drops out of the loop and no longer appears?
Is there anyone who knows the technical facts about this and can enlighten me?
Many thanks in advance and best regards, Sophie
Is it possible that the Disposition Review uses the Managed Folder Assistant to bring the labeled documents to the Content Review? That’s where my question comes from:I extended a labeled document for two days during the Disposition Review, but it never came back for another Disposition Review. The document was not deleted either. It simply remained labeled in the corresponding SharePoint library.Could it be that the Disposition Review works with the Managed Folder Assistant and only checks after seven days whether a new document is due for review? And if you select a period of less than 7 days as the disposition period, then it simply drops out of the loop and no longer appears? Is there anyone who knows the technical facts about this and can enlighten me?Many thanks in advance and best regards, Sophie Read More
date diff for power auto for 2 column
I am creating a flow that will automatically send an email once a week if the dates between 2 columns become < 30 days.
I created a switch where if the project is Closed, nothing happens. if it is open it will get the items of the 2 values, but unsure how to do a date diff.
Do i do another switch where if above 30 nothing happens?
Also, if I have a switch that if it doesn’t match does nothing is there a standard practice to close any potential loops? ty
I am creating a flow that will automatically send an email once a week if the dates between 2 columns become < 30 days. I created a switch where if the project is Closed, nothing happens. if it is open it will get the items of the 2 values, but unsure how to do a date diff.Do i do another switch where if above 30 nothing happens? Also, if I have a switch that if it doesn’t match does nothing is there a standard practice to close any potential loops? ty Read More
New Blog | Microsoft Defender for Endpoint’s Safe Deployment Practices
By jweberMSFT
For customers it is key to understand that software vendors use safe deployment practices that help them build resilient processes that maintain productivity. This blog addresses Microsoft Defender for Endpoint’s architectural design and its approach to delivering security updates, which is grounded in Safe Deployment Practices (SDP).
Microsoft Defender for Endpoint helps protect organizations against sophisticated adversaries while optimizing for resiliency, performance, and compatibility, following best practices for managing security tools in Windows.
Security tools running on Windows can balance security and reliability through careful product design, as described in this post by David Weston. Security vendors can use optimized sensors which operate within kernel mode for data collection and enforcement, limiting the risk of reliability issues. The remainder of the security solution, including managing updates, loading content, and user interaction, can occur isolated within user mode, where any reliability issues are less impactful. This architecture enables Defender for Endpoint to limit its reliance on kernel mode while protecting customers in real-time.
Read the full post here: Microsoft Defender for Endpoint’s Safe Deployment Practices
By jweberMSFT
For customers it is key to understand that software vendors use safe deployment practices that help them build resilient processes that maintain productivity. This blog addresses Microsoft Defender for Endpoint’s architectural design and its approach to delivering security updates, which is grounded in Safe Deployment Practices (SDP).
Microsoft Defender for Endpoint helps protect organizations against sophisticated adversaries while optimizing for resiliency, performance, and compatibility, following best practices for managing security tools in Windows.
Security tools running on Windows can balance security and reliability through careful product design, as described in this post by David Weston. Security vendors can use optimized sensors which operate within kernel mode for data collection and enforcement, limiting the risk of reliability issues. The remainder of the security solution, including managing updates, loading content, and user interaction, can occur isolated within user mode, where any reliability issues are less impactful. This architecture enables Defender for Endpoint to limit its reliance on kernel mode while protecting customers in real-time.
Read the full post here: Microsoft Defender for Endpoint’s Safe Deployment Practices Read More
Windows Server 2025: The upgrade and update experience
Windows Server 2025 is the most secure and performant release yet! Download the evaluation now!
Looking to migrate from VMware to Windows Server 2025? Contact your Microsoft account team!
Looking to migrate from VMware to Windows Server 2025? Contact your Microsoft account team!
The 2024 Windows Server Summit was held in March and brought three days of demos, technical sessions, and Q&A, led by Microsoft engineers, guest experts from Intel®, and our MVP community. For more videos from this year’s Windows Server Summit, please find the full session list here.
This article focuses on the upgrade experience to Windows Server 2025.
Windows Server 2025: The upgrade and update experience
Discover the streamlined upgrade process to Windows Server 2025 in our session. We will cover N-4 media-based upgrades, feature upgrades through Windows Update, and efficient management of feature and quality updates with Windows Server Update Services (WSUS). Gain insights into best practices and tools for a smooth transition, ensuring your infrastructure aligns seamlessly with the latest advancements. Don’t miss this opportunity for valuable insights, practical tips, and a roadmap to upgrade your Windows Servers effectively.
Microsoft Tech Community – Latest Blogs –Read More
Simplify development with Dev Container templates for Azure SQL Database
What are Dev Containers?
A development container essentially packages up your project’s development environment using the Development Container Specification (devcontainer.json). This specification enriches your container with metadata and content necessary to enable development from inside a container.
Workspace files are mounted from the local file system or copied or cloned into the container. Extensions are installed and run inside the container, where they have full access to the tools, platform, and file system. This means that you can seamlessly switch your entire development environment just by connecting to a different container.
Dev Container Templates are source files packaged together that encode configuration for a complete development environment, while Dev Container Features allow us to add runtimes, tools, and libraries inside a container. As a result, all this put together ensures a consistent and reproducible development environment from any tool that supports the Development Container Specification.
When you open your project in the dev container, your code will just work without downloading anything on your local machine. Furthermore, the best part is that when connected to a dev container, your developer experience is exactly the same as if you opened the project locally in VS Code.
Introducing Dev Container Templates for Azure SQL Database
We are excited to introduce new Dev Container templates specifically designed for Azure SQL Database. These templates support multiple programming languages, including .NET 8, .NET Aspire, Python, and Node.js, making it easier for developers to get started quickly and focus on building their applications.
Dev Containers streamline the development process by providing an out-of-the-box environment configured for Azure SQL Database. This eliminates the need for developers to spend time searching for and setting up VS Code extensions to interact with their database and preferred programming language. With these templates, you can dive straight into coding, boosting productivity and reducing setup friction.
Included with the templates is a pre-built demo database called Library, which serves as a practical example to help developers get started quickly. While these Dev Containers use the Azure SQL Edge container image, which offers a surface area close to Azure SQL Database, using SQL Database Projects ensures that your database code remains compatible with Azure SQL Database. With this demo project, you can easily use the dacpac artifact created by SQL Database Projects and deploy it to Azure SQL Database using the Azure SQL Action for GitHub Actions. This process streamlines your workflow and ensures seamless integration with your production environment.
Whether working locally or in the cloud, dev containers ensure consistency across development environments, making it easier to collaborate and maintain high standards across your team. With the inclusion of essential tools like SQLCMD, SqlPackage, Azure Command-Line Interface (CLI) and Azure Developer CLI (AZD), these templates offer a comprehensive solution for enhancing your development workflow with Azure SQL Database.
Benefits of Using Dev Containers
Dev Containers ensure a consistent and seamless experience, promoting smooth collaboration across teams and workflows, and facilitating an easy transition to Azure environments. Key benefits include:
Preconfigured environments: These come with all necessary tools and dependencies.
Consistency: Maintain uniformity across different development setups.
Simplified setup: Reduce time spent on configuration.
Enhanced collaboration: Improve teamwork within development teams.
Seamless transition to Azure: Leverage the scalability and reliability of Azure SQL Database for production deployments.
Accelerated time-to-market: Streamline development workflows and integrate seamlessly with existing toolsets, giving businesses a competitive edge.
Cost-efficient development: Reduce dependencies on cloud resources during the development and testing phases.
By using dev containers, developers can avoid the hassle of setting up and configuring their local development environment manually.
Prerequisites
Before you begin, make sure you have the following tools installed on your local machine:
Git: For version control
Docker: Required for running containers
Visual Studio Code: The primary IDE to use Dev Containers
Dev Containers extension for Visual Studio Code: Enables working with Dev Containers
To set up your environment, follow these steps:
First, ensure you have Git installed for version control.
Then, install Docker, which is necessary for running containers.
After that, download and install Visual Studio Code, as it will be your primary IDE for using Dev Containers.
Lastly, add the Dev Containers extension to Visual Studio Code to enable seamless containerized development.
Setting up the Dev Container template for Azure SQL Database
Creating a Dev Container
Select the Add configuration file to workspace option if you want to add the dev container configuration file to your current local repository. Alternatively, choose the Add configuration file to user data folder option. For this qiuckstart, select the Add configuration file to workspace option.
Visual Studio Code prompts you to select a Dev Container template. The available templates are based on the tools and dependencies required for the specific development environment. Select Show All Definitions to view all available templates.
Next, select the desired Dev Container template for Azure SQL Database by typing Azure SQL into the command palette. This action displays a list of available templates designed for Azure SQL Database development.
Building the Container
Upon selection, Visual Studio Code automatically generates the necessary configuration files tailored to the chosen template. These files include settings for the development environment, extensions to install, and Docker configuration details. They’re stored in a .devcontainer folder within your project directory, ensuring a consistent and reproducible development environment.
Following the configuration file generation, Visual Studio Code prompts you to transition your project into the newly created Dev Container environment. You can do it by selecting Reopen in Container. This step is crucial as it moves your development inside the container, applying the predefined environment settings for Azure SQL development.
If you haven’t already, you can also initiate this transition manually at any time using the Dev Containers extension. Use the Reopen in Container command from the command palette or select on the blue icon at the bottom left corner of Visual Studio Code and select Reopen in Container.
This action initiates the setup process, where Visual Studio Code generates the necessary configuration files and builds the development container based on the selected template. The process ensures that your development environment is precisely configured for Azure SQL Database development.
Exploring and verifying the Dev Container
After you build the dev container, start exploring and verifying the setup. Open a terminal within Visual Studio Code to check that all necessary tools are installed and working correctly.
As an optional step, you can also run predefined tasks directly from the command palette, streamlining your development workflow and allowing you to focus on writing code.
For more detailed information about specific templates, visit Azure SQL Database Dev Container templates.
Conclusion
Dev Containers for Azure SQL Database offer a powerful and efficient way to streamline your development process. By providing a consistent, portable environment, they help you focus on writing code and building features rather than configuring your setup. We encourage you to explore these templates and see how they can enhance your development workflow for Azure SQL Database.
Looking ahead, we will delve into more advanced topics like integrating Azure services with Dev Containers to further optimize your development process. Stay tuned for more insights and practical guides to help you get the most out of Azure SQL Database and Dev Containers.
More about Dev Container templates for Azure SQL Datatabase.
Microsoft Tech Community – Latest Blogs –Read More
Using arrays calculations in Excel.
In a rather complex Excel spreadsheet that I am studying to apply it, I came across a calculation step that uses arrays in such a synthetic way that I cannot decipher.
I looked on the internet and with Copilot for this calculation system, but no one has hit the nail on the head with this particular calculation solution.
In essence, in the spreadsheet there is a drop-down combo box with a list of elements for which the spreadsheet determines the static characteristics of each element chosen.
In another part of the spreadsheet, a one-dimensional array has been inserted vertically showing all the elements listed in the cell with the drop-down combo box and next to each of them the value of a certain coefficient depending on the type of element chosen in the drop-down list.
The spreadsheet reports the analysis of only one element, the one chosen in the drop-down box, while the array calculates and reports the coefficient in question for all the elements inserted in the drop-down list.
To fix the ideas, I report on an excel sheet the situation extracted from the context for greater clarity hoping that someone can recognize the steps used and explain them to me in a linear way because I am not familiar with the use of arrays in excel.
Thanking in advance the people who want to help me understand the problem, I send cordial greetings to everyone.
In a rather complex Excel spreadsheet that I am studying to apply it, I came across a calculation step that uses arrays in such a synthetic way that I cannot decipher.I looked on the internet and with Copilot for this calculation system, but no one has hit the nail on the head with this particular calculation solution.In essence, in the spreadsheet there is a drop-down combo box with a list of elements for which the spreadsheet determines the static characteristics of each element chosen.In another part of the spreadsheet, a one-dimensional array has been inserted vertically showing all the elements listed in the cell with the drop-down combo box and next to each of them the value of a certain coefficient depending on the type of element chosen in the drop-down list.The spreadsheet reports the analysis of only one element, the one chosen in the drop-down box, while the array calculates and reports the coefficient in question for all the elements inserted in the drop-down list.To fix the ideas, I report on an excel sheet the situation extracted from the context for greater clarity hoping that someone can recognize the steps used and explain them to me in a linear way because I am not familiar with the use of arrays in excel.Thanking in advance the people who want to help me understand the problem, I send cordial greetings to everyone. Read More
Conditional formatting range of cells based upon specific value of cell for that column
Hi I want to create a tracker template with conditional formatting for a range of sales growth red if value is less than a specific cell value and green if value is greater than the value of a cell in that particular column. For example:
HIghlight the cells in column O based upon if their value is greater (GREEN) or (RED) less than Value in O4 and so on. Please
Help
Hi I want to create a tracker template with conditional formatting for a range of sales growth red if value is less than a specific cell value and green if value is greater than the value of a cell in that particular column. For example: HIghlight the cells in column O based upon if their value is greater (GREEN) or (RED) less than Value in O4 and so on. PleaseHelp Read More
Windows 11 Upgrade Vai Intune with Delivery Optimization MCC
Hi
Good day
We are planning to set up a MCC for our environment for delivery optimization. We would like to know whether it will also help us in the upgrade process from windows 10 to windows 11 and how? From where exactly the patch will download? how the background mechanism will work (Content will be shared with requested devices without downloaded)
We have created a test policy from Intune for Windows 10 already and it works, but we want to know once we setup the MCC is there any specific settings changes required from Intune for this window 11 upgrade. Please suggest
Thanks in Advance
Karimulla
Hi Good day We are planning to set up a MCC for our environment for delivery optimization. We would like to know whether it will also help us in the upgrade process from windows 10 to windows 11 and how? From where exactly the patch will download? how the background mechanism will work (Content will be shared with requested devices without downloaded) We have created a test policy from Intune for Windows 10 already and it works, but we want to know once we setup the MCC is there any specific settings changes required from Intune for this window 11 upgrade. Please suggest Thanks in Advance Karimulla Read More
Advice needed: Multitenant organization issues
Hey peeps, a client of mine is asking for an optimal solution to their sub-optimal organization structure. I want to see if there’s something more I can do here or if we are stuck with our environment the way it is. It’s such a strange ask that it will take a few paragraphs to describe, so bear with me.
Client has a central corporate entity, but the “branch” entities operate separately and have a fair amount of self-governance.
This central corporate entity has a Microsoft365 tenant and that’s what everyone’s email matches, including branch members. Let’s call it corp.onmicrosoft.com with a verified domain of corp.com. So, everyone at corporate and the branches have addresses/UPNs of @corp.com.
Before my time, one of the self-governing branches chose to setup a Sharepoint site specific to their branch. They put all the files on a separate 365 tenant of corp-ny.onmicrosoft.com with verified domain corp-ny.com. There are a couple of identities on that 365 tenant, but since everyone uses their corp.com email, they access the Sharepoint data from their primary corporate identities as GUESTS of the branch’s tenant. So the branch tenant has 3 members and 100+ guests.
We perform IT for just the BRANCH, not the corporate structure. Since corporate IT is not interested in changing infrastructure at this time, we would like to convert all the guest identities on the branch tenant to members and we can then leverage technologies like Intune & CA and move them off of their on-premise AD server that is not doing AD Connect. I have a quick script that will do all of that – convert, license, set some properties for all 100 members. Seems okay! After the change, members will have their corporate identity for email, and the branch identity for Sharepoint and Windows login.
We’ve identified a problem, however, with notifications. When you comment on a file in Sharepoint, a notification is generated for anyone that participates in that file. The notification is sent from the commenter’s identity. Currently, that means notifications come from @corp.com . However, after the change those notifications will come from corp-ny.com. This domain does NOT have an MX record associated with it 🙁 and we think this will lead to a LOT of confusion if people try to reply directly to the emails. It might also have the potential(?) to fail email spoofing checks or be flagged as suspicious by email servers. Additionally, the notifications would be sent to their branch identities, which I assume would not deliver. Even if it did deliver and we added an MX record, it would be in an inbox that’s not checked by the team.
My question is:
Can I mask the notification email to be from “email address removed for privacy reasons” for all of the notifications? Or,
Can I “spoof” the emails so that they appear to be sent from the corporate identity?
Secondly,
What’s the best way to deal with notifications headed to the wrong inbox? Can a transport rule redirect these emails to their corporate emails?
Hey peeps, a client of mine is asking for an optimal solution to their sub-optimal organization structure. I want to see if there’s something more I can do here or if we are stuck with our environment the way it is. It’s such a strange ask that it will take a few paragraphs to describe, so bear with me. Client has a central corporate entity, but the “branch” entities operate separately and have a fair amount of self-governance. This central corporate entity has a Microsoft365 tenant and that’s what everyone’s email matches, including branch members. Let’s call it corp.onmicrosoft.com with a verified domain of corp.com. So, everyone at corporate and the branches have addresses/UPNs of @corp.com. Before my time, one of the self-governing branches chose to setup a Sharepoint site specific to their branch. They put all the files on a separate 365 tenant of corp-ny.onmicrosoft.com with verified domain corp-ny.com. There are a couple of identities on that 365 tenant, but since everyone uses their corp.com email, they access the Sharepoint data from their primary corporate identities as GUESTS of the branch’s tenant. So the branch tenant has 3 members and 100+ guests. We perform IT for just the BRANCH, not the corporate structure. Since corporate IT is not interested in changing infrastructure at this time, we would like to convert all the guest identities on the branch tenant to members and we can then leverage technologies like Intune & CA and move them off of their on-premise AD server that is not doing AD Connect. I have a quick script that will do all of that – convert, license, set some properties for all 100 members. Seems okay! After the change, members will have their corporate identity for email, and the branch identity for Sharepoint and Windows login. We’ve identified a problem, however, with notifications. When you comment on a file in Sharepoint, a notification is generated for anyone that participates in that file. The notification is sent from the commenter’s identity. Currently, that means notifications come from @corp.com . However, after the change those notifications will come from corp-ny.com. This domain does NOT have an MX record associated with it 🙁 and we think this will lead to a LOT of confusion if people try to reply directly to the emails. It might also have the potential(?) to fail email spoofing checks or be flagged as suspicious by email servers. Additionally, the notifications would be sent to their branch identities, which I assume would not deliver. Even if it did deliver and we added an MX record, it would be in an inbox that’s not checked by the team. My question is:Can I mask the notification email to be from “email address removed for privacy reasons” for all of the notifications? Or,Can I “spoof” the emails so that they appear to be sent from the corporate identity? Secondly,What’s the best way to deal with notifications headed to the wrong inbox? Can a transport rule redirect these emails to their corporate emails? Read More
File Saving macro from three cells
I am in search of a macro to save all Excel workbooks in the same folder with a filename that is a combination of cell A1 and B3 and b4. Windows 10 and Office 365. Thank you for sharing your expertise and time.
I am in search of a macro to save all Excel workbooks in the same folder with a filename that is a combination of cell A1 and B3 and b4. Windows 10 and Office 365. Thank you for sharing your expertise and time. Read More
#NUM error
Hi everyone
Kinda new here, but I believe I’m in the right place for the problem I would like solve. This formula…
=SUM(SMALL((G8,I8,K8,M8,O8,Q8,U8,W8,Y8,AA8,AC8),{1,2,3,4,5}))+S8
… doesn’t seem to work on my windows PC but it works on my Android phone. What could be wrong with excel on my windows PC. I have tried many ways like enabling itarative calculations, but still it doesn’t change a thing. What should I do? I’ll appreciate your support.
Hi everyoneKinda new here, but I believe I’m in the right place for the problem I would like solve. This formula… =SUM(SMALL((G8,I8,K8,M8,O8,Q8,U8,W8,Y8,AA8,AC8),{1,2,3,4,5}))+S8 … doesn’t seem to work on my windows PC but it works on my Android phone. What could be wrong with excel on my windows PC. I have tried many ways like enabling itarative calculations, but still it doesn’t change a thing. What should I do? I’ll appreciate your support. Read More
Random selection of questions from an archive
Hi there,
I’m new to Forms migrating from Moodle.
What I was used to do is to set a quiz to extract 31 random questions from an archive of 170 questions, so to obtain a personalized and always different quiz for each student.
Currently, my archive of questions is in Word style and I’ve verified that the quick import of Forms can correctly read it, but it treats the whole archive as a single quiz with 170 questions. Is there any way to do the random selection trick?
Thanks a lot!
Jordi
Hi there, I’m new to Forms migrating from Moodle.What I was used to do is to set a quiz to extract 31 random questions from an archive of 170 questions, so to obtain a personalized and always different quiz for each student.Currently, my archive of questions is in Word style and I’ve verified that the quick import of Forms can correctly read it, but it treats the whole archive as a single quiz with 170 questions. Is there any way to do the random selection trick? Thanks a lot! Jordi Read More
All information is lost due to OneDrive
I didn’t want synchronization, so I stopped it. Then I went to OneDrive via the Internet and deleted all the folders and from the recycle bin as well. Everything was fine. But after a couple of minutes all the information disappeared from my computer. Who can retrieve all my work files and all the information to me now??? My PC is empty now. Disaster.
I didn’t want synchronization, so I stopped it. Then I went to OneDrive via the Internet and deleted all the folders and from the recycle bin as well. Everything was fine. But after a couple of minutes all the information disappeared from my computer. Who can retrieve all my work files and all the information to me now??? My PC is empty now. Disaster. Read More