Category: News
Azure Messaging and Streaming update – May 2024
Azure Messaging Update – May 2024
This update covers features being released at the Microsoft Build 2024 conference for the Azure Service Bus, Azure Event Hubs and Azure Event Grid services.
Azure Event Hubs Updates
Event Hubs has several new features including one of the most requested features we have heard from our customers.
New features:
Event Hubs Emulator (Public Preview)
The Event Hubs Emulator is a containerized instance of Azure Event Hubs that can run on Windows or Linux for development or test purposes. This has been the most requested feature, and it is great to finally deliver it for customers. In this first delivery of the emulator, it only will support AMQP traffic.
Large Message size (Public Preview)
Namespaces in Azure Event Hubs Dedicated now can support messages as large as 20 MB.
The Kafka Compression feature is now generally available.
In addition to the exiting AVRO schema support, the Schema Registry has added support for 2 more schema types.
JSON Schema in Schema Registry (GA) – The JSON Schema support is now GA.
Protobuf in Schema Registry (Public Preview) – Protobuf support is now available in preview in the Schema Registry.
Azure Service Bus Update
Azure Service Bus new features.
Azure Service Bus now supports Batch Delete providing a more convenient way to manage your queues, including dead letter queues.
Azure Event Grid Updates
Azure Event Grid continues to evolve with the addition of new features to achieve MQTT compliance, simplify security for IoT and event-driven solutions, and facilitate seamless integrations.
MQTT features:
MQTT Last Will and Testament (LWT) (GA)
Enables MQTT clients to notify other MQTT clients of their abrupt disconnections. You can use LWT to ensure predictable and reliable flow of communication among MQTT clients during unexpected disconnections, which is valuable for scenarios where real-time communication, system reliability, and coordinated actions are critical.
OAuth 2.0 authentication for MQTT clients (Public Preview)
Allows MQTTv3.1.1 and MQTTv5 clients to authenticate and connect with the MQTT broker using JSON Web Tokens (JWT) issued by any third-party OpenID Connect (OIDC) identity provider. This authentication method provides a lightweight, secure, and flexible option for MQTT clients that are not provisioned in Azure.
Namespace features:
Custom domain names support (Public Preview)
Allows users to assign their own domain names to Event Grid namespace’s MQTT and HTTP endpoints, enhancing security and simplifying client configuration.
Push delivery to Azure Event Hubs (GA)
Allows users to configure event subscriptions on namespace topics to send messages to Azure Event Hubs for streaming purposes.
Push delivery to Webhooks (Public Preview)
Allows users to configure event subscriptions on namespace topics to send messages to your application’s public endpoint using a simple and reliable delivery mechanism.
CloudEvents 1.0 Binary Content Mode (GA)
Offers the ability to produce messages whose payload is encoded in any media type.
Shared Access Signature (SAS) tokens authentication (Public Preview)
Allows users to publish or receive (pull delivery) messages using a simple authentication mechanism.
Namespace Topic as a destination (GA)
Enables users to create an event subscription on a custom, system, domain, and partner topics (Event Grid Basic) that forwards events to namespace topics. Forwarding events to the namespace topic allows you to take advantage of its pull delivery support and flexibility in consumption.
Event sources:
Microsoft Graph API events (GA)
Enables users to react to resource changes in Microsoft Entra ID, Microsoft Teams, Outlook, SharePoint, etc. This feature is key for enterprise scenarios such as auditing, onboarding, and policy enforcement, to name a few.
Azure Resource Notifications health resources events to Azure Monitor alerts (Public Preview)
Enables near real-time notifications when your workload is impacted. With this feature, you can get a better understanding of any service issues that may be affecting your resources.
API Center system topic (Public Preview)
Enables you to receive real-time updates when an API definition is added or updated. This means you can keep track of your APIs and ensure they are always up to date, making it easier for stakeholders throughout your organization to discover, reuse, and govern APIs.
To learn about new features released in Azure Event Grid, see this announcement.
To learn more about each of the Messaging and Streaming services:
Azure Service Bus
Azure Event Hubs
Azure Event Grid
Azure Stream Analytics
Microsoft Tech Community – Latest Blogs –Read More
Azure WAF integration in Copilot for Security- Protect web applications using Gen AI
Today, we are launching the public preview of Azure Web Application Firewall (WAF) integration in Microsoft Copilot for Security. Azure WAF capabilities available in the standalone Copilot for Security experience are: Get Top Rules Triggered, Get Top Blocks By IP, Get SQLi Blocks By WAF, and Get XSS Blocks By WAF
Azure WAF network security analysts face many challenges. A lot of their time goes into research and understanding why certain WAF requests were blocked, which is a very time-consuming and manual task.
With the Azure WAF in Copilot for Security integration, security and IT teams can move faster, and focus on high value tasks. The Copilot summarizes data and generates in-depth contextual insights into the WAF threat landscape. This enables analysts to determine if the WAF policy is blocking a request it should not have blocked, or if their WAF policy needs to be fine-tuned. It results in time and cost savings since Copilot can reason over terabytes of data in a matter of minutes, not hours or days.
Another gain in productivity is simplifying the complex, analysts don’t have to write complex KQL queries. Instead, they can simply ask questions in natural language and Copilot for Security understands the context and generates the response. This results in time savings and unlocks new skills for junior analysts while Tier1 analysts can now complete more complex tasks focusing on strategic rather than tactical work.
Let’s take a closer look at what each of these new Azure WAF Skills in Copilot for Security do to help network security professionals investigate logs via natural language prompts.
Azure WAF Skills in Copilot for Security
The four WAF Skills available are:
Get Top Rules Triggered: Retrieve contextual details about WAF detections.
Get Top Blocks By IP: Retrieve the top malicious IPs in the environment along with related WAF rules and patterns triggering the attack.
Get SQLi Blocks By WAF: Explain why Azure WAF blocks SQL Injection (SQLi) attacks. Analyze Azure WAF diagnostic logs and connect related logs over a specific time period to generate a summary of the attack.
Get XSS Blocks By WAF: Explain why Azure WAF blocks Cross-site Scripting (XSS) attacks. Analyze Azure WAF diagnostic logs and connect related logs over a specific time period to generate a summary of the attack.
Using the Get Top Rules Triggered Skill
This Copilot Skill summarizes in natural language the overall threat landscape in the WAF environment. The Skill reasons over terabytes of WAF logs and generates a list of top WAF rules triggered, detection logic information used for detections, malicious client IPs triggering the WAF rules. The list is ordered based on the number of times rules are hit and rules with the greatest number of hits are displayed at the top.
The screenshot below describes the response generated when a prompt is issued for top WAF rules in a regional WAF over the last one day.
The default timespan for any of the WAF Skills is 24 hours but prompts can be tailored specific to a request.
Using the top WAF rules triggered Skill, it is possible for analyst to get details on any of the WAF rule sets – Default Rule Set, Bot Rule Set, or Custom rule set.
The screenshot given below looks for details of the bot rules triggered.
Furthermore, it is possible to use this Skill to obtain details of a specific vulnerability. In the following example, an analyst is trying to see if any Remote Code Execution (RCE) is seen by WAF and receives details about an RCE including the Log4J CVE details. The analyst can use other Copilot for Security products such as Microsoft Defender for Threat Intelligence to obtain further details about the CVE.
Using the Get Top Blocks By IP Skill
This Skill generates a list of most frequently triggered offending IPs along with related WAF contextual information.
By using the response from this Skill, analysts can get a holistic picture of WAF rules triggered by the offending IPs and overall exposure of the WAF policy to the IPs.
Furthermore, the malicious IPs discovered by this WAF Skill can be searched in other Copilot for Security products such as the Microsoft Defender for Threat Intelligence to get other attack vectors associated with the IPs.
Using the Get SQLi Blocks By WAF Skill
This Skill provides contextual insights into WAF detections of SQL Injection (SQLi) attacks. This helps analysts understand the details of the SQLi attack such as WAF resources under attack, attack pattens such as query parameters triggering the attack.
Using the Get XSS Blocks By WAF Skill
This Skill provides contextual insights into WAF detections of cross-site scripting (XSS) attacks. This helps analysts understand the details of the attack such as WAF resources under attack, attack pattens such as query parameters triggering the attack.
How to use Azure WAF integration in Copilot for Security
Copilot for Security is accessible to organizations as a pay-as-you-go consumption model. After the Security Compute Units (SCU) are provisioned and Azure WAF logs are present in Azure Log Analytics, the WAF Skills will be ready for use.
Select “sources” in the prompt bar and ensure the Azure Web Application Firewall plugin is enabled for use. Ensure the WAF Log Analytics workspace name, Log Analytics resource group name and Log Analytics subscription ID are configured.
With the Azure WAF in Copilot for Security integration, security and IT teams can move faster, upskill and transition into the age of AI. The integration announced today combine Microsoft’s expertise in security with Gen AI, packaged together to empower network security analysts to outpace adversaries with the speed and scale of AI.
Sowmya Mahadevaiah
Principal Product Manager, Azure Networking
Microsoft Tech Community – Latest Blogs –Read More
Building Intelligent Apps with Azure Cache for Redis, EntraID, Azure Functions, E1 SKU, and more!
We’re excited to announce the latest updates to Azure Cache for Redis that will improve your data management and application performance as we kickoff for Microsoft Build 2024. Coming soon, the Enterprise E1 SKU (Preview) will offer a lower entry price, Redis modules, and enterprise-grade features. The Azure Function Triggers and Bindings for Redis are now in general availability, simplifying your workflow with seamless integration. Microsoft EntraID in Azure Cache for Redis is now in GA, providing enhanced security management. And there’s more – we are also sharing added resources for developing intelligent applications using Azure Cache for Redis Enterprise, enabling you to build smarter, more responsive apps. Read the blog below to find out more about these amazing updates and how they can enhance your Azure Cache for Redis experience.
Building Intelligent Apps with Azure Cache for Redis
Developers can leverage the power and versatility of Azure Cache for Redis Enterprise to build and enhance intelligent apps. In this Azure .NET session, you will learn how to use various libraries and SDKs, such as semantic kernel, Redis OM for DotNet, and .NET 8 caching abstractions, to implement scenarios such as AI chatbots, vector similarity search, semantic caching, and more. Additionally, the scenarios can also be supported across various languages such as Java, Python, Node.js, and, Go. You will also see how to integrate Azure Cache for Redis with other Azure services, such as Cognitive Services and Azure Cosmos DB, to create responsive intelligent applications. Check out the video, documentation, and demo to discover how Azure Cache for Redis Enterprise can help you take your apps to the next level of intelligence and performance.
Enterprise E1 SKU (Preview)
The Azure Cache for Redis Enterprise tier will have a new E1 SKU available in preview soon. The E1 SKU reduces the cost to get started with Azure Cache for Redis Enterprise. This tier will continue to support all Redis modules, such as RediSearch, RedisBloom, RedisTimeSeries, and vector search for generative AI applications.
Azure Function Triggers and Bindings in Azure Cache for Redis (GA)
We are also happy to announce that the Azure Functions triggers and bindings in Azure Cache for Redis are now generally available. This feature allows you to easily build serverless applications that connect with your Azure Cache for Redis data, without writing repetitive code. You can use different triggers, such as pub/sub channels, lists, streams, and key space notifications, to run your functions based on events in your cache. You can also use input and output bindings to read and write data from and to your cache within your function code. The Azure Functions Triggers and Bindings for Redis support various languages, such as C#, Java, Node, Python, and PowerShell. They work with both premium and durable functions, and support for consumption functions is being rolled out now on a regional basis. To learn more about this feature and how to get started, check out the tutorial in our documentation, or read about how to use Functions to refresh expired keys in Redis
Read through cache using Azure Functions
Event based architectures with Azure Cache for Redis & Azure Functions triggers
Microsoft Entra ID for Authentication and Authorization (GA)
Lastly, we are pleased to announce the general availability of Microsoft Entra ID for Authentication and Authorization. Microsoft Entra ID allows you to assign permissions to your Entra ID identities, to control data access policies for your cache.
Using Microsoft Entra ID for authentication provides you with a secure and flexible way to manage your data access policies and allows you to use Microsoft Entra ID identities, such as service principals and managed identities, to authenticate to your cache. This eliminates the need to store and rotate access keys and simplifies the credential management process. It also enables you to assign permissions to your Microsoft Entra ID identities, and control which commands and keys they can access in your cache. This helps you enforce the principle of least privilege and protect your data from unauthorized access. Learn more about Microsoft Entra ID here and how to configure it for your Azure Cache for Redis here.
Resources
Azure Cache for Redis
Building .NET Based Intelligent Apps with Azure Cache for Redis
Making .NET intelligent apps smarter and consistent with Redis
ChatGPT + Enterprise data with Azure OpenAI and Azure Cognitive Search (.NET) Demo
Vector similarity search in Azure Cache for Redis
Enterprise E1 SKU (Preview)
Azure Function Trigger and Bindings in Azure Cache for Redis (GA)
How to Refresh Expired Keys in Redis using Azure Functions
Get started with Azure Functions triggers and bindings in Azure Cache for Redis
Create a write-behind cache by using Azure Functions and Azure Cache for Redis
Microsoft EntraID Authentication and Authorization (GA)
Microsoft EntraID documentation
Microsoft Tech Community – Latest Blogs –Read More
Azure Firewall integration in Copilot for Security: protect networks at machine speed with Gen AI
Azure Firewall is a cloud-native and intelligent network firewall security service that provides best of breed threat protection for your cloud workloads running in Azure. It’s a fully stateful firewall as a service with built-in high availability and unrestricted cloud scalability. In this blog we will be focusing on the newly announced Azure Firewall integration in Copilot for Security.
The Azure Firewall integration in Copilot for Security helps analysts perform detailed investigations of the malicious traffic intercepted by the IDPS feature of their firewalls across their entire fleet using natural language questions in the Copilot for Security standalone experience.
These capabilities were announced at RSA. Take a look at this blog to learn more about the user journey and value that Copilot can deliver: Bringing generative AI to Azure network security with new Microsoft Copilot integrations.
There are four primary capabilities now in public preview which are outlined below.
Get top IDPS signature hits
This capability retrieves the top IDPS signature hits for an Azure Firewall. It helps the user get information about the traffic intercepted by the IDPS feature by simply asking natural language questions instead of the user having to construct KQL queries manually.
Get details on an IDPS signature
This capability enriches the threat profile of an IDPS signature beyond the information found in logs. It helps the user get additional details about an IDPS signature instead of requiring them to manually source this information. The Microsoft Defender Threat Intelligence plugin is another source that Copilot may use to provide threat intelligence for IDPS signatures.
Search across firewalls for an IDPS signature
This capability looks for a given IDPS signature across your tenant, subscription or resource group. It helps users perform a fleet-wide search (over any scope) for a threat across all their Firewalls instead of searching for the threat manually.
Secure your environment using IDPS
This capability generates recommendations to secure your environment using Azure Firewall’s IDPS feature. It helps users get information from documentation about using Azure Firewall’s IDPS feature to secure their environment instead of having to look up this information manually. Copilot for Security may also use the Ask Microsoft Documentation capability to provide this information.
Get started
Learn more in our documentation about these capabilities and how to access them in Microsoft Copilot for Security today!
Abhinav Sriram,
Product Manager
Microsoft Tech Community – Latest Blogs –Read More
Deploy and Scale Spring Batch in the Cloud – with Adaptive Cost Control
You can now use Azure Spring Apps to effectively run Spring Batch applications with adaptive cost control. You only pay when batch jobs are running, and you can simply lift and shift your Spring Batch jobs with no code change.
Spring Batch is a framework for processing large amounts of data in Java applications. It provides reusable functions for logging, transaction management, job statistics, job restart, skipping errors, and resource management. It also supports high-performance tasks through optimization and partitioning. Introduced in March 2008, Spring Batch is popular among Java developers and is part of the Spring portfolio. It is widely used in modern enterprise systems to handle complex batch processing tasks efficiently.
Running Spring Batch jobs in the cloud presents several challenges:
Scalability: Ensuring batch jobs can scale efficiently to handle large volumes of data.
Cost Management: Controlling costs by only paying for resources when jobs are running.
Job Lifecycle Management: Managing the lifecycle of batch jobs, including scheduling, monitoring, and restarting jobs if they fail.
Infrastructure Management: Handling the underlying infrastructure, such as servers and storage, required to run batch jobs.
Security: Securing the batch jobs and the data they process.
Monitoring: Setting up effective monitoring and logging for job performance and errors.
Again, you can now use Azure Spring Apps to effectively run Spring Batch applications with adaptive cost control:
You only pay when batch jobs are running.
You can simply lift and shift your Spring Batch jobs with no code change.
We are announcing the public preview of Jobs in Azure Spring Apps to enable you to deploy and scale Spring Batch applications without worrying about job scalability, cost control, lifecycle, infrastructure, security, and monitoring. This makes it easier to handle large-scale data processing efficiently, leveraging the flexibility and scalability of the cloud.
Introduction to Jobs in Azure Spring Apps
Jobs in Azure Spring Apps are tasks with a finite lifespan — they start, perform processing, and exit upon completion. Each job execution typically handles a single unit of work and can run from minutes to hours, with multiple executions running simultaneously. Examples include batch processes that run on demand and scheduled tasks — a great fit for scenarios such as data processing, machine learning, building intelligence for AI applications, and any scenario where on-demand processing is required. This capability enables developers to efficiently manage and scale tasks within their applications, ensuring optimized performance and resource usage in a cloud environment.
Jobs in Azure Spring Apps enable you to run containerized, run-to-completion tasks within your environment. They will support three trigger types:
Manual: Triggered on demand by a user or application.
Schedule: Runs on a recurring schedule.
Event: Triggered by an event, like a message in a queue, and can be used for CI/CD pipeline build agents.
Currently, the public preview supports manual triggers. Our engineering team is actively working on adding support for scheduled and event-based triggers, which will be available soon. This ongoing development ensures that you can fully leverage the flexibility and power of Azure Spring Apps for all your batch processing needs.
Jobs share the same environment as your Spring applications, enabling shared resources like networking and storage. You can create and manage jobs, bind secrets with Azure Key Vault, secure communications, and monitor jobs, just like your Spring applications in Azure Spring Apps. You can combine Jobs and Apps to build powerful solutions.
Deploy Spring Batch Jobs in 3 Easy Steps
With these simple steps, you can quickly deploy and run your Spring Batch jobs on Azure Spring Apps.
Achieve Cost Efficiency and Simplicity with Adaptive Cost Control for Spring Batch Jobs
Let’s use an example to explain adaptive cost control. Suppose you have a Spring Batch job needing 8 vCPUs and 16 GB of memory. Normally, you’d use a larger virtual machine, like an Azure Virtual Machine D16v5, costing around $572 USD per month. Even if you run the job for only 2 hours a day, you still pay for the full month and handle maintenance for the OS, packages, JDK, and APM.
With Azure Spring Apps, you allocate 8 vCPUs and 16 GB for just the job’s runtime, say 60 hours a month. This costs around $45 USD per month, with all underlying infrastructure maintenance — OS, packages, JDK, and APM — handled for you. This reduces both infrastructure costs and the effort required by your developers and platform engineers. This approach is known as adaptive cost control.
Deploy Spring Batch Jobs and Share Your Feedback
Azure Spring Apps delivers simplicity and productivity, and you can leverage Spring experts to make your projects even more successful. You can easily deploy your Spring and polyglot applications – and now Spring Batch Jobs – to the cloud and get them up and running in no time. It’s a golden path to production that simplifies the deployment process and optimizes your resource usage. We’ll continue to innovate tools and optimize services for streamlining Spring app migration to cloud at scale and running those Spring apps efficiently and economically – Faster, Cheaper, and Better.
And the best part? We’re offering FREE monthly grants on all tiers – 50 vCPU hours and 100 GB hours per tier. This is the number of FREE hours you get BEFORE any usage is billed, giving you a chance to test out the service without any financial charges.
So why wait? Take advantage of our FREE monthly grants and deploy your first Spring Batch Job to Azure Spring Apps today!
Go to aka.ms/first-spring-batch-job !!
Microsoft Tech Community – Latest Blogs –Read More
.NET8 MAUI Image “gif” animation Debug vs. Release
Has anyone figured out how to get a MAUI Image “gif” animation to work in Release mode?
Using Visual Studio 2022’s Android Device Manager, Emulator set to Tablet M-DPI 10.1in – API34, Android 14.0 – API 34, the MAUI Image animation in Debug mode works every single time! Awesome! However, when I switch the Build to Release mode and Deploy to the Emulator, the application responds just fine but I see a Image control presenting a FROZEN “gif” and I don’t know how to solve the problem.
I experience the same FROZEN “gif” problem if the Emulator is running Pixel 6 Pro Android 14 – API 34.
Using the Debug build, Pixel 6 Pro Android 14 – API 34 Emulator shows the Image control animating the “gif” perfectly!
However, switching Build to Release mode and Deploying to the Pixel 6 Pro Emulator, again I experience the application responding just fine, but the Image control presents a FROZEN “gif”.
Here’s my XAML definition for my Image element:
<Image
x:Name=”ClintHatGif”
Source=”clinteastwood.gif”
IsAnimationPlaying=”True”
Aspect=”AspectFit”
VerticalOptions=”Center”
HeightRequest=”180″ />
When I select my Project and visit “Manage NuGet Packages” and select the “Updates” tab, no updates appear. So, I think I’ve got the latest.
Maybe you know of a NuGet Package or a Build Release setting that solves the problem? I’m unsure how to proceed.
Thanks for reading this post.
Has anyone figured out how to get a MAUI Image “gif” animation to work in Release mode? Using Visual Studio 2022’s Android Device Manager, Emulator set to Tablet M-DPI 10.1in – API34, Android 14.0 – API 34, the MAUI Image animation in Debug mode works every single time! Awesome! However, when I switch the Build to Release mode and Deploy to the Emulator, the application responds just fine but I see a Image control presenting a FROZEN “gif” and I don’t know how to solve the problem. I experience the same FROZEN “gif” problem if the Emulator is running Pixel 6 Pro Android 14 – API 34.Using the Debug build, Pixel 6 Pro Android 14 – API 34 Emulator shows the Image control animating the “gif” perfectly! However, switching Build to Release mode and Deploying to the Pixel 6 Pro Emulator, again I experience the application responding just fine, but the Image control presents a FROZEN “gif”.Here’s my XAML definition for my Image element: <Image
x:Name=”ClintHatGif”
Source=”clinteastwood.gif”
IsAnimationPlaying=”True”
Aspect=”AspectFit”
VerticalOptions=”Center”
HeightRequest=”180″ /> When I select my Project and visit “Manage NuGet Packages” and select the “Updates” tab, no updates appear. So, I think I’ve got the latest. Maybe you know of a NuGet Package or a Build Release setting that solves the problem? I’m unsure how to proceed.Thanks for reading this post. Read More
FILTER and COUNTA function returning 1 even when no data is found
Hi,
I am using the formula below to track an unique count of Red Hat Enterprise OS which are in a specific migration wave. However, the formula is returning a count of “1” even though there are no Red Hat Enterprise OS’s in the wave in question. How can I modify this formula to provide an accurate unique count of Red Hat devices?
=COUNTA(UNIQUE(FILTER(MasterServerToApp[Server], (MasterServerToApp[Wave] = B4) * ISNUMBER(SEARCH(“Red Hat Enterprise”, MasterServerToApp[OS Trim])))))
Below are two screenshots which prove there are “0” Red Hat Enterprise devices, however the dashboard still shows a value of “1”
Thanks,
Connor
Hi,I am using the formula below to track an unique count of Red Hat Enterprise OS which are in a specific migration wave. However, the formula is returning a count of “1” even though there are no Red Hat Enterprise OS’s in the wave in question. How can I modify this formula to provide an accurate unique count of Red Hat devices?=COUNTA(UNIQUE(FILTER(MasterServerToApp[Server], (MasterServerToApp[Wave] = B4) * ISNUMBER(SEARCH(“Red Hat Enterprise”, MasterServerToApp[OS Trim])))))Below are two screenshots which prove there are “0” Red Hat Enterprise devices, however the dashboard still shows a value of “1” Thanks,Connor Read More
Permanently deleted log analytics workspace in Azure and how to recover it ?
Have permanently deleted log analytics workspace in Azure environment and need to recover the deleted workspace.
Reason# Recreated a new log analytics workspace but when tried to check under conditional access –> Insights and reporting
receiving an error message as insufficient permission and highlighting the deleted log analytics workspace.
/subscriptions/xxxxxxxxxxxxxxxxx/resourceGroups/rg-test-prod-uks-001/providers/Microsoft.OperationalInsights/workspaces/
Error code showing : 403 | Content : NewLogAnalyticsBlade
Any idea on how to recover the deleted one or how to fix this permission issue.
Impact: Not able to get audit logs from Conditional access policies to Log analytics workspace.
Thanks.
Have permanently deleted log analytics workspace in Azure environment and need to recover the deleted workspace.Reason# Recreated a new log analytics workspace but when tried to check under conditional access –> Insights and reportingreceiving an error message as insufficient permission and highlighting the deleted log analytics workspace. /subscriptions/xxxxxxxxxxxxxxxxx/resourceGroups/rg-test-prod-uks-001/providers/Microsoft.OperationalInsights/workspaces/Error code showing : 403 | Content : NewLogAnalyticsBladeAny idea on how to recover the deleted one or how to fix this permission issue.Impact: Not able to get audit logs from Conditional access policies to Log analytics workspace.Thanks. Read More
Scheduling a meeting with many required attendees
I’m trying to book a 2 hour meeting across 3 time zones (EST,PST,Mountain) and although the scheduling assistant is helpful, its still really tedious.
Does anyone know of a plugin or helper app that can do that work for me? Find two hours and then I can create the meeting?
Thanks
I’m trying to book a 2 hour meeting across 3 time zones (EST,PST,Mountain) and although the scheduling assistant is helpful, its still really tedious. Does anyone know of a plugin or helper app that can do that work for me? Find two hours and then I can create the meeting? Thanks Read More
Join Our Post-Build AMA on Copilot for Microsoft 365 – May 23rd at 9 am PST
We are hosting a post-Build AMA event on Copilot for Microsoft 365 on Thursday, May 23rd, at 9 AM PDT / 12 PM EST. This session will focus on the announcements made at Microsoft Build 2024 about Copilot for Microsoft 365. Be sure to RSVP and join us in the Copilot for Microsoft 365 Tech Community
We are hosting a post-Build AMA event on Copilot for Microsoft 365 on Thursday, May 23rd, at 9 AM PDT / 12 PM EST. This session will focus on the announcements made at Microsoft Build 2024 about Copilot for Microsoft 365. Be sure to RSVP and join us in the Copilot for Microsoft 365 Tech Community Read More
NEW outlook will NOT play sound when notifications appear.
I have looked absolutely everywhere and tried soooo many workarounds and settings changes both in the NEW outlook, the WEB outlook, reverting back to the OLD outlook and back again, ENSURED my windows notification settings, battery settings, focus assist settings are all CORRECT, I have enabled notifications for outlook on windows, for all apps, I have went through ALL the old outlook settings, web and new (desktop) outlook notification settings and even my windows control center’s sound settings.
I receive sounds for EVERY other app I have notifications set except for outlook. very unhappy with the new outlook but am determined to continue using it as I am now comfortable with it. I am using an HP-Dragonfly 3 laptop with Windows 11. how difficult can it be to get the sound working properly? I work in tech and as I’ve said, I’ve gone through ALL the windows help forum discussions and many others outside of windows/outlook trying to find a solution.
Has anyone with this issue actually found a solution?
I have looked absolutely everywhere and tried soooo many workarounds and settings changes both in the NEW outlook, the WEB outlook, reverting back to the OLD outlook and back again, ENSURED my windows notification settings, battery settings, focus assist settings are all CORRECT, I have enabled notifications for outlook on windows, for all apps, I have went through ALL the old outlook settings, web and new (desktop) outlook notification settings and even my windows control center’s sound settings. I receive sounds for EVERY other app I have notifications set except for outlook. very unhappy with the new outlook but am determined to continue using it as I am now comfortable with it. I am using an HP-Dragonfly 3 laptop with Windows 11. how difficult can it be to get the sound working properly? I work in tech and as I’ve said, I’ve gone through ALL the windows help forum discussions and many others outside of windows/outlook trying to find a solution. Has anyone with this issue actually found a solution? Read More
Need Help With Rule
Hi! I am trying to create a new rule that will successfully sort emails like the attached example into a separate folder. The problem is that there is nothing in either the subject line or the body of the email that is unique. In the attached example, what you see for the body of the message is the entirety of the message–there is nothing beyond the reference number line that I could key on. Every email produced by our ticketing system includes “Ref:MSG” followed by a number, so I my rule cannot key on that without picking up a lot of other messages that I don’t want to catch with this rule.
In the attached example, the text “Short Description: Move” would be enough to make the rule work, but I cannot figure out how to get at that. It’s not part of the subject line, it’s not part of the body, and I don’t find it in the message headers. However, if I right-click on that text and select the View Source option, it appears to be HTML. I get this:
</head><body><div>Short Description: Move
Is there any way that I can use that in a rule?
Thanks for any help that you can offer!
–Tom
Hi! I am trying to create a new rule that will successfully sort emails like the attached example into a separate folder. The problem is that there is nothing in either the subject line or the body of the email that is unique. In the attached example, what you see for the body of the message is the entirety of the message–there is nothing beyond the reference number line that I could key on. Every email produced by our ticketing system includes “Ref:MSG” followed by a number, so I my rule cannot key on that without picking up a lot of other messages that I don’t want to catch with this rule. In the attached example, the text “Short Description: Move” would be enough to make the rule work, but I cannot figure out how to get at that. It’s not part of the subject line, it’s not part of the body, and I don’t find it in the message headers. However, if I right-click on that text and select the View Source option, it appears to be HTML. I get this: </head><body><div>Short Description: Move Is there any way that I can use that in a rule? Thanks for any help that you can offer! –Tom Read More
Duplicate Domains Different Tenants
We have an issue we have a tenant registered as abwidget.com our internal AD is abw.com. Once we started using azure ad sync(hybrid mode) we found out there is a company with registered abw.com for their domain. Now when our users try to log in it directs them to the other company even though they are using abwidget.com to identify.Azure AD
We have an issue we have a tenant registered as abwidget.com our internal AD is abw.com. Once we started using azure ad sync(hybrid mode) we found out there is a company with registered abw.com for their domain. Now when our users try to log in it directs them to the other company even though they are using abwidget.com to identify.Azure AD Read More
Introducing Model Customization for Azure AI
We are thrilled to announce the launch of our Model Customization for Azure AI, an engineering service designed to accelerate our co-innovation with customers to deliver tailored AI solutions. Our commitment to empowering our customers extends beyond the provision of tools and platforms; we are offering an opportunity for selected customers to collaborate closely with our engineering and research teams to develop custom models tailored to their unique domain-specific needs.
Custom models can offer significant benefits for enterprises and complement other techniques such as fine-tuning, retrieval-augmented generation (RAG), and prompt engineering by encapsulating specialized domain knowledge and understanding nuanced context in the domain. By refining the model’s parameters for a specific domain, custom models can improve accuracy and enhance the model’s ability to comprehend the subtleties of language and context, as well as specific domain knowledge. This refinement aids in better generalization within that domain, enabling the model to perform well with new data while minimizing overfitting. Additionally, custom models can increase robustness, equipping the model to handle diverse scenarios and protect against potential vulnerabilities. They can also incorporate safety and ethical considerations, ensuring responsible and fair AI behavior. Moreover, custom models will be able to enhance language proficiency by refining the model’s ability to process and generate text in a specific language. This can lead to more efficient use of tokens, resulting in smoother and more coherent language output.
Engineering Excellence Meets Domain Expertise
At the heart of this co-innovation approach lies the synergy between Microsoft’s engineering excellence and the domain expertise of our customers. We understand that the challenges faced in specialized fields require customized AI solutions to maximize value realization.
For businesses, the ability to leverage AI that precisely understands and operates within their specific context can be highly beneficial, as it not only understand the intricacies of a specific domain but can also enhance the capabilities within that sphere. This level of customization can potentially improve accuracy in tasks such as customer service, predictive analytics, and decision-making processes, directly contributing to improved operational efficiency and customer satisfaction. Additionally, custom-trained models are designed to handle tasks that require understanding complex, specialized knowledge, offering the possibility of enhanced performance over standard models in these scenarios.
Our Model Customization service offers customers the opportunity to work hand-in-glove with our world-class AI engineers. By collaborating closely, we can develop models that are uniquely tailored to specific business needs, leveraging advanced techniques and extensive expertise to ensure that AI solutions are both accurate and contextually relevant. That is why we are offering this paid-for service with our expert engineering and science resources to help our customers.
For more information, please reach out to your Microsoft representatives or account managers.
As we embark on this journey together, we are not just providing a service; we are creating innovations that can define the future of domain-specific AI applications.
Learn more about Azure AI
Build with Azure AI Studio: ai.azure.com
Get the latest Azure AI news and resources
Apply now for access to Azure OpenAI Service
Learn more about What’s new in Azure OpenAI Service?
If you are a current Azure OpenAI customer and would like to add additional use cases, fill out the Azure OpenAI Additional Use Case form.
Responsible AI: Transparency Note for Azure OpenAI Service
Microsoft Tech Community – Latest Blogs –Read More
Announcing SharePoint Embedded General Availability
Today we’re pleased to announce the general availability of SharePoint Embedded, a new way to build file and document centric apps. SharePoint Embedded allows you to integrate advanced Microsoft 365 features into your apps including full featured collaborative functions from Office, Purview’s security and compliance tools, and Copilot capabilities. It also helps you build both enterprise line of business apps and independent software vendor (ISV) apps. SharePoint Embedded is a metered service with pay-as-you-go pricing. In addition, we are also excited to announce a private preview of SharePoint Embedded custom copilot experiences.
View all documentation on Microsoft Learn and register here to stay up to date with the latest newsletters and upcoming webinars.
Enterprises today often have files and documents spread across multiple systems, all with different capabilities, lowering user satisfaction and increasing administrative complexity. SharePoint Embedded delivers Microsoft 365 superpowers as part of any app and consolidates all files and documents within a universal document layer. Apps that manage files and documents with SharePoint Embedded have a common set of collaboration, compliance, security, and AI capabilities, all designed to delight users and admins.
SharePoint Embedded is a headless, API only version of SharePoint, specifically built for apps. SharePoint Embedded introduces the ability for an app developer to create and manage a dedicated partition for their app within their Microsoft 365 tenant. This partition is logically separated from existing storage areas like SharePoint Online and OneDrive, but integrated with core Microsoft 365 services, including Office co-authoring, search, compliance, Copilot, business continuity, and more. And, since it’s a pay-as-you-go service, apps built on it have their own limits around things like API transaction rates, rather than being part of shared Microsoft 365 limits. SharePoint Embedded apps build and manage their own user experience layer and are managed by admins through familiar Microsoft 365 admin centers ISVs can now create their own partitions within a customer’s M365 tenant, surfacing the same capabilities as part of their app. With an ISV app, tenants remain in control of their documents, and tenant specific compliance settings such as retention periods automatically apply.
Building a file and document centric application presents unique challenges, from compliance to collaboration to AI. SharePoint Embedded handles all of this and simplifies and accelerates your file and document management roadmap, for any app. Developers leverage the robust and secure document management features of Microsoft 365 without the need to build or maintain their own infrastructure. IT professionals benefit from centralized administration and governance, ensuring compliance and security across all applications that use it. Users get the collaboration experience and productivity tools they love.
Teams at Microsoft already use SharePoint Embedded to provide apps like Microsoft Loop and Microsoft Designer with rich file and document management capabilities for use around the world. When you choose SharePoint Embedded, you’re using the exact same platform that Microsoft uses to build our own apps.
Many customers and partners like KPMG, Peppermint Technologies, BDO, AvePoint and more are already working with SharePoint Embedded to solve common business process and content management problems.
Proventeq, a long time Microsoft partner, is using SharePoint Embedded to build apps that help customers rationalize their document management footprint into a universal document layer powered by Microsoft 365.
“SharePoint Embedded is a great approach to managing documents originating in systems outside of Microsoft 365,” said Rakesh Chenchery, Chief Technology Officer at Proventeq, whose product Proventeq Document Management for Salesforce is generally available today. “SharePoint Embedded was simple to integrate into our existing app and gives us a high-performance solution with the easy to manage security and rich collaboration tools our customers are looking for.“
Announcing custom copilot experiences for SharePoint Embedded
In addition to out of the box integration with Microsoft 365 Copilot, today we are pleased to announce that custom copilot experiences based on your SharePoint Embedded managed data and built on the Copilot platform are now in private preview. With custom copilot experiences, you can create robust interactions with your SharePoint Embedded managed data, and easily surface these within your app. If you would like to nominate your company for the SharePoint Embedded custom copilot private preview, please complete this form.
Resources
Discover a new way of building and operating apps with SharePoint Embedded at SharePoint Embedded Overview | Microsoft Learn.
Learn more about SharePoint Embedded development on the Microsoft 365 Community Call SharePoint Embedded playlist.
Watch the SharePoint Embedded announcement at Microsoft BUILD.
Join the next SharePoint Embedded webinar here.
Register here to stay up to date with the latest from the SharePoint Embedded team.
Microsoft Tech Community – Latest Blogs –Read More
General Availability of license-free standby replica for Azure SQL database
We are excited to announce General Availability of license-free standby replica for Azure SQL Database letting you to save on licensing costs by designating your secondary disaster recovery database as standby replica. Typically license costs constitute to be about 40% and so with license-free standby replica the secondary will be about 40% less expensive.
To protect database powering the application from region failures and achieving higher business continuity it is crucial to enable disaster recovery for database. In some industries it is mandatory and part of compliance requirement to have disaster recovery in place and frequently conduct drills. One of the biggest hindrances in enabling disaster recovery has been cost as secondary database is mainly used in the event of a disaster.
When a secondary database replica is used only for disaster recovery, and doesn’t have any workloads running on it, or applications connecting to it, you can save on licensing costs by designating the database as a standby replica. Microsoft provides you with the number of vCores licensed to the primary database at no extra charge under the failover rights benefit in the product licensing terms for standby replica. You’re still billed for the compute and storage that the secondary database uses.
The standby database replica must only be used for disaster recovery. The following lists the only activities that are permitted on the standby database:
Perform maintenance operations, such as checkDB
Connect monitoring applications
Run disaster recovery drills
You can designate one secondary single database deployment model as license-free standby replica in General Purpose & Business Critical service tier and provisioned compute tier. It is possible to configure license-free standby replica using portal, powershell or CLI.
Additional capabilities added for general availability release are:
Perform in place update of geo replica to standby replica using portal and REST API.
Assign standby replica while creating failover group using portal and REST API.
Estimate cost for standby replica by using Azure pricing calculator and selecting Standby replica in Disaster Recovery dropdown.
For comprehensive details on license-free standby replica including limitations and frequently asked questions, please refer to documentation.
Microsoft Tech Community – Latest Blogs –Read More
The #1 factor in ADX/KQL database performance
The #1 factor in ADX/KQL database performance
In Power BI or any other tool
In this article I’ll show many variations of a query executed on a large table that contains public events arriving at GitHub.
The query summarizes data for 10 or 20 days and I compare the CPU consumption of the query in different syntax variations.
I mention only CPU time and not execution time because execution can vary by the cluster size and load on the cluster.
My purpose is to demonstrate how the query performs well when the date filter is used by the engine to limit the number of scanned extents (aka shards).
In some cases, the query scans all extents, and it takes a lot of CPU.
In other cases, only a small subset of the extents are scanned and performance is good.
In a follow-up article I’ll explain how Power BI and ADX dashboards can be used to filter and join tables in an optimal way.
Queries on a single table
1. The query summarize 10 days of data.
An element is extracted from a Json structure and a distinct count operation is done on the extracted value. These two operations contribute significantly to the overall cost.
Above each query you can see the CPU seconds, the volume of scanned data and the number of scanned extents.
// 6.53 1.98GB 128
EventsFromLiveStream
| where CreatedAt between(datetime(2024-4-1)..10d)
| extend Login=tostring(Actor.login)
| summarize count(),dcount(Login) by Type
2. The same for 20 days. The cost is almost exactly double which is expected.
This is the benchmark against which we can compare all other variations.
// 12.5 3.63GB 132
EventsFromLiveStream
| where CreatedAt between(datetime(2024-4-1)..20d)
| extend Login=tostring(Actor.login)
| summarize count(),dcount(Login) by Type
3. A function is applied to the datetime column and so the effect of filtering is lost. All data is scanned and cost is 4 times more
// 49.87 8.67 all
EventsFromLiveStream
| extend shiftdata=datetime_add(‘hour’,2,CreatedAt)
| where shiftdata between(datetime(2024-4-1)..20d)
| extend Login=tostring(Actor.login)
| summarize count(),dcount(Login) by Type
4. Another variation of shifting the datetime 2 hours forward and then filtering. Equally bad as #3
// 49.3 8.67 all
EventsFromLiveStream
| extend shiftdata=CreatedAt + 2h
| where shiftdata between(datetime(2024-4-1)..20d)
| extend Login=tostring(Actor.login)
| summarize count(),dcount(Login) by Type
5. Another function (bin) is applied to the datetime column but this time the filter is applied correctly. Cost is a bit higher because the actual bin function needs to be calculated.
// 13.42 3.79GB 132
EventsFromLiveStream
| extend Day=bin(CreatedAt,1d)
| where Day between(datetime(2024-4-1)..20d)
| extend Login=tostring(Actor.login)
| summarize count(),dcount(Login) by Type
6. Same as #5, startofday , startofmonth are also applied correctly.
// 13.51 3.79GB 132
EventsFromLiveStream
| extend Day=startofday(CreatedAt)
| where Day between(datetime(2024-4-1)..20d)
| extend Login=tostring(Actor.login)
| summarize count(),dcount(Login) by Type
7. The worst case scenario – 45 times slower than the base
Trying to shift the datetime value using a very expensive function that needs to be applied to all rows. Also, the filter cannot be used.
In this case filtering on 10 days or 20 days cost the same because almost all the CPU is spent on the datetime_utc_to_local function.
// 9:51.67 8.7GB all
EventsFromLiveStream
| extend LocalTime=datetime_utc_to_local(CreatedAt,’America/Buenos_Aires’)
| where LocalTime between(datetime(2024-4-1)..20d)
| extend Login=tostring(Actor.login)
| summarize count(),dcount(Login) by Type
8. Shifting the filter range instead of shifting the data.
Cost is back to base.
Notice that leaving the statement of calculating local time doesn’t cost anything because the result is not used so it is not calculated
// 19.5 3.53GB 154
EventsFromLiveStream
| extend LocalTime=datetime_utc_to_local(CreatedAt,’America/Buenos_Aires’)
| where CreatedAt between(datetime_local_to_utc(datetime(2024-4-1),’America/Buenos_Aires’)..20d)
| extend Login=tostring(Actor.login)
| summarize count(),dcount(Login) by Type
9. Add another where clause on the base datetime column .
Still more expensive but not by such a big margin.
Notice that although the filter on the original column is mentioned after the calculation of the shifted datetime value , it is executed before and so only a small subset of the data is actually shifted.
// 19.5 3.63GB 154
EventsFromLiveStream
| extend LocalTime=datetime_utc_to_local(CreatedAt,’America/Buenos_Aires’)
| where CreatedAt between (datetime(2024-3-30) ..21d )
| where LocalTime between(datetime(2024-4-1)..20d)
| extend Login=tostring(Actor.login)
| summarize count(),dcount(Login) by Type
Applying the filter on the left side of a join
10. A dates table is joined with the Events table.
The dates table is on the left side of the join.
The filter on the dates table is applied to the right side when the filter is using in or ==
// 1:16 8:14GB 134
let Calendar = range Day from datetime(2024-1-1) to datetime(2024-12-31) step 1d;
Calendar | where Day in(
datetime(2024-04-01T00:00:00Z),
datetime(2024-04-02T00:00:00Z),
datetime(2024-04-03T00:00:00Z),
datetime(2024-04-04T00:00:00Z),
datetime(2024-04-05T00:00:00Z),
datetime(2024-04-06T00:00:00Z),
datetime(2024-04-07T00:00:00Z),
datetime(2024-04-08T00:00:00Z),
datetime(2024-04-09T00:00:00Z),
datetime(2024-04-10T00:00:00Z),
datetime(2024-04-11T00:00:00Z),
datetime(2024-04-12T00:00:00Z),
datetime(2024-04-13T00:00:00Z),
datetime(2024-04-14T00:00:00Z),
datetime(2024-04-15T00:00:00Z),
datetime(2024-04-16T00:00:00Z),
datetime(2024-04-17T00:00:00Z),
datetime(2024-04-18T00:00:00Z),
datetime(2024-04-19T00:00:00Z),
datetime(2024-04-20T00:00:00Z))
| join kind=inner hint.strategy=broadcast
(EventsFromLiveStream | extend Day=startofday(CreatedAt)) on Day
| extend Login=tostring(Actor.login)
| summarize count(),dcount(Login) by Type
11. When the filter on the left side is using a between or < or > , it is not applied to the right side of the join.
The results are correct, but performance is bad.
// 47:40 206.5GB all
let Calendar = range Day from datetime(2024-1-1) to datetime(2024-12-31) step 1d;
Calendar | where Day between(datetime(2024-4-1)..20d)
| join kind=inner hint.strategy=broadcast
(EventsFromLiveStream | extend Day=startofday(CreatedAt)) on Day
| extend Login=tostring(Actor.login)
| summarize count(),dcount(Login) by Type
12. Improvement on the join, extract the Login value inside the parentheses of the right side table. The join still have a cost but the overall is much less than in #9.
The reason is that the dynamic column Actor does not need to be part of the join,only the login value
// 24 8:33GB 150
let Calendar = range Day from datetime(2024-1-1) to datetime(2024-12-31) step 1d;
Calendar | where Day in(
datetime(2024-04-01T00:00:00Z),
datetime(2024-04-02T00:00:00Z),
datetime(2024-04-03T00:00:00Z),
datetime(2024-04-04T00:00:00Z),
datetime(2024-04-05T00:00:00Z),
datetime(2024-04-06T00:00:00Z),
datetime(2024-04-07T00:00:00Z),
datetime(2024-04-08T00:00:00Z),
datetime(2024-04-09T00:00:00Z),
datetime(2024-04-10T00:00:00Z),
datetime(2024-04-11T00:00:00Z),
datetime(2024-04-12T00:00:00Z),
datetime(2024-04-13T00:00:00Z),
datetime(2024-04-14T00:00:00Z),
datetime(2024-04-15T00:00:00Z),
datetime(2024-04-16T00:00:00Z),
datetime(2024-04-17T00:00:00Z),
datetime(2024-04-18T00:00:00Z),
datetime(2024-04-19T00:00:00Z),
datetime(2024-04-20T00:00:00Z))
| join kind=inner hint.strategy=broadcast
(EventsFromLiveStream | extend Day=startofday(CreatedAt),Login=tostring(Actor.login)) on Day
| summarize count(),dcount(Login) by Type
Microsoft Tech Community – Latest Blogs –Read More
activate matlab.sh file is missing in the bin folder
while installing matlab according to the documentation,the activate file is missing.while installing matlab according to the documentation,the activate file is missing. while installing matlab according to the documentation,the activate file is missing. matlab, installation MATLAB Answers — New Questions
Partner Case Study Series | proMX: Dynamics 365 add-ons improve project management at Interflex
Helping businesses improve efficiency and achieve digital transformation
proMX is a Microsoft partner headquartered in Nuremberg, Germany. Since its founding in 2000, proMX has helped small and large businesses transform themselves into digital organizations, and it has supported them in their efforts to become more efficient. One of the ways proMX does this is through applications designed for Dynamics 365. Numerous add-ons, such as Time Tracking for Dynamics 365 Project Service Automation, are available on a free-trial basis on Microsoft AppSource.
proMX reports that, across several companies, the integration of its project management add-ons has led to improvements regarding administrative working hours, project documentation costs, and, most significantly, per-employee capacity utilization. On average, revenue per employee has increased by 15 percent, while costs have remained stable.
Continue reading here
**Explore all case studies or submit your own**
Microsoft Tech Community – Latest Blogs –Read More
function-loop-if statment
"Write a function that asks the user to input 10 numbers and calculates the sum of the odd numbers entered (use for loop + if)" I couldn’t figure it out at all as functions cannot be added in loops, so im not sure"Write a function that asks the user to input 10 numbers and calculates the sum of the odd numbers entered (use for loop + if)" I couldn’t figure it out at all as functions cannot be added in loops, so im not sure "Write a function that asks the user to input 10 numbers and calculates the sum of the odd numbers entered (use for loop + if)" I couldn’t figure it out at all as functions cannot be added in loops, so im not sure for loop, if statement, function MATLAB Answers — New Questions