Category: Microsoft
Category Archives: Microsoft
Need Help With Rule
Hi! I am trying to create a new rule that will successfully sort emails like the attached example into a separate folder. The problem is that there is nothing in either the subject line or the body of the email that is unique. In the attached example, what you see for the body of the message is the entirety of the message–there is nothing beyond the reference number line that I could key on. Every email produced by our ticketing system includes “Ref:MSG” followed by a number, so I my rule cannot key on that without picking up a lot of other messages that I don’t want to catch with this rule.
In the attached example, the text “Short Description: Move” would be enough to make the rule work, but I cannot figure out how to get at that. It’s not part of the subject line, it’s not part of the body, and I don’t find it in the message headers. However, if I right-click on that text and select the View Source option, it appears to be HTML. I get this:
</head><body><div>Short Description: Move
Is there any way that I can use that in a rule?
Thanks for any help that you can offer!
–Tom
Hi! I am trying to create a new rule that will successfully sort emails like the attached example into a separate folder. The problem is that there is nothing in either the subject line or the body of the email that is unique. In the attached example, what you see for the body of the message is the entirety of the message–there is nothing beyond the reference number line that I could key on. Every email produced by our ticketing system includes “Ref:MSG” followed by a number, so I my rule cannot key on that without picking up a lot of other messages that I don’t want to catch with this rule. In the attached example, the text “Short Description: Move” would be enough to make the rule work, but I cannot figure out how to get at that. It’s not part of the subject line, it’s not part of the body, and I don’t find it in the message headers. However, if I right-click on that text and select the View Source option, it appears to be HTML. I get this: </head><body><div>Short Description: Move Is there any way that I can use that in a rule? Thanks for any help that you can offer! –Tom Read More
Duplicate Domains Different Tenants
We have an issue we have a tenant registered as abwidget.com our internal AD is abw.com. Once we started using azure ad sync(hybrid mode) we found out there is a company with registered abw.com for their domain. Now when our users try to log in it directs them to the other company even though they are using abwidget.com to identify.Azure AD
We have an issue we have a tenant registered as abwidget.com our internal AD is abw.com. Once we started using azure ad sync(hybrid mode) we found out there is a company with registered abw.com for their domain. Now when our users try to log in it directs them to the other company even though they are using abwidget.com to identify.Azure AD Read More
Introducing Model Customization for Azure AI
We are thrilled to announce the launch of our Model Customization for Azure AI, an engineering service designed to accelerate our co-innovation with customers to deliver tailored AI solutions. Our commitment to empowering our customers extends beyond the provision of tools and platforms; we are offering an opportunity for selected customers to collaborate closely with our engineering and research teams to develop custom models tailored to their unique domain-specific needs.
Custom models can offer significant benefits for enterprises and complement other techniques such as fine-tuning, retrieval-augmented generation (RAG), and prompt engineering by encapsulating specialized domain knowledge and understanding nuanced context in the domain. By refining the model’s parameters for a specific domain, custom models can improve accuracy and enhance the model’s ability to comprehend the subtleties of language and context, as well as specific domain knowledge. This refinement aids in better generalization within that domain, enabling the model to perform well with new data while minimizing overfitting. Additionally, custom models can increase robustness, equipping the model to handle diverse scenarios and protect against potential vulnerabilities. They can also incorporate safety and ethical considerations, ensuring responsible and fair AI behavior. Moreover, custom models will be able to enhance language proficiency by refining the model’s ability to process and generate text in a specific language. This can lead to more efficient use of tokens, resulting in smoother and more coherent language output.
Engineering Excellence Meets Domain Expertise
At the heart of this co-innovation approach lies the synergy between Microsoft’s engineering excellence and the domain expertise of our customers. We understand that the challenges faced in specialized fields require customized AI solutions to maximize value realization.
For businesses, the ability to leverage AI that precisely understands and operates within their specific context can be highly beneficial, as it not only understand the intricacies of a specific domain but can also enhance the capabilities within that sphere. This level of customization can potentially improve accuracy in tasks such as customer service, predictive analytics, and decision-making processes, directly contributing to improved operational efficiency and customer satisfaction. Additionally, custom-trained models are designed to handle tasks that require understanding complex, specialized knowledge, offering the possibility of enhanced performance over standard models in these scenarios.
Our Model Customization service offers customers the opportunity to work hand-in-glove with our world-class AI engineers. By collaborating closely, we can develop models that are uniquely tailored to specific business needs, leveraging advanced techniques and extensive expertise to ensure that AI solutions are both accurate and contextually relevant. That is why we are offering this paid-for service with our expert engineering and science resources to help our customers.
For more information, please reach out to your Microsoft representatives or account managers.
As we embark on this journey together, we are not just providing a service; we are creating innovations that can define the future of domain-specific AI applications.
Learn more about Azure AI
Build with Azure AI Studio: ai.azure.com
Get the latest Azure AI news and resources
Apply now for access to Azure OpenAI Service
Learn more about What’s new in Azure OpenAI Service?
If you are a current Azure OpenAI customer and would like to add additional use cases, fill out the Azure OpenAI Additional Use Case form.
Responsible AI: Transparency Note for Azure OpenAI Service
Microsoft Tech Community – Latest Blogs –Read More
Announcing SharePoint Embedded General Availability
Today we’re pleased to announce the general availability of SharePoint Embedded, a new way to build file and document centric apps. SharePoint Embedded allows you to integrate advanced Microsoft 365 features into your apps including full featured collaborative functions from Office, Purview’s security and compliance tools, and Copilot capabilities. It also helps you build both enterprise line of business apps and independent software vendor (ISV) apps. SharePoint Embedded is a metered service with pay-as-you-go pricing. In addition, we are also excited to announce a private preview of SharePoint Embedded custom copilot experiences.
View all documentation on Microsoft Learn and register here to stay up to date with the latest newsletters and upcoming webinars.
Enterprises today often have files and documents spread across multiple systems, all with different capabilities, lowering user satisfaction and increasing administrative complexity. SharePoint Embedded delivers Microsoft 365 superpowers as part of any app and consolidates all files and documents within a universal document layer. Apps that manage files and documents with SharePoint Embedded have a common set of collaboration, compliance, security, and AI capabilities, all designed to delight users and admins.
SharePoint Embedded is a headless, API only version of SharePoint, specifically built for apps. SharePoint Embedded introduces the ability for an app developer to create and manage a dedicated partition for their app within their Microsoft 365 tenant. This partition is logically separated from existing storage areas like SharePoint Online and OneDrive, but integrated with core Microsoft 365 services, including Office co-authoring, search, compliance, Copilot, business continuity, and more. And, since it’s a pay-as-you-go service, apps built on it have their own limits around things like API transaction rates, rather than being part of shared Microsoft 365 limits. SharePoint Embedded apps build and manage their own user experience layer and are managed by admins through familiar Microsoft 365 admin centers ISVs can now create their own partitions within a customer’s M365 tenant, surfacing the same capabilities as part of their app. With an ISV app, tenants remain in control of their documents, and tenant specific compliance settings such as retention periods automatically apply.
Building a file and document centric application presents unique challenges, from compliance to collaboration to AI. SharePoint Embedded handles all of this and simplifies and accelerates your file and document management roadmap, for any app. Developers leverage the robust and secure document management features of Microsoft 365 without the need to build or maintain their own infrastructure. IT professionals benefit from centralized administration and governance, ensuring compliance and security across all applications that use it. Users get the collaboration experience and productivity tools they love.
Teams at Microsoft already use SharePoint Embedded to provide apps like Microsoft Loop and Microsoft Designer with rich file and document management capabilities for use around the world. When you choose SharePoint Embedded, you’re using the exact same platform that Microsoft uses to build our own apps.
Many customers and partners like KPMG, Peppermint Technologies, BDO, AvePoint and more are already working with SharePoint Embedded to solve common business process and content management problems.
Proventeq, a long time Microsoft partner, is using SharePoint Embedded to build apps that help customers rationalize their document management footprint into a universal document layer powered by Microsoft 365.
“SharePoint Embedded is a great approach to managing documents originating in systems outside of Microsoft 365,” said Rakesh Chenchery, Chief Technology Officer at Proventeq, whose product Proventeq Document Management for Salesforce is generally available today. “SharePoint Embedded was simple to integrate into our existing app and gives us a high-performance solution with the easy to manage security and rich collaboration tools our customers are looking for.“
Announcing custom copilot experiences for SharePoint Embedded
In addition to out of the box integration with Microsoft 365 Copilot, today we are pleased to announce that custom copilot experiences based on your SharePoint Embedded managed data and built on the Copilot platform are now in private preview. With custom copilot experiences, you can create robust interactions with your SharePoint Embedded managed data, and easily surface these within your app. If you would like to nominate your company for the SharePoint Embedded custom copilot private preview, please complete this form.
Resources
Discover a new way of building and operating apps with SharePoint Embedded at SharePoint Embedded Overview | Microsoft Learn.
Learn more about SharePoint Embedded development on the Microsoft 365 Community Call SharePoint Embedded playlist.
Watch the SharePoint Embedded announcement at Microsoft BUILD.
Join the next SharePoint Embedded webinar here.
Register here to stay up to date with the latest from the SharePoint Embedded team.
Microsoft Tech Community – Latest Blogs –Read More
General Availability of license-free standby replica for Azure SQL database
We are excited to announce General Availability of license-free standby replica for Azure SQL Database letting you to save on licensing costs by designating your secondary disaster recovery database as standby replica. Typically license costs constitute to be about 40% and so with license-free standby replica the secondary will be about 40% less expensive.
To protect database powering the application from region failures and achieving higher business continuity it is crucial to enable disaster recovery for database. In some industries it is mandatory and part of compliance requirement to have disaster recovery in place and frequently conduct drills. One of the biggest hindrances in enabling disaster recovery has been cost as secondary database is mainly used in the event of a disaster.
When a secondary database replica is used only for disaster recovery, and doesn’t have any workloads running on it, or applications connecting to it, you can save on licensing costs by designating the database as a standby replica. Microsoft provides you with the number of vCores licensed to the primary database at no extra charge under the failover rights benefit in the product licensing terms for standby replica. You’re still billed for the compute and storage that the secondary database uses.
The standby database replica must only be used for disaster recovery. The following lists the only activities that are permitted on the standby database:
Perform maintenance operations, such as checkDB
Connect monitoring applications
Run disaster recovery drills
You can designate one secondary single database deployment model as license-free standby replica in General Purpose & Business Critical service tier and provisioned compute tier. It is possible to configure license-free standby replica using portal, powershell or CLI.
Additional capabilities added for general availability release are:
Perform in place update of geo replica to standby replica using portal and REST API.
Assign standby replica while creating failover group using portal and REST API.
Estimate cost for standby replica by using Azure pricing calculator and selecting Standby replica in Disaster Recovery dropdown.
For comprehensive details on license-free standby replica including limitations and frequently asked questions, please refer to documentation.
Microsoft Tech Community – Latest Blogs –Read More
The #1 factor in ADX/KQL database performance
The #1 factor in ADX/KQL database performance
In Power BI or any other tool
In this article I’ll show many variations of a query executed on a large table that contains public events arriving at GitHub.
The query summarizes data for 10 or 20 days and I compare the CPU consumption of the query in different syntax variations.
I mention only CPU time and not execution time because execution can vary by the cluster size and load on the cluster.
My purpose is to demonstrate how the query performs well when the date filter is used by the engine to limit the number of scanned extents (aka shards).
In some cases, the query scans all extents, and it takes a lot of CPU.
In other cases, only a small subset of the extents are scanned and performance is good.
In a follow-up article I’ll explain how Power BI and ADX dashboards can be used to filter and join tables in an optimal way.
Queries on a single table
1. The query summarize 10 days of data.
An element is extracted from a Json structure and a distinct count operation is done on the extracted value. These two operations contribute significantly to the overall cost.
Above each query you can see the CPU seconds, the volume of scanned data and the number of scanned extents.
// 6.53 1.98GB 128
EventsFromLiveStream
| where CreatedAt between(datetime(2024-4-1)..10d)
| extend Login=tostring(Actor.login)
| summarize count(),dcount(Login) by Type
2. The same for 20 days. The cost is almost exactly double which is expected.
This is the benchmark against which we can compare all other variations.
// 12.5 3.63GB 132
EventsFromLiveStream
| where CreatedAt between(datetime(2024-4-1)..20d)
| extend Login=tostring(Actor.login)
| summarize count(),dcount(Login) by Type
3. A function is applied to the datetime column and so the effect of filtering is lost. All data is scanned and cost is 4 times more
// 49.87 8.67 all
EventsFromLiveStream
| extend shiftdata=datetime_add(‘hour’,2,CreatedAt)
| where shiftdata between(datetime(2024-4-1)..20d)
| extend Login=tostring(Actor.login)
| summarize count(),dcount(Login) by Type
4. Another variation of shifting the datetime 2 hours forward and then filtering. Equally bad as #3
// 49.3 8.67 all
EventsFromLiveStream
| extend shiftdata=CreatedAt + 2h
| where shiftdata between(datetime(2024-4-1)..20d)
| extend Login=tostring(Actor.login)
| summarize count(),dcount(Login) by Type
5. Another function (bin) is applied to the datetime column but this time the filter is applied correctly. Cost is a bit higher because the actual bin function needs to be calculated.
// 13.42 3.79GB 132
EventsFromLiveStream
| extend Day=bin(CreatedAt,1d)
| where Day between(datetime(2024-4-1)..20d)
| extend Login=tostring(Actor.login)
| summarize count(),dcount(Login) by Type
6. Same as #5, startofday , startofmonth are also applied correctly.
// 13.51 3.79GB 132
EventsFromLiveStream
| extend Day=startofday(CreatedAt)
| where Day between(datetime(2024-4-1)..20d)
| extend Login=tostring(Actor.login)
| summarize count(),dcount(Login) by Type
7. The worst case scenario – 45 times slower than the base
Trying to shift the datetime value using a very expensive function that needs to be applied to all rows. Also, the filter cannot be used.
In this case filtering on 10 days or 20 days cost the same because almost all the CPU is spent on the datetime_utc_to_local function.
// 9:51.67 8.7GB all
EventsFromLiveStream
| extend LocalTime=datetime_utc_to_local(CreatedAt,’America/Buenos_Aires’)
| where LocalTime between(datetime(2024-4-1)..20d)
| extend Login=tostring(Actor.login)
| summarize count(),dcount(Login) by Type
8. Shifting the filter range instead of shifting the data.
Cost is back to base.
Notice that leaving the statement of calculating local time doesn’t cost anything because the result is not used so it is not calculated
// 19.5 3.53GB 154
EventsFromLiveStream
| extend LocalTime=datetime_utc_to_local(CreatedAt,’America/Buenos_Aires’)
| where CreatedAt between(datetime_local_to_utc(datetime(2024-4-1),’America/Buenos_Aires’)..20d)
| extend Login=tostring(Actor.login)
| summarize count(),dcount(Login) by Type
9. Add another where clause on the base datetime column .
Still more expensive but not by such a big margin.
Notice that although the filter on the original column is mentioned after the calculation of the shifted datetime value , it is executed before and so only a small subset of the data is actually shifted.
// 19.5 3.63GB 154
EventsFromLiveStream
| extend LocalTime=datetime_utc_to_local(CreatedAt,’America/Buenos_Aires’)
| where CreatedAt between (datetime(2024-3-30) ..21d )
| where LocalTime between(datetime(2024-4-1)..20d)
| extend Login=tostring(Actor.login)
| summarize count(),dcount(Login) by Type
Applying the filter on the left side of a join
10. A dates table is joined with the Events table.
The dates table is on the left side of the join.
The filter on the dates table is applied to the right side when the filter is using in or ==
// 1:16 8:14GB 134
let Calendar = range Day from datetime(2024-1-1) to datetime(2024-12-31) step 1d;
Calendar | where Day in(
datetime(2024-04-01T00:00:00Z),
datetime(2024-04-02T00:00:00Z),
datetime(2024-04-03T00:00:00Z),
datetime(2024-04-04T00:00:00Z),
datetime(2024-04-05T00:00:00Z),
datetime(2024-04-06T00:00:00Z),
datetime(2024-04-07T00:00:00Z),
datetime(2024-04-08T00:00:00Z),
datetime(2024-04-09T00:00:00Z),
datetime(2024-04-10T00:00:00Z),
datetime(2024-04-11T00:00:00Z),
datetime(2024-04-12T00:00:00Z),
datetime(2024-04-13T00:00:00Z),
datetime(2024-04-14T00:00:00Z),
datetime(2024-04-15T00:00:00Z),
datetime(2024-04-16T00:00:00Z),
datetime(2024-04-17T00:00:00Z),
datetime(2024-04-18T00:00:00Z),
datetime(2024-04-19T00:00:00Z),
datetime(2024-04-20T00:00:00Z))
| join kind=inner hint.strategy=broadcast
(EventsFromLiveStream | extend Day=startofday(CreatedAt)) on Day
| extend Login=tostring(Actor.login)
| summarize count(),dcount(Login) by Type
11. When the filter on the left side is using a between or < or > , it is not applied to the right side of the join.
The results are correct, but performance is bad.
// 47:40 206.5GB all
let Calendar = range Day from datetime(2024-1-1) to datetime(2024-12-31) step 1d;
Calendar | where Day between(datetime(2024-4-1)..20d)
| join kind=inner hint.strategy=broadcast
(EventsFromLiveStream | extend Day=startofday(CreatedAt)) on Day
| extend Login=tostring(Actor.login)
| summarize count(),dcount(Login) by Type
12. Improvement on the join, extract the Login value inside the parentheses of the right side table. The join still have a cost but the overall is much less than in #9.
The reason is that the dynamic column Actor does not need to be part of the join,only the login value
// 24 8:33GB 150
let Calendar = range Day from datetime(2024-1-1) to datetime(2024-12-31) step 1d;
Calendar | where Day in(
datetime(2024-04-01T00:00:00Z),
datetime(2024-04-02T00:00:00Z),
datetime(2024-04-03T00:00:00Z),
datetime(2024-04-04T00:00:00Z),
datetime(2024-04-05T00:00:00Z),
datetime(2024-04-06T00:00:00Z),
datetime(2024-04-07T00:00:00Z),
datetime(2024-04-08T00:00:00Z),
datetime(2024-04-09T00:00:00Z),
datetime(2024-04-10T00:00:00Z),
datetime(2024-04-11T00:00:00Z),
datetime(2024-04-12T00:00:00Z),
datetime(2024-04-13T00:00:00Z),
datetime(2024-04-14T00:00:00Z),
datetime(2024-04-15T00:00:00Z),
datetime(2024-04-16T00:00:00Z),
datetime(2024-04-17T00:00:00Z),
datetime(2024-04-18T00:00:00Z),
datetime(2024-04-19T00:00:00Z),
datetime(2024-04-20T00:00:00Z))
| join kind=inner hint.strategy=broadcast
(EventsFromLiveStream | extend Day=startofday(CreatedAt),Login=tostring(Actor.login)) on Day
| summarize count(),dcount(Login) by Type
Microsoft Tech Community – Latest Blogs –Read More
Partner Case Study Series | proMX: Dynamics 365 add-ons improve project management at Interflex
Helping businesses improve efficiency and achieve digital transformation
proMX is a Microsoft partner headquartered in Nuremberg, Germany. Since its founding in 2000, proMX has helped small and large businesses transform themselves into digital organizations, and it has supported them in their efforts to become more efficient. One of the ways proMX does this is through applications designed for Dynamics 365. Numerous add-ons, such as Time Tracking for Dynamics 365 Project Service Automation, are available on a free-trial basis on Microsoft AppSource.
proMX reports that, across several companies, the integration of its project management add-ons has led to improvements regarding administrative working hours, project documentation costs, and, most significantly, per-employee capacity utilization. On average, revenue per employee has increased by 15 percent, while costs have remained stable.
Continue reading here
**Explore all case studies or submit your own**
Microsoft Tech Community – Latest Blogs –Read More
What’s New in Azure App Service at Build 2024
Welcome to Build 2024!
The team will be covering the latest AI enhancements for migrating web applications, how AI helps developers to monitor and troubleshoot applications, examples of integrating generative AI into both classic ASP.NET and .Net Core apps, and platform enhancements for scaling, load testing, observability, WebJobs and sidecar extensibility.
Drop by the breakout session “Using AI with App Service to deploy differentiated web apps and APIs” on Thursday May 23rd (12:30PM to 1:15PM Pacific time – BRK125 – In-Person and Online) to see live demonstrations of all of these topics!
Azure App Service team members will also be in attendance at the Expert Meetup area on the fifth floor – drop by and chat if you are attending Build in-person!
There are additional demos and presentations from partner teams that will cover (in part) App Service specific scenarios, so if you have time consider the additional sessions as well!
Using AI with App Service to deploy differentiated web apps and APIs
BRK125
Thursday, May 23rd
12:30 PM – 1:15 PM Pacific Daylight Time
Breakout Session – In-Person and Online
App innovation in the AI era: cost, benefits, and challenges
BRK120
Tuesday, May 21st
4:45 PM – 5:30 PM Pacific Daylight Time
Breakout Session – In-Person and Online
Conversational app and code assessment in Azure Migrate
DEM713
Wednesday, May 22nd
10:30 AM – 10:45 AM Pacific Daylight Time
Demo Session – In-Person Only
Leverage Azure Testing Services to build high quality applications
BRK183
Thursday, May 23rd
1:45 PM – 2:30 PM Pacific Daylight Time
Breakout Session – In-Person and Online
Vision to value – SAS accelerates modernization at scale with Azure
BRK170
Thursday, May 23rd
1:45 PM – 2:30 PM Pacific Daylight Time
Breakout Session – In-Person and Online
GitHub Copilot Skills for Azure Migrate
In a recent IDC study of 900 IT decision makers worldwide, 74% of the respondents cited faster innovation, faster time to market, and/or improved business agility as one of the top benefits driving the business case for migrating and modernizing apps with a managed cloud service. Microsoft has been continuously investing in first party tools to make it easier and faster to migrate using the tools you already use and love. We are excited to announce that Azure Migrate application and code assessment, which was released at Microsoft Ignite 2023, now adds GitHub Copilot Chat enhancement to the Visual Studio migration extension!
Once you have the updated migration extension installed in Visual Studio, as well as enabling the Visual Studio GitHub Copilot Chat extension, GitHub Copilot Chat will guide you through the individual items found in the application migration report. You can ask questions like “Can I migrate this app to Azure?” or “What changes do I need to make to this code?” and get answers and recommendations from Azure Migrate. (Note: GitHub Copilot licenses sold separately).
You can get started by clicking on the “Open Chat” button in the compatibility report as shown below.
This will open an interactive chat session where you can chat with Copilot to iterate through the various assessment suggestions. In this example the migration report recommends moving secrets like database connection strings out of web.config or code, and into a secure location such as Azure Key Vault.
You can interactively step through recommended remediations for each issue:
In this example after selecting “No, I don’t have an Azure Key Vault…”, Copilot will show the commands necessary to setup Key Vault in Azure:
You can continue to walk through all of the migration suggestions and issues found in the assessment report in this manner, leveraging Copilot to provide specific steps, CLI commands, and code remediations to prepare your application for migration into Azure!
Sidecar Scenarios in Azure App Service on Linux
Sidecar patterns are a way to add extra features to an application, such as logging, monitoring and caching, without changing the application’s core code. Sidecar support for container based applications on Azure App Service on Linux is now in public preview! Public preview for using sidecars with source-code based applications is expected to be available this summer.
Common scenarios include attaching monitoring solutions to your application, including popular third-party application performance monitoring (APM) offerings. This example shows a container-based application configured with an OpenTelemetry (OTel) collector sidecar which exports metrics to OTel compatible targets. There are also additional examples showing how to integrate with commonly used ISV solutions such as Datadog with your web applications.
Other common scenarios include attaching a sidecar for in-memory caching using Azure Cache for Redis, and attaching a vector cache sidecar to reduce traffic to back-end LLLM resources when adding generative AI to your application.
At Microsoft Build 2024, breakout session BRK125 includes demonstrations of sidecar scenarios for both container-based and source-code based applications!
WebJobs for Azure App Service on Linux
Webjobs are background tasks that run on the same server as the web app and can perform various functions, such as sending emails, executing bash scripts and running scheduled jobs. WebJobs are now integrated with Azure App Service on Linux, which means they share the same compute resources as the web app to help save costs and ensure consistent performance. WebJobs support for both Azure App Service on Linux as well as Windows Containers on Azure App Service is broadly available in public preview.
WebJobs enable developers to easily run arbitrary code and scripts, in the language of their choice, on a variety of schedules including continuously, manually on-demand, or on a periodic schedule defined via a crontab expression. For example, Linux developers can continuously run shell scripts that perform background “infra-glue” tasks like scanning through a back-end database and sending email reports.
The full list of supported scripting options, as well as information on how to run jobs in a specific language, is available in the updated WebJobs documentation.
Automatic Scaling on Azure App Service
We’re happy to announce that the automatic scaling feature in Azure App Service is now generally available! Automatic scaling provides significant performance improvements for any web app without writing new code or making code changes. With this feature, Azure App Service automatically adjusts the number of application instances and worker instances based on dynamically assessing the incoming HTTP request rate and observed load on the underlying app service plan.
We improved the Automatic Scaling feature based on your feedback during the preview phase with expanded SKU availability and a new scaling metric:
Automatic scaling expanded support to encompass the P0v3 and P*mv3 SKUs.
A new metric called “AutomaticScalingInstanceCount” was added which shows the number of worker instances your application is consuming.
Let Azure App Service adjust the worker count of your App Service plan to match your web application load, without worrying about auto-scale profiles or manual control. It is like an “automatic cruise control” for your web apps! Also check out our community standup to see this feature in action!
Four Nines’ Resiliency is Kind of a Big Deal!
As of May 1st Azure App Service officially supports 99.99% resiliency when your app service plan is running in an Availability Zone based configuration! Availability Zones are isolated locations within an Azure region that provide high availability and fault tolerance. Please refer to the Service Level Agreement (SLA) documentation dated May 01, 2024 to learn more about the higher SLA.
Azure App Service Environment version 3: New and Notable
For customers using the Isolatedv2 SKU on App Service Environment v3 (ASEv3) with Windows, the new memory-optimized pricing tiers, denoted with an ‘m’ such as in Imv2, are now available and can be configured using the Azure CLI as well as ARM/Bicep! The memory optimized tiers provide a higher memory-to-core ratio than their regular counterparts. For instance, in one of the larger Isolated v2 tiers, both I5v2 and I5mv2 provide the same number of cores at 32 vCPU, but the memory-optimized tier has double the RAM at 256GB. Support for Linux and Windows Containers is expected to be available later this year. Portal support for Windows source-code based apps running on ASEv3 will also be available shortly after Build! Please refer to the product documentation to learn more about the new tiers and availability.
Friendly Reminder: While on the subject of Azure App Service Environment, allow me to rerun our public service announcement about the upcoming retirement of Azure App Service Environment v1 and v2 on August 31 2024. We recommend starting the migration process as soon as possible (time is quickly running out!). Many customers have already completed this migration with little to no downtime. Please visit product documentation for detailed steps, tools, and useful resources to help you. Our next community standup scheduled for June 5th will also cover this in detail.
TLS 1.3 and More!
We are pleased to announce that TLS 1.3 has been rolled out worldwide and is now generally available across App Service on Public Cloud and Azure for US Government! Customers can configure an application to require TLS 1.3 via the minimum TLS setting available in the Azure portal, as well as via ARM.
With the availability of TLS 1.3, App Service has also updated the TLS cipher suite order to account for recommended TLS 1.3 cipher suites. You will see the following two TLS cipher suites listed on the minimum TLS cipher suite feature:
TLS_AES_256_GCM_SHA384
TLS_AES_128_GCM_SHA256
As part of the TLS updates, App Service on both Windows and Linux support End to End (E2E) TLS Encryption (in public preview). Incoming HTTPS requests are usually terminated at the App Service front-ends, with the requests proxied to individual workers over HTTTP. With the updated E2E TLS Encryption feature, both Windows and Linux applications can choose to encrypt the requests between the App Service front-ends and the workers running applications. E2E TLS Encryption is available for Standard App Service Plans and above, and can be enabled in the Azure portal as well as via ARM and Azure CLI.
If you have an Azure Key Vault that uses Azure role-based access control (RBAC), you can now import that Key Vault certificate to your web app. Because newly created Key Vaults are configured to use RBAC by default, instead of the legacy access policies, this new support in Azure App Service will make it easier for you to integrate your Key Vault certificates with App Service. Support for importing certificates into App Service from Key Vault using RBAC permissions is available via ARM and the Azure CLI, with Azure portal support planned for the future. Developers can read more about this new support in the documentation.
For more information regarding TLS 1.3 on App Service, the new minimum cipher suites, and updates to E2E TLS Encryption refer to the all-inclusive article on the Microsoft Community Hub!
Better Together with Recommended Azure Service
You can now find recommendations in the Azure Portal for services commonly deployed with Azure App Service! The initial list is curated and primarily focuses on connecting newly created Azure resources to your existing App Service applications. An example of showing recommended services is shown below.
In addition to the curated listed, the new Recommended Services capability in Copilot for Azure offers quick recommendations tailored to your specific application. For instance, it can suggest a popular database suitable for your application type or ensure that you are “on the right track” with commonly deployed services, drawing insights from similar applications.
To use the new Copilot integrated capability, navigate to the Azure Portal and open Copilot for Azure. Examples of the types of questions that you can ask include: “What are commonly deployed services for my app?” or “What is the recommended database for my app?” Read more about these capabilities and try out the new Recommend Services Copilot capability today!
Azure Load Testing Integration
How many times has new code been released to production only to encounter unexpected performance related problems? With the recent release of Azure Load Testing integration with Azure App Service, there has never been a better time to easily run load tests on your web applications. Discover performance problems before they make it into production and uncover race conditions and other load related bugs ahead of time!
You can start setting up load tests directly from the Overview page of your web applications.
As part of this you configure one or more Urls to include in the test run.
You also configure the size of the load test, along with other parameters governing startup behavior and load test duration. After the load test is completed, you will see summarized results for the specific load test where you can also drill down to more detailed metrics.
For more advanced scenarios involving high-scale production scenarios, Azure Load Testing integration also makes it easy for developers to experiment with different scaling strategies and compare the results to achieve desired workload performance.
Language and Deployment Updates
App Service regularly updates major and minor language versions across both the Windows and Linux variants of the platform. As part of that continuing cadence, App Service on Linux just released PHP 8.3 last week! And just last month WordPress on Linux App Service GA’d the Free Tier option which includes a twelve month no-cost backend database running on Azure Database for MySql!
An interesting technical tidbit for the curious, there is also a great write-up here on how to use WordPress on App Service as a headless CMS back-end in conjunction with Azure Static Web Apps.
gRPC has been generally available for App Service on Linux since last November. We’re happy to announce that gRPC support is now available in public preview for App Service on Windows! The team recently demonstrated using gRPC on Windows and Linux at the recently concluded .Net Day 2024.
Azure App Service on Linux has also added a new deployment status tracking API that surfaces detailed deployment log information when deploying source-code based applications. The deployment status tracking API surfaces detailed step-by-step progress information including specific failure information, a link to follow for more detailed deployment failure logs, and post-deployment app startup information. The platform is continuing to expand this capability with additional integration planned for the Azure Portal. For more details on the new deploy status tracking API and guidance on how to use it see this article!
Next Steps
Developers can learn more about Azure App Service at Getting Started with Azure App Service. Stay up to date on new features and innovations on Azure App Service via Azure Updates as well as the Azure App Service (@AzAppService) X feed. There is always a steady stream of great deep-dive technical articles about App Service as well as the breadth of developer focused Azure services over on the Apps on Azure blog.
Take a look at innovation with .Net, and .Net on Azure App Service, with the recently completed .Net Day 2024 event where the new code assessment migration tools were demonstrated as well as gRPC functionality running on both Windows and Linux App Service.
And lastly take a look at Azure App Service Community Standups hosted on the Microsoft Azure Developers YouTube channel. The Azure App Service Community Standup series regularly features walkthroughs of new and upcoming features from folks that work directly on the product!
Microsoft Tech Community – Latest Blogs –Read More
Azure SQL DB availability portal metric
Azure SQL database is the modern cloud based relational database service to power wide variety of applications including mission critical, resource intensive and the latest generative AI applications. Azure SQL database provides industry leading availability SLA of 99.99%. We know customers want to monitor availability of critical Azure services like Azure SQL database in granular, consistent way and in near real time with high quality data.
We are excited to announce Public Preview of Availability portal metric enabling you to monitor SLA compliant availability. This Azure monitor metric is emitted at 1-minute frequency and has up to 93 days of history. Typically, the latency to display availability is less than three minutes. You can visualize the metric in Azure monitor and set up alerts too.
Availability is determined based on the database being operational for connections. A minute is considered as downtime or unavailable if all continuous attempts by users to establish connection to the database within the minute fail due to a service issue. If there is intermittent unavailability, the duration of continuous unavailability must cross the minute boundary to be considered as downtime.
Availability metric data is applicable for a database in DTU or vCore purchasing model and in all the service tiers (Basic, Standard, Premium, General Purpose, Business Critical & Hyperscale). Both singleton and elastic pool deployments are supported. You can monitor the metric by adding Availability metric in portal as shown below:
For comprehensive details on Availability metric like the logic used for computing availability please refer to documentation. To learn more of Azure SQL database Service Level Agreements (SLA) refer to SLA.
Microsoft Tech Community – Latest Blogs –Read More
Leveraging Azure AI Services to Build, Deploy, and Monitor AI Applications with .NET
Azure AI services offer robust tools and platforms that enable developers to bring their AI solutions from concept to production seamlessly. Using .NET 8 alongside these services, developers can experiment, build, and scale their AI applications effectively. This post explores how you can harness the power of Azure AI and .NET to transform your ideas into production-ready AI solutions.
From Prototyping to Production with Azure AI
Start your AI journey by experimenting with local prototypes using Azure AI’s extensive suite of tools. Azure Machine Learning and Azure Cognitive Services provide the necessary components to plug in different AI models and build comprehensive solutions. When you’re ready to scale, Azure OpenAI Service and .NET Aspire enable you to run and monitor your applications efficiently, ensuring high performance and reliability.
Why Build AI Apps with Azure AI Services?
Integrating AI into your applications with Azure AI offers numerous benefits:
Enhanced User Engagement: Deliver more relevant and satisfying user interactions.
Increased Productivity: Automate tasks to save time and reduce errors.
New Business Opportunities: Create innovative, value-added services.
Competitive Advantage: Stay ahead of market trends with cutting-edge AI capabilities.
Getting Started with Azure AI and .NET
Explore the new Azure AI and .NET documentation to learn core AI development concepts. These resources include quickstart guides to help you get hands-on experience with code and start building your AI applications.
Utilizing Semantic Kernel
Semantic Kernel, an open-source SDK, simplifies building AI solutions by enabling easy integration with various models like OpenAI, Azure OpenAI, and Hugging Face. It supports connections to popular vector stores such as Weaviate, Pinecone, and Azure AI Search. By providing common abstractions for dependency injection in .NET, Semantic Kernel allows you to experiment and iterate on your apps with minimal code impact.
Testing and Monitoring with .NET Aspire
.NET Aspire offers robust support for debugging and diagnostics, leveraging the .NET OpenTelemetry SDK. It simplifies the configuration of logging, tracing, and metrics, making it easy to monitor your applications. Azure Monitor and Prometheus can be used to keep an eye on your production deployments, ensuring your applications run smoothly.
Real-World Example: H&R Block’s AI Tax Assistant
H&R Block has developed an innovative AI Tax Assistant using .NET and Azure OpenAI, transforming how clients handle tax-related queries. This assistant provides personalized advice and simplifies the tax process, showcasing the capabilities of Azure AI in building scalable, AI-driven solutions. This project serves as an inspiring example for developers looking to integrate AI into their applications.
Join H&R Block at Microsoft Build as they discuss their journey and experience building AI with .NET and Azure in the session, Infusing your .NET Apps with AI: Practical Tools and Techniques.
Learn More
To dive deeper into AI development with Azure AI and .NET:
Explore the latest .NET and Azure AI documentation
Get started with our quickstart guides for Azure AI and Semantic Kernel
Read the Semantic Kernel announcement post
Share your feedback and connect with our team
Microsoft Tech Community – Latest Blogs –Read More
Announcing new supported formats for Azure Schema Registry
Ever since its general availability in November 2021, Azure Schema Registry has provided a central repository for schema documents, essential for event-driven and messaging-centric applications, greatly simplifying schema management, governance and evolution and streamlining data pipelines for customers.
When we began, we supported Avro format due to its popularity in the open-source community and within the Apache Kafka ecosystem. However, as architectures have evolved, customers have asked us to enable additional formats so that they can onboard more workflows and use Azure Schema Registry for ALL their schema management needs.
On that note, we’re excited to make a few announcements.
General availability of JSON Schema formats for Kafka applications
Today we are excited to announce the General Availability of JSON Schema support in Azure Schema Registry for Kafka applications.
JSON provides a simple, extensible model for development in an increasing cloud-native world. JSON Schema is the standard to support reliable use of the JSON data format in production-grade solutions. Additionally, JSON Schema’s rich ecosystem supercharges development with tools to generate documentation, interfaces, code, and other artifacts thus significantly reducing operational overhead.
Real time streaming workloads on Azure Event Hubs and analytics workloads on Microsoft Fabric can leverage JSON Schema with Azure Schema Registry to simplify schema management at scale.
To learn more about how to use JSON Schema with Azure Schema Registry, for Azure Event Hubs and Apache Kafka applications, refer to the documentation.
Examples
You can find examples of how to use JSON Schema with Azure Schema Registry SDK for different languages in the following links:
Public preview of Protobuf support
We’re also excited to announce preview support for the Protobuf data format.
Protocol Buffers, often abbreviated as protobuf, is a language-neutral, platform-neutral, and extensible mechanism developed for the serialization of structured data. Protobuf’s rich ecosystem allows developers to define data structures once in .proto files, and then use generated source code to supercharge development workflows.
To utilize protobuf in your client applications, you can use the Schema Registry REST API for various management operations.
To create a new schema group, you can call PutSchemaGroup and specify the “schemaType” in the request body as below –
“schemaType”: “Protobuf”
Once the schema group has been created, you may PutSchema in that group, and specify “contentType” in the request headers as below –
“content-type” : text/vnd.ms.protobuf
Learn more
To learn more about JSON Schema, visit the official website at JSON Schema (json-schema.org)
To learn more about Azure Schema Registry, visit the documentation at Use Azure Schema Registry from Apache Kafka and other apps – Azure Event Hubs | Microsoft Learn
To learn more about Azure Schema Registry REST API, see the Schema Registry REST API overview.
To learn more about Azure Event Hubs, visit the documentation at Azure Event Hubs documentation | Microsoft Learn
Microsoft Tech Community – Latest Blogs –Read More
Announcing general availability for Kafka compression in Azure Event Hubs
Today, we’re excited to announce that compression for Kafka clients is generally available in Azure Event Hubs.
Azure Event Hubs is a cloud native streaming service enabling you to build scalable, durable and low-latency workflows with massive volumes of event data with ease. As you onboard more workloads to Azure and build them around Azure Event Hubs, your bandwidth and storage requirements may scale exponentially.
Kafka compression can help with this by reducing the data payloads that are stored and transmitted across your architecture, thus reducing network bandwidth and storage requirements and costs, while still keeping the programming model simple.
To utilize Kafka compression, you must set the below setting in the Kafka producer configuration properties –
compression.type = gzip
No changes are required on the Kafka consumer side, since the compression information is made available in the message header and the consumer can automagically uncompress these and make it available for processing.
Learn more
To learn more about Kafka compression in Azure Event Hubs, please refer to the documentation.
Microsoft Tech Community – Latest Blogs –Read More
outlook repeatedly asks for 365 credentials
We have a new RDS solution so user profiles newly created etc.. AD environment with two session hosts, profile disks and a broker server. Since we have used this solution outlook repeatedly asks for 365 credentials. This is happening for all users at different intervals. Some get asked every time they log on others could go several days without incident. I have ran office repair on both servers, recreated profiles, tried several reg entries but nothing seems to change this behaviour. Sometimes we also get the 1001 something went wrong error and user has to sign out of RDS log back in and re-enter credentials. Any help would be greatly appreciated.
We have a new RDS solution so user profiles newly created etc.. AD environment with two session hosts, profile disks and a broker server. Since we have used this solution outlook repeatedly asks for 365 credentials. This is happening for all users at different intervals. Some get asked every time they log on others could go several days without incident. I have ran office repair on both servers, recreated profiles, tried several reg entries but nothing seems to change this behaviour. Sometimes we also get the 1001 something went wrong error and user has to sign out of RDS log back in and re-enter credentials. Any help would be greatly appreciated. Read More
Format for time
Dear all,
i transfer my file CSV to excel but the time format is different (last colume)
could you pls help me how can i change time to same format?
thank you very much
Dear all,i transfer my file CSV to excel but the time format is different (last colume)could you pls help me how can i change time to same format? thank you very much Read More
Azure Arc enabled Servers unable to assess Updates
Starting yesterday, several of my Arc-enabled Win 2019 and 2022 Servers are unable to assess Windows Updates anymore.
Error: “Assessment failed due to this reason: Not able to complete assessment within specified time.”
Is there anything I can do to reinstall “WindowsPatchExtension” as it won’t automatically install itself after removing it from the Extensions?
(It’s not available for manual install, at least not via “Install extension” GUI)
Starting yesterday, several of my Arc-enabled Win 2019 and 2022 Servers are unable to assess Windows Updates anymore.Error: “Assessment failed due to this reason: Not able to complete assessment within specified time.”Is there anything I can do to reinstall “WindowsPatchExtension” as it won’t automatically install itself after removing it from the Extensions?(It’s not available for manual install, at least not via “Install extension” GUI) Read More
Build 2024 companion guide: Windows developer security resources
Ready to learn more about the topics discussed in our sessions on “Unleash Windows App Security & Reputation with Trusted Signing” and “The Latest in Windows Security for Developers” at Microsoft Build 2024? Here are some resources and tools to help you get started:
Dive deeper into:
Passkeys in Windows – (1 min.) Get a quick overview of passkeys, how they are used in Windows, and how they compare to passwords.
Virtualization-based security (VBS) key protection – (5 min.) Learn how to create, import, and protect your keys using VBS.
NTLM-less – (4 min.) Find the syntax, parameters, return value, and remarks for the AcquireCredentialsHandle (Negotiate) function.
Personal Data Encryption (PDE) – (5 min.) Get information on prerequisites, protection levels, and more for this security feature that provides file-based data encryption capabilities to Windows.
Virtualization-based security (VBS) Enclave – (1 min.) Explore the functions used by System Services and Secure Enclaves.
Trusted Platform Module attestation – (8 min.) Explore key TPM attestation concepts and capabilities supported by Azure Attestation.
Zero Trust DNS – (4 min.) Learn more about Zero Trust DNS (ZTDNS), currently in development for a future version of Windows to help support those trying to lock down devices so that they can access approved network destinations only.
Win32 app isolation repo – Access the documentation and tools you need to help you isolate your applications.
MSIX app packaging – (3 min.) Learn how to use the MSIX Packaging Tool to repackage your existing desktop applications to the MSIX format.
Trusted Signing – Access how-to guides, quickstart tutorials, and other documentation to help you utilize this Microsoft fully managed end-to-end signing solution for third party developers.
Smart App Control – (3 min.) Get to know the requirements and stages for Smart App Control, plus get answers to frequently asked questions.
Coming soon:
Making admins more secure
Granular privacy controls for all Win32 apps
Continue the conversation. Find best practices. Join us on the Windows security discussion board.
Ready to learn more about the topics discussed in our sessions on “Unleash Windows App Security & Reputation with Trusted Signing” and “The Latest in Windows Security for Developers” at Microsoft Build 2024? Here are some resources and tools to help you get started:
Dive deeper into:
Passkeys in Windows – (1 min.) Get a quick overview of passkeys, how they are used in Windows, and how they compare to passwords.
Virtualization-based security (VBS) key protection – (5 min.) Learn how to create, import, and protect your keys using VBS.
NTLM-less – (4 min.) Find the syntax, parameters, return value, and remarks for the AcquireCredentialsHandle (Negotiate) function.
Personal Data Encryption (PDE) – (5 min.) Get information on prerequisites, protection levels, and more for this security feature that provides file-based data encryption capabilities to Windows.
Virtualization-based security (VBS) Enclave – (1 min.) Explore the functions used by System Services and Secure Enclaves.
Trusted Platform Module attestation – (8 min.) Explore key TPM attestation concepts and capabilities supported by Azure Attestation.
Zero Trust DNS – (4 min.) Learn more about Zero Trust DNS (ZTDNS), currently in development for a future version of Windows to help support those trying to lock down devices so that they can access approved network destinations only.
Win32 app isolation repo – Access the documentation and tools you need to help you isolate your applications.
MSIX app packaging – (3 min.) Learn how to use the MSIX Packaging Tool to repackage your existing desktop applications to the MSIX format.
Trusted Signing – Access how-to guides, quickstart tutorials, and other documentation to help you utilize this Microsoft fully managed end-to-end signing solution for third party developers.
Smart App Control – (3 min.) Get to know the requirements and stages for Smart App Control, plus get answers to frequently asked questions.
Coming soon:
Making admins more secure
Granular privacy controls for all Win32 apps
Continue the conversation. Find best practices. Join us on the Windows security discussion board. Read More
Submittable accelerates growth and AI innovation with Microsoft
Submittable’s social impact platform enables foundations to manage the end-to-end process for grants, corporate giving, and awards. With Submittable, customers collect and review applications, award funds, track change, and report on results. “We enable the social impact sector to work more efficiently and more equitably,” says Sam Caplan, Vice President of Social Impact at Submittable. “We focus on creating a trust-based environment and deepening relationships between funders, nonprofit organizations, and the communities they serve.”
To multiply its impact, Submittable joined the Digital Natives Partner Program, a select group of innovative ISV partners that Microsoft Tech for Social Impact (TSI) has chosen to invest in and grow with. The Digital Natives program helps cloud-first SaaS companies connect with Microsoft’s vast community of nearly 400,000 nonprofits and continuously innovate their Azure-based solutions.
“We’re continually looking for ways to grow and help our customers serve their communities,” Caplan says. “We get a huge jumpstart by working on the Azure platform, taking advantage of the work Microsoft has done around AI, and being closely partnered with TSI.”
“The partnership with Submittable is built on shared values and a shared commitment to bring the very best capabilities our two organizations have to offer,” explains Jeremy Pitman, Director of the Digital Natives Partner Program at Microsoft TSI. He adds, “By working together on AI and modern technology solutions, we are able to accelerate the impact mission-driven organizations are having in their communities.”
About a year ago, Submittable decided to leave their current cloud platform and move workloads to Microsoft Azure with the support of Redapt, an Azure partner. With this strategic move, Submittable can expand their reach to more mission-driven organizations, develop industry leading AI-powered features, and be leaders in the impact technology landscape alongside Microsoft. “Microsoft’s TSI team has some of the world’s deepest knowledge of the nonprofit sector—a level of expertise we couldn’t get anywhere else,” Caplan says. “It made perfect sense for us to be closely connected so we can continue to learn and contribute to this amazing work.”
Submittable always seeks to streamline the grantmaking and giving process for its customers and applicants. They recently worked with a Microsoft Partner Technical Strategist to develop and launch a series of AI-enabled tools that reduce busy work. The tools, which run on Azure, help applicants fill out forms quickly, reviewers understand and synthesize the most vital information from applications, and administrators easily build new application forms using natural language prompts—no coding expertise required.
“Our AI features are becoming some of our most appreciated benefits on the platform,” Caplan says. “I don’t think we could have introduced them as fast or as well as we did without this partnership.”
The AI tools don’t replace but rather complement the people who make decisions and craft creative solutions. By freeing up their time, these features enable Submittable’s customers and grant applicants to focus on work that moves the needle on their most ambitious goals.
“As nonprofits harness the potential of artificial intelligence, Submittable is mindful that technology alone is not the destination—it’s the vehicle,” says Justin Spelhaug, Corporate Vice President of Tech for Social Impact at Microsoft Philanthropies. “In the realm of AI in social impact, Submittable is a leading example of the future where nonprofits can achieve more than ever before.”
Download the full case study to learn how Submittable is boosting social impact with the Digital Natives Partner Program.
Submittable’s social impact platform enables foundations to manage the end-to-end process for grants, corporate giving, and awards. With Submittable, customers collect and review applications, award funds, track change, and report on results. “We enable the social impact sector to work more efficiently and more equitably,” says Sam Caplan, Vice President of Social Impact at Submittable. “We focus on creating a trust-based environment and deepening relationships between funders, nonprofit organizations, and the communities they serve.”
To multiply its impact, Submittable joined the Digital Natives Partner Program, a select group of innovative ISV partners that Microsoft Tech for Social Impact (TSI) has chosen to invest in and grow with. The Digital Natives program helps cloud-first SaaS companies connect with Microsoft’s vast community of nearly 400,000 nonprofits and continuously innovate their Azure-based solutions.
“We’re continually looking for ways to grow and help our customers serve their communities,” Caplan says. “We get a huge jumpstart by working on the Azure platform, taking advantage of the work Microsoft has done around AI, and being closely partnered with TSI.”
“The partnership with Submittable is built on shared values and a shared commitment to bring the very best capabilities our two organizations have to offer,” explains Jeremy Pitman, Director of the Digital Natives Partner Program at Microsoft TSI. He adds, “By working together on AI and modern technology solutions, we are able to accelerate the impact mission-driven organizations are having in their communities.”
About a year ago, Submittable decided to leave their current cloud platform and move workloads to Microsoft Azure with the support of Redapt, an Azure partner. With this strategic move, Submittable can expand their reach to more mission-driven organizations, develop industry leading AI-powered features, and be leaders in the impact technology landscape alongside Microsoft. “Microsoft’s TSI team has some of the world’s deepest knowledge of the nonprofit sector—a level of expertise we couldn’t get anywhere else,” Caplan says. “It made perfect sense for us to be closely connected so we can continue to learn and contribute to this amazing work.”
Submittable always seeks to streamline the grantmaking and giving process for its customers and applicants. They recently worked with a Microsoft Partner Technical Strategist to develop and launch a series of AI-enabled tools that reduce busy work. The tools, which run on Azure, help applicants fill out forms quickly, reviewers understand and synthesize the most vital information from applications, and administrators easily build new application forms using natural language prompts—no coding expertise required.
“Our AI features are becoming some of our most appreciated benefits on the platform,” Caplan says. “I don’t think we could have introduced them as fast or as well as we did without this partnership.”
The AI tools don’t replace but rather complement the people who make decisions and craft creative solutions. By freeing up their time, these features enable Submittable’s customers and grant applicants to focus on work that moves the needle on their most ambitious goals.
“As nonprofits harness the potential of artificial intelligence, Submittable is mindful that technology alone is not the destination—it’s the vehicle,” says Justin Spelhaug, Corporate Vice President of Tech for Social Impact at Microsoft Philanthropies. “In the realm of AI in social impact, Submittable is a leading example of the future where nonprofits can achieve more than ever before.”
Download the full case study to learn how Submittable is boosting social impact with the Digital Natives Partner Program. Read More
Announcing new pub-sub capabilities in Azure Event Grid
Azure Event Grid is a highly scalable, fully managed publish-subscribe message distribution service that offers flexible message consumption patterns using the MQTT and HTTP protocols. Our recent efforts have been dedicated to enhancing MQTT compliance, simplifying security for IoT and event-driven solutions, and facilitating seamless integrations. Today, we announce the newest features in these critical areas and their potential impact on your solutions.
Event Grid’s MQTT Broker capability
The MQTT broker capability leverages standard MQTT features and secure authentication methods to enable your clients to communicate in a compliant, secure, and flexible manner. This capability is vital for IoT solutions where efficient communication is essential for seamless operations and where security is critical to protect sensitive data and maintain device integrity. We are excited to announce the release of the following features, reinforcing our commitment to these goals.
Last Will and Testament (LWT): is now generally available (GA), enabling MQTT clients to notify other MQTT clients of their abrupt disconnections through a will message. You can use LWT to ensure predictable and reliable flow of communication among MQTT clients during unexpected disconnections, which is valuable for scenarios where real-time communication, system reliability, and coordinated actions are critical. Now, you’re able to use will delay interval to reduce the noise from fluctuating disconnections.
OAuth 2.0 authentication: is now public preview, allowing clients to authenticate and connect with the MQTT broker using JSON Web Tokens (JWT) issued by any third-party OpenID Connect (OIDC) identity provider, aside from Microsoft Entra Id. MQTT clients can get their token from their identity provider (IDP) and provide the token in the MQTTv5 or MQTTv3.1.1 CONNECT packets to authenticate with the MQTT broker. This authentication method provides a lightweight, secure, and flexible option for MQTT clients that are not provisioned in Azure.
Custom domain names support: is now public preview, allowing users to assign their own domain names to Event Grid namespace’s MQTT and HTTP endpoints, enhancing security and simplifying client configuration. This feature helps enterprises meet their security and compliance requirements and eliminates the need to modify clients already linked to the domain. Assigning a custom domain name to multiple namespaces can also help enhance availability, manage capacity, and handle cross-region client mobility.
Event Grid Namespace Topic
The namespace topic offers flexible consumption of messages through HTTP Push and HTTP Pull delivery, enabling seamless integration of cloud applications in an asynchronous and decoupled manner. Enterprise applications rely on distributed and asynchronous messaging to scale and evolve independently. Using Event Grid, publishers can send messages to the namespace topic, which subscribers can consume using push or pull delivery. Additionally, you can also configure the MQTT broker to route MQTT messages to the namespace topic to integrate your IoT data with Azure services and your backend applications.
We are thrilled to announce the release of the following features aimed at enhancing integration with Azure services, providing flexibility in consuming messages in any format, and offering a versatile authentication method.
Push delivery to Azure Event Hubs: is now GA, allowing you to configure event subscriptions on namespace topics to send messages to Azure Event Hubs at scale. Event Hubs is a cloud native data streaming service that can stream millions of events per second, with low latency, from any source to any destination.
Push delivery to Webhooks: is now public preview, allowing you to configure event subscriptions on namespace topics to send messages to your application’s public endpoint using a simple, scalable, and reliable delivery mechanism. The WebHook doesn’t need to be hosted in Azure to receive events from the namespace topic. You can also use an Azure Automation workbook or an Azure logic app as an event handler via webhooks. With the support of these push delivery destinations, we are offering more options for you to build integrated solutions and data pipelines using namespace topics.
CloudEvents 1.0 Binary Content Mode: is now GA, offering the ability to produce messages whose payload is encoded in any media type. With this namespace topic feature, you can publish events using the encoding format of your choice like AVRO, Protobuf, XML, or even your own proprietary encoding.
Shared Access Signature (SAS) tokens authentication: is now public preview, allowing you to publish or receive (pull delivery) messages using SAS tokens for authentication. SAS token authentication is a simple mechanism to delegate and enforce access control when sending or receiving messages scoped to a specific namespace, namespace topic, or event subscription. While Microsoft Entra ID offers exceptional authentication and access control features, you may still want to use SAS for scenarios where the publisher or subscriber is not protected by Microsoft Entra ID; for example, your client is hosted on another cloud provider, or uses another identity provider.
Event Grid Basic
Event Grid basic tier enables you to build event-driven solutions by sending events to a diverse set of Azure services or webhooks using push event delivery through custom, system, domain, and partner topics. Event sources include your custom applications, Azure services, and partner (SaaS) services that publish events announcing system state changes (also known as “discrete” events). In turn, Event Grid delivers those events to your subscribers, allowing you to filter events and control delivery settings. We are excited to announce the release of the following features to enhance integration among Event Grid resources, Azure services, and partners.
Namespace Topic as a destination: is now GA, enabling you to create an event subscription on a custom, system, domain, and partner topics (Event Grid Basic) that forwards events to namespace topics. This feature will enable you to create data integrations using a diverse set of Event Grid resources. Forwarding events to the namespace topic allows you to take advantage of its pull delivery support and flexibility in consumption.
Microsoft Graph API events: is now GA, enabling you to react to resource changes in Microsoft Entra ID, Microsoft Teams, Outlook, SharePoint, etc. This feature is key for enterprise scenarios such as auditing, onboarding, and policy enforcement, to name a few. Now, you can subscribe to Microsoft Entra ID events through a new simplified Azure portal experience.
Sending Azure Resource Notifications health resources events to Azure Monitor alerts: is now public preview, to notify you when your workload is impacted so you can act quickly. Azure Resource Notifications events in Event Grid provide reliable and thorough information on the status of your virtual machines, including single instance VMs, Virtual Machine Scale Set VMS, and Virtual Machine Scale Sets. With this feature, you can get a better understanding of any service issues that may be affecting your resources.
API Center system topic: is public preview, enabling you to receive real-time updates when an API definition is added or updated. This means you can keep track of your APIs and ensure they are always up to date, making it easier for stakeholders throughout your organization to discover, reuse, and govern APIs. With this new integration, Event Grid is now even more powerful and versatile, giving you the tools you need to build modern, event-driven applications.
Summary
Event Grid continues to invest in MQTT compliance to ensure interoperability and support of non-Azure providers for IoT and event-driven solutions for flexibility. Additionally, Event Grid is adding more integrations among Event Grid resources, Azure services, and partners, and providing flexible consumption of messages in any format. We are excited to have you try these new capabilities. To learn more about Event Grid, got to the Event Grid documentation. If you have questions or feedback, you can contact us at askgrid@microsoft.com or askmqtt@microsoft.com.
Microsoft Tech Community – Latest Blogs –Read More
Introducing GenAI Gateway Capabilities in Azure API Management
We are thrilled to announce GenAI Gateway capabilities in Azure API Management – a set of features designed specifically for GenAI use cases.
Azure OpenAI service offers a diverse set of tools, providing access to advanced models like GPT3.5-Turbo to GPT-4 and GPT-4 Vision, enabling developers to build intelligent applications that can understand, interpret, and generate human-like text and images.
One of the main resources you have in Azure OpenAI is tokens. Azure OpenAI assigns quota for your model deployments expressed in tokens-per-minute (TPMs) which is then distributed across your model consumers that can be represented by different applications, developer teams, departments within the company, etc.
Starting with a single application integration, Azure makes it easy to connect your app to Azure OpenAI. Your intelligent application connects to Azure OpenAI directly using API Key with a TPM limit configured directly on the model deployment level. However, when you start growing your application portfolio, you are presented with multiple apps calling single or even multiple Azure OpenAI endpoints deployed as Pay-as-you-go or Provisioned Throughput Units (PTUs) instances. That comes with certain challenges:
How can we track token usage across multiple applications? How can we do cross charges for multiple applications/teams that use Azure OpenAI models?
How can we make sure that a single app does not consume the whole TPM quota, leaving other apps with no option to use Azure OpenAI models?
How can we make sure that the API key is securely distributed across multiple applications?
How can we distribute load across multiple Azure OpenAI endpoints? How can we make sure that PTUs are used first before falling back to Pay-as-you-go instances?
To tackle these operational and scalability challenges, Azure API Management has built a set of GenAI Gateway capabilities:
Azure OpenAI Token Limit Policy
Azure OpenAI Emit Token Metric Policy
Load Balancer and Circuit Breaker
Import Azure OpenAI as an API
Azure OpenAI Semantic Caching Policy (in public preview)
Azure OpenAI Token Limit Policy
Azure OpenAI Token Limit policy allows you to manage and enforce limits per API consumer based on the usage of Azure OpenAI tokens. With this policy you can set limits, expressed in tokens-per-minute (TPM).
This policy provides flexibility to assign token-based limits on any counter key, such as Subscription Key, IP Address or any other arbitrary key defined through policy expression. Azure OpenAI Token Limit policy also enables pre-calculation of prompt tokens on the Azure API Management side, minimizing unnecessary request to the Azure OpenAI backend if the prompt already exceeds the limit.
Learn more about this policy here.
Azure OpenAI Emit Token Metric Policy
Azure OpenAI enables you to configure token usage metrics to be sent to Azure Applications Insights, providing overview of the utilization of Azure OpenAI models across multiple applications or API consumers.
This policy captures prompt, completions, and total token usage metrics and sends them to Application Insights namespace of your choice. Moreover, you can configure or select from pre-defined dimensions to split token usage metrics, enabling granular analysis by Subscription ID, IP Address, or any custom dimension of your choice.
Learn more about this policy here.
Load Balancer and Circuit Breaker
Load Balancer and Circuit Breaker features allow you to spread the load across multiple Azure OpenAI endpoints.
With support for round-robin, weighted (new), and priority-based (new) load balancing, you can now define your own load distribution strategy according to your specific requirements.
Define priorities within the load balancer configuration to ensure optimal utilization of specific Azure OpenAI endpoints, particularly those purchased as PTUs. In the event of any disruption, a circuit breaker mechanism kicks in, seamlessly transitioning to lower-priority instances based on predefined rules.
Our updated circuit breaker now features dynamic trip duration, leveraging values from the retry-after header provided by the backend. This ensures precise and timely recovery of the backends, maximizing the utilization of your priority backends to their fullest.
Learn more about load balancer and circuit breaker here.
Import Azure OpenAI as an API
New Import Azure OpenAI as an API in Azure API management provides an easy single click experience to import your existing Azure OpenAI endpoints as APIs.
We streamline the onboarding process by automatically importing the OpenAPI schema for Azure OpenAI and setting up authentication to the Azure OpenAI endpoint using managed identity, removing the need for manual configuration. Additionally, within the same user-friendly experience, you can pre-configure Azure OpenAI policies, such as token limit and emit token metric, enabling swift and convenient setup.
Learn more about Import Azure OpenAI as an API here.
Azure OpenAI Semantic Caching policy
Azure OpenAI Semantic Caching policy empowers you to optimize token usage by leveraging semantic caching, which stores completions for prompts with similar meaning.
Our semantic caching mechanism leverages Azure Redis Enterprise or any other external cache compatible with RediSearch and onboarded to Azure API Management. By leveraging the Azure OpenAI Embeddings model, this policy identifies semantically similar prompts and stores their respective completions in the cache. This approach ensures completions reuse, resulting in reduced token consumption and improved response performance.
Learn more about semantic caching policy here.
Get Started with GenAI Gateway Capabilities in Azure API Management
We’re excited to introduce these GenAI Gateway capabilities in Azure API Management, designed to empower developers to efficiently manage and scale their applications leveraging Azure OpenAI services. Get started today and bring your intelligent application development to the next level with Azure API Management.
Microsoft Tech Community – Latest Blogs –Read More
Your Guide to Surface at Microsoft Build
Join the Surface team as we kick off Microsoft Build! Here are all the details to check out the latest developer news and announcements happening this week (May 21-23, 2024) live from Seattle and streaming online worldwide.
Whether you’re tuning in online (for free) or joining us in person in Seattle, prepare to be immersed in the latest innovations in Microsoft developer tools and technologies. Microsoft Build also offers unparalleled opportunities to network and create valuable connections with industry leaders and like-minded professionals.
During the keynote and throughout the show, the Surface team will be showcasing the all-new Surface Laptop and Surface Pro designed to accelerate AI in the workplace. The latest Laptop and Pro devices announced this week are designed to revolutionize PC productivity and will be available alongside Surface Pro 10 and Surface Laptop 6 announced earlier this year. By adding to the diversity of hardware within the Surface portfolio, we’re giving customers more choice than ever before to choose the right devices that meet the unique needs of their organization.
This year’s lineup promises an array of engaging sessions. While this blog post focuses on the Surface presence and experience, there will be a lot more to discover at the conference!
Register for Build
Wherever you are, we’re coming to you! Get ready to connect with Microsoft experts, technology professionals, and developers from around the world.
When: May 21-23 in Seattle & online
Why: Check out Microsoft’s latest developer news & announcements
Link to register: Microsoft Build registration
Build Keynote live from Seattle
Check out the latest Microsoft news and announcements for all developers. Join Microsoft CEO Satya Nadella, as well as Rajesh Jha, Mustafa Suleyman, and Kevin Scott at the opening keynote to learn how this era of AI will unlock new opportunities, transform how developers work, and drive business productivity across industries. Don’t miss the demos from Surface and Windows!
Surface sessions at Build
Join us during the sessions below for an exclusive look into the latest advancements in Microsoft Surface technology led by subject matter experts. (All times Pacific)
Session Code
Session Date / Time
Session Title
Speakers
STUDIO67
Tuesday, May 21
11-11:10 a.m.
Microsoft Surface Innovation (plays immediately after keynote)
Sam Morton
Malex Guinand (Microsoft)
DEM780
Tuesday, May 21
1:45-2 p.m.
Microsoft Surface Innovation in Action
Frank Buchholz
Jacob Rhoades (Microsoft)
DEM781
Wednesday, May 22
3:45-4 p.m.
Microsoft Surface Innovation in Action
Frank Buchholz
Jacob Rhoades (Microsoft)
Seattle experience
For those joining us in Seattle, this is your opportunity to be the first to get hands-on with the latest devices and meet with Surface experts in the Expert Meet-up, located in the Seattle Convention Center Summit Building on the top floor (Floor 5), the Ballroom, to the right of the Microsoft Build Stage. Explore the possibilities of how AI experiences in Windows can enhance productivity with the latest Surface PCs built for the new era of AI.
Expert meet up hours
May 21: 11 a.m.-6:30 p.m.
May 22: 10 a.m.-6:45 p.m.
May 23: 8:30 a.m.-5 p.m.
Social Media
Be sure to amplify Microsoft Surface content using the combination of these hashtags #MicrosoftSurface and #MSBuild. Be sure to watch the Surface LinkedIn for Microsoft Build posts.
Windows at Build
And for even more exciting updates happening at Build for Windows developers, be sure to check out the Windows Developer blog post.
Microsoft Tech Community – Latest Blogs –Read More