Tag Archives: microsoft
Sharepoint file locking alternative
Good evening!
I manage a fleet of laptops using Microsoft Intune.
Our Microsoft 365 Business Premium subscriptions include Sharepoint Online and I’d like to use it as a file server.
Everything works fine except for one specific point: locking files during editing.
With a Windows Server file share, when someone wants to access a file being edited, a pop-up appears, inviting the user to open it in read-only mode, notify the editor, etc., which avoids the risk of conflict.
With Sharepoint, it’s more complicated.
Here’s the scenario:
Sharepoint files are synchronized on PCs via OneDrive with the Intune configuration “Configure team site libraries to sync automatically, Use OneDrive Files on-demand and Convert synced team site files to online-only files” enabled.User 1 double-clicks on one of the files in the OneDrive folder, and a local copy is downloaded.User 1 edits the file via Word installed locally, while the Internet connection is lost.At the same time, user 2 edits the file and saves it. His version ends up on Sharepoint.A few minutes later, once User 1’s connection is re-established, his version is saved on Sharepoint.
User 2’s changes will go completely unnoticed, creating confusion and wasted time.
The check-out/check-in system proposed by Sharepoint has the merit of existing, but is very restrictive for users.
Is there a way of ensuring that Sharepoint files appear in the OneDrive folder in Windows File Explorer, but that modifications can only be made online, to avoid this type of problem?
I thought that the “Convert synced team site files to online-only files” option would meet this need, but it seems that this is not the case.
Thanks for your help!
Jo
Good evening! I manage a fleet of laptops using Microsoft Intune.Our Microsoft 365 Business Premium subscriptions include Sharepoint Online and I’d like to use it as a file server.Everything works fine except for one specific point: locking files during editing.With a Windows Server file share, when someone wants to access a file being edited, a pop-up appears, inviting the user to open it in read-only mode, notify the editor, etc., which avoids the risk of conflict.With Sharepoint, it’s more complicated.Here’s the scenario:Sharepoint files are synchronized on PCs via OneDrive with the Intune configuration “Configure team site libraries to sync automatically, Use OneDrive Files on-demand and Convert synced team site files to online-only files” enabled.User 1 double-clicks on one of the files in the OneDrive folder, and a local copy is downloaded.User 1 edits the file via Word installed locally, while the Internet connection is lost.At the same time, user 2 edits the file and saves it. His version ends up on Sharepoint.A few minutes later, once User 1’s connection is re-established, his version is saved on Sharepoint.User 2’s changes will go completely unnoticed, creating confusion and wasted time.The check-out/check-in system proposed by Sharepoint has the merit of existing, but is very restrictive for users. Is there a way of ensuring that Sharepoint files appear in the OneDrive folder in Windows File Explorer, but that modifications can only be made online, to avoid this type of problem? I thought that the “Convert synced team site files to online-only files” option would meet this need, but it seems that this is not the case. Thanks for your help!Jo Read More
DLP Exception for “Permission Controlled” Not Working (Microsoft Purview | RMS Template | Encrypt)
Hello,
We are in the process of moving some of our mail-flow / transport rules over to Microsoft Purview.
We don’t want the DLP policy to apply when people click their “Encrypt” or “Do not Forward” buttons (RMS templates; OME encryption.)
Putting “Permission Controlled” in the exceptions group should theoretically let the emails go through. The exception we have for when people put “Encrypt” in the subject line works (we have a mail-flow rule that encrypts those emails.)
But actually clicking “Options” > “Set permissions on this item” > “Encrypt” doesn’t remove the policy tip on an email draft, and people are unable to send the emails.
Can someone verify this is rule constructed properly? If so, we may have to reach out to Microsoft Support. Thank you so much for your time and help!
Hello, We are in the process of moving some of our mail-flow / transport rules over to Microsoft Purview. We don’t want the DLP policy to apply when people click their “Encrypt” or “Do not Forward” buttons (RMS templates; OME encryption.)Putting “Permission Controlled” in the exceptions group should theoretically let the emails go through. The exception we have for when people put “Encrypt” in the subject line works (we have a mail-flow rule that encrypts those emails.)But actually clicking “Options” > “Set permissions on this item” > “Encrypt” doesn’t remove the policy tip on an email draft, and people are unable to send the emails. Can someone verify this is rule constructed properly? If so, we may have to reach out to Microsoft Support. Thank you so much for your time and help! Read More
SharePoint List need multi-line column with timestamp and allow updates
I’m setting up a SharePoint list to support attestation by owners for different business virtual assets. This will support a review to ensure we aren’t retaining stale virtual assets.
When the owners review SharePoint list rows where they are listed as an owner, I’d like them to provide a note detailing their rationale for why the asset should be retained but have that note timestamped so the next time they perform a review they can see the previous rationale and the date they added that note.
What is the best way to do this? Calculated, multi-line column? If so, what formula would work? Or would this need to be run through Power Automate and have a flow that updates any input to that column to concatenate it with a date?
I’m setting up a SharePoint list to support attestation by owners for different business virtual assets. This will support a review to ensure we aren’t retaining stale virtual assets. When the owners review SharePoint list rows where they are listed as an owner, I’d like them to provide a note detailing their rationale for why the asset should be retained but have that note timestamped so the next time they perform a review they can see the previous rationale and the date they added that note. What is the best way to do this? Calculated, multi-line column? If so, what formula would work? Or would this need to be run through Power Automate and have a flow that updates any input to that column to concatenate it with a date? Read More
Dynamic Multi-Cloud Networking: Configuring a BGP-Enabled VPN Between Azure and AWS
Introduction
In my previous blog post, I demonstrated how to set up a basic VPN connection between Azure and AWS. This updated guide builds on that foundation by incorporating BGP (Border Gateway Protocol) to enable dynamic routing and redundancy across two VPN tunnels. By following this configuration, you can establish a more resilient multi-cloud VPN connection that supports automatic route exchanges between Azure VPN Gateway and AWS Virtual Private Gateway over IPsec tunnels. This approach ensures reliable connectivity and helps simplify network management between Azure and AWS environments.
Step 1: Set Up Your Azure Environment
1.1. Create a Resource Group
Go to Azure Portal > Resource groups > Create.
Select your subscription and region, and give the resource group a name like RG-AzureAWSVPN-BGP.
1.2. Create a Virtual Network (VNet) and Subnet
In the Azure Portal, go to Virtual Networks > Create.
Name the VNet AzureVNetBGP and specify an address space of 172.16.0.0/16.
Under Subnets, create a subnet named Subnet-AzureVPN with the address range 172.16.1.0/24.
Add a GatewaySubnet with a /27 address block (e.g., 172.16.254.0/27) for the VPN gateway.
1.3. Set Up the Azure VPN Gateway
Go to +Create a resource, search for Virtual Network Gateway, and select Create.
Fill in the details:
Name: AzureVPNGatewayBGP
Gateway Type: VPN
SKU: VpnGw1 (or higher for redundancy/performance).
Public IP Address: Create a new one and name it AzureVPNGatewayPublicIP.
Enable BGP: Yes.
ASN: Use an Autonomous System Number (ASN) for Azure, e.g., 65010.
Azure APIPA BGP IP Address: Use 169.254.21.2 for the first tunnel with AWS and 169.254.22.2 for the second tunnel with AWS.
Note: For this example, we’ll create an Active-Standby setup so Active-Active Mode will not be enabled. If you wanted to change from active-standby to active-active later follow this: Configure active-active VPN gateways: Azure portal – Azure VPN Gateway | Microsoft Learn.
Step 2: Set Up Your AWS Environment with BGP
2.1. Create a VPC and Subnet in AWS
In the AWS Console, go to VPC > Create VPC.
Use an address space (e.g., 10.0.0.0/16) for the AWS VPC.
Under Subnets, create a subnet with a name like Subnet-AWSVPN and the address space 10.0.1.0/24 for your subnet.
2.2. Create an AWS Virtual Private Gateway (VGW)
In the AWS VPC Console, go to Virtual Private Gateway and create a new VGW named AWS-VPN-VGW-BGP.
Attach the VGW to the VPC.
During the VGW creation, set the ASN for AWS. AWS will assign one by default (e.g., 64512), but you can customize this if needed.
2.3. Set Up a Customer Gateway (CGW)
In the AWS Console, go to Customer Gateway, and create a CGW using the public IP address of the Azure VPN Gateway (obtained during the Azure VPN Gateway setup). Name it Azure-CGW-BGP.
Set the BGP ASN for the Customer Gateway to 65010, the same ASN as set in Azure.
2.4. Create the Site-to-Site VPN Connection with BGP setting
In AWS Console, go to Site-to-Site VPN Connections > Create VPN Connection.
Select the Virtual Private Gateway created earlier.
Select the Customer Gateway created earlier.
Routing Options: Select Dynamic (requires BGP) to enable dynamic routing with BGP.
Tunnels: AWS will automatically create two tunnels for redundancy.
2.4.1. Tunnel Configuration – Optional Settings
Under the Optional Tunnel Settings, configure the Inside IPv4 CIDR for each tunnel:
For Tunnel 1: Set the Inside IPv4 CIDR to 169.254.21.0/30.
For Tunnel 2: Set the Inside IPv4 CIDR to 169.254.22.0/30.
This ensures proper BGP peering between Azure and AWS for both tunnels.
2.4.3. Download the VPN Configuration File
After the VPN is set up, download the configuration file.
Select Generic for the platform and Vendor agnostic for the software.
Select IKEv2 for the IKE version.
Step 3: Finish the Azure Side Configuration with the two tunnels and BGP setup
3.1. Create Two Local Network Gateways
To support two tunnels, you will need to create two Local Network Gateways on Azure, one for each tunnel.
In the Azure Portal, go to Local Network Gateway > Create.
Local Network Gateway 1 (for the first tunnel):
ASN: Set to 64512 (AWS ASN).
BGP Peer IP Address: Enter 169.254.21.1(AWS BGP peer IP for the first tunnel).
Name: AWSLocalNetworkGatewayBGP-Tunnel1
Public IP Address: Enter the public IP for the first AWS VPN tunnel (from the configuration file).
BGP Settings: Go to the Advanced Tab, select Yes for Configure BGP Settings, then:
Note: You do not need to specify an address space when creating the Local Network Gateway. Only the public IP and BGP settings are required.
3. Local Network Gateway 2 (for the second tunnel):
Name: AWSLocalNetworkGatewayBGP-Tunnel2
Public IP Address: Enter the public IP for the second AWS VPN tunnel.
BGP Settings: Go to the Advanced Tab, select Yes for Configure BGP Settings, then:
ASN: Set to 64512 (AWS ASN).
BGP Peer IP Address: Enter 169.254.22.1 (AWS BGP peer IP for the second tunnel).
Note: Enter the ASN first, followed by the BGP Peer IP Address in this order.
3.2. Create the VPN Connection for Both Tunnels
Go to Azure Portal > Virtual Network Gateway > Connections > + Add.
For the first tunnel:
Primary Custom BGP Address: Enter 169.254.21.2 for Tunnel 1.
Name: AzureAWSVPNConnectionBGP-Tunnel1
Connection Type: Site-to-site (IPsec).
Virtual Network Gateway: Select AzureVPNGatewayBGP.
Local Network Gateway: Select AWSLocalNetworkGatewayBGP-Tunnel1.
Shared Key (PSK): Use the shared key from the AWS VPN configuration file for tunnel 1.
IKE Protocol: Ensure that IKEv2 is selected.
Enable BGP: Mark the checkbox to enable.
After selecting Enable BGP, check the box for Enable Custom BGP Addresses and set:
IPSec/IKE Policy: Set this to Default.
Use Policy-Based Traffic Selector: Set to Disabled.
DPD (Dead Peer Detection) Timeout: Set the Timeout in Seconds to 45 seconds.
Connection Mode: Leave this as Default (no need to change to initiator-only or responder-only).
In about 3 minutes, you can check the VPN connection established.
Repeat the same process for the second tunnel:
Name: AzureAWSVPNConnectionBGP-Tunnel2
Local Network Gateway: Select AWSLocalNetworkGatewayBGP-Tunnel2.
Shared Key (PSK): Use the shared key from the AWS VPN configuration file for tunnel 2.
Enable BGP: Mark the checkbox to enable.
Check the box for Enable Custom BGP Addresses and set:
Primary Custom BGP Address: Enter 169.254.22.2 for Tunnel 2.
In about 3 minutes, you can check the VPN connection established.
3.3. Ensure the VPN is established
From Site-to-Site VPN connections on AWS, go to Tunnel details and check that the Tunnel 1 is UP:
2. From Azure side, check if the status of the VPN connections is Connected:
In BPG peers, you can see the BGP peers and the BGP learned routes:
Step 4: Add Routes and Configure Security
4.1. AWS Route Table Configuration
In AWS Console, go to Route Tables and select the route table for your AWS VPC.
Navigate to Route Propagation and select Edit Route Propagation.
Enable route propagation to ensure that BGP dynamically propagates the routes between AWS and Azure, removing the need for manual static route entries. Almost instantaneously after enabling the route propagation, you will be able to see the new routes
4.2. Add an Internet Gateway (IGW)
Note: An Internet Gateway (IGW) is required for the EC2 instance to be accessible via its public IP address. Without the IGW, the EC2 instance won’t be reachable over the public internet, preventing you from logging into the EC2 using their public IP address. This is the sole purpose of deploying the IGW.
4.3. Set Security Group and NSG Rules
AWS Security Group: Ensure that the Security Group for the AWS EC2 instance allows ICMP (ping), SSH, and any other necessary protocols.
Azure NSG (Network Security Group): Ensure that the NSG attached to the Azure VM’s NIC allows inbound traffic from AWS for the required protocols, such as ICMP and SSH.
Step 5: Test Connectivity Between Azure and AWS VMs
To test connectivity between Azure and AWS, first deploy a virtual machine in the appropriate subnet on each cloud provider—an EC2 instance on AWS and a VM on Azure. Once both machines are running, connect to each VM using their respective public IP addresses. After logging in, use the private IP addresses of both instances to run a ping test and verify private network connectivity between them.
If you decided to not create the IGW to make the EC2 VM accessible over the internet, you can just login into the Azure VM using their public IP address and test unilaterally running the ping command against the private IP of the EC2 VM.
5.1. Ensure ICMP Traffic Is Allowed
Both the AWS Security Group and the Azure NSG (Network Security Group) should allow ICMP (ping) traffic for proper testing of connectivity between the virtual machines.
5.2. Test Connectivity with ping
From the Azure VM, ping the AWS VM using its private IP.
From the AWS VM, ping the Azure VM using its private IP.
Ensure that the pings are successful in both directions to verify that the VPN tunnels are functioning correctly.
Troubleshooting Common Issues
BGP Not Establishing
Double-check that the BGP peer IP addresses and ASNs are correctly configured for both tunnels.
Ensure that BGP is enabled on both the Azure Virtual Network Gateway and the AWS VPN connection.
Ensure that route propagation is enabled on AWS, allowing dynamic routes to be exchanged through BGP.
No Inbound Traffic on Azure VPN Gateway
Verify that AWS route propagation is enabled and that the Azure routes are correctly learned from AWS.
Check the NSG rules on Azure to ensure inbound traffic is allowed from AWS.
Dead Peer Detection (DPD) Issues
Mismatched DPD settings may cause tunnels to drop. Ensure that both Azure and AWS have consistent DPD configurations. The recommended DPD Timeout for both Azure and AWS is 45 seconds.
Tunnel Status Showing as Down
If one or both tunnels show as down, ensure that the IKEv2/IPsec policies match on both sides. Double-check the encryption algorithms, hashing functions, and Diffie-Hellman group settings between Azure and AWS for Phase 1 and Phase 2.
Restart the VPN connection on both Azure and AWS to re-initiate the tunnels.
Conclusion
By following this guide, you’ve successfully set up a VPN connection between Azure and AWS using BGP with two tunnels for redundancy. This configuration ensures robust and reliable connectivity between the two clouds, with dynamic route propagation handled by BGP. The use of managed services minimizes operational overhead and simplifies management.
For more advanced configurations, such as custom IPsec/IKE policies, enabling failover, or using BGP with Active-Active Mode, refer to the official documentation for Azure VPN Gateway.
Microsoft Tech Community – Latest Blogs –Read More
Part 2 – Multichannel notification system using Azure Communication Services and Azure Functions
In the first part of this topic, we setup all the Azure resources like the Azure Communication Services for Email, SMS, WhatsApp for Business and developed the Azure Functions code for the email trigger. In this second part, we will complete coding the remaining Azure Functions triggers and then go ahead to deploy the multichannel notification system to Azure Functions, testing the Email, SMS, and WhatsApp triggers with OpenAI GPTs. Let’s get started!
Prerequisite
To follow this tutorial, ensure you have completed the first part of this topic.
Coding the SMSTrigger
Enhancing the SMSTrigger Azure Function from the default template involves a series of steps. These steps will transform the basic Function into one that can send SMS messages using Azure Communication Services. Below is a guide to get you from the default HTTP triggered function to the finished SMSTrigger.
Step 1: Set Up the Function Template
Follow the instructions for setting up the function template from the Email section and name the trigger as ‘SMSTrigger’ or any other string you prefer.
Step 2: Add Azure Communication Services SMS Reference
Add a reference to using Azure.Communication.Sms then create a property in the SMS Trigger class to hold an instance of SmsClient and a property to hold the email sender address.
csharp
private string? sender = Environment.GetEnvironmentVariable(“SENDER_PHONE_NUMBER”);
Step 3: Read Configuration and Initialize SmsClient
In the constructor of the SMSTrigger class, read the Azure Communication Services connection string from the environment variables using the Environment.GetEnvironmentVariable() method and initialize the SmsClient instance.
Be sure to check if the connection string is null, and if so, throw an exception to indicate that the environment variable is missing:
csharp
if (connectionString is null)
{
throw new InvalidOperationException(“COMMUNICATION_SERVICES_CONNECTION_STRING environment variable is not set.”);
}
_smsClient = new SmsClient(connectionString);
Step 4: Define the Request Model
Create a request model class within the SMSTrigger class called SmsRequest. This model should contain properties for the message text and the phone number to which the message will be sent.
csharp
{
public string Message { get; set; } = string.Empty;
public string PhoneNumber { get; set; } = string.Empty;
}
Step 5: Parse the Request Body
Change the Run function to be async as we will perform asynchronous operations. Use a StreamReader to read the request body as a string and deserialize it into an SmsRequest object using JsonSerializer.
csharp
If the request body fails to deserialize into SmsRequest, return a BadRequestResult:
csharp
SmsRequest? data = JsonSerializer.Deserialize<SmsRequest>(requestBody, new JsonSerializerOptions() {
PropertyNamingPolicy = JsonNamingPolicy.CamelCase
});
if (data is null)
{
return new BadRequestResult();
}
Step 6: Define the Sender and Send an SMS
Retrieve the sender’s phone number from the environment variables with Environment.GetEnvironmentVariable(). Then, attempt to send the SMS with a try-catch block, handling any RequestFailedException that may occur and logging the relevant information:
csharp
{
_logger.LogInformation(“Sending SMS…”);
SmsSendResult smsSendResult = await _smsClient.SendAsync(
sender,
data.PhoneNumber,
data.Message
);
_logger.LogInformation($”SMS Sent. Successful = {smsSendResult.Successful}”);
_logger.LogInformation($”SMS operation id = {smsSendResult.MessageId}”);
}
catch (RequestFailedException ex)
{
_logger.LogInformation($”SMS send operation failed with error code: {ex.ErrorCode}, message: {ex.Message}”);
// Return an appropriate error response if needed
}
Step 7: Return a Success Response
If sending the SMS is successful, return an OkObjectResult to the caller indicating that the SMS has been sent.
csharp
Final Code
The final SMSTrigger Azure Function, with the steps implemented, should look as follows:
csharp
using Microsoft.AspNetCore.Mvc;
using Microsoft.Azure.Functions.Worker;
using Microsoft.Extensions.Logging;
using Azure;
using Azure.Communication.Messages;
using System.Text.Json;
using System.IO;
using System.Threading.Tasks;
using System.Linq;
using System.Collections.Generic;
namespace ACSGPTFunctions
{
public class WhatsAppTrigger
{
private readonly ILogger<WhatsAppTrigger> _logger;
private readonly NotificationMessagesClient _messagesClient;
private string? sender = Environment.GetEnvironmentVariable(“WHATSAPP_NUMBER”);
public WhatsAppTrigger(ILogger<WhatsAppTrigger> logger)
{
_logger = logger;
string? connectionString = Environment.GetEnvironmentVariable(“COMMUNICATION_SERVICES_CONNECTION_STRING”);
if (connectionString is null)
{
throw new InvalidOperationException(“COMMUNICATION_SERVICES_CONNECTION_STRING environment variable is not set.”);
}
_messagesClient = new NotificationMessagesClient(connectionString);
}
public class WhatsAppRequest
{
public string PhoneNumber { get; set; } = string.Empty;
public string TemplateName { get; set; } = “appointment_reminder”;
public string TemplateLanguage { get; set; } = “en”;
public List<string> TemplateParameters { get; set; } = new List<string>();
}
[Function(“WhatsAppTrigger”)]
public async Task<IActionResult> Run([HttpTrigger(AuthorizationLevel.Function, “get”, “post”)] HttpRequest req)
{
_logger.LogInformation(“Processing request.”);
string requestBody = await new StreamReader(req.Body).ReadToEndAsync();
WhatsAppRequest? data = JsonSerializer.Deserialize<WhatsAppRequest>(requestBody, new JsonSerializerOptions() {
PropertyNamingPolicy = JsonNamingPolicy.CamelCase
});
if (data is null)
{
return new BadRequestResult();
}
var recipientList = new List<string> { data.PhoneNumber };
var values = data.TemplateParameters
.Select((parameter, index) => new MessageTemplateText($”value{index + 1}”, parameter))
.ToList();
var bindings = new MessageTemplateWhatsAppBindings(
body: values.Select(value => value.Name).ToList()
);
var template = new MessageTemplate(data.TemplateName, data.TemplateLanguage, values, bindings);
var sendTemplateMessageOptions = new SendMessageOptions(sender, recipientList, template);
try
{
Response<SendMessageResult> templateResponse = await _messagesClient.SendMessageAsync(sendTemplateMessageOptions);
_logger.LogInformation(“WhatsApp message sent successfully!”);
}
catch (RequestFailedException ex)
{
_logger.LogError($”WhatsApp send operation failed with error code: {ex.ErrorCode}, message: {ex.Message}”);
return new ObjectResult(new { error = ex.Message }) { StatusCode = 500 };
}
return new OkObjectResult(“WhatsApp sent successfully!”);
}
}
}
This completed SMSTrigger Azure Function can now facilitate SMS as part of your multichannel notification system.
Coding the WhatsAppTrigger
Creating a functional WhatsAppTrigger Azure Function involves iterating on the default HTTP-triggered function template provided by Azure Functions for C#. We will modify this template to integrate Azure Communication Services for sending WhatsApp messages via template messages. Follow the steps below to transform this template into a complete WhatsAppTrigger function:
Step 1: Set Up the Function Template
Follow the instructions in the first step for setting up SMS trigger and name the function as WhatsAppTrigger. Set the authorization level to anonymous or function, depending on your security preference.
Step 2: Reference the Azure Communication Services Messages Package
Ensure the Azure.Communication.Messages NuGet package is included in your project to enable messaging features needed for WhatsApp. Install the package with the following command in Visual Studio Code’s terminal:
bash
Add a reference to using Azure.Communication.Messages then create a property in the WhatsApp Trigger class to hold an instance of NotificationMessagesClient and a property to hold the WhatsApp identifier.
csharp
private string? sender = Environment.GetEnvironmentVariable(“WHATSAPP_NUMBER”);
Step 3: Read Configuration and Initialize NotificationMessagesClient
Update the WhatsAppTrigger class constructor to read the Azure Communication Services connection string from environment variables using Environment.GetEnvironmentVariable() and initialize NotificationMessagesClient with this connection string:
csharp
if (connectionString is null)
{
throw new InvalidOperationException(“COMMUNICATION_SERVICES_CONNECTION_STRING environment variable is not set.”);
}
_messagesClient = new NotificationMessagesClient(connectionString);
Step 4: Define the Request Model
Create a request model class named WhatsAppRequest within the WhatsAppTrigger class, containing properties for the destination phone number, template name, language, and template parameters:
csharp
{
public string PhoneNumber { get; set; } = string.Empty;
public string TemplateName { get; set; } = “appointment_reminder”;
public string TemplateLanguage { get; set; } = “en”;
public List<string> TemplateParameters { get; set; } = new List<string>();
}
Step 5: Parse the Request Body
Convert the Run function to be async to enable asynchronous work. Use StreamReader to read the request body and deserialize it to a WhatsAppRequest instance using System.Text.Json.JsonSerializer with JsonNamingPolicy.CamelCase.
csharp
Handle potential deserialization failure by returning BadRequestResult:
csharp
WhatsAppRequest? data = JsonSerializer.Deserialize<WhatsAppRequest>(requestBody, new JsonSerializerOptions() {
PropertyNamingPolicy = JsonNamingPolicy.CamelCase
});
if (data is null)
{
return new BadRequestResult();
}
Step 6: Prepare Template Message and Send WhatsApp Message
Modify the try-catch block to construct a SendMessageOptions object using MessageTemplateWhatsAppBindingsand MessageTemplate, and then make a call to _messagesClient.SendMessageAsync(sendTemplateMessageOptions):
csharp
{
_logger.LogInformation(“Sending WhatsApp message…”);
List<string> recipientList = new List<string> { data.PhoneNumber };
List<MessageTemplateText> values = data.TemplateParameters
.Select((parameter, index) => new MessageTemplateText($”value{index + 1}”, parameter))
.ToList();
MessageTemplateWhatsAppBindings bindings = new MessageTemplateWhatsAppBindings(
body: values.Select(value => value.Name).ToList()
);
MessageTemplate template = new MessageTemplate(data.TemplateName, data.TemplateLanguage, values, bindings);
SendMessageOptions sendTemplateMessageOptions = new SendMessageOptions(sender, recipientList, template);
Response<SendMessageResult> templateResponse = await _messagesClient.SendMessageAsync(sendTemplateMessageOptions);
_logger.LogInformation(“WhatsApp message sent successfully!”);
}
catch (RequestFailedException ex)
{
_logger.LogError($”WhatsApp send operation failed with error code: {ex.ErrorCode}, message: {ex.Message}”);
return new ObjectResult(new { error = ex.Message }) { StatusCode = 500 };
}
Step 7: Return Success Response
After sending the WhatsApp message successfully, return an OkObjectResult stating “WhatsApp sent successfully!”.
csharp
Final Code
Following the described steps, the final WhatsAppTrigger Azure Function should look like this:
csharp
using Microsoft.AspNetCore.Mvc;
using Microsoft.Azure.Functions.Worker;
using Microsoft.Extensions.Logging;
using Azure;
using Azure.Communication.Messages;
using System.Text.Json;
using System.IO;
using System.Threading.Tasks;
using System.Linq;
using System.Collections.Generic;
namespace ACSGPTFunctions
{
public class WhatsAppTrigger
{
private readonly ILogger<WhatsAppTrigger> _logger;
private readonly NotificationMessagesClient _messagesClient;
private string? sender = Environment.GetEnvironmentVariable(“WHATSAPP_NUMBER”);
public WhatsAppTrigger(ILogger<WhatsAppTrigger> logger)
{
_logger = logger;
string? connectionString = Environment.GetEnvironmentVariable(“COMMUNICATION_SERVICES_CONNECTION_STRING”);
if (connectionString is null)
{
throw new InvalidOperationException(“COMMUNICATION_SERVICES_CONNECTION_STRING environment variable is not set.”);
}
_messagesClient = new NotificationMessagesClient(connectionString);
}
public class WhatsAppRequest
{
public string PhoneNumber { get; set; } = string.Empty;
public string TemplateName { get; set; } = “appointment_reminder”;
public string TemplateLanguage { get; set; } = “en”;
public List<string> TemplateParameters { get; set; } = new List<string>();
}
[Function(“WhatsAppTrigger”)]
public async Task<IActionResult> Run([HttpTrigger(AuthorizationLevel.Function, “get”, “post”)] HttpRequest req)
{
_logger.LogInformation(“Processing request.”);
string requestBody = await new StreamReader(req.Body).ReadToEndAsync();
WhatsAppRequest? data = JsonSerializer.Deserialize<WhatsAppRequest>(requestBody, new JsonSerializerOptions() {
PropertyNamingPolicy = JsonNamingPolicy.CamelCase
});
if (data is null)
{
return new BadRequestResult();
}
var recipientList = new List<string> { data.PhoneNumber };
var values = data.TemplateParameters
.Select((parameter, index) => new MessageTemplateText($”value{index + 1}”, parameter))
.ToList();
var bindings = new MessageTemplateWhatsAppBindings(
body: values.Select(value => value.Name).ToList()
);
var template = new MessageTemplate(data.TemplateName, data.TemplateLanguage, values, bindings);
var sendTemplateMessageOptions = new SendMessageOptions(sender, recipientList, template);
try
{
Response<SendMessageResult> templateResponse = await _messagesClient.SendMessageAsync(sendTemplateMessageOptions);
_logger.LogInformation(“WhatsApp message sent successfully!”);
}
catch (RequestFailedException ex)
{
_logger.LogError($”WhatsApp send operation failed with error code: {ex.ErrorCode}, message: {ex.Message}”);
return new ObjectResult(new { error = ex.Message }) { StatusCode = 500 };
}
return new OkObjectResult(“WhatsApp sent successfully!”);
}
}
}
The WhatsAppTrigger Azure Function is now ready to send WhatsApp template messages. Be sure to test it extensively and remember to handle any issues related to input validation and communicate with the Azure Communication Services API correctly.
Deployment and Testing
After developing the multichannel notification system using Azure Functions, the next step is to deploy and test the functions. This section will guide you through deploying your Azure Function to the cloud and testing the Email, SMS, and WhatsApp triggers.
Deploying the Azure Function
Deployment of your Azure Function can be done right from Visual Studio Code with the Azure Functions extension.
Publish the Function App: In Visual Studio Code, sign in to Azure if you haven’t already. In the Azure Functions extension tab, find the ‘Deploy to Function App…’ button and select it.
Choose Your Function App: You can either create a new Function App or deploy it to an existing one. If it’s the first time you are deploying, choose ‘Create New Function App in Azure…’.
Set the Configuration: Provide a unique name for your Function App, select a runtime stack (.NET Core in this case), choose the appropriate region, and confirm your selections.
Wait for Deployment: The deployment process will take a few minutes. Monitor the output window for completion status and any potential errors.
Set Up Application Settings
After deployment, you need to configure the application settings (environment variables) in Azure.
Open the Function App: Navigate to the Azure Portal, and find your Function App under ‘All Resources’ or by searching the name you provided.
Access Application Settings: In the Function App’s menu, go to ‘Configuration’ under the ‘Settings’ section.
Add the Settings: Click on ‘New application setting’ and add the key-value pairs for the environment variables specified in your local.settings.json: COMMUNICATION_SERVICES_CONNECTION_STRING, SENDER_EMAIL_ADDRESS, SENDER_PHONE_NUMBER, WHATSAPP_NUMBER, etc.,
json
“IsEncrypted”: false,
“Values”: {
“AzureWebJobsStorage”: “”,
“FUNCTIONS_WORKER_RUNTIME”: “dotnet-isolated”,
“COMMUNICATION_SERVICES_CONNECTION_STRING”: “<<connection string>>”,
“SENDER_PHONE_NUMBER”: “<<phone number>>”,
“SENDER_EMAIL_ADDRESS”: “<<email address>>”,
“WHATSAPP_NUMBER”:”<<WhatsApp id>>”
}
}
Save and Restart: After adding the required settings, make sure to save the configurations and restart the Function App to ensure the new settings take effect.
Alternatively, when the Function has finished deploying, you can click on ‘Upload settings’ to upload your settings from local.settings.json. Don’t forget to restart the Function App after uploading the settings.
Testing the Function
With the deployment complete and the environment configured, it’s time to verify that your function works as intended through each communication channel.
Testing Email Notifications
To test the EmailTrigger function:
Send an HTTP POST Request: Use a tool like Postman to send a POST request to the Function App’s URL suffixed with /api/EmailTrigger. The body should contain JSON with keys for subject, htmlContent, and recipient.
Verify Email Receipt: Check the recipient’s email inbox for the message. Ensure that the subject and content match what you sent through the POST request.
Testing SMS Notifications
To test the SMSTrigger function:
Send an HTTP POST Request: Using Postman, send a POST request to the Function App’s URL with /api/SMSTrigger at the end. The body of your request should contain JSON with message and phoneNumberkeys.
Check for SMS: Ensure that the specified phone number receives the SMS and the message content matches the request.
Testing WhatsApp Notifications
To test the WhatsAppTrigger function:
Send an HTTP POST Request: Use Postman again to POST to the Function URL, this time ending with /api/WhatsAppTrigger. Include a JSON body with keys for phoneNumber, templateName, templateLanguage, and templateParameters.
Confirm WhatsApp Message: Verify that the WhatsApp message reaches the intended recipient with correct template filling.
Integrate with OpenAI GPTs
In OpenAI GPTs editor, click ‘new GPT’ and ‘configure’. Name it “Email Sender” and set the description and instructions as mentioned.
Help author short and delightful emails. Ask for details on the nature of the email content and include creative ideas for topics. Compose the email with placeholders for the sender’s name and receiver’s name. You do not need a full name. Share a draft of the email and ask for the sender’s name, and the receiver’s name and email address. Provide a draft of the final email and confirm the user is happy with it. When the user provides a recipient’s email address ask if it is correct before sending. Do not send the email until you provide a final draft and you have a confirmed recipient email address.
Add Actions and JSON Schema
Click ‘Create new action’ in your GPT configuration. Enter the following JSON:
json
“openapi”: “3.1.0”,
“info”: {
“title”: “Send Message API”,
“description”: “API for sending a message to a specified email address.”,
“version”: “v1.0.0”
},
“servers”: [
{
“url”: “https://<<function-app-url>>.azurewebsites.net”
}
],
“paths”: {
“/api/emailtrigger”: {
“post”: {
“description”: “Send a message to a given email address”,
“operationId”: “SendMessage”,
“requestBody”: {
“required”: true,
“content”: {
“application/json”: {
“schema”: {
“type”: “object”,
“properties”: {
“recipient”: {
“type”: “string”,
“format”: “email”,
“description”: “Email address of the recipient”
},
“subject”: {
“type”: “string”,
“description”: “The message subject”
},
“htmlContent”: {
“type”: “string”,
“description”: “The body content of the email encoded as escaped HTML”
}
},
“required”: [
“to”,
“message”
]
}
}
}
},
“deprecated”: false
}
}
},
“components”: {
“schemas”: {}
}
}
Leave Authentication to none, and Privacy Policy blank.
Test Your GPT
Finally, try out your GPT in the preview pane to see it in action!
By following these steps, you can easily integrate Azure Communication Services with OpenAI GPTs to send emails effortlessly.
Conclusion and Further Reading
We have successfully walked through the journey of building a serverless multichannel notification system using Azure Functions and Azure Communication Services. This system can send timely and personalized notifications across multiple channels, such as Email, SMS, and WhatsApp. In addition, we have explored how to enhance our system with sophisticated content generation capabilities using OpenAI GPTs.
The modular nature of the Azure Functions framework allows your application to scale and adapt easily to changing requirements and traffic demands. Meanwhile, Azure Communication Services enrich the user experience by meeting customers on their preferred platforms, contributing to a seamless and cohesive communication strategy.
As developers, there’s always room to expand our knowledge and add robust features to our applications. Here are some suggestions for further exploration and resources that can assist you in taking your applications to the next level:
Azure Communication Services AI samples: One stop shop for GitHub samples for AI-powered communication solutions.
Azure Functions Best Practices: Learn about best practices for designing and implementing Azure Functions by visiting Azure Functions best practices.
Azure Communication Services Documentation: Explore the full capabilities of Azure Communication Services including chat, phone numbers, video calling, and more on the Azure Communication Services documentation.
Security and Compliance in Azure: Understand the best practices for security and compliance in Azure applications, particularly relevant for handling sensitive user communication data. Check the Microsoft Azure Trust Center.
OpenAI GPT Documentation: For more insight into using and customizing OpenAI GPTs, refer to the OpenAI API documentation.
Azure AI Services: Azure offers a range of AI services beyond just communication. Explore Azure AI services for more advanced scenarios such as speech recognition, machine translation, and anomaly detection at Azure AI services documentation.
Handling Large-scale Data: To handle a large amount of data and improve the performance of communication systems, consider learning about Azure’s data-handling services like Azure Cosmos DB, Azure SQL Database, and Azure Cache for Redis. Start with the Azure Data storage documentation.
Monitoring and Diagnostics: Improve the reliability of your applications by implementing robust monitoring and diagnostics tools. Azure offers several tools such as Azure Monitor and Application Insights. Dive into Application Insights for Azure Functions.
Serverless Workflow Automation with Azure Logic Apps: Enhance your serverless applications using Azure Logic Apps to automate and simplify workflows. Learn more about Azure Logic Apps at What is Azure Logic Apps?.
Happy coding!
Microsoft Tech Community – Latest Blogs –Read More
Part 1 – Multichannel Notification System with Azure Communication Services and Azure Functions
In the interconnected digital era, it’s crucial for businesses and services to communicate effectively with their audience. A robust notification system that spans various communication channels can greatly enhance user engagement and satisfaction.
This blog post is part 1 of the two-part tutorial for a step-by-step guide on building such a multichannel notification system with Azure Functions and Azure Communication Services.
Leveraging serverless architecture and the reach of Azure Communication Services, your application can dynamically generate and send messages via SMS, Email, and WhatsApp. By incorporating OpenAI GPTs, the system can create content that is not only relevant and timely but personalized, making communication more impactful.
Example email
Architecture diagram
Here are some practical scenarios where a multichannel notification system is valuable:
Financial Alerts: Banks and financial services can send fraud alerts, transaction confirmations, and account balance updates.
Healthcare Reminders: Clinics and pharmacies can notify patients about appointment schedules, vaccinations, or prescription refills.
Security Verification: Services requiring secure authentication can utilize two-factor authentication prompts sent via SMS or WhatsApp.
Marketing and Promotions: Retailers can craft and distribute targeted marketing messages and promotions, driving customer engagement.
The foundation of this solution is Azure Functions, for event-driven platform for running scalable applications and Azure Communication Services, for reliable Email, SMS, and WhatsApp messaging. To generate content, we use OpenAI GPTs, which enables the creation of sophisticated, context-aware text that can be used in notifications.
Now, let’s get started on your path to building a serverless messaging system on Azure.
Prerequisites
Before we dive into building our multichannel notification system with Azure Functions and Azure Communication Services, you will need to ensure that the following tools and accounts set up:
Azure Account: You need a Microsoft Azure account to create and manage resources on Azure. If you haven’t got one yet, you can create a free account here.
Visual Studio Code: We use Visual Studio Code (VS Code) as our Integrated Development Environment (IDE) for writing and debugging our code. Download and install it from here.
Azure Functions Extension for Visual Studio Code: This extension provides you with a seamless experience for developing Azure Functions. It can be installed from the VS Code marketplace.
C# Dev Kit: Since we write our Azure Functions in C#, this extension is necessary for getting C# support in VS Code. You can install it from the VS Code marketplace.
Azure CLI: The Azure Command-Line Interface (CLI) will be used to create and manage Azure resources from the command line. For installation instructions, visit the Azure CLI installation documentation page.
Postman: Although not strictly necessary, Postman is a handy tool for testing our HTTP-triggered Azure Functions without having to write a front-end application. You can download Postman from getpostman.com.
With the prerequisites in place, you’re ready to set up your development environment, which we will cover in the following section.
Creating Resources
To get started with building a multichannel notification system, we’ll need to create several resources within Azure. This section will walk you through setting up your Azure environment using the Azure CLI. Ensure that you have the Azure CLI installed on your machine and that you’re logged into your Azure account.
Azure Communication Services
Azure Communication Services (ACS) provides the backbone for our notification system, allowing us to send SMS, Email, and WhatsApp messages. Follow these steps to create resources for all three communication channels. However, you can choose one or more depending upon your preference. Log in to Azure:
az login
Create a Resource Group (if necessary): This groups all your resources in one collection.
az group create –name <YourResourceGroupName> –location <PreferredLocation>
Replace <YourResourceGroupName> with a name for your new resource group and <PreferredLocation> with the Azure region you prefer, such as eastus.
Create ACS Resource: This will be the main ACS resource where we manage communications capabilities.
az communication create –name <YourACSResourceName> –location Global –data-location UnitedStates –resource-group <YourResourceGroupName>
Replace <YourACSResourceName> with a unique name for your ACS resource and <YourResourceGroupName> with the name of your resource group.
After creating the resource, retrieve the connection string as you will need it to connect your Azure Function to ACS. Copy the one marked as primary.
az communication list-key –name <YourACSResourceName> –resource-group <YourResourceGroupName>
Azure Communication Services for Email
To set up Azure Communication Services Email, you’ll need to follow a few steps in the Azure Portal:
Create the Email Communications Service resource using the portal: Provision a new Email Communication Services resource in Azure portal using the instructions here. Make sure to select the same resource group as your ACS resource.
Configure the Email Communications Service: You will need to configure domains and sender authentication for email. Provision an Azure Managed Domain or set up your Custom Verified Domain depending on your use case.
Azure Communication Services for SMS
To send SMS messages, you will need to acquire a phone number through ACS. You will have to submit a phone number verification application for enabling the number for sending or receiving SMS. This may take a couple of weeks. You can choose to skip SMS and continue the tutorial with Email and WhatsApp.
Get a Phone Number: Navigate to the Phone Numbers blade in your ACS resource on the Azure portal and follow the steps to get a phone number that’s capable of sending and receiving SMS.
Toll Free verification: Apply for verification of your number using Apply for toll-free verification.
Note the Phone Number: After acquiring a phone number, record it to use when sending SMS messages from your Azure Function.
WhatsApp for Business
Sending WhatsApp messages requires setting up a WhatsApp Business account.
Set up a WhatsApp Business Account: Follow the instructions for connecting a WhatsApp business account with Azure Communication Services.
Note the WhatsApp Configuration: Once set up, make a note of the necessary configuration details such as the phone number and WhatsApp Business API credentials, as they will be needed in your Azure Function.
By following these steps, you create the resources needed to build a multichannel notification system that can reach users through SMS, Email, and WhatsApp. Next, we set up your Azure Function and integrating these services into it.
Setting Up The Environment
With the prerequisites out of the way, let’s prepare our environment to develop our multichannel notification system using Azure Functions and Azure Communication Services.
Creating the Function App Project
Open Visual Studio Code and follow these steps to create a new Azure Functions project:
Click on the Azure icon in the Activity Bar on the side of Visual Studio Code to open the Azure Functions extension.
In the Azure Functions extension, click on the Create New Project icon, choose a directory for your project, and select Create New Project Here.
Choose the language for your project. We select C# for this tutorial.
Select the template for your first function. For this project, an HTTP-triggered function is a good starting point since we want to receive HTTP requests to send out notifications.
Provide a function name, such as EmailTrigger, and set the authorization level to anonymous or function, depending on your security preference.
After you have completed these steps, your Azure Functions project is set up with all the necessary files in the chosen directory.
Installing the Necessary Packages
Now it’s time to add the packages necessary for integrating Azure Communication Services:
Open the integrated terminal in Visual Studio Code by clicking on ‘Terminal’ in the top menu and then selecting ‘New Terminal’.
Add the Azure Communication Services packages to your project:
bash
dotnet add package Azure.Communication.Sms
dotnet add package Azure.Communication.Messages –prerelease
Setting Up Environment Variables
You should store configuration details like connection strings and phone numbers as environment variables instead of hardcoding them into your functions. To do so in Azure Functions, add them to the local.settings.json file, which is used for local development.
Edit the local.settings.json file to include your Azure Communication Services (ACS) connection string and phone numbers:
json
“IsEncrypted”: false,
“Values”: {
“AzureWebJobsStorage”: “”,
“FUNCTIONS_WORKER_RUNTIME”: “dotnet”,
“COMMUNICATION_SERVICES_CONNECTION_STRING”: “<acs_connection_string>”,
“SENDER_PHONE_NUMBER”: “<acs_sms_phone_number>”,
“WHATSAPP_NUMBER”: “<acs_whatsapp_number>”,
“SENDER_EMAIL_ADDRESS”: “<acs_email_address>”
}
}
Be sure to replace <acs_connection_string>, <acs_sms_phone_number>, <acs_whatsapp_number>, and <acs_email_address> with your actual Azure storage account connection string, Azure Communication Services connection string, SMS phone number, WhatsApp number, and sending email address.
Remember not to commit the local.settings.json file to source control if it contains sensitive information. Configure similar settings in the Application Settings for your Azure Function when you deploy to Azure.
Coding the EmailTrigger
Creating a functional EmailTrigger Azure Function involves starting from the default template provided by Azure Functions for C# and enhancing it with the necessary logic and services to handle email sending. In this section, we guide you through the steps to transform the default template into the finished EmailTrigger function.
Step 1: Set Up the Function Template
Start by using the default HTTP triggered function template provided by Visual Studio Code for creating an Azure Functions project. It will have the necessary usings, function name attribute, and a simple HTTP trigger that returns a welcome message. Select your project in the Workspace pane and click on the ‘Create Function’ button in the Azure Functions extension. Choose ‘HTTP trigger’ as the template and provide a name for the function, such as EmailTrigger. Set the authorization level to anonymous or function, depending on your security preference.
Step 2: Add Azure Communication Services Email Reference
Add a reference to using Azure.Communication.Email then create a property in the EmailTrigger class to hold an instance of EmailClient and a property to hold the email sender address.
csharp
private string? sender = Environment.GetEnvironmentVariable(“SENDER_EMAIL_ADDRESS”);
Step 3: Read Configuration and Initialize EmailClient
Within the EmailTrigger class constructor, read the Azure Communication Services connection string from the environment variables using Environment.GetEnvironmentVariable() method and initialize an instance of EmailClient with the connection string.
Make sure to handle the possibility that the environment variable may be null and throw an appropriate exception if it is not set.
csharp
if (connectionString is null)
{
throw new InvalidOperationException(“COMMUNICATION_SERVICES_CONNECTION_STRING environment variable is not set.”);
}
_emailClient = new EmailClient(connectionString);
Step 4: Define the Request Model
Create a request model class EmailRequest inside the EmailTrigger class to represent the expected payload. This model includes the subject, HTML content, and recipient email address.
csharp
{
public string Subject { get; set; } = string.Empty;
public string HtmlContent { get; set; } = string.Empty;
public string Recipient { get; set; } = string.Empty;
}
Step 5: Parse the Request Body
Modify the Run function to be async since we’ll be performing asynchronous operations.
csharp
Use StreamReader to read the request body and deserialize it into the EmailRequest object using System.Text.Json.JsonSerializer.
Handle the case where the deserialization fails by returning a BadRequestResult.
csharp
EmailRequest? data = JsonSerializer.Deserialize<EmailRequest>(requestBody, new JsonSerializerOptions() {
PropertyNamingPolicy = JsonNamingPolicy.CamelCase
});
if (data is null)
{
return new BadRequestResult();
}
Step 6: Define the Sender and Send the Email
Instantiate a sender email address string that will be passed to the SendAsync method of the EmailClient instance. Replace the static email ‘DoNotReply@effaa622-a003-4676-b27e-6b9e7a783581.azurecomm.net‘ with your configured sender address in the actual implementation.
Use a try-catch block to send the email using the SendAsync method and catch any RequestFailedException to log any errors.
csharp
EmailSendOperation emailSendOperation = await _emailClient.SendAsync(
Azure.WaitUntil.Completed,
sender,
data.Recipient,
data.Subject,
data.HtmlContent
);
_logger.LogInformation($”Email Sent. Status = {emailSendOperation.Value.Status}”);
_logger.LogInformation($”Email operation id = {emailSendOperation.Id}”);
Step 7: Return a Success Response
Once the email send operation is completed, return an OkObjectResult indicating the success of the operation.
csharp
}
Final Code
After completing all the above steps, your EmailTriggerAzure Function should look as follows:
csharp
using System.IO;
using System.Text.Json;
using System.Threading.Tasks;
using Azure;
using Azure.Communication.Email;
using Microsoft.AspNetCore.Mvc;
using Microsoft.Azure.Functions.Worker;
using Microsoft.Extensions.Logging;
namespace ACSGPTFunctions
{
public class EmailTrigger
{
private readonly ILogger<EmailTrigger> _logger;
private readonly EmailClient _emailClient;
public EmailTrigger(ILogger<EmailTrigger> logger)
{
_logger = logger;
string? connectionString = Environment.GetEnvironmentVariable(“COMMUNICATION_SERVICES_CONNECTION_STRING”);
if (connectionString is null)
{
throw new InvalidOperationException(“COMMUNICATION_SERVICES_CONNECTION_STRING environment variable is not set.”);
}
_emailClient = new EmailClient(connectionString);
}
public class EmailRequest
{
public string Subject { get; set; } = string.Empty;
public string HtmlContent { get; set; } = string.Empty;
public string Recipient { get; set; } = string.Empty;
}
[Function(“EmailTrigger”)]
public async Task<IActionResult> Run([HttpTrigger(AuthorizationLevel.Anonymous, “post”)] HttpRequest req)
{
_logger.LogInformation(“Processing request.”);
string requestBody = await new StreamReader(req.Body).ReadToEndAsync();
EmailRequest? data = JsonSerializer.Deserialize<EmailRequest>(requestBody, new JsonSerializerOptions() {
PropertyNamingPolicy = JsonNamingPolicy.CamelCase
});
if (data is null)
{
return new BadRequestResult();
}
var sender = “DoNotReply@effaa622-a003-4676-b27e-6b9e7a783581.azurecomm.net”;
try
{
_logger.LogInformation(“Sending email…”);
EmailSendOperation emailSendOperation = await _emailClient.SendAsync(
Azure.WaitUntil.Completed,
sender,
data.Recipient,
data.Subject,
data.HtmlContent
);
_logger.LogInformation($”Email Sent. Status = {emailSendOperation.Value.Status}”);
_logger.LogInformation($”Email operation id = {emailSendOperation.Id}”);
}
catch (RequestFailedException ex)
{
_logger.LogInformation($”Email send operation failed with error code: {ex.ErrorCode}, message: {ex.Message}”);
return new ObjectResult(new { error = ex.Message }) { StatusCode = 500 };
}
return new OkObjectResult(“Email sent successfully!”);
}
}
}
This completed EmailTriggerAzure Function is now ready to be part of a multichannel notification system, handling the email communication channel.
Next Steps
Continue to the next part of this topic to further explore building, deploying and testing your intelligent app for a multichannel notification system.
Microsoft Tech Community – Latest Blogs –Read More
Boosting Power BI Performance with Azure Databricks through Automatic Aggregations
This post is authored in conjunction with Yatish Anand, Senior Solutions Architect at Databricks, and Andrey Mirskiy, Senior Specialist Solutions Architect at Databricks.
Figure 1. Power BI automatic aggregations overview, source.
Introduction
In today’s fast-paced data landscape, timely insights can make all the difference. Power BI’s Automatic Aggregations feature is a breakthrough, designed to push performance boundaries by delivering low-latency query results on large datasets. It harnesses AI-powered caching to streamline DirectQuery models, combining the best-in-class performance of Power BI on Azure Databricks with low-latency BI for modern reporting needs. Used with DirectQuery mode, Automatic Aggregations do not face data volume limitations and allow you to scale regardless of data size without compromising on BI performance.
This innovation makes it easy for Power BI users of all skill levels to tap into advanced performance without worrying about backend strain or complex data modeling. Imagine your reports updating in real-time, even with billions of records in play. With this approach, you can deliver actionable insights faster than ever, freeing up time to focus on your business, not your data infrastructure. In this blog, we will showcase the integration of Power BI Automatic Aggregation with Azure Databricks and how this integration will help improve the performance of your Power BI reports.
What Are Automatic Aggregations?
Automatic aggregations streamline the process of improving BI query performance by maintaining an in-memory cache of aggregated data. This means that a substantial portion of report queries can be served directly from this in-memory cache instead of relying on the backend data sources. Power BI automatically builds these aggregations using AI based on your query patterns and then intelligently decides which queries can be served from the in-memory cache and which are routed to the data source through DirectQuery, resulting in faster visualizations and reduced load on the backend systems.
Key Benefits of Automatic Aggregations
Faster Report Visualizations: Automatic aggregations optimize most report queries by caching aggregated query results in advance, including those generated when users interact with reports. Only outlier queries that cannot be resolved via the cache are directed to the data source.
Balanced Architecture: Compared to using pure DirectQuery mode, automatic aggregations enable a more balanced approach. Most frequently used queries are served from Power BI query in-memory cache, which reduces the processing load on data sources during peak reporting times, improves scalability, and decreases costs.
Simplified Setup: Model owners can easily activate automatic aggregations and schedule regular refreshes. Once the initial training and refresh are complete, the system autonomously develops an aggregation framework tailored to the specific queries and data patterns.
Configuring Automatic Aggregations
Setting up automatic aggregations is straightforward. Users can enable the feature in the model settings and schedule one or more refresh operations. It is essential to review comprehensive guidelines on how automatic aggregations function to ensure they are suitable for your specific environment.
Figure 2. Enabling automatic aggregations, source.
Once configured, Power BI will utilize a query log to track user interactions and optimize the aggregations cache over time. The training operation, which evaluates query patterns, occurs during the first scheduled refresh, allowing Power BI to adapt to changing usage patterns.
Requirements for Automatic Aggregations
Automatic aggregations are compatible with several Power BI plans, including:
Power BI Premium per capacity
Fabric F Sku Capacity
Power BI Premium per user
Power BI Embedded models
Automatic aggregations are specifically designed for DirectQuery models, including composite models that utilize both import tables and DirectQuery connections.
Automatic Aggregation Walkthrough with Azure Databricks Integration
In this example we will showcase how to enable Automatic Aggregations on Power BI semantic models and train Automatic Aggregations in order to enhance the performance of reports using Azure Databricks as a data source.
Pre-requisites
Before you begin, ensure you have the following:
An Azure Databricks account, access to an Azure Databricks workspace, and a Databricks SQL Warehouse.
Power BI Desktop installed on your machine. The latest version is highly recommended.
Power BI workspace
DAX Studio or any other DAX parser tool
Step by Step Instructions
1. Create an initial Power BI semantic model based on samples catalog, tpch schema. Add tables and relationships as shown on the screenshot below. The dimension tables customer and nation should be set to Dual storage mode. The fact tables orders and lineitem should be set to DirectQuery storage mode. Below is the data model for the sample report.
For Best practices around Power BI storage mode please refer to this Git repo
2. Create a simple tabular report showing the count of orders and min shipment date, sum of discounts and sum of quantities. Also add the slicer with nation names, as shown below.
3. Now publish this report to a Power BI workspace.
4. As shown below when we run the report it Power BI takes ~20 sec to run the query. Below is the snapshot from the Network Trace:
Also below screenshot shows query hit the Databricks SQL Warehouse and read 38M records.
5. Enable the Automatic Aggregations in the semantic model settings. You can set the Query coverage according to your needs. This setting will increase the number of user queries analyzed and considered for performance improvement. The higher percentage of Query coverage will lead to more queries being analyzed, hence higher potential benefits, however, aggregation training will take longer.
6. For Power BI to be able to create aggregations, we need to populate the Power BI query log which stores internal queries created by Power BI when users interact with a report. Thus, you can either open the deployed Power BI Report and interact with the report by selecting different nation names in the slicer or you can open the DAX studio and run the sample DAX query mentioned below.
Please note that for better model training you need to set different values for the slicer or the filter in DAX-query and run it multiple times.
TREATAS({“BRAZIL”}, ‘nation'[n_name])
One of the guidelines to populate a query log is that before making a report available to users , the report publisher should open the report and try with different slicer filters. In our scenario as mentioned above we populated the query log by selecting the different names in the report slicer . This step would help end user have faster report rendering.
7. You can now start the model training manually or schedule it.
8. Once the model is trained, Power BI will have aggregated values in in-memory cache. The next time you interact with the report using similar patterns (dimensions, measures, filters) Power BI will leverage cached aggregations to serve the queries and will not send queries to Databricks SQL Warehouse. Hence, you may expect sub-second report refresh performance.
As shown in the below screenshot post enabling Automatic Aggregation we can see that the report visual is now getting rendered in ~1.6 sec as compared to 20 sec earlier. This is because the data is now getting read from query log cache.
Also as shown below there is no SQL query fired at the DBSQL as well
Monitoring and Managing Automatic Aggregations
Power BI continuously refines the in-memory aggregations cache through scheduled refreshes. Semantic model owners can choose to trigger training operations on demand if necessary. It’s also crucial to monitor the refresh history to ensure operations complete successfully and to identify any potential issues.
Power BI provides detailed refresh history logs that display the performance of each operation, enabling users to keep track of memory usage and other critical metrics
Conclusion
In today’s data-driven world, the integration of Azure Databricks and Power BI Automatic Aggregations is a game-changer, delivering unparalleled performance for even the most demanding data environments. While Azure Databricks excels at processing multi-terabyte datasets, Automatic Aggregations uses AI on your query patterns to intelligently cache aggregates, dramatically accelerating performance and reducing costs. This combination addresses the limitations of Import and Direct Lake modes, which are limited at working with large volumes while enhancing the efficiency of DirectQuery models. As shown in our blog with Automatic Aggregation on DirectQuery models you can now get sub-second report performance without constantly querying the underlying data source. With this innovative approach, you can focus on delivering lightning-fast BI reports at any scale rather than manually tuning your semantic model.
Microsoft Tech Community – Latest Blogs –Read More
IP-based redirection
Hello!
I am running a Linux VM on Azure (IaaS) which is providing an SFTP service to the Internet. Sadly, many customers are connecting to this service via public IP address (as opposed to FQDN).
I am migrating this service back to on-premises, through a firewall on a different public IP address.
Linux VM has public IP 1.1.1.1 right on its NIC.
Firewall’s IP is 2.2.2.2.
I want to redirect traffic to the on-premises firewall.
Is there an Azure service/resource that can take inbound connections to 1.1.1.1, then NAT the destination IP to 2.2.2.2 and then also NAT the source IP to 1.1.1.1 or another public IP (like 3.3.3.3) on that service/resource?
Thanks!
Hello! I am running a Linux VM on Azure (IaaS) which is providing an SFTP service to the Internet. Sadly, many customers are connecting to this service via public IP address (as opposed to FQDN). I am migrating this service back to on-premises, through a firewall on a different public IP address. Linux VM has public IP 1.1.1.1 right on its NIC.Firewall’s IP is 2.2.2.2. I want to redirect traffic to the on-premises firewall. Is there an Azure service/resource that can take inbound connections to 1.1.1.1, then NAT the destination IP to 2.2.2.2 and then also NAT the source IP to 1.1.1.1 or another public IP (like 3.3.3.3) on that service/resource? Thanks! Read More
File Size vs Size on Disk Info
Why would a JPG file from 24 years ago show that the Size is 365KB but Size on Disk shows 128Mb. How can I see what makes up all that empty space? Is there any negative impact from a volume full of files like this?
Why would a JPG file from 24 years ago show that the Size is 365KB but Size on Disk shows 128Mb. How can I see what makes up all that empty space? Is there any negative impact from a volume full of files like this? Read More
server Microsoft
سلام من میخوام سرور با مایکروسافت پیکربندی کنم مخزن بسازم شما بفرمائید کدوم گزینه رو انتخاب بکنم پیکربندی میخوام بکنم داده های دستگاه رو
سلام من میخوام سرور با مایکروسافت پیکربندی کنم مخزن بسازم شما بفرمائید کدوم گزینه رو انتخاب بکنم پیکربندی میخوام بکنم داده های دستگاه رو Read More
License for Multi Tenant Setup
Scenario: User R is part of Tenant A and have M365 License. Tenant A & B are cross sync. Whether User R would need M365 license from Tenant B to operate on files stored in Tenant B?
Scenario: User M is the external guest to Tenant B. Whether User M would need M365 license from Tenant B to operate on files stored in Tenant B?
Scenario: User R is part of Tenant A and have M365 License. Tenant A & B are cross sync. Whether User R would need M365 license from Tenant B to operate on files stored in Tenant B? Scenario: User M is the external guest to Tenant B. Whether User M would need M365 license from Tenant B to operate on files stored in Tenant B? Read More
GETPIVOTDATA Dynamic Referencing from Sliced Pivot Table
Hi all,
I’m using a Sliced Pivot Table to generate dynamic Charts and only realized after working on the rest of my Workbook that changing the reference week will not update the formula, leading to a #REF error
Here is an example of formulas I’m using, with reference week 42, when changing reference week with the slicer, the table will update, but the formula keeps “[NF].[Week].&[42]” returning the error message?
=GETPIVOTDATA(“[Measures].[Count of Branch 3]”,’PIVOT NF’!$Q$3,”[NF].[Week]”,”[NF].[Week].&[42]”,”[NF].[Dep]”,”[NF].[Dep].&[EXP]”)
How can I ensure that the formula updates along with the table? Am I not using GETPIVOTDATA functions correctly?
Hi all, I’m using a Sliced Pivot Table to generate dynamic Charts and only realized after working on the rest of my Workbook that changing the reference week will not update the formula, leading to a #REF error Here is an example of formulas I’m using, with reference week 42, when changing reference week with the slicer, the table will update, but the formula keeps “[NF].[Week].&[42]” returning the error message? =GETPIVOTDATA(“[Measures].[Count of Branch 3]”,’PIVOT NF’!$Q$3,”[NF].[Week]”,”[NF].[Week].&[42]”,”[NF].[Dep]”,”[NF].[Dep].&[EXP]”) How can I ensure that the formula updates along with the table? Am I not using GETPIVOTDATA functions correctly? Read More
Coming in late December: MB-7007: Deploy and configure Microsoft 365 Copilot for Sales
Course Name: MB-7007: Deploy and configure Microsoft 365 Copilot for Sales
Release Date: December 20th, 2024 (Release dates are subject to change)
Duration: 1-Day ILT
Solution Area: Business Applications
Credential: Applied Skills Assessment
Course Description:
Audience:
Students should be familiar with Copilot for Sales, Microsoft 365, Microsoft Dynamics 365 Sales, and Power BI, and should have experience administering Microsoft Teams and Microsoft Outlook.
Please note: This is not a support forum. Only comments related to this specific blog post content are permitted and responded to.
If you have ILT questions not related to this blog post, please reach out to your program forums & resources for additional support.
For Training Services Partners:
aka.ms/TSP_Learn_Resources
partner.microsoft.com/support
For Microsoft Certified Trainers:
Microsoft Tech Community – Latest Blogs –Read More
Force change password at next login on-premise and MS online
Hi
Currently, I have a hybrid environment with AD on-premise, Azure AD sync (with password hash & SSPR), and Exchange Online.
My goal is to force change the password at the next login from on-premise AD to MS online and vice versa.
It’s working. When I change the password on-premise AD, MS Online prompts me to change the password. It is not working when I set the account from the Admin center to force the password change at the next login; it does not sync to on-premise AD. The domain computer will not prompt to change password.
Thanks in advance
MS recommend to try this
Install-Module -Name Microsoft.Graph
Connect-MgGraph -Scopes “OnPremDirectorySynchronization.ReadWrite.All”
Then run this command.
$OnPremSync = Get-MgDirectoryOnPremiseSynchronization
$OnPremSync.Features.UserForcePasswordChangeOnLogonEnabled = $true
Update-MgDirectoryOnPremiseSynchronization -OnPremisesDirectorySynchronizationId $OnPremSync.Id -Features $OnPremSync.Features
Hi Currently, I have a hybrid environment with AD on-premise, Azure AD sync (with password hash & SSPR), and Exchange Online. My goal is to force change the password at the next login from on-premise AD to MS online and vice versa. It’s working. When I change the password on-premise AD, MS Online prompts me to change the password. It is not working when I set the account from the Admin center to force the password change at the next login; it does not sync to on-premise AD. The domain computer will not prompt to change password. Thanks in advance MS recommend to try this Install-Module -Name Microsoft.Graph
Connect-MgGraph -Scopes “OnPremDirectorySynchronization.ReadWrite.All”
Then run this command.
$OnPremSync = Get-MgDirectoryOnPremiseSynchronization
$OnPremSync.Features.UserForcePasswordChangeOnLogonEnabled = $true
Update-MgDirectoryOnPremiseSynchronization -OnPremisesDirectorySynchronizationId $OnPremSync.Id -Features $OnPremSync.Features Read More
Coming in December: SC-5004: Defend against cyberthreats with Microsoft Defender XDR
Course Name: SC-5004: Defend against cyberthreats with Microsoft Defender XDR
Release Date: December 10th, 2024 (Release dates are subject to change)
Duration: 1-Day ILT
Solution Area: Security
Credential: Applied Skills Assessment
Course Description:
Configure a Microsoft Defender XDR environment
Manage devices by using Microsoft Defender for Endpoint
Manage incidents in Microsoft Defender XDR
Manage investigations on an endpoint
Perform Advanced Hunting with KQL to detect unique threats
Audience:
Security Operations Analysts
Please note: This is not a support forum. Only comments related to this specific blog post content are permitted and responded to.
If you have ILT questions not related to this blog post, please reach out to your program forums & resources for additional support.
For Training Services Partners:
aka.ms/TSP_Learn_Resources
partner.microsoft.com/support
For Microsoft Certified Trainers:
Microsoft Tech Community – Latest Blogs –Read More
How to import a pst.file in the New Outlook version ?
I’ve changed jobs and computers. To keep them, I’ve exported a number of e-mails in a .pst file. How do I import this file into Outlook on my new PC? I’m using the new version of Outlook and the File/Import function doesn’t exist.
I’ve changed jobs and computers. To keep them, I’ve exported a number of e-mails in a .pst file. How do I import this file into Outlook on my new PC? I’m using the new version of Outlook and the File/Import function doesn’t exist. Read More
Introducing the Open Targets Dataset: Now Available on Genomics Data Lake on Azure
Title:
Introducing the Open Targets Dataset: Now Available on Genomics Data Lake on Azure for Advanced Biomedical Research
Introduction
Biomedical research is accelerating at an unprecedented pace, driven by the vast amounts of data generated from genetic studies, drug development, and disease research. Today, we would like to announce that critical datasets from Open Targets are now available on Azure Genomics Open Data Lake. This data can be seamlessly integrated into your research workflows, providing a rich resource for exploring gene-disease associations, drug targets, and biomedical mechanisms.
With Azure’s cloud-based solutions, these datasets are not only easier to access, but they can also be combined with machine learning, analytics, and AI-powered tools to drive deeper insights and foster innovation in areas like drug discovery, personalized medicine, and genomics.
Dataset Overview
The Open Targets consortium is a collaborative public -private research partnership which aims to systematically identify and prioritize drug targets. Its flagship informatics platform integrates genetic and molecular evidence associating targets and diseases, and includes extensive data on the genetic basis of diseases, drugs, and the identification of potential therapeutic targets.
The Open Targets provides crucial data that integrates genetics, genomics, and drug information, enabling researchers to identify and prioritize drug targets for complex diseases. By offering insights into gene-disease associations, molecular interactions, and drug mechanisms, it supports drug discovery and development. This dataset enhances the understanding of the genetic basis of diseases and the effects of drugs, fostering better therapeutic strategies and precision medicine. It is widely used for target validation, drug repositioning, and understanding adverse drug events.
Key Datasets
This dataset offers comprehensive access to 25 different JSON and file formats, which can be seamlessly integrated into your analysis workflows. These datasets fall into the following categories:
Drug data: Mechanism of action, indications, pharmacovigilance and pharmacogenetics
Target-Disease Associations: Curated data linking specific genes to diseases, helping researchers better understand disease pathways and mechanisms.
Target, Disease, drug annotations: core annotations for molecular targets, diseases and drugs
Molecular interactions: Target interactions and supporting evidence.
Expression and Phenotypes: Baseline expressions, animal model phenotypes and gene ontology
Pathways and essentiality: Reactome pathway and DepMap essentiality for targets
Significance in research and drug development:
This rich dataset opens numerous opportunities for researchers in a variety of fields. Below are a few use cases:
Identification of Drug Targets for Alzheimer’s Disease
Publication: “Genome-wide association study identifies new loci and functional pathways influencing Alzheimer’s disease risk” by Kunkle, B.W. et al. (2019) in Nature Genetics.
Summary: The researchers used the Open Targets dataset to integrate genetic association data with functional genomics, which helped them prioritize genes and pathways linked to Alzheimer’s disease. This approach led to the identification of potential therapeutic targets that could be further investigated for developing treatments for Alzheimer’s
Understanding Genetic Basis of Inflammatory Bowel Disease (IBD)
Publication: “Genetic risk factors for inflammatory bowel disease” by de Lange, K.M. et al. (2017) in Nature Genetics.
Summary: This study leveraged the Open Targets dataset to identify and prioritize genetic variants associated with IBD. By linking genetic associations to specific genes and pathways, the researchers gained valuable insights into the mechanisms underlying the disease, which could inform the development of new therapeutic strategies
Drug Repurposing for COVID-19
Publication: “Drug repurposing for COVID-19: a systematic review” by Zhou, Y. et al. (2020) in Nature Reviews Drug Discovery.
Summary: The researchers used the Open Targets dataset to analyze and prioritize drug targets for COVID-19. This analysis helped them identify existing drugs that could be repurposed for treating COVID-19, providing a list of potential candidates for clinical trials
Availability
How to Access the Dataset on Azure
** Please note**
We are enabling public access to all Genomics Data Lake containers. The existing “signed URLs” (shared access signatures) will be retired at: 2024-11-04T00:00:00Z. After this time, the URLs without a query string will continue to work, however the “signed URLs” will no longer work and will return a 403 HTTP status code. Please plan accordingly to access the public URLs without a query string after this date (remove the ‘?’ and trailing characters)
Accessing this dataset on Azure is straightforward and can be integrated into a variety of Azure services for analysis and visualization. Here’s how you can get started:
Using AzCopy
Prerequisites:
AzCopy must be installed on your machine. Download AzCopy here.
Steps:
Get the SAS URL of the blob container or file you want to download. The url can be found here
Open your command line (e.g., Command Prompt, Terminal, or PowerShell).
Run the following command to download data from the Azure Blob storage:
azcopy copy “https://datasetopentargets.blob.core.windows.net/dataset//17.02/17.02_association_data.json.gz” “C:UsersYourUserDownloads”
This will copy the blob “17.02_association_data.json.gz” to your Downloads directory
Using Python SDK
#Install the Azure Storage Blob library for Python
pip install azure-storage-blob
Import the necessary libraries
from azure.storage.blob import BlobClient
import os
#Download the blob by specifying the SAS URL and the local file path.
sas_url = “https://datasetopentargets.blob.core.windows.net/dataset/17.02/17.02_association_data.json.gz?sv=2023-01-03&st=2024-10-24T21%3A20%3A22Z&se=2026-10-25T21%3A20%3A00Z&sr=c&sp=rl&sig=9EI4PbUvTkT%2F0jUCg5aNLP5CBlu1bUDsyK6TDFzZacw%3D”
local_path = “path/to/save/file”
#Create BlobClient
blob_client = BlobClient.from_blob_url(sas_url)
#Download the blob content to a local file
with open(local_path, “wb”) as download_file: download_stream = blob_client.download_blob() download_file.write(download_stream.readall())
Run the script. The blob will be downloaded and saved to the specified location.
Using Azure Storage Explorer
Prerequisites:
Download and install Azure Storage Explorer.
Steps:
Open Azure Storage Explorer.
Connect to your Azure account by clicking “Add an Account” or use the “Connect to Azure Storage Container”
Choose “Use a shared access signature (SAS) URI” and paste the SAS URL for your blob container. OR
Choose “Anonymously (my blob container allows public access)” [after 11/19/2024 since public access will be unabled on all dataset]
Navigate to the Blob Container in the left-hand panel where your data is stored.
Right-click on the blob or folder you want to download and select “Download”.
Select the destination folder on your local machine.
The blob data will be downloaded to the specified location
We encourage researchers to explore the Open Targets dataset to accelerate breakthroughs in target prioritisation!
Acknowledgements:
We would like to Annalisa Buniello, Manuel Bernal Llinares, Roberto LLeras and Matt Mcloughlin for helping us make the data available on Azure and Helena Cornu for help with the blog.
Microsoft Tech Community – Latest Blogs –Read More
New SharePoint Lists ‘Copy link’ to New Form – superfluous “New Item” text
Good Afternoon,
When I create a link to the New List Item Form:-
I get this superfluous “New Item” text:”-
Note that I’ve already changed the JSON header txtContent to “Submit Your Improvement Idea”, otherwise it would state “New Item” twice.
Is there a way to remove or change the “New Item” text?
Good Afternoon, When I create a link to the New List Item Form:- I get this superfluous “New Item” text:”- Note that I’ve already changed the JSON header txtContent to “Submit Your Improvement Idea”, otherwise it would state “New Item” twice. Is there a way to remove or change the “New Item” text? Read More
Calendar invite sender address?
I have a microsoft account in form <name> at <myowndomain.com>. If i send email in outlook the sender adress is <name> at <myowndomain.com> as expected. However, when i send a calendar invite the outlook invite shows as sender outlook_<lotsofcharacters>@outlook.com and it is impossible to respond to this address (bounces back). How can i change the calendar invites to show <name> at <myowndomain.com>as sender like i can do in email?
I have a microsoft account in form <name> at <myowndomain.com>. If i send email in outlook the sender adress is <name> at <myowndomain.com> as expected. However, when i send a calendar invite the outlook invite shows as sender outlook_<lotsofcharacters>@outlook.com and it is impossible to respond to this address (bounces back). How can i change the calendar invites to show <name> at <myowndomain.com>as sender like i can do in email? Read More