Category: Microsoft
Category Archives: Microsoft
Firewall considerations for gMSA on Azure Kubernetes Service
This week I spent some time helping a customer with a gMSA environment on which they were finding some issues in deploying their app. The issues started when they were trying to figure out why the Kerberos ticket was not being issues for the Window pod with gMSA configured in AKS. I decided to write this blog post to list some of the firewall considerations for different scenarios on which security rules might block the authentication process.
gMSA and its moving parts
To use gMSA on AKS, you must understand that there are many moving parts in play. First, your Kubernetes cluster on AKS is comprised of both Linux and Windows nodes. Your nodes will all be part of a virtual network, but only the Windows nodes will try to reach the Domain Controller (DC).
The DC itself might be in another virtual network, in the same virtual network, or even outside of Azure. Then you have the Azure Key Vault (AKV) on which the secret (username and password) is securely stored. Your AKV should only be available to the proper Windows nodes, no one else.
The problem though, comes when you have Windows nodes on AKS and DCs running on different networks or even sites, and you need to open the proper ports between the Windows nodes and the Active Directory DC.
Ports to open for Active Directory and gMSA
We have had documentation on which ports to open for Active Directory for a while. That is relatively well known and can be leveraged here.
The thing to understand is that when using gMSA on AKS, not all these ports need to be opened, and allowing unnecessary traffic might expose you to threats without a need for it. For gMSA, there’s no computer or user account being used interactively, and thus we can compile the following list:
Protocol and port
Purpose
TCP and UDP 53
DNS
TCP and UDP 88
Kerberos
TCP 139
NetLogon
TCP and UDP 389
LDAP
TCP 636
LDAP SSL
Keep in mind this list of ports does not take into consideration ports that your application might need to query AD or perform any other action with the DC. You might need to check for those with the application owner.
Domain Controllers in Azure
You might mitigate a lot of firewall issues by simply adding one (or more) DC to Azure as a VM. By doing that, you have two things that play in your favor:
You keep the authentication process within Azure. Your Windows pods and nodes don’t need to reach to an on-premises environment – unless the DC(s) in Azure is down.
You have a better understanding of ports to open between NSGs in Azure rather than traffic between workloads on Azure and DCs on-premises.
On the other hand, you must consider that the DCs in Azure do need to replicate to the DCs on-premises. However, this is a preferred scenario because you know who the DCs are, versus workloads machine that might scale-out or even new workloads/clusters be added in the future. At the end of the day, the scope for opening ports is lower, which minimizes exposure. Please refer to the documentation to understand ports for AD replication as well.
Hopefully this will help you fix any issues you might be having with gMSA caused by blocked traffic. Keep in mind the ports listed above might not be the full list of ports you need to open, but the minimal set of ports and traffic for the proper authentication. As always, let us know in the comments what are your thoughts and if you have a different scenario.
Microsoft Tech Community – Latest Blogs –Read More
Firewall considerations for gMSA on Azure Kubernetes Service
This week I spent some time helping a customer with a gMSA environment on which they were finding some issues in deploying their app. The issues started when they were trying to figure out why the Kerberos ticket was not being issues for the Window pod with gMSA configured in AKS. I decided to write this blog post to list some of the firewall considerations for different scenarios on which security rules might block the authentication process.
gMSA and its moving parts
To use gMSA on AKS, you must understand that there are many moving parts in play. First, your Kubernetes cluster on AKS is comprised of both Linux and Windows nodes. Your nodes will all be part of a virtual network, but only the Windows nodes will try to reach the Domain Controller (DC).
The DC itself might be in another virtual network, in the same virtual network, or even outside of Azure. Then you have the Azure Key Vault (AKV) on which the secret (username and password) is securely stored. Your AKV should only be available to the proper Windows nodes, no one else.
The problem though, comes when you have Windows nodes on AKS and DCs running on different networks or even sites, and you need to open the proper ports between the Windows nodes and the Active Directory DC.
Ports to open for Active Directory and gMSA
We have had documentation on which ports to open for Active Directory for a while. That is relatively well known and can be leveraged here.
The thing to understand is that when using gMSA on AKS, not all these ports need to be opened, and allowing unnecessary traffic might expose you to threats without a need for it. For gMSA, there’s no computer or user account being used interactively, and thus we can compile the following list:
Protocol and port
Purpose
TCP and UDP 53
DNS
TCP and UDP 88
Kerberos
TCP 139
NetLogon
TCP and UDP 389
LDAP
TCP 636
LDAP SSL
Keep in mind this list of ports does not take into consideration ports that your application might need to query AD or perform any other action with the DC. You might need to check for those with the application owner.
Domain Controllers in Azure
You might mitigate a lot of firewall issues by simply adding one (or more) DC to Azure as a VM. By doing that, you have two things that play in your favor:
You keep the authentication process within Azure. Your Windows pods and nodes don’t need to reach to an on-premises environment – unless the DC(s) in Azure is down.
You have a better understanding of ports to open between NSGs in Azure rather than traffic between workloads on Azure and DCs on-premises.
On the other hand, you must consider that the DCs in Azure do need to replicate to the DCs on-premises. However, this is a preferred scenario because you know who the DCs are, versus workloads machine that might scale-out or even new workloads/clusters be added in the future. At the end of the day, the scope for opening ports is lower, which minimizes exposure. Please refer to the documentation to understand ports for AD replication as well.
Hopefully this will help you fix any issues you might be having with gMSA caused by blocked traffic. Keep in mind the ports listed above might not be the full list of ports you need to open, but the minimal set of ports and traffic for the proper authentication. As always, let us know in the comments what are your thoughts and if you have a different scenario.
Microsoft Tech Community – Latest Blogs –Read More
ADX Continuous Export to Delta Table – Preview
We’re excited to announce that continuous export to Delta table is now available in Preview.
Continuous export in ADX allows you to export data from Kusto to an external table with a periodically run query. The results are stored in the external table, which defines the destination, such as Azure Blob Storage, and the schema of the exported data. This process guarantees that all records are exported “exactly once”, with some exceptions. Continous export previously supported CSV, TSV, JSON and Parquet formats.
Starting today, you can continuously export to a delta table.
To define continuous export to a delta table:
Create an external delta table, as described in Create and alter delta external tables on Azure Storage.
(.create | .alter | .create-or-alter) external table TableName [(Schema)] kind = delta (StorageConnectionString ) [with (Property [, …])]
Define continuous export to this table using the commands described in Create or alter continuous export.
.create-or-alter continuous-export continuousExportName [over (T1, T2 )] to table externalTableName [with (propertyName = propertyValue [, …])] <| query
Few things to note:
If the schema of delta table while defining the external table isn’t provided, Kusto will try to infer it automatically based on the delta table defined in the target storage container.
If the schema of delta table while deining the external table is provided and there is no delta table defined in the target storage container, continous export will create a delta table during the first export.
The schema of the delta table must be in sync with the continuous export query. If the underlying delta table changes, the export might start failing with unexpected behavior.
Delta table partitioning is not supported today.
Read more : Continuous data export – Azure Data Explorer & Real-Time Analytics | Microsoft Learn
As always, we’d love to hear your feedback and comments.
Microsoft Tech Community – Latest Blogs –Read More
CoPilot for genomics report interpretation!
Contributors – Manoj Kumar, Sam Bhatia, Kelly Johnson, Roberto Llaras, Luis Apellaniz Fontanez, Sean Hamill, Mary Burke, Anukool A, Christina Tillbrook
Imagine having a special friend who is a genomics expert and can understand your genetic profile and can help you understand it in a simple, easy to understand way. Well, that’s exactly what we built during the recent Microsoft Hackathon in September 2024.
Picture this: You visit the doctor’s office, and they give you a bunch of papers filled with big, fancy words and numbers as part of your recent genomic analysis. Whoa, that can be confusing, right?
Physicians today use genomic reports to diagnose and treat a wide range of genetic conditions. These reports typically provide information about an individual’s genetic makeup, including any mutations or variations that may be associated with specific diseases or disease conditions. Interpreting genomic reports can be complex and requires specialized training and expertise. Physicians typically work with genetic counselors and other specialists to help interpret and understand the information contained in these reports. In general, physicians will look for any genetic mutations or variations that are known to be associated with the patient’s symptoms or family history. They will also consider the patient’s overall health, medical history, and other factors when interpreting genomic reports. The interpretation of genomic reports is constantly evolving as new research and discoveries are made in the field of genetics. As a result, physicians must stay up to date with the latest advances in genomics and work closely with other experts to ensure that they are providing the most accurate and effective care possible.
That’s where the CoPilot for genomic report interpretation jumps in to save the day! It’s like having a secret decoder that turns all those complex things into simple stories just for you.
What is a CoPilot?
A copilot is an intelligent virtual assistant that uses large language models to facilitate natural, human-like conversational interactions. It can assist users in various tasks and decision-making processes across multiple domains within an enterprise environment. By leveraging these models, AI copilots have the capability to understand, analyze, and process vast amounts of data. Microsoft, for instance, has rebranded all of its AI-powered companions as Copilot, adding specialized monikers to each of them. These AI copilots can provide support in areas such as coding, writing, learning, and more. They are designed to understand context, provide relevant suggestions, and even generate content, making them a valuable tool in many professional settings.
This Copilot is responsible for taking the report or genomic impression from tertiary analysis as input and creating a narrative on the impression in the report in a layman language for non-experts to understand.
Watch the following video to understand how it actually works.
Following is the architecture for this hackathon project.
Health copilot was a powerapp application that different personas interacted with. The reports were stored in Azure storage. The reference for this was available as part of SharePoint List. Power Automate was used to connect to Azure form recognizer to convert from pdf to text. This text was then sent to Azure Open AI GPT 3.5 model with a specific prompt to generate response. Text analytics for health was also used to highlight medical images that were part of the report for easy reading.
Following prompt was sent to large language models in order to produce a consistent as well as understandable summary report at 5th grade level.
#### **Introduction**
Please create a report that will help patients with their
diagnosis based on the lab and genomics results. Use simple language to explain
the findings and what they mean. Be sure to explain any scientific terms you
use in a glossary at the end of the report.
#### **Section Breakdown**
1. **Your Test**
– Describe the type of test carried out in a manner easy
for a patient to understand. Mention that a small sample of their blood was
taken for the test.
2. **What We Found**
– Summarize the findings, including lab parameters seen
in the report if any. Briefly explain what was looked at in the genes without
using technical jargon.
3. **What It Means**
– Break down what the findings mean in simple terms,
focusing on mapping lab results to what was found in the genetic tests.
4. **Your Health Team’s Advice**
– Provide advice on potential lifestyle changes like diet
and physical activity in a patient-friendly way, encouraging healthy habits
rather than instilling fear.
5. **Questions You Might Have**
– Anticipate and answer a couple of basic questions a
patient might have, framing it as a collaborative effort to maintain health.
6. **Glossary**
– Define any technical terms used in the report in
simple, easy-to-understand language.
#### **Guidance on Tone**
– Maintain a friendly, reassuring tone throughout the
report.
– Avoid using scary words or phrases that might cause
anxiety.
– Use encouraging language that promotes a positive
outlook and emphasizes the proactive steps the patient can take to maintain
their health.
#### **Instructions**
1. Start with a friendly greeting to make the patient
feel seen and respected.
2. For each section, provide clear, concise information
without overwhelming the reader with too much detail.
3. Use analogies or comparisons where possible to explain
complex concepts in a way that a patient can understand.
4. Include colorful illustrations or diagrams to help
explain the concepts visually (if possible).
5. Review the report to ensure it is free of medical
jargon, and replace any complex terms with simpler alternatives.
6. Conclude with a positive note, reassuring the patient
that their health team is there to support them.
#### **Review**
After drafting the report, please review it to ensure:
– The language is appropriate for a 5th-grade reading
level.
– The report conveys all necessary information without
being too detailed or technical.
– It maintains a friendly, positive tone throughout.
Thank you for helping to create a report that respects
and empowers our young patients by providing them with the information they
need in a way they can understand!
—
[Start of text]
Text extracted from complex genomics reports
[End of text]
Sample output from the model using the above prompt.
Dear Sarah Taylor,
We hope this report finds you well. We have conducted a test to help diagnose your condition and provide you with valuable information. Below, we have summarized the findings and explained what they mean in simple terms. If you have any questions, please don’t hesitate to reach out to your health team.
Section 1: Your Test
We conducted a test called FoundationOne®Liquid which involved taking a small sample of your blood. This test uses next generation sequencing to identify genomic alterations related to your condition.
Section 2: What We Found
Based on the test results, we found some biomarker and genomic findings related to your diabetes type 1. Autoantibodies to other antigens have been reported, but they occur infrequently and are not used for prediction. We also found some genomic alterations, including HLA-DR3 & HLA-DR4 combined, CTLA4 +49G>A and CT60, and IL2RA rs2104286, rs12722489, and rs11594656. These variations were found in a certain percentage of the population.
Section 3: What It Means
The presence of HLA-DR3 and HLA-DR4 alleles is associated with an increased risk of type 1 diabetes. Treatment strategies for type 1 diabetes typically involve insulin therapy, diet management, regular exercise and monitoring of blood glucose levels Immunomodulatory agents can also be used to prevent or delay the onset of type 1 diabetes.
For patients with CTLA4 alterations, immunomodulatory agents can be used to prevent or delay the onset of type 1 diabetes. These agents work by modulating the immune system to prevent the destruction of insulin-producing beta cells in the pancreas. Other potential treatment strategies include standard therapies for type 1 diabetes, such as insulin therapy, diet management, regular exercise and monitoring of blood glucose levels.
Patients with IL2RA alterations can also benefit from immunomodulatory agents to prevent or delay the onset of type 1 diabetes. These agents work by modulating the immune system to prevent the destruction of insulin-producing beta cells in the pancreas.
Section 4: Your Health Team’s Advice
We advise you to work closely with your healthcare provider to develop a personalized plan for managing your risk of developing type 1 diabetes. These include insulin therapy, diet management, regular exercise, and monitoring your blood glucose levels. These proactive steps can contribute to better overall health and help you maintain stable blood sugar levels. Your healthcare provider can also discuss the potential use of immunomodulatory agents and other treatment options based on your specific situation. Remember, your health team is here to support you every step of the way.
Section 5: Questions You Might Have
You may have some questions about your diagnosis and the recommended lifestyle changes. We anticipate a few common questions you might have:
Q: What lifestyle changes can I make to manage my risk of developing type 1 diabetes?
A: Maintaining a healthy diet, engaging in regular physical activity, and monitoring your blood glucose levels can help manage your risk of developing type 1 diabetes. It’s important to work with your healthcare provider to develop a personalized plan that suits your needs.
Q: Are there any experimental treatments available for type 1 diabetes?
A: Cell transplantation is an experimental treatment that involves transplanting insulin producing cells from a donor pancreas into a patient with type 1 diabetes. However, this procedure is not widely available and should be discussed with your healthcare provider.
Q: How can I manage my diet effectively?
A: Your healthcare team can provide you with a personalized diet plan that takes into account your specific needs and preferences. They can guide you on portion sizes, food choices, and meal timings to help you maintain stable blood sugar levels.
Q: What types of physical activity are recommended?
A: Engaging in regular physical activity, such as walking, swimming, or cycling, can have a positive impact on your overall health and help manage diabetes. Your healthcare team can provide guidance on the appropriate level of activity for you and any precautions you should take.
Section 6: Glossary
To ensure you understand all the terms used in this report, we have included a glossary at the end. It provides simple, easy-to-understand definitions of any technical terms used.
– Autoantibodies: Antibodies produced by the immune system that mistakenly target and attack the body’s own cells or tissues.
– Genomic alterations: Changes or variations in the DAN sequence that can affect the function of genes.
– HLA-DR3 & HLA-DR4: Specific genetic variations associated with an increased risk of type 1 diabetes.
– CTLA4: A gene involved in regulating the immune response.
– IL2RA: A gene that encodes a protein involved in the immune response.
– Immunomodulatory agents: Medications that modify or regulate the immune system’s response.
– Insulin therapy: Treatment involving the use of insulin to manage blood glucose levels in individuals with diabetes.
– Diet management: Making healthy food choices and following a balanced diet to maintain blood glucose levels.
– Regular exercise: Engaging in physical activity to improve overall health and manage blood glucose levels.
– Monitoring of blood glucose levels: Regularly checking blood glucose levels to ensure they are within a healthy range.
We hope this report helps you better understand your diagnosis and the steps you can take to manage your health. Remember, your health team is always available to answer any further questions you may have. Stay positive and take proactive steps towards a healthier lifestyle.
Best regards,
[Your Name]
Microsoft Tech Community – Latest Blogs –Read More
Semantic Kernel introduces SQLPlugin and UploadPlugin for efficient data insights.
Semantic kernel Bot in-a-box
The era of information has made it incredibly valuable to interact with and gain insights from data. With the emergence of advanced AI technologies, integrating natural language processing with data has opened up new possibilities for data interaction. Semantic Kernel is a technology that combines artificial intelligence with software development to bring together large language models and software functionality. It is designed to show how AI can work together with human expertise and technical knowledge to create a solution where we can use intelligent tools that blend seamlessly into our digital world.
In this blog post, we are thrilled to discuss two new features for Semantic Kernel’s AI-powered assistant – SQLPlugin and UploadPlugin. SQLPlugin uses SQL to extract insights that can transform the way professionals interact with data, while UploadPlugin lets users upload documents and retrieve knowledge. These cutting-edge advancements are transforming how businesses operate, making them more efficient and productive.
1. Architecture
The process for message flow is as follows:
End-users connect to a messaging channel where your bot is published, for example, Web, PowerBI dashboard, or Teams.
Messages are processed through Azure Bot Services, which communicates with a .NET application running on App Services.
The .NET application has a Semantic Kernel Stepwise Planner at its core, which elaborates a series of steps to process the user’s request and then executes it.
Each plan step is formulated through Azure OpenAI and then executed against Cognitive Search (traditional RAG pattern) or Azure SQL (structured data RAG).
Cognitive search simplifies data retrieval from multiple cloud sources by ingesting and indexing the data, while Azure SQL contains customer data from the AdventureWorksLT sample.
Azure OpenAI decides which data source each question gets routed to. Questions may also span multiple data sources.
2. Pre-requisites
For running locally:
Install .NET;
Install Bot Framework Emulator;
For deploying to Azure:
Install Azure CLI
Install Azure Developer CLI
Log into your Azure subscription
azd auth login
3. Recommended deployment parameters
For this tutorial, you will deploy Document Intelligence, SQL, and Cognitive Search resources using the Semantic Kernel Bot-in-a-box accelerator. This will ensure a seamless and efficient setup process with optimal deployment parameters.
Azure_location = “East US 2”
gptModel = “gpt-4”
gptVersion = “1106-preview”
deployDocIntel
deploySql = True
deploySearch = true
publicNetworkAccess = True
4. Deploy to Azure
Clone this repository locally:
git clone https://github.com/Azure/AI-in-a-Box
Then, access to the semantic-kernel-bot-in-a-box
Deploy resources:
Running the following command in your terminal :
azd up
Add the parameters by following the recommended parameters to use SQL assistant and upload the document for search using retrieval-augmented generation.
3. Use Case 1: SQL-powered GenAI assistant.
Consider the task of extracting the top 10 product figures with prices and categories. With the SQL Assistant plugin, the analyst can easily extract the top 10 product figures with prices and categories. The GPT-4 model interprets the request and translates it into SQL, providing a structured representation of the data sourced directly from the database.
Select your Azure Bot resource from the menu on the Azure portal for this first use case.
Test the bot by asking some sample questions:
Can you provide the top 10 list of all products, their prices, and the categories they belong to?
How much in total sales do we have?
What are my biggest customers?
What are my top-selling products?
4. Use Case 2: AI Search Retrieval-augmented generation.
Envision a scenario where users effortlessly extract valuable information from uploaded documents. Integrated with GPT-4, a document processing system adeptly analyzes diverse document types, including PDFs. Simultaneously, the system collaborates with HR processes, extracting insights from employee data for strategic workforce planning. This seamless integration enhances efficiency and reduces manual effort in extracting information from various document formats.
To retrieve the information, kindly select the document you wish to use.
After uploading, you can enter your prompt by typing it in.
Sample prompt :
How does HRM contribute to achieving an organization’s strategic goals through its functional objectives?
5. Embed the web chat.
In the context of communication applications, a channel refers to a means of establishing a connection between a bot and its users. By registering with Azure, a bot can leverage channels to facilitate effective communication with its users.
To configure a bot to connect to a channel, complete the following steps:
Select the bot that you want to configure.
In the left pane, select Channels under Settings.
Select the channel icon you want to add to your bot in the right pane. You may need to scroll down to see the list of all Available Channels.
Embed the web chat control on a web page.
Under Settings, select Channels. Then select Web Chat.
The Web Chat page will open. Select the Default Site from the list of Sites.
Copy the first Secret key and the Embed code.
6. Integrated the Bot in Power BI Dashboard
To enrich your reports and dashboards, you can easily incorporate HTML content by either manually entering the embedded code or copying and pasting it from popular websites like YouTube, Twitter, and embed.ly. This feature provides a convenient way to enhance your reports and dashboards’ visual appeal and functionality.
From the Add tile window, select Web content > Next.
Add information to the Add web content tile window:
To display a title above the tile, select Display title and subtitle and enter a Title and an optional Subtitle.
Enter the embed code. In this example, we’re copying and pasting a Twitter feed.
Select Apply.
On the dashboard, resize and move the web content tile as needed.
Congratulations on exploring the Semantic Kernel Bot in-a-box, the innovative solution revolutionizing data interaction. This advanced platform features the SQL-Powered GenAI Assistant and the Upload document Plugin, which can easily manage complex analyses and document processing. Azure’s infrastructure ensures a secure and scalable environment that’s easy to set up. With Power BI dashboards, you’ll get real-time insights and analytics. The Semantic Kernel Bot in-a-box makes data interaction more user-friendly and accessible, and it’s a great example of how AI can transform howt we work with data.
7. Ressources
Create AI agents with Semantic Kernel | Microsoft Learn
How to quickly start with Semantic Kernel | Microsoft Learn
Azure OpenAI Service models – Azure OpenAI | Microsoft Learn
AI-in-a-Box/gen-ai/semantic-kernel-bot-in-a-box at main · Azure/AI-in-a-Box (github.com)
Microsoft Tech Community – Latest Blogs –Read More
ADX Continuous Export to Delta Table – Preview
We’re excited to announce that continuous export to Delta table is now available in Preview.
Continuous export in ADX allows you to export data from Kusto to an external table with a periodically run query. The results are stored in the external table, which defines the destination, such as Azure Blob Storage, and the schema of the exported data. This process guarantees that all records are exported “exactly once”, with some exceptions. Continous export previously supported CSV, TSV, JSON and Parquet formats.
Starting today, you can continuously export to a delta table.
To define continuous export to a delta table:
Create an external delta table, as described in Create and alter delta external tables on Azure Storage.
(.create | .alter | .create-or-alter) external table TableName [(Schema)] kind = delta (StorageConnectionString ) [with (Property [, …])]
Define continuous export to this table using the commands described in Create or alter continuous export.
.create-or-alter continuous-export continuousExportName [over (T1, T2 )] to table externalTableName [with (propertyName = propertyValue [, …])] <| query
Few things to note:
If the schema of delta table while defining the external table isn’t provided, Kusto will try to infer it automatically based on the delta table defined in the target storage container.
If the schema of delta table while deining the external table is provided and there is no delta table defined in the target storage container, continous export will create a delta table during the first export.
The schema of the delta table must be in sync with the continuous export query. If the underlying delta table changes, the export might start failing with unexpected behavior.
Delta table partitioning is not supported today.
Read more : Continuous data export – Azure Data Explorer & Real-Time Analytics | Microsoft Learn
As always, we’d love to hear your feedback and comments.
Microsoft Tech Community – Latest Blogs –Read More
ADX Continuous Export to Delta Table – Preview
We’re excited to announce that continuous export to Delta table is now available in Preview.
Continuous export in ADX allows you to export data from Kusto to an external table with a periodically run query. The results are stored in the external table, which defines the destination, such as Azure Blob Storage, and the schema of the exported data. This process guarantees that all records are exported “exactly once”, with some exceptions. Continous export previously supported CSV, TSV, JSON and Parquet formats.
Starting today, you can continuously export to a delta table.
To define continuous export to a delta table:
Create an external delta table, as described in Create and alter delta external tables on Azure Storage.
(.create | .alter | .create-or-alter) external table TableName [(Schema)] kind = delta (StorageConnectionString ) [with (Property [, …])]
Define continuous export to this table using the commands described in Create or alter continuous export.
.create-or-alter continuous-export continuousExportName [over (T1, T2 )] to table externalTableName [with (propertyName = propertyValue [, …])] <| query
Few things to note:
If the schema of delta table while defining the external table isn’t provided, Kusto will try to infer it automatically based on the delta table defined in the target storage container.
If the schema of delta table while deining the external table is provided and there is no delta table defined in the target storage container, continous export will create a delta table during the first export.
The schema of the delta table must be in sync with the continuous export query. If the underlying delta table changes, the export might start failing with unexpected behavior.
Delta table partitioning is not supported today.
Read more : Continuous data export – Azure Data Explorer & Real-Time Analytics | Microsoft Learn
As always, we’d love to hear your feedback and comments.
Microsoft Tech Community – Latest Blogs –Read More
ADX Continuous Export to Delta Table – Preview
We’re excited to announce that continuous export to Delta table is now available in Preview.
Continuous export in ADX allows you to export data from Kusto to an external table with a periodically run query. The results are stored in the external table, which defines the destination, such as Azure Blob Storage, and the schema of the exported data. This process guarantees that all records are exported “exactly once”, with some exceptions. Continous export previously supported CSV, TSV, JSON and Parquet formats.
Starting today, you can continuously export to a delta table.
To define continuous export to a delta table:
Create an external delta table, as described in Create and alter delta external tables on Azure Storage.
(.create | .alter | .create-or-alter) external table TableName [(Schema)] kind = delta (StorageConnectionString ) [with (Property [, …])]
Define continuous export to this table using the commands described in Create or alter continuous export.
.create-or-alter continuous-export continuousExportName [over (T1, T2 )] to table externalTableName [with (propertyName = propertyValue [, …])] <| query
Few things to note:
If the schema of delta table while defining the external table isn’t provided, Kusto will try to infer it automatically based on the delta table defined in the target storage container.
If the schema of delta table while deining the external table is provided and there is no delta table defined in the target storage container, continous export will create a delta table during the first export.
The schema of the delta table must be in sync with the continuous export query. If the underlying delta table changes, the export might start failing with unexpected behavior.
Delta table partitioning is not supported today.
Read more : Continuous data export – Azure Data Explorer & Real-Time Analytics | Microsoft Learn
As always, we’d love to hear your feedback and comments.
Microsoft Tech Community – Latest Blogs –Read More
ADX Continuous Export to Delta Table – Preview
We’re excited to announce that continuous export to Delta table is now available in Preview.
Continuous export in ADX allows you to export data from Kusto to an external table with a periodically run query. The results are stored in the external table, which defines the destination, such as Azure Blob Storage, and the schema of the exported data. This process guarantees that all records are exported “exactly once”, with some exceptions. Continous export previously supported CSV, TSV, JSON and Parquet formats.
Starting today, you can continuously export to a delta table.
To define continuous export to a delta table:
Create an external delta table, as described in Create and alter delta external tables on Azure Storage.
(.create | .alter | .create-or-alter) external table TableName [(Schema)] kind = delta (StorageConnectionString ) [with (Property [, …])]
Define continuous export to this table using the commands described in Create or alter continuous export.
.create-or-alter continuous-export continuousExportName [over (T1, T2 )] to table externalTableName [with (propertyName = propertyValue [, …])] <| query
Few things to note:
If the schema of delta table while defining the external table isn’t provided, Kusto will try to infer it automatically based on the delta table defined in the target storage container.
If the schema of delta table while deining the external table is provided and there is no delta table defined in the target storage container, continous export will create a delta table during the first export.
The schema of the delta table must be in sync with the continuous export query. If the underlying delta table changes, the export might start failing with unexpected behavior.
Delta table partitioning is not supported today.
Read more : Continuous data export – Azure Data Explorer & Real-Time Analytics | Microsoft Learn
As always, we’d love to hear your feedback and comments.
Microsoft Tech Community – Latest Blogs –Read More
ADX Continuous Export to Delta Table – Preview
We’re excited to announce that continuous export to Delta table is now available in Preview.
Continuous export in ADX allows you to export data from Kusto to an external table with a periodically run query. The results are stored in the external table, which defines the destination, such as Azure Blob Storage, and the schema of the exported data. This process guarantees that all records are exported “exactly once”, with some exceptions. Continous export previously supported CSV, TSV, JSON and Parquet formats.
Starting today, you can continuously export to a delta table.
To define continuous export to a delta table:
Create an external delta table, as described in Create and alter delta external tables on Azure Storage.
(.create | .alter | .create-or-alter) external table TableName [(Schema)] kind = delta (StorageConnectionString ) [with (Property [, …])]
Define continuous export to this table using the commands described in Create or alter continuous export.
.create-or-alter continuous-export continuousExportName [over (T1, T2 )] to table externalTableName [with (propertyName = propertyValue [, …])] <| query
Few things to note:
If the schema of delta table while defining the external table isn’t provided, Kusto will try to infer it automatically based on the delta table defined in the target storage container.
If the schema of delta table while deining the external table is provided and there is no delta table defined in the target storage container, continous export will create a delta table during the first export.
The schema of the delta table must be in sync with the continuous export query. If the underlying delta table changes, the export might start failing with unexpected behavior.
Delta table partitioning is not supported today.
Read more : Continuous data export – Azure Data Explorer & Real-Time Analytics | Microsoft Learn
As always, we’d love to hear your feedback and comments.
Microsoft Tech Community – Latest Blogs –Read More
ADX Continuous Export to Delta Table – Preview
We’re excited to announce that continuous export to Delta table is now available in Preview.
Continuous export in ADX allows you to export data from Kusto to an external table with a periodically run query. The results are stored in the external table, which defines the destination, such as Azure Blob Storage, and the schema of the exported data. This process guarantees that all records are exported “exactly once”, with some exceptions. Continous export previously supported CSV, TSV, JSON and Parquet formats.
Starting today, you can continuously export to a delta table.
To define continuous export to a delta table:
Create an external delta table, as described in Create and alter delta external tables on Azure Storage.
(.create | .alter | .create-or-alter) external table TableName [(Schema)] kind = delta (StorageConnectionString ) [with (Property [, …])]
Define continuous export to this table using the commands described in Create or alter continuous export.
.create-or-alter continuous-export continuousExportName [over (T1, T2 )] to table externalTableName [with (propertyName = propertyValue [, …])] <| query
Few things to note:
If the schema of delta table while defining the external table isn’t provided, Kusto will try to infer it automatically based on the delta table defined in the target storage container.
If the schema of delta table while deining the external table is provided and there is no delta table defined in the target storage container, continous export will create a delta table during the first export.
The schema of the delta table must be in sync with the continuous export query. If the underlying delta table changes, the export might start failing with unexpected behavior.
Delta table partitioning is not supported today.
Read more : Continuous data export – Azure Data Explorer & Real-Time Analytics | Microsoft Learn
As always, we’d love to hear your feedback and comments.
Microsoft Tech Community – Latest Blogs –Read More
ADX Continuous Export to Delta Table – Preview
We’re excited to announce that continuous export to Delta table is now available in Preview.
Continuous export in ADX allows you to export data from Kusto to an external table with a periodically run query. The results are stored in the external table, which defines the destination, such as Azure Blob Storage, and the schema of the exported data. This process guarantees that all records are exported “exactly once”, with some exceptions. Continous export previously supported CSV, TSV, JSON and Parquet formats.
Starting today, you can continuously export to a delta table.
To define continuous export to a delta table:
Create an external delta table, as described in Create and alter delta external tables on Azure Storage.
(.create | .alter | .create-or-alter) external table TableName [(Schema)] kind = delta (StorageConnectionString ) [with (Property [, …])]
Define continuous export to this table using the commands described in Create or alter continuous export.
.create-or-alter continuous-export continuousExportName [over (T1, T2 )] to table externalTableName [with (propertyName = propertyValue [, …])] <| query
Few things to note:
If the schema of delta table while defining the external table isn’t provided, Kusto will try to infer it automatically based on the delta table defined in the target storage container.
If the schema of delta table while deining the external table is provided and there is no delta table defined in the target storage container, continous export will create a delta table during the first export.
The schema of the delta table must be in sync with the continuous export query. If the underlying delta table changes, the export might start failing with unexpected behavior.
Delta table partitioning is not supported today.
Read more : Continuous data export – Azure Data Explorer & Real-Time Analytics | Microsoft Learn
As always, we’d love to hear your feedback and comments.
Microsoft Tech Community – Latest Blogs –Read More
ADX Continuous Export to Delta Table – Preview
We’re excited to announce that continuous export to Delta table is now available in Preview.
Continuous export in ADX allows you to export data from Kusto to an external table with a periodically run query. The results are stored in the external table, which defines the destination, such as Azure Blob Storage, and the schema of the exported data. This process guarantees that all records are exported “exactly once”, with some exceptions. Continous export previously supported CSV, TSV, JSON and Parquet formats.
Starting today, you can continuously export to a delta table.
To define continuous export to a delta table:
Create an external delta table, as described in Create and alter delta external tables on Azure Storage.
(.create | .alter | .create-or-alter) external table TableName [(Schema)] kind = delta (StorageConnectionString ) [with (Property [, …])]
Define continuous export to this table using the commands described in Create or alter continuous export.
.create-or-alter continuous-export continuousExportName [over (T1, T2 )] to table externalTableName [with (propertyName = propertyValue [, …])] <| query
Few things to note:
If the schema of delta table while defining the external table isn’t provided, Kusto will try to infer it automatically based on the delta table defined in the target storage container.
If the schema of delta table while deining the external table is provided and there is no delta table defined in the target storage container, continous export will create a delta table during the first export.
The schema of the delta table must be in sync with the continuous export query. If the underlying delta table changes, the export might start failing with unexpected behavior.
Delta table partitioning is not supported today.
Read more : Continuous data export – Azure Data Explorer & Real-Time Analytics | Microsoft Learn
As always, we’d love to hear your feedback and comments.
Microsoft Tech Community – Latest Blogs –Read More
ADX Continuous Export to Delta Table – Preview
We’re excited to announce that continuous export to Delta table is now available in Preview.
Continuous export in ADX allows you to export data from Kusto to an external table with a periodically run query. The results are stored in the external table, which defines the destination, such as Azure Blob Storage, and the schema of the exported data. This process guarantees that all records are exported “exactly once”, with some exceptions. Continous export previously supported CSV, TSV, JSON and Parquet formats.
Starting today, you can continuously export to a delta table.
To define continuous export to a delta table:
Create an external delta table, as described in Create and alter delta external tables on Azure Storage.
(.create | .alter | .create-or-alter) external table TableName [(Schema)] kind = delta (StorageConnectionString ) [with (Property [, …])]
Define continuous export to this table using the commands described in Create or alter continuous export.
.create-or-alter continuous-export continuousExportName [over (T1, T2 )] to table externalTableName [with (propertyName = propertyValue [, …])] <| query
Few things to note:
If the schema of delta table while defining the external table isn’t provided, Kusto will try to infer it automatically based on the delta table defined in the target storage container.
If the schema of delta table while deining the external table is provided and there is no delta table defined in the target storage container, continous export will create a delta table during the first export.
The schema of the delta table must be in sync with the continuous export query. If the underlying delta table changes, the export might start failing with unexpected behavior.
Delta table partitioning is not supported today.
Read more : Continuous data export – Azure Data Explorer & Real-Time Analytics | Microsoft Learn
As always, we’d love to hear your feedback and comments.
Microsoft Tech Community – Latest Blogs –Read More
ADX Continuous Export to Delta Table – Preview
We’re excited to announce that continuous export to Delta table is now available in Preview.
Continuous export in ADX allows you to export data from Kusto to an external table with a periodically run query. The results are stored in the external table, which defines the destination, such as Azure Blob Storage, and the schema of the exported data. This process guarantees that all records are exported “exactly once”, with some exceptions. Continous export previously supported CSV, TSV, JSON and Parquet formats.
Starting today, you can continuously export to a delta table.
To define continuous export to a delta table:
Create an external delta table, as described in Create and alter delta external tables on Azure Storage.
(.create | .alter | .create-or-alter) external table TableName [(Schema)] kind = delta (StorageConnectionString ) [with (Property [, …])]
Define continuous export to this table using the commands described in Create or alter continuous export.
.create-or-alter continuous-export continuousExportName [over (T1, T2 )] to table externalTableName [with (propertyName = propertyValue [, …])] <| query
Few things to note:
If the schema of delta table while defining the external table isn’t provided, Kusto will try to infer it automatically based on the delta table defined in the target storage container.
If the schema of delta table while deining the external table is provided and there is no delta table defined in the target storage container, continous export will create a delta table during the first export.
The schema of the delta table must be in sync with the continuous export query. If the underlying delta table changes, the export might start failing with unexpected behavior.
Delta table partitioning is not supported today.
Read more : Continuous data export – Azure Data Explorer & Real-Time Analytics | Microsoft Learn
As always, we’d love to hear your feedback and comments.
Microsoft Tech Community – Latest Blogs –Read More
ADX Continuous Export to Delta Table – Preview
We’re excited to announce that continuous export to Delta table is now available in Preview.
Continuous export in ADX allows you to export data from Kusto to an external table with a periodically run query. The results are stored in the external table, which defines the destination, such as Azure Blob Storage, and the schema of the exported data. This process guarantees that all records are exported “exactly once”, with some exceptions. Continous export previously supported CSV, TSV, JSON and Parquet formats.
Starting today, you can continuously export to a delta table.
To define continuous export to a delta table:
Create an external delta table, as described in Create and alter delta external tables on Azure Storage.
(.create | .alter | .create-or-alter) external table TableName [(Schema)] kind = delta (StorageConnectionString ) [with (Property [, …])]
Define continuous export to this table using the commands described in Create or alter continuous export.
.create-or-alter continuous-export continuousExportName [over (T1, T2 )] to table externalTableName [with (propertyName = propertyValue [, …])] <| query
Few things to note:
If the schema of delta table while defining the external table isn’t provided, Kusto will try to infer it automatically based on the delta table defined in the target storage container.
If the schema of delta table while deining the external table is provided and there is no delta table defined in the target storage container, continous export will create a delta table during the first export.
The schema of the delta table must be in sync with the continuous export query. If the underlying delta table changes, the export might start failing with unexpected behavior.
Delta table partitioning is not supported today.
Read more : Continuous data export – Azure Data Explorer & Real-Time Analytics | Microsoft Learn
As always, we’d love to hear your feedback and comments.
Microsoft Tech Community – Latest Blogs –Read More
AI for Developers
The era of AI is here, and today’s developer needs the skills and tools to build intelligent apps. This month, we’re exploring resources to help developers modernize their applications and get started with AI. Join a Hack Together event, complete a Cloud Skills Challenge, work through guided tutorials, and register for upcoming events. These resources will help you build intelligent chat apps, extend Microsoft Copilot or create a custom copilot, learn about Microsoft Fabric, and much more.
Cloud Skills Challenge: Build Intelligent Apps
Join a Cloud Skills Challenge to compete against peers, show case your talents, and learn new skills. Combine AI, cloud-scaled data, and cloud-native app development to create intelligent apps. Join a challenge today.
Hack Together: The AI Chat App Hack
It’s not too late to join the AI Chat App Hack! This Hack Together event (January 29 – February 12) offers a playground for experimenting with RAG chat apps and a chance to learn from Microsoft experts.
Azure Cosmos DB Conf Call for Proposals
Want to give a presentation at the Azure Cosmos DB Conference 2024? Submit proposals for presentations on AI integration, innovative use cases, and other topics emphasizing practical insights and hands-on experiences. Submit by February 15, 2024.
Hack Together: The Microsoft Fabric Global AI Hack
Join the Microsoft Fabric Global AI Hack February 19 – March 1 for hands-on learning and find out why Microsoft Fabric is the data platform of choice for AI.
Official Collection: Learn how to build intelligent apps with .NET
Explore a collection of Microsoft Learn modules, videos, and samples on GitHub that will help you build intelligent apps with .NET.
Microsoft Fabric Community Conference
Register for the first annual Microsoft Fabric Community Conference—a live, in-person event taking place March 26 – 28 in Las Vegas. Immerse yourself in data and AI, get hands-on experience with the latest technologies, and connect with other experts.
Playwright Testing and GitHub Actions tutorial: How to run Playwright tests on every code commit
Set up continuous, end-to-end testing for your web apps with Microsoft Playwright and GitHub actions. Watch this tutorial to see how you can run tests on every code commit and validate that your app works across different browsers and operating systems.
The future of collaboration and AI
Build the next era of AI apps with the Teams AI Library, now generally available. Combined with Azure Open AI Service, you have everything you need to build your own AI apps and copilots. Learn more about extending your app to the Copilot ecosystem.
Azure Cosmos DB Conf 2024
Sign up for Azure Cosmos DB Conf, a free virtual developer event. Tune into the live show on April 16 to learn why Azure Cosmos DB is the leading database for AI and modern app development. Then explore more sessions on demand.
POSETTE Call for Presentations
Every great event starts with great speakers. Do you have Postgres tips, tricks, stories, or expertise to share? Submit your presentation proposals to be considered for POSETTE (formerly Citus Con), a free, virtual developer event organized by the Postgres team at Microsoft.
Build and modernize AI apps with new solution accelerators
Build intelligent apps on Azure with new tools that bring top use cases to life. Explore demos, GitHub repos, and Hackathon content to help you get started building AI-powered apps, such as a copilot using your own data.
New Azure AI Advantage offer
There’s a new Azure AI Advantage offer that lets Azure AI and GitHub Copilot customers save when using Azure Cosmos DB.
Build a production RAG chat using Azure AI Studio and Prompt Flow
Learn how to build a production-level RAG app for a customer support agent – and integrate it with your web-based product catalog. Streamline your end-to-end app development from prompt engineering to LLMOps with prompt flow in Azure AI Studio.
Train a machine learning model and debug it with the Responsible AI dashboard
Ready to build a machine learning model or integrate one into your app? Learn how to debug your model to assess it for Responsible AI practices using the Azure Responsible AI Dashboard.
How to Convert Audio to .WAV for Speech Service Using MoviePy
Azure Speech Service requires audio files to adhere to specific standards. Find out how to use MoviePy to easily convert your audio files to make them compatible with Azure Speech Service.
Build it with AI video series
Ready to get started with AI? Check out the Build it with AI video series from Microsoft Reactor. Deepen your engagement, grow your AI-driven solutions, and start building your business on AI technology.
How to build a custom copilot using Azure AI Studio and Microsoft Copilot Studio
Want to build your own copilot? Explore options in the Microsoft ecosystem for building a copilot. This blog post looks into low code tools and out-of-the-box features. A follow-up post will focus on code-heavy and extensible options.
Build an AI Powered Image App
Use AI image technologies to deploy it to build an AI-powered image web app. A new Microsoft Learn challenge module steps you through bite-sized project to give you a taste of the latest tools.
Microsoft JDConf 2024
Get ready for JDConf 2024—a free virtual event for Java developers. Explore the latest in tooling, architecture, cloud integration, frameworks, and AI. It all happens online March 27-28. Learn more and register now.
Step-by-step guide: Build a recommender full stack app using OpenAI and Azure SQL
Check out this step-by-step guide for creating an intelligent web app with Azure Open AI Service. This blog post shows you how to create a recommender full stack app with OpenAI and Azure SQL.
Official collection: AI Kick-off Projects
Put your AI skills to test and start building innovative solutions. This collection of AI Challenge Projects provides modules that will teach you how to build various intelligent solutions, such as a minigame and a speech translator.
Register now: Microsoft Fabric Community Conference
Join us at the first ever Microsoft Fabric Community Conference—a live, in-person event. Discover how Microsoft data and AI services accelerate innovation and prepare you for the era of AI. Use discount code MSCUST to save $100.
Microsoft Tech Community – Latest Blogs –Read More
AI for Developers
The era of AI is here, and today’s developer needs the skills and tools to build intelligent apps. This month, we’re exploring resources to help developers modernize their applications and get started with AI. Join a Hack Together event, complete a Cloud Skills Challenge, work through guided tutorials, and register for upcoming events. These resources will help you build intelligent chat apps, extend Microsoft Copilot or create a custom copilot, learn about Microsoft Fabric, and much more.
Cloud Skills Challenge: Build Intelligent Apps
Join a Cloud Skills Challenge to compete against peers, show case your talents, and learn new skills. Combine AI, cloud-scaled data, and cloud-native app development to create intelligent apps. Join a challenge today.
Hack Together: The AI Chat App Hack
It’s not too late to join the AI Chat App Hack! This Hack Together event (January 29 – February 12) offers a playground for experimenting with RAG chat apps and a chance to learn from Microsoft experts.
Azure Cosmos DB Conf Call for Proposals
Want to give a presentation at the Azure Cosmos DB Conference 2024? Submit proposals for presentations on AI integration, innovative use cases, and other topics emphasizing practical insights and hands-on experiences. Submit by February 15, 2024.
Hack Together: The Microsoft Fabric Global AI Hack
Join the Microsoft Fabric Global AI Hack February 19 – March 1 for hands-on learning and find out why Microsoft Fabric is the data platform of choice for AI.
Official Collection: Learn how to build intelligent apps with .NET
Explore a collection of Microsoft Learn modules, videos, and samples on GitHub that will help you build intelligent apps with .NET.
Microsoft Fabric Community Conference
Register for the first annual Microsoft Fabric Community Conference—a live, in-person event taking place March 26 – 28 in Las Vegas. Immerse yourself in data and AI, get hands-on experience with the latest technologies, and connect with other experts.
Playwright Testing and GitHub Actions tutorial: How to run Playwright tests on every code commit
Set up continuous, end-to-end testing for your web apps with Microsoft Playwright and GitHub actions. Watch this tutorial to see how you can run tests on every code commit and validate that your app works across different browsers and operating systems.
The future of collaboration and AI
Build the next era of AI apps with the Teams AI Library, now generally available. Combined with Azure Open AI Service, you have everything you need to build your own AI apps and copilots. Learn more about extending your app to the Copilot ecosystem.
Azure Cosmos DB Conf 2024
Sign up for Azure Cosmos DB Conf, a free virtual developer event. Tune into the live show on April 16 to learn why Azure Cosmos DB is the leading database for AI and modern app development. Then explore more sessions on demand.
POSETTE Call for Presentations
Every great event starts with great speakers. Do you have Postgres tips, tricks, stories, or expertise to share? Submit your presentation proposals to be considered for POSETTE (formerly Citus Con), a free, virtual developer event organized by the Postgres team at Microsoft.
Build and modernize AI apps with new solution accelerators
Build intelligent apps on Azure with new tools that bring top use cases to life. Explore demos, GitHub repos, and Hackathon content to help you get started building AI-powered apps, such as a copilot using your own data.
New Azure AI Advantage offer
There’s a new Azure AI Advantage offer that lets Azure AI and GitHub Copilot customers save when using Azure Cosmos DB.
Build a production RAG chat using Azure AI Studio and Prompt Flow
Learn how to build a production-level RAG app for a customer support agent – and integrate it with your web-based product catalog. Streamline your end-to-end app development from prompt engineering to LLMOps with prompt flow in Azure AI Studio.
Train a machine learning model and debug it with the Responsible AI dashboard
Ready to build a machine learning model or integrate one into your app? Learn how to debug your model to assess it for Responsible AI practices using the Azure Responsible AI Dashboard.
How to Convert Audio to .WAV for Speech Service Using MoviePy
Azure Speech Service requires audio files to adhere to specific standards. Find out how to use MoviePy to easily convert your audio files to make them compatible with Azure Speech Service.
Build it with AI video series
Ready to get started with AI? Check out the Build it with AI video series from Microsoft Reactor. Deepen your engagement, grow your AI-driven solutions, and start building your business on AI technology.
How to build a custom copilot using Azure AI Studio and Microsoft Copilot Studio
Want to build your own copilot? Explore options in the Microsoft ecosystem for building a copilot. This blog post looks into low code tools and out-of-the-box features. A follow-up post will focus on code-heavy and extensible options.
Build an AI Powered Image App
Use AI image technologies to deploy it to build an AI-powered image web app. A new Microsoft Learn challenge module steps you through bite-sized project to give you a taste of the latest tools.
Microsoft JDConf 2024
Get ready for JDConf 2024—a free virtual event for Java developers. Explore the latest in tooling, architecture, cloud integration, frameworks, and AI. It all happens online March 27-28. Learn more and register now.
Step-by-step guide: Build a recommender full stack app using OpenAI and Azure SQL
Check out this step-by-step guide for creating an intelligent web app with Azure Open AI Service. This blog post shows you how to create a recommender full stack app with OpenAI and Azure SQL.
Official collection: AI Kick-off Projects
Put your AI skills to test and start building innovative solutions. This collection of AI Challenge Projects provides modules that will teach you how to build various intelligent solutions, such as a minigame and a speech translator.
Register now: Microsoft Fabric Community Conference
Join us at the first ever Microsoft Fabric Community Conference—a live, in-person event. Discover how Microsoft data and AI services accelerate innovation and prepare you for the era of AI. Use discount code MSCUST to save $100.
Microsoft Tech Community – Latest Blogs –Read More
ICYMI | Microsoft 365 Blog: Introducing the new Microsoft 365 Document Collaboration Partner Program
If you’re an independent software vendor (ISV) who provides a cloud communication and collaboration platform, you may want to offer customers a collaboration experience inside and outside meetings. That’s why we are excited to introduce the Microsoft 365 Document Collaboration Partner Program (MDCPP), a new opportunity for eligible platform providers to integrate Microsoft 365 apps into their platforms. Whether it’s a presentation, a spreadsheet, or a document, the program can enable users to share, edit, and coauthor, without switching between apps or losing context.
Continue reading in our Microsoft 365 blog
Microsoft Tech Community – Latest Blogs –Read More
ICYMI | Microsoft 365 Blog: Introducing the new Microsoft 365 Document Collaboration Partner Program
If you’re an independent software vendor (ISV) who provides a cloud communication and collaboration platform, you may want to offer customers a collaboration experience inside and outside meetings. That’s why we are excited to introduce the Microsoft 365 Document Collaboration Partner Program (MDCPP), a new opportunity for eligible platform providers to integrate Microsoft 365 apps into their platforms. Whether it’s a presentation, a spreadsheet, or a document, the program can enable users to share, edit, and coauthor, without switching between apps or losing context.
Continue reading in our Microsoft 365 blog
Microsoft Tech Community – Latest Blogs –Read More