Category: Microsoft
Category Archives: Microsoft
How to Fix QuickBooks Desktop has Stopped Working after latest updates?
I recently updated my QuickBooks Desktop software, but now it keeps crashing with a ‘QuickBooks Desktop has Stopped Working’ message. I’m unable to access my financial data or complete any tasks. What could be causing this issue, and how can I resolve it quickly?
I recently updated my QuickBooks Desktop software, but now it keeps crashing with a ‘QuickBooks Desktop has Stopped Working’ message. I’m unable to access my financial data or complete any tasks. What could be causing this issue, and how can I resolve it quickly? Read More
Teams Adds Slash Commands to the Message Compose Box
Teams has added the ability to use slash commands (shortcuts) to the message compose box. Although the feature seems useful, I wonder about its potential usage. The fact is that people are pretty accustomed to how they compose message text and other options are available to add Loop or code blocks or set their online status, so why would they use the slash commands in the message compose box?
https://office365itpros.com/2024/05/16/teams-slash-commands/
Teams has added the ability to use slash commands (shortcuts) to the message compose box. Although the feature seems useful, I wonder about its potential usage. The fact is that people are pretty accustomed to how they compose message text and other options are available to add Loop or code blocks or set their online status, so why would they use the slash commands in the message compose box?
https://office365itpros.com/2024/05/16/teams-slash-commands/ Read More
Lesson Learned #488: A severe error occurred on the current command. Operation cancelled by user.
Today, I worked on a service request that our customer got this error message: “A severe error occurred on the current command. The results, if any, should be discarded.rnOperation cancelled by user.“. This cancellation happens before the CommandTimeout duration is met in the SQL Client application, normally, asynchronous database operations that the CancellationToken setting is reached. Following I would like to share with you my lessons learned.
The customer application was running asynchronous database operations and two primary types of cancellations can occur:
A CommandTimeout cancellation typically indicates that the query is taking longer than expected, possibly due to database performance issues or query complexity. On the other hand, a cancellation triggered by a CancellationToken may be due to application logic deciding to abort the operation, often in response to user actions or to maintain application responsiveness.
Error Handling and Connection Resilience:
Errors during query execution, such as syntax errors or references to non-existent database objects, necessitate immediate attention and are not suitable for retry logic. The application must distinguish these errors from transient faults, where retry logic with exponential backoff can be beneficial. Moreover, connection resilience is paramount, and implementing a retry mechanism for establishing database connections ensures that transient network issues do not disrupt application functionality.
Measuring Query Execution Time:
Gauging the execution time of queries is instrumental in identifying performance bottlenecks and optimizing database interactions. The example code demonstrates using a Stopwatch to measure and log the duration of query execution, providing valuable insights for performance tuning.
Adaptive Timeout Strategy:
The code snippet illustrates an adaptive approach to handling query cancellations due to timeouts. By dynamically adjusting the CommandTimeout and CancellationToken timeout values upon encountering a timeout-related cancellation, the application attempts to afford the query additional time to complete in subsequent retries, where feasible.
Tests and Results:
I conducted a series of tests to understand the behavior under different scenarios and the corresponding exceptions thrown by the .NET application. Here are the findings:
Cancellation Prior to Query Execution:
Scenario: The cancellation occurs before the query gets a chance to execute, potentially due to reasons such as application overload or a preemptive cancellation policy.
Exception Thrown: TaskCanceledException
Internal Error Message: “A task was canceled.”
Explanation: This exception is thrown when the operation is canceled through a CancellationToken, indicating that the asynchronous task was canceled before it could begin executing the SQL command. It reflects the application’s decision to abort the operation, often to maintain responsiveness or manage workload.
Cancellation Due to CommandTimeout:
Scenario: The cancellation is triggered by reaching the CommandTimeout of SqlCommand, indicating that the query’s execution duration exceeded the specified timeout limit.
Exception Thrown: SqlException with an error number of -2
Internal Error Message: “Execution Timeout Expired. The timeout period elapsed prior to completion of the operation or the server is not responding.”
Explanation: This exception occurs when the query execution time surpasses the CommandTimeout value, prompting SQL Server to halt the operation. It suggests that the query may be too complex, the server is under heavy load, or there are network latency issues.
Cancellation Before CommandTimeout is Reached:
Scenario: The cancellation happens before the CommandTimeout duration is met, not due to the CommandTimeout setting but possibly due to an explicit cancellation request or an unforeseen severe error during execution.
Exception Thrown: General Exception (or a more specific exception depending on the context)
Internal Error Message: “A severe error occurred on the current command. The results, if any, should be discarded.rnOperation cancelled by user.”
Explanation: This exception indicates an abrupt termination of the command, potentially due to an external cancellation signal or a critical error that necessitates aborting the command. Unlike the TaskCanceledException, this may not always originate from a CancellationToken and can indicate more severe issues with the command or the connection.
using System;
using System.Diagnostics;
using System.Data;
using System.Threading;
using System.Threading.Tasks;
using Microsoft.Data.SqlClient;
namespace CancellationToken
{
class Program
{
private static string ConnectionString = “Server=tcp:servername.database.windows.net,1433;User Id=username;Password=pwd!;Initial Catalog=dbname;Persist Security Info=False;MultipleActiveResultSets=False;Encrypt=True;TrustServerCertificate=False;Connection Timeout=30;Pooling=true;Max Pool size=100;Min Pool Size=1;ConnectRetryCount=3;ConnectRetryInterval=10;Application Name=ConnTest”;
private static string Query = “waitfor delay ’00:00:20′”;
static async Task Main(string[] args)
{
SqlConnection connection = await EstablishConnectionWithRetriesAsync(3, 2000);
if (connection == null)
{
Console.WriteLine(“Failed to establish a database connection.”);
return;
}
await ExecuteQueryWithRetriesAsync(connection, 5, 1000, 10000,15);
connection.Close();
}
private static async Task<SqlConnection> EstablishConnectionWithRetriesAsync(int maxRetries, int initialDelay)
{
SqlConnection connection = null;
int retryDelay = initialDelay;
for (int attempt = 1; attempt <= maxRetries; attempt++)
{
try
{
connection = new SqlConnection(ConnectionString);
await connection.OpenAsync();
Console.WriteLine(“Connection established successfully.”);
return connection;
}
catch (SqlException ex)
{
Console.WriteLine($”Failed to establish connection: {ex.Message}. Attempt {attempt} of {maxRetries}.”);
if (attempt == maxRetries)
{
Console.WriteLine(“Maximum number of connection attempts reached. The application will terminate.”);
return null;
}
Console.WriteLine($”Waiting {retryDelay / 1000} seconds before the next connection attempt…”);
await Task.Delay(retryDelay);
retryDelay *= 2;
}
}
return null;
}
private static async Task ExecuteQueryWithRetriesAsync(SqlConnection connection, int maxRetries, int initialDelay, int CancellationTokenTimeout, int CommandSQLTimeout)
{
int retryDelay = initialDelay;
for (int attempt = 1; attempt <= maxRetries; attempt++)
{
using (var cts = new CancellationTokenSource())
{
cts.CancelAfter(CancellationTokenTimeout*attempt); // Set CancellationToken timeout to 10 seconds
try
{
using (SqlCommand command = new SqlCommand(Query, connection))
{
command.CommandTimeout = CommandSQLTimeout*attempt; // Set CommandTimeout to 15 seconds
Stopwatch stopwatch = Stopwatch.StartNew();
await command.ExecuteNonQueryAsync(cts.Token);
stopwatch.Stop();
Console.WriteLine($”Query executed successfully in {stopwatch.ElapsedMilliseconds} milliseconds.”);
return;
}
}
catch (TaskCanceledException)
{
Console.WriteLine($”Query execution was canceled by the CancellationToken. Attempt {attempt} of {maxRetries}.”);
}
catch (SqlException ex) when (ex.Number == -2)
{
Console.WriteLine($”Query execution was canceled due to CommandTimeout. Attempt {attempt} of {maxRetries}.”);
}
catch (SqlException ex) when (ex.Number == 207 || ex.Number == 208 || ex.Number == 2627)
{
Console.WriteLine($”SQL error preventing retries: {ex.Message}”);
return;
}
catch (Exception ex)
{
Console.WriteLine($”An exception occurred: {ex.Message}”);
return;
}
Console.WriteLine($”Waiting {retryDelay / 1000} seconds before the next query attempt…”);
await Task.Delay(retryDelay);
retryDelay *= 2;
}
}
}
}
}
Disclaimer:
The example code provided in this article is intended for educational and informational purposes only. It is strongly recommended to test this code in a controlled and secure environment before implementing it in any production system. The user assumes full responsibility for any risks, damages, or losses incurred by using the code. The author and the blog are not liable for any issues that may arise from the use or misuse of the provided code.
Microsoft Tech Community – Latest Blogs –Read More
How to enable paraent control in windows 11?
Does Windows 11 have the native feature for parent control or I have to download 3rd-party app for doing this?
Does Windows 11 have the native feature for parent control or I have to download 3rd-party app for doing this? Read More
How to fix unrecoverable error in quickbooks desktop
I’m facing an unrecoverable error in QuickBooks Desktop, disrupting my workflow. It’s frustrating not being able to access crucial financial data. How can I troubleshoot this issue effectively?
I’m facing an unrecoverable error in QuickBooks Desktop, disrupting my workflow. It’s frustrating not being able to access crucial financial data. How can I troubleshoot this issue effectively? Read More
Conditional Access Policy – Register security information
Hi,
As response to a security incident we’ve created a conditional access policy to block registration of MFA methodes from other countries based on the User Actions = Register security information.
We noticed that users who are working in other countries using VPN sometimes can’t log in because of this conditional access.
It seems like after a number of logins, over a period of time, there is a registration triggered which causes this conditional access policy to be hit.
Excluding the affected user from this policy solves the issue, and after removing the exclusion the user can keep on working for a period of time without issues.
Is it correct to assume that there is an automatic registering of security information is triggered? What are the conditions for this to happen?
kind Regards,
Ivan
Hi,As response to a security incident we’ve created a conditional access policy to block registration of MFA methodes from other countries based on the User Actions = Register security information.We noticed that users who are working in other countries using VPN sometimes can’t log in because of this conditional access.It seems like after a number of logins, over a period of time, there is a registration triggered which causes this conditional access policy to be hit.Excluding the affected user from this policy solves the issue, and after removing the exclusion the user can keep on working for a period of time without issues.Is it correct to assume that there is an automatic registering of security information is triggered? What are the conditions for this to happen?kind Regards,Ivan Read More
PIM Groups prevent permanent assignment
Hi,
I am designing a PIM implementation and was planning on leveraging PIM groups for most privileged access management scenarios. I created a group and PIM-enabled it and configured the settings to prevent permanent assignment.
However, I find I can still assign permanent members via the normal Entra ID Groups section where you add members to a normal group. Then when I check the PIM section I see a permanent assignment.
Is there a way of preventing this?
Cheers,
Jeremy.
Hi,I am designing a PIM implementation and was planning on leveraging PIM groups for most privileged access management scenarios. I created a group and PIM-enabled it and configured the settings to prevent permanent assignment.However, I find I can still assign permanent members via the normal Entra ID Groups section where you add members to a normal group. Then when I check the PIM section I see a permanent assignment.Is there a way of preventing this?Cheers,Jeremy. Read More
How to Resolve QuickBooks Payroll Error PS032?
I’m encountering QuickBooks Payroll Error PS032 when trying to download updates. How can I fix this issue and update my QuickBooks Payroll successfully?
I’m encountering QuickBooks Payroll Error PS032 when trying to download updates. How can I fix this issue and update my QuickBooks Payroll successfully? Read More
Why Does My QuickBooks Data Conversion Services?
I’m encountering difficulties with QuickBooks data conversion services. How can I efficiently convert my data to QuickBooks format without errors or data loss?
I’m encountering difficulties with QuickBooks data conversion services. How can I efficiently convert my data to QuickBooks format without errors or data loss? Read More
How Do I Fix Unrecoverable Error in QuickBooks Desktop Windows 11?
Encountering an unrecoverable error in QuickBooks Desktop on Windows 11? It’s disrupting my workflow. What could be causing this issue, and how can I fix it to ensure smooth operation of my accounting software?
Encountering an unrecoverable error in QuickBooks Desktop on Windows 11? It’s disrupting my workflow. What could be causing this issue, and how can I fix it to ensure smooth operation of my accounting software? Read More
How to Resolve QuickBooks Data Migration Services?
I’m encountering challenges with QuickBooks data migration services. How can I ensure a smooth transition of my data to QuickBooks, and what are the best practices to overcome migration issues?
I’m encountering challenges with QuickBooks data migration services. How can I ensure a smooth transition of my data to QuickBooks, and what are the best practices to overcome migration issues? Read More
Why can’t I send payroll data in QuickBooks Desktop?
I’m encountering difficulties sending payroll data in QuickBooks Desktop. What could be causing this issue, and how can I fix it to ensure smooth payroll processing?
I’m encountering difficulties sending payroll data in QuickBooks Desktop. What could be causing this issue, and how can I fix it to ensure smooth payroll processing? Read More
Group Managed Service Accounts for SQL Services -> Best Practices Microsoft
Hello everyone
I would like to address the SQL community with the following question:
We use Group Managed Service Accounts for all SQL services on our new SQL server.
I have now been told by a software supplier that Microsoft recommends either leaving the predefined SQL service accounts for the services or using a service account (AD user) with user name and password and not Group Managed Service Accounts.
Is this correct? Should I not use Group Managed Service Accounts for the SQL services? What is official best practice from Microsoft?
My thought for using Group Managed Service Accounts is from a security perspective.
Thanks for your support
Greetings
Oliver
Hello everyoneI would like to address the SQL community with the following question:We use Group Managed Service Accounts for all SQL services on our new SQL server.I have now been told by a software supplier that Microsoft recommends either leaving the predefined SQL service accounts for the services or using a service account (AD user) with user name and password and not Group Managed Service Accounts.Is this correct? Should I not use Group Managed Service Accounts for the SQL services? What is official best practice from Microsoft?My thought for using Group Managed Service Accounts is from a security perspective.Thanks for your supportGreetingsOliver Read More
Windows 11/Windows Server 2022 への SQL Server のインストールが失敗する場合の対処策について
こんにちは。SQL Server サポート チームです。
今回は、Windows 11 に SQL Server をインストールする際に、インストール先のディスクのセクター サイズが 4KB よりも大きいために、インストールに失敗した場合の対処策をご案内します。
SQL Server では 4KB よりも大きなセクター サイズをサポートしていないため、4KB よりも大きなセクター サイズのディスクに SQL Server をインストールしようとすると、インストールに失敗します。
詳細については、以下の公開情報に記載されています。
4 KB を超えるシステム ディスク セクター サイズに関連するエラーのトラブルシューティング
https://learn.microsoft.com/ja-jp/troubleshoot/sql/database-engine/database-file-operations/troubleshoot-os-4kb-disk-sector-size
上記の詳細情報をもとに、本事象に該当するか確認する方法や対処策を本記事でもご案内します。
◆事象
Windows 11 に SQL Server をインストールする際に、インストール先のディスクが 4KB よりも大きなセクター サイズの場合、SQL Server のインストールが失敗します。
◆本事象の発生する対象
Windows 11 に SQL Server をインストールする場合に、SQL Server のインストール先ディスクのセクター サイズが、4KB よりも大きい場合に本事象が発生します。
同じディスクを使用する場合でも、Windows 10 ではこの事象は発生しません。
そのため、Windows 10 のときは SQL Server がエラーなく動作している環境でも、Windows 11 にアップデートすることで、
Windows 11 が 4KB よりも大きなディスクのセクター サイズを認識するようになり、SQL Server サービスの開始が失敗するケースもあります。
◆発生対象に該当するか確認する方法
Windows 11 への SQL Server のインストールが失敗する場合、または、Windows 11 へアップデート後に SQL Server サービスの起動が失敗する場合、
以下を確認し、該当する場合は後述の対処策を実施ください。
【確認方法】
SQL Server のインストール先のドライブを fsutil コマンドで確認し、 PhysicalBytesPerSectorForAtomicity が 4096 よりも大きい値か確認します。大きい場合は対処策を実施します。
==============================
確認手順
==============================
1. タスクバー上の虫眼鏡のマークをクリックし、検索する文字として「コマンド」と入力します。
2. 最も一致する検索結果として、コマンド プロンプトが表示されますので、[管理者として実行] をクリックします。
3. 「このアプリがデバイスに変更を加えることを許可しますか?」という画面が表示されますので [はい] をクリックします。
4. コマンド プロンプトに以下のコマンドをコピーして実行します。
fsutil fsinfo sectorinfo C:
※ SQL Server のインストール先が C: ボリューム以外の場合、対象のボリュームを指定するように変更して実行ください。
対象のボリュームが E: ボリュームの場合は以下のように実行します。
fsutil fsinfo sectorinfo E:
5. 出力内容の PhysicalBytesPerSectorForAtomicity を確認し、4096 よりも大きい値か確認します。
4096 よりも大きい場合、本公開情報の事象に該当しまので、以下の対処策を実施します。
◆対処策
レジストリの設定を変更することで対処します。
レジストリを誤って変更すると、深刻な問題が発生するため、変更するコマンドを実行する際には十分にご注意ください。
レジストリの設定を変更する前には、対象のレジストリのバックアップも取得します。
==============================
レジストリのバックアップ
==============================
1. タスクバー上の虫眼鏡のマークをクリックし、検索する文字として「コマンド」と入力します。
2. 最も一致する検索結果として、コマンド プロンプトが表示されますので、[管理者として実行] をクリックします。
3. 「このアプリがデバイスに変更を加えることを許可しますか?」という画面が表示されますので [はい] をクリックします。
4. コマンド プロンプトで以下の 4 つのコマンドを順次実行し、バックアップを保存するフォルダー C:temp を作成し、C:temp に対象のレジストリのバックアップを保存します。
cd
mkdir temp
cd temp
REG SAVE “HKLMSYSTEMCurrentControlSetServicesstornvmeParametersDevice” reg-backup.hiv
==============================
対処手順
==============================
1. タスクバー上の虫眼鏡のマークをクリックし、検索する文字として「コマンド」と入力します。
2. 最も一致する検索結果として、コマンド プロンプトが表示されますので、[管理者として実行] をクリックします。
3. 「このアプリがデバイスに変更を加えることを許可しますか?」という画面が表示されますので [はい] をクリックします。
4. コマンド プロンプトに以下のコマンドをコピーして実行します。
REG ADD “HKLMSYSTEMCurrentControlSetServicesstornvmeParametersDevice” /v “ForcedPhysicalSectorSizeInBytes” /t REG_MULTI_SZ /d “* 4095” /f
5. 次に、コマンド プロンプトに以下のコマンドをコピーして実行します。
REG QUERY “HKLMSYSTEMCurrentControlSetServicesstornvmeParametersDevice” /v “ForcedPhysicalSectorSizeInBytes”
6. 上記の手順 5. の実行結果が以下であることを確認します。
HKEY_LOCAL_MACHINESYSTEMCurrentControlSetServicesstornvmeParametersDevice
ForcedPhysicalSectorSizeInBytes REG_MULTI_SZ * 4095
7. 対処手順は以上です。Windows を再起動し、SQL Server をインストールします。
本記事が皆様のお役に立てれば幸いです。
※本情報の内容(添付文書、リンク先などを含む)は、作成日時点でのものであり、予告なく変更される場合があります。
Microsoft Tech Community – Latest Blogs –Read More
Create your own copilot using Azure Prompt flow and Streamlit
LLMs such as GPT have certain limitations. They may not have up-to-date information due to their knowledge cutoff date for training. This poses a significant challenge when we want our AI models to provide accurate, context-aware, and timely responses. Imagine asking an LLM about the latest technology trends or seeking real-time updates on a breaking news event; traditional language models might fall short in these scenarios.
In this blog, we will introduce you to a game-changing technique called retrieval-augmented generation (RAG). This unique approach empowers language models such as GPT to bridge the gap between their static knowledge and the dynamic real world. With RAG, we’ll show you how to equip your generative AI applications with the ability to pull in fresh information, ground your organizational data, cross-reference facts to address hallucinations and stay contextually aware, all in real-time.
Generative AI technology has the potential to greatly enhance education in the health sector, particularly in fields like anatomy and physiology. This is because AI platforms can create highly detailed and interactive models of the human body, making complex systems like the cardiovascular or nervous systems easier to understand than with traditional methods.
Another benefit of generative AI is its ability to personalize the learning experience. By analyzing a student’s performance, the AI can identify areas where the student needs improvement and generate customized practice questions to target those areas. Additionally, generative AI can simulate patient interactions, which is essential in enhancing diagnostic skills.
This blog will show how generative AI using Azure AI studio prompt flow with Multi-Round Q&A on Your Data chat can make anatomy and physiology education more interactive, engaging, and effective and help students prepare for their healthcare careers.
1. Architecture
2. Create an Azure AI Search resource
You need an Azure AI Search resource to index your data for your copilot solution. This will let you use custom data in a prompt flow.
In a web browser, open the Azure portal at https://portal.azure.com and sign in using your Azure credentials.
On the home page, select + Create a resource and search for Azure AI Search. Then create a new Azure AI Search resource with the following settings:
Subscription: Select your Azure subscription
Resource group: Select or create a resource group
Service name: Enter a unique service name
Location: Make a random choice from any of the following regions*
Australia East
Canada East
East US
East US 2
France Central
Japan East
North Central US
Sweden Central
Switzerland
Pricing tier: Standard
3. Wait for your Azure AI Search resource deployment to be completed.
3. Create an Azure AI project
Now you’re ready to create an Azure AI Studio project and the Azure AI resources to support it.
In a web browser, open Azure AI Studio at https://ai.azure.com sign in using your Azure credentials.
On the Manage page, select + New AI hub. Then, in the Getting started wizard, create a project with the following settings:
AI Hub: Create a new resource with the following settings:
AI Hub name: A unique name
Azure Subscription: Your Azure subscription
Resource group: Select the resource group containing your Azure AI Search resource
Location: The same location as your Azure AI Search resource
Azure OpenAI: (New) Autofills with your selected hub name
Project name: Create a new project with the following settings:
The project name
Choose the Hub created early
3. Wait for your project to be created.
4. Deploy models
To implement your solution, you will require two models.
An embedding model that turns text data into vectors for easy indexing and processing.
A model that can produce responses in natural language to queries using your data.
In the Azure AI Studio, in your project, in the navigation pane on the left, under Components, select the Deployments page.
Create a new deployment (using a real-time endpoint) of the text-embedding-ada-002 model with the following settings:
Deployment name: text-embedding-ada-002
Model version: Default
Advanced options:
Content filter: Default
Tokens per minute rate limit: 5K
Repeat the previous steps to deploy a gpt-35-turbo model with the deployment name gpt-35-turbo.
5.Add data to your project
The data for your copilot consists of a set of Essentials of Anatomy and Physiology in PDF format designed to provide a comprehensive introduction to human anatomy and physiology. Let’s add it to the project.
In Azure AI Studio, in your project, select the Data page in the navigation pane on the left under Components.
Select + New data.
expand the drop-down menu to select Upload files/folders in the Add your data wizard.
Select Upload files/folder and select Upload files.
Set the data name to “xxxxxxx”.
6. Create an index for your data
Now that you’ve added a data source to your project, you can use it to create an index in your Azure AI Search resource.
In Azure AI Studio, in your project, select the Indexes page in the navigation pane on the left under Components.
Add a new index with the following settings:
Source data:
Data source: Use existing project data
Select the “xxxxxx” data source
Index storage:
Select the AzureAISearch connection to your Azure AI Search resource
Search settings:
Vector settings: Add vector search to this search resource
Azure OpenAI Resource: Default_AzureOpenAI
Acknowledge that an embedding model will be deployed if not already there
Index settings:
Index name: “xxxxxxx”
Virtual machine: Auto select
Wait for the indexing process to be completed, which can take several minutes. The index creation operation consists of the following jobs:
Crack, chunk, and embed the text tokens in your data.
Update Azure AI Search with the new index.
Register the index asset.
7. Examine the index
Before using your index in a RAG-based prompt flow, let’s verify that it can be used to affect generative AI responses.
In the navigation pane on the left, under Tools, select the Playground page.
On the Setup panel, select the Add your data tab, and then add the brochures-index project index and select the hybrid (vector + keyword) search type.
After the index has been added and the chat session has restarted, resubmit the prompt What can you recommend for beginners?
Review the response, which should be based on data in the index.
8. Use the index in your prompt flow
In this example, I’ll use a ready-made template from the Prompt Flow page. I’ll choose the “Multi-Round Q&A on Your Data” template, which applies an augmented generation method to the retrieved Essentials of Anatomy and Philosophy book, based on a user’s question.
You can easily clone this Flow and start exploring it.
Let us analyze the individual parts that comprise it and delve into their unique characteristics:
Inputs: User’s query and Chat history: contains the history chat
Output : Model’s augmented response
modify_query_with_history
Before the question gets embedded, modify_query_with_history adds context to the question by including instructions and the chat history. This is the default prompt:
Examples of chat history:
Input: Hi, I’m struggling to understand how the nervous system interacts with other body systems. Can you help?
Output: Absolutely! The nervous system interacts with other body systems primarily through nerve impulses that control and coordinate actions. For example, the nervous system sends signals to the muscular system to initiate movement.
Input: That makes sense. Can you explain how the nervous system affects the digestive system?
Output: Sure! The nervous system regulates the speed at which food is digested through signals that increase or decrease the rate of muscle contractions in the gastrointestinal tract. This is crucial for efficient digestion and nutrient absorption.
Input: Thanks! How about the cardiovascular system? How does the nervous system interact with that?
Output: The nervous system controls the heart rate and blood pressure. It can send signals to the heart to speed up or slow down based on the body’s needs, like during exercise or rest.
lookup
We use lookup to link our data to the model. The data must be indexed first, which is done in the Azure AI Studio ‘Indexes’ component on the side bar.
In the lookup section, set the following parameter values:
mlindex_content: Select the empty field to open the Generate pane
index_type: Registered Index
mlindex_asset_id: brochures-index:1
queries: ${modify_query_with_history.output}
query_type: Hybrid (vector + keyword)
top_k: 2
To get the right path, go to Build / your_project_name / Data / your_index_name, click on Index Data and copy the Data connection URI from the Data links section
generate_prompt_context
generate_prompt_context receives a list of search result entities as input and turns them into a string with content and source information for each document. This function enhances the intelligence level by adding pertinent details to the prompt, enabling more knowledgeable and context-sensitive responses.
Prompt_variants
With prompt_variants, you can make different versions of prompts to get more variety in the questions you ask.
chat_with_context
chat_with_context uses the context created by generate_prompt_context to improve the conversation. It takes into account the previous context and the related document chunks, which helps it to reply more logically and correctly.
Let’s test the chat to see how it reacts.
After creating the flow, we can deploy it as a managed endpoint, which can be consumed through REST API by clicking the “Deploy” button on the flow page.
After that, you will need to choose a virtual machine that will be used to facilitate the deployment process :
Note that there is a feature available that you can opt to enable, called Inferencing data collection (currently in preview). When enabled, it automatically collects inputs and outputs as a data asset within your Azure AI Studio. This can be used later as a test dataset.
9. Consuming your Prompt Flow
After deploying your flow in Azure AI Studio, you can consume it as a managed endpoint. To access this feature, simply navigate to the “Deployments” tab and click on your flow’s name. From there, you can also test your flow to ensure it’s working properly before consumption.
We can use streamlit in VS Code to write the code that will view your Endpoint and Keys. Go to the consume tab and copy and paste them into your code.
import streamlit as st
import urllib.request
import json
import os
import ssl
from dotenv import load_dotenv
# Load environment variables
load_dotenv()
AZURE_ENDPOINT_KEY = os.environ[‘AZURE_ENDPOINT_KEY’] = ‘xxxxxxxxxxxxxxxxxxxxxx’
def allowSelfSignedHttps(allowed):
# Bypass the server certificate verification on the client side
if allowed and not os.environ.get(‘PYTHONHTTPSVERIFY’, ”) and getattr(ssl, ‘_create_unverified_context’, None):
ssl._create_default_https_context = ssl._create_unverified_context
# Streamlit UI components
st.image(“education.png”, width=600)
st.title(‘ Welcome to your Essential of Anantomy and Physiology Assistant!’)
st.sidebar.title(” Copilot for Anantomy and Physiology !”)
st.sidebar.caption(“Made by an Pascal Burume”)
st.sidebar.info(“””
Generative AI technology has the potential to greatly enhance education in the health sector, particularly in fields like anatomy and physiology. This is because AI platforms can create highly detailed and interactive models of the human body, making complex systems like the cardiovascular or nervous systems easier to understand than with traditional methods.
“””)
def main():
allowSelfSignedHttps(True)
# Initialize chat history
if “chat_history” not in st.session_state:
st.session_state.chat_history = []
# Display chat history
for interaction in st.session_state.chat_history:
if interaction[“inputs”][“chat_input”]:
with st.chat_message(“user”):
st.write(interaction[“inputs”][“chat_input”])
if interaction[“outputs”][“chat_output”]:
with st.chat_message(“assistant”):
st.write(interaction[“outputs”][“chat_output”])
# React to user input
if user_input := st.chat_input(“Ask me anything…”):
# Display user message in chat message container
st.chat_message(“user”).markdown(user_input)
# Query API
data = {“chat_history”: st.session_state.chat_history, ‘chat_input’: user_input}
body = json.dumps(data).encode(‘utf-8’)
url = ‘https://xxxxxxxxxxxxxxxxxxxxx.ml.azure.com/score’
headers = {
‘Content-Type’: ‘application/json’,
‘Authorization’: f’Bearer {AZURE_ENDPOINT_KEY}’,
‘azureml-model-deployment’: ‘xxxxxxxxxx-1’
}
req = urllib.request.Request(url, body, headers)
try:
response = urllib.request.urlopen(req)
response_data = json.loads(response.read().decode(‘utf-8’))
# Check if ‘chat_output’ key exists in the response_data
if ‘chat_output’ in response_data:
with st.chat_message(“assistant”):
st.markdown(response_data[‘chat_output’])
st.session_state.chat_history.append(
{“inputs”: {“chat_input”: user_input},
“outputs”: {“chat_output”: response_data[‘chat_output’]}}
)
else:
st.error(“The response data does not contain a ‘chat_output’ key.”)
except urllib.error.HTTPError as error:
st.error(f”The request failed with status code: {error.code}”)
if __name__ == “__main__”:
main()
Samples prompts:
What can you recommend for me today
Give me a plan of study today
I want dive into the muscular system
You can also use the backend of Azure Monitor to view metrics for your flow, in the “Monitoring” tab.
9. Conclusion
As we wrap up this exploration into the transformative capabilities of generative AI technologies, particularly within the realms of education and healthcare, it’s clear that the potential for innovation is immense. By leveraging retrieval-augmented generation (RAG), we have unlocked a path that bridges the gap between static data and the dynamic needs of real-world applications. This blog has outlined not just the theoretical possibilities but also practical steps to implement these technologies using Azure AI Studio.
Thank you for joining us on this insightful journey through the capabilities of modern AI technologies. We are excited about the future possibilities as we continue to push the boundaries of what AI can achieve in educational contexts. Let’s move forward into a future where technology and education merge to create enriching, empowering learning experiences.
10. Resources
Streamlit • A faster way to build and share data apps
Deploy a flow in prompt flow as a managed online endpoint for real-time inference – Azure Machine Learning | Microsoft Learn
Get started in prompt flow – Azure Machine Learning | Microsoft Learn
Microsoft Tech Community – Latest Blogs –Read More
Troubleshooting Common Custom Policy Issues in Policy Development
We develop Azure Custom Policies when we need flexibility and advanced capabilities which cannot be achieved by existing built- in policies provided by Azure. We can add multiple custom policies for any Azure Product which can be modified and changed at any point of time.
In summary, Azure custom policies provide a powerful and flexible framework to design tailored identity experiences, integrate with external systems, and meet advanced security and compliance requirements.
Reference link on how to develop custom policies can be found here- Tutorial: Create a custom policy definition – Azure Policy | Microsoft Learn
Common Issues which can be seen while developing Custom Policy
Issue #1: An incorrect or nonexistent alias is used in a policy definition.
While Adding Policy Definition, always make sure that Correct Policy Alias is used otherwise you will get the error stating that ” incorrect alias/alias not available”. There are multiple ways to check the policy Alias which is described as below
Use Powershell Command Get-AzPolicyAlias
For example, if you are developing Policy for Resource type – Microsoft.Storage , then we will use below command:
(Get-AzPolicyAlias -NamespaceMatch ‘storage’).Aliases
Once you run the above PowerShell command, you will get full list of available aliases which can be used to develop policies.
Reference Link: Get-AzPolicyAlias (Az.Resources) | Microsoft Learn
Deploy Resource, Add configuration which you are expecting in your custom Policy and Check GET/PUT calls to understand which property/alias is getting passed in backend via developer tools while adding configuration. For example, if you want to enable Entra ID authentication in Web apps and check via custom policy that resource has entra ID auth enabled, then you will deploy web app and enable entra id authentication and check the values getting passed in backend while enabling entra id. Below Screenshots will help you understand how to check the values.
We are adding authentication for app service here:
Now, before clicking on add button, open developer tools:
Come back and Add the Authentication for web apps:
Now Go back to developer tools and check the Get/PUT calls after stopping recording:
Click on batchapiversion and under “response” check values which are updated while you have added authentication for your app service. You will see the appropriate alias value and multiple properties inside it, which will help you understand correct values which are needed to be added in custom policy to check if Entra ID is enabled for resource or not.
You can also use Azure Policy Extension for Visual Studio Code to check the correct Policy Alias for your custom policy
Issue #2: No Resources Found under Compliance Report.
There can be a possibility that once you add your custom policy, your resources are not visible under compliance report. To fix this issue, please make sure that you have provided correct resource type inside your policy definition and Read operation is available for the resource type. For example – resource type – > Microsoft.DBforPostgreSQL/flexibleServers, if read operation-
Microsoft.DBforPostgreSQL/flexibleServers/read is not avaialble, then you wont be able to see resources under compliance report. Please make a note that if read operation is not available, raise a request to PG team to add the read feature for that product.
All operation available for a particular product can be checked using this link- Azure permissions – Azure RBAC | Microsoft Learn
Issue #3: Incorrect Compliance Report
There can be a case where custom policy is added however compliance report is not correct. For eg- Your Resource is marked as compliant however it is non-compliant and vice versa. To solve this issue, we have to make sure that our policy rule is correct. Try to reverse the policy rule, use different combinations of “if”, “then” block with “All”, “Any” Conditions and test your policy to see if compliance report is changing.
Reference for Policy Rule can be found here- Details of the policy definition structure policy rules – Azure Policy | Microsoft Learn
Also make sure that Policy Effect is working properly. If policy effect is not working as expected, this can also impact the compliance report. To solve this, we have to check the PUT and GET requests for particular product and see if it is working with the effect. Sometimes “Deny” effect does not support the resource type for a particular policy rule and changing the effect to “Audit” will support the policy rule and gives the correct compliance report.
Issue #4: Unable to Develop Custom Policy- Alias Not Available
There can be a case that Policy Alias is not available to add the custom policy. In this case, we have to report it directly to PG team to check if Alias can be added in future as a new feature release or not.
Please make a note that there can be cases where due to security reasons, particular alias cannot be added by PG team . In those cases, we have to check alternative options or values which can be consumed indirectly to implement custom policy.
Issue #5: Incorrect Mode Usage
It is very important to understand that Value “Mode” in policy definition structure is referenced correctly otherwise the resource compliant will be incorrect.
As a best practice, always use Mode as “All” in your policy definitions. If Mode as “Indexed” is used it will limit evaluation to only those resources that support tags and you wont be able to see correct compliance report for other resources.
Microsoft Tech Community – Latest Blogs –Read More
A Step-by-Step Guide to Datadog Integration with Linux App Service via Sidecars
In this blog post, we dive into the realms of observability and monitoring, taking advantage of the latest advancements in Azure’s Linux App Service. If you’ve been following App Service updates, you might have caught wind of the Public Preview for the Sidecar Pattern for Linux App Service announced recently. Leveraging this development, we’re here to guide you through integrating Datadog, an Azure Native ISV service partner that provides a powerful observability platform, with your .NET custom container application hosted on Linux App Service. Whether you’re eager to streamline log management, track application traces, or enhance request monitoring, we’ve got you covered.
Setting up your .NET application
To get started, you’ll need to containerize your .NET application. This tutorial walks you through the process step by step.
Once your application is containerized, you can integrate the Datadog tracer. To do that, you will need to add the following lines to the Dockerfile for your main application.
# Datadog specific
RUN mkdir -p /datadog/tracer
RUN mkdir -p /home/LogFiles/dotnet
ADD https://github.com/DataDog/dd-trace-dotnet/releases/download/v2.49.0/datadog-dotnet-apm-2.49.0.tar.gz /datadog/tracer
RUN cd /datadog/tracer && tar -zxf datadog-dotnet-apm-2.49.0.tar.gz
This ensures that the Datadog tracer is properly installed and configured within your application container.
Below is a sample Dockerfile incorporating Datadog integration:
# Stage 1: Build the application
FROM mcr.microsoft.com/dotnet/sdk:8.0 AS build
WORKDIR /app
# Copy the project file and restore dependencies
COPY *.csproj ./
RUN dotnet restore
# Copy the remaining source code
COPY . .
# Build the application
RUN dotnet publish -c Release -o out
# Stage 2: Create a runtime image
FROM mcr.microsoft.com/dotnet/aspnet:8.0 AS runtime
WORKDIR /app
# Copy the build output from stage 1
COPY –from=build /app/out ./
# Datadog specific
RUN mkdir -p /datadog/tracer
RUN mkdir -p /home/LogFiles/dotnet
ADD https://github.com/DataDog/dd-trace-dotnet/releases/download/v2.49.0/datadog-dotnet-apm-2.49.0.tar.gz /datadog/tracer
RUN cd /datadog/tracer && tar -zxf datadog-dotnet-apm-2.49.0.tar.gz
# Set the entry point for the application
ENTRYPOINT [“dotnet”, “<your dotnet app>.dll”]
You’re now ready to build the image and push it to your preferred container registry, be it Azure Container Registry, Docker Hub, or a private registry.
Create your Linux Web App
Create a new Linux Web App from the portal and choose the options for Container and Linux.
On the Container tab, make sure that Sidecar support is Enabled.
Specify the details of your application image.
Note: Typically, .NET uses port 8080 but you can change it in your project.
Setup your Datadog
If you don’t have a Datadog account, you can create an instance of Datadog on the Azure portal by following this QuickStart.
Create Datadog – Azure Native ISV Services | Microsoft Learn
Alternatively, you can also create a service account on Datadog by following the steps in this tutorial.
Service Accounts (datadoghq.com)
Datadog offers a 14 days Free Trial if you would like to try out the service.
AppSettings for the Datadog Integration
You need to set the following AppSettings.
DD_API_KEY – If you have created the Datadog resource on the Azure portal, you can manage your API keys like this.
Alternatively, you can create your API Key by following the steps here API and Application Keys (datadoghq.com).
We would encourage you to add sensitive information like API keys to Azure Key vault Use Key Vault references – Azure App Service | Microsoft Learn.
DD_SITE – Datadog offers you different sites for your data. You can use us3.datadoghq.com as this site is hosted in Azure. Therefore, the Observability data for your application stays in Azure. You can find more information about Datadog sites here.
DD_SERVICE: The name of the service that would be displayed in your Datadog Service Catalog.
DD_ENV: This is used to set the global environment, which allows you to differentiate data coming from various environments like staging or production.
DD_SERVERLESS_LOG_PATH: This is the path where you write your application logs. Typically, this will be /home/Logfile/*.log, If you have changed the location for your application logs, you can specify that in this setting.
DD_DOTNET_TRACER_HOME: /datadog/tracer
DD_TRACE_LOG_DIRECTORY: /home/Logfiles/dotnet
CORECLR_ENABLE_PROFILING: 1
CORECLR_PROFILER: {846F5F1C-F9AE-4B07-969E-05C26BC060D8}
CORECLR_PROFILER_PATH: /datadog/tracer/Datadog.Trace.ClrProfiler.Native.so
To know more about these Datadog settings, you can refer to the documentation.
Add the Datadog Sidecar
Go to the Deployment Center for your application and add a sidecar container
Image Source: Docker Hub and other registries
Image type: Public
Registry server URL: svlsddagent.azurecr.io
Image and tag: serverless-sidecar:latest
Port: 8126
Disclaimer: Datadog Image Usage
It’s important to note that the Datadog image used here is sourced directly from Datadog and is provided ‘as-is.’ Microsoft does not own or maintain this image. Therefore, its usage is subject to the terms of use outlined by Datadog, which can be found here.
Visualizing Your Observability Data in Datadog
You are all set! You can now see your Observability data flow to Datadog backend. Take a look at the Azure serverless page for a complete view of your App Services.
The Service Catalog gives you an overview of each service, such as the number of requests, latency, and more.
You can see your application logs by going to Logs -> Explorer
Your application traces will be under APM->Traces->Explorer
To learn more about Datadog dashboards, you can refer to the documentation.
Next steps
In this guide, we’ve explored the seamless integration of Datadog with your .NET custom container application hosted on Linux App Service. By leveraging the Sidecar Pattern and Datadog’s powerful observability platform, you can now unlock actionable insights and enhance the monitoring capabilities of your applications.
It’s important to note that Datadog, as an Azure Native ISV Services partner, offers robust support for Azure services and environments. Our collaboration with Datadog is aimed at providing you with even closer and simplified integration experiences in the future.
Stay tuned for upcoming guides where we’ll delve into integrating Datadog with code-based web applications and other language stacks like NodeJS and Python.
Microsoft Tech Community – Latest Blogs –Read More
What are the common causes for QuickBooks Migration Failed Unexpectedly and how can it be resolved?
Can someone please explain why QuickBooks migration failed unexpectedly? Seeking urgent assistance to resolve this halt and ensure a smooth transition of data. Any guidance or troubleshooting solutions steps would be heartly appreciated.
Can someone please explain why QuickBooks migration failed unexpectedly? Seeking urgent assistance to resolve this halt and ensure a smooth transition of data. Any guidance or troubleshooting solutions steps would be heartly appreciated. Read More
Make email notifications not go to junk on Outlook.com
Not a first time i notice i missed a lot of messages from posts i subscribe to to only find all of them in the Junk folder on Outlook.com. C’mon, aren’t you the same company? 🙂 Seems like it started happening 5/7. Every time i mark it as not junk, but it keeps happening.
Not a first time i notice i missed a lot of messages from posts i subscribe to to only find all of them in the Junk folder on Outlook.com. C’mon, aren’t you the same company? 🙂 Seems like it started happening 5/7. Every time i mark it as not junk, but it keeps happening. Read More
How to Unfreeze QuickBooks Desktop
Facing trouble unfreezing QuickBooks Desktop? My software froze unexpectedly, and I can’t access any data. How can I troubleshoot this issue and restore functionality? I need a step-by-step guide to unfreeze QuickBooks Desktop and ensure my data remains intact. Any advice or solution would be greatly appreciated.
Facing trouble unfreezing QuickBooks Desktop? My software froze unexpectedly, and I can’t access any data. How can I troubleshoot this issue and restore functionality? I need a step-by-step guide to unfreeze QuickBooks Desktop and ensure my data remains intact. Any advice or solution would be greatly appreciated. Read More