Month: October 2024
Platform API Catalog for Azure API Center
Azure API Center enables tracking all of the APIs within an organization in a centralized location for discovery, reuse, and governance. Enterprise developers across the organization need to get seamless access to consume the APIs in an Azure API Center. Not all enterprise developers in an organization have complete access to Azure API Center e.g., access to adding new APIs to the API Center. However, all enterprise developers need access to view API definitions, export API specification documents and generate API clients.
Platform API Catalog enables enterprise developers across the organization to get seamless access to APIs using the Visual Studio Code extension for Azure API Center. It promotes API discovery, consumption and reuse across the enterprise.
Using the platform API catalog, developers can discover APIs in the Azure API center, view API definitions, and optionally generate API clients when they don’t have access to manage the API center itself or add APIs to the inventory. Access to the platform API catalog is managed using Microsoft Entra ID and Azure role-based access control. The platform API catalog helps enterprise developers discover API details and start API client development.
Platform API Catalog enables developers to:
Export API specification document – Export an API specification from a definition and then download it as a file
Generate API client – Use the Microsoft Kiota extension to generate an API client for their favorite language
Generate Markdown – Generate API documentation in Markdown format
OpenAPI documentation – View the documentation for an API definition and try operations in a Swagger UI
For more details visit: Enable platform API catalog – Azure API Center – VS Code extension – Azure API Center | Microsoft Learn
Microsoft Tech Community – Latest Blogs –Read More
جلبـ الحبيبـ بملوك الجن🍁-OO-𝟗𝟳𝟑.𝟑𝟰𝟑049O7-🍁 شيــخ روحــانى. جـلبـ الـحبـيـبـ مع الحبـ
جلبـ الحبيبـ بملوك الجن:maple_leaf:-OO-𝟗𝟳𝟑.𝟑𝟰𝟑049O7-:maple_leaf: شيــخ روحــانى. جـلبـ الـحبـيـبـ مع الحبـ
جلبـ الحبيبـ بملوك الجن:maple_leaf:-OO-𝟗𝟳𝟑.𝟑𝟰𝟑049O7-:maple_leaf: شيــخ روحــانى. جـلبـ الـحبـيـبـ مع الحبـ Read More
Math formulas centers on itself
Hello,
When writing long math formulas, the text always centers on itself. Eaven when I set “align with this character” I would like it if it was to the right, as shown in the picture.
Hello,When writing long math formulas, the text always centers on itself. Eaven when I set “align with this character” I would like it if it was to the right, as shown in the picture. Read More
Managing your marketplace community subscriptions
As the marketplace community continues to grow, I recommend managing your subscriptions to get the content most relevant to your top areas of interest, at the right time.
About Labels and Subscriptions:
You may have noticed that posts within this discussion forum contain labels such as “Build,” “Publish,” “Grow,” and “ISV Success.” These labels are applied to help you identify focused on specific marketplace topics (Building apps for the marketplace, publishing apps, growing your sales of apps on the marketplace, etc.)
There are similar labels applied to all blogs on the marketplace blog:
“Thought Leadership” – includes content on best practices, interviews with marketplace partners, etc.
“Roadmap & Functionality” – any content related to the marketplace roadmap and new features/functionality
You can “subscribe” to certain labels based on your areas of interest, which allows for a more customized community experience.
For example. on the marketplace blog site, if you are most interested in Roadmap & Functionality content. you can adjust your settings to be notified immediately when new content with this label is published. And if you are interested in other topics, such as ISV Success, you can still subscribe, but just elect to receive notifications less frequently.
Starting with subscriptions
Refer to this general help article on subscriptions and notifications: Managing Preferences, Subscriptions and Notifications – Microsoft Community Hub
If you’re not a member of the Tech Community already, create an account at techcommunity.microsoft.com
Manage your subscription settings so that you can receive email updates on the content you are interested in. Manage these settings by selecting your profile icon in the top right corner of the community, selecting “My Settings,” and selecting “Subscriptions & Notifications.
You can also subscribe to the entire marketplace community which will provide you with regular updates on all marketplace discussion topics happening in this forum. To do this, click “subscribe” in the upper right corner of the marketplace community homepage.
Getting Started in the Community: General information about starting on the community can be found via these resources:
Getting Started on the Tech Community – Microsoft Community Hub
Contributing to the Community – Microsoft Community Hub
As the marketplace community continues to grow, I recommend managing your subscriptions to get the content most relevant to your top areas of interest, at the right time.
About Labels and Subscriptions:
You may have noticed that posts within this discussion forum contain labels such as “Build,” “Publish,” “Grow,” and “ISV Success.” These labels are applied to help you identify focused on specific marketplace topics (Building apps for the marketplace, publishing apps, growing your sales of apps on the marketplace, etc.)
There are similar labels applied to all blogs on the marketplace blog:
“Thought Leadership” – includes content on best practices, interviews with marketplace partners, etc.
“Roadmap & Functionality” – any content related to the marketplace roadmap and new features/functionality
You can “subscribe” to certain labels based on your areas of interest, which allows for a more customized community experience.
For example. on the marketplace blog site, if you are most interested in Roadmap & Functionality content. you can adjust your settings to be notified immediately when new content with this label is published. And if you are interested in other topics, such as ISV Success, you can still subscribe, but just elect to receive notifications less frequently.
Starting with subscriptions
Refer to this general help article on subscriptions and notifications: Managing Preferences, Subscriptions and Notifications – Microsoft Community Hub
If you’re not a member of the Tech Community already, create an account at techcommunity.microsoft.com
Manage your subscription settings so that you can receive email updates on the content you are interested in. Manage these settings by selecting your profile icon in the top right corner of the community, selecting “My Settings,” and selecting “Subscriptions & Notifications.
You can also subscribe to the entire marketplace community which will provide you with regular updates on all marketplace discussion topics happening in this forum. To do this, click “subscribe” in the upper right corner of the marketplace community homepage.
Getting Started in the Community: General information about starting on the community can be found via these resources:
Getting Started on the Tech Community – Microsoft Community Hub
Contributing to the Community – Microsoft Community Hub Read More
Accessing new MS Planner plan via MS Loop
I am new to MS Loop. All documentation refers to the capabilities of linking loop component tasks to a specific plan in MS Planner. I want to be able to link action items resulting from meeting notes directly into the specific plan. All I seem to be able to link to is the plans created in the old MS Planner app. I do not see any plans (task lists) created via MS Project for the Web/Project Online. How do I access those tasks? Did Microsoft forget to add this functionality into the new version of MS Planner or am I just missing something?
Thanks, Jason
I am new to MS Loop. All documentation refers to the capabilities of linking loop component tasks to a specific plan in MS Planner. I want to be able to link action items resulting from meeting notes directly into the specific plan. All I seem to be able to link to is the plans created in the old MS Planner app. I do not see any plans (task lists) created via MS Project for the Web/Project Online. How do I access those tasks? Did Microsoft forget to add this functionality into the new version of MS Planner or am I just missing something? Thanks, Jason Read More
Task with choise between 2 (or more) resources
In my project I have 2 resources, I’ll call them A and B. Some task can only be performed by resource A, but other task can be done by either A or B, but never A and B together.
I tried to solve this by assigning both A and B to the tasks that either of them can do and put them at 50%. This way 2 tasks can be performed concurrently. However if resource A is working on a task which only A can perform, B is also idle because A is 100% occupied.
Is there a way to tell a task to use “Resource A” or “Recource B”? Or another work around?
To make my question more concrete. Resource A and B in my project are cranes that have to lift certain loads. Crane A is bigger than crane B, so some loads can only be lifted by crane A and some can be lifted by crane B or A, whichever happens te be available.
__PRESENT
In my project I have 2 resources, I’ll call them A and B. Some task can only be performed by resource A, but other task can be done by either A or B, but never A and B together. I tried to solve this by assigning both A and B to the tasks that either of them can do and put them at 50%. This way 2 tasks can be performed concurrently. However if resource A is working on a task which only A can perform, B is also idle because A is 100% occupied. Is there a way to tell a task to use “Resource A” or “Recource B”? Or another work around? To make my question more concrete. Resource A and B in my project are cranes that have to lift certain loads. Crane A is bigger than crane B, so some loads can only be lifted by crane A and some can be lifted by crane B or A, whichever happens te be available.__PRESENT Read More
IF Function not working on desktop app but OK online
I’ve been using excel on and off for over 20 years. My problem is that a simple IF statement works online but not on my desktop app.
=IF(C4=C5, “True”, “False”) was created online without errors and works, C4=1, C5=2
When I create the same on the desktop app I get this
I have the problem on both of my PC’s. If I use the formula wizard I get the same error.
Any help would be appreciated.
Dave.
I’ve been using excel on and off for over 20 years. My problem is that a simple IF statement works online but not on my desktop app. =IF(C4=C5, “True”, “False”) was created online without errors and works, C4=1, C5=2 When I create the same on the desktop app I get this I have the problem on both of my PC’s. If I use the formula wizard I get the same error.Any help would be appreciated. Dave. Read More
Domain Controller and AD FS Upgrade from Windows Server 2008 R2
My Site, a Community College is planning to upgrade our Domain controllers and AD FS Server from Windows Server 2008 R2 Data Center, We have 2 Domain Controllers and 1 AD FS Server, Looking for advice on how to stage this upgrade
Here are More Details on our current configuration.
PS C:Windowssystem32> Get-ADForest
ApplicationPartitions : {DC=ForestDnsZones,DC=sullivan,DC=suny,DC=edu, DC=DomainDnsZones,DC=sullivan,DC=suny,DC=edu}
CrossForestReferences : {}
DomainNamingMaster : DC01.sullivan.suny.edu
Domains : {sullivan.suny.edu}
ForestMode : Windows2003Forest
GlobalCatalogs : {DC01.sullivan.suny.edu, DC02.sullivan.suny.edu}
Name : sullivan.suny.edu
PartitionsContainer : CN=Partitions,CN=Configuration,DC=sullivan,DC=suny,DC=edu
RootDomain : sullivan.suny.edu
SchemaMaster : DC01.sullivan.suny.edu
Sites : {Default-First-Site-Name}
SPNSuffixes : {}
UPNSuffixes : {}
Primary Domain Controller dc01
OS Version – Windows Server 2008 R2 Data Center
Roles
Active Directory Certificate Services
Active Directory Domain Services
DHCP Server
DNS Server
2nd Domain Controller dc02
OS Version – Windows Server 2008 R2 Data Center
Roles
Active Directory Certificate Services
Active Directory Domain Services
Network Policy and Access Services
ADFS SERVER
OS Version – Windows Server 2008 R2 Data Center
ADFS version 2.0
Roles
Web Server (IIS)
Features
Remote Server Administration Tools
My Site, a Community College is planning to upgrade our Domain controllers and AD FS Server from Windows Server 2008 R2 Data Center, We have 2 Domain Controllers and 1 AD FS Server, Looking for advice on how to stage this upgrade Here are More Details on our current configuration. PS C:Windowssystem32> Get-ADForestApplicationPartitions : {DC=ForestDnsZones,DC=sullivan,DC=suny,DC=edu, DC=DomainDnsZones,DC=sullivan,DC=suny,DC=edu}CrossForestReferences : {}DomainNamingMaster : DC01.sullivan.suny.eduDomains : {sullivan.suny.edu}ForestMode : Windows2003ForestGlobalCatalogs : {DC01.sullivan.suny.edu, DC02.sullivan.suny.edu}Name : sullivan.suny.eduPartitionsContainer : CN=Partitions,CN=Configuration,DC=sullivan,DC=suny,DC=eduRootDomain : sullivan.suny.eduSchemaMaster : DC01.sullivan.suny.eduSites : {Default-First-Site-Name}SPNSuffixes : {}UPNSuffixes : {} Primary Domain Controller dc01OS Version – Windows Server 2008 R2 Data CenterRolesActive Directory Certificate ServicesActive Directory Domain ServicesDHCP ServerDNS Server2nd Domain Controller dc02OS Version – Windows Server 2008 R2 Data CenterRolesActive Directory Certificate ServicesActive Directory Domain ServicesNetwork Policy and Access ServicesADFS SERVEROS Version – Windows Server 2008 R2 Data CenterADFS version 2.0RolesWeb Server (IIS)FeaturesRemote Server Administration Tools Read More
VBA Split Data by Columns + Formatting
I’m looking for VBA macro code that can split a table into numerous worksheets by column value and keep formatting.
Have an input to select which column needs to be split by data in that columnEach new worksheet should be named the name of the value from the original column that is being splitCopy formatting of table into new sheets (borders, column width, etc)The table does not always start in cell A1. For example, if table starts in C4 (would require some input, hopefully selecting the cell), copy the columns to the left, and rows above to paste on all new pages (information not including, but surrounding the table)
I posted an example of me doing the process manually below. Original is the original table with Result1 and Result2 being the worksheets created after the macro is run.
I’m looking for VBA macro code that can split a table into numerous worksheets by column value and keep formatting. Have an input to select which column needs to be split by data in that columnEach new worksheet should be named the name of the value from the original column that is being splitCopy formatting of table into new sheets (borders, column width, etc)The table does not always start in cell A1. For example, if table starts in C4 (would require some input, hopefully selecting the cell), copy the columns to the left, and rows above to paste on all new pages (information not including, but surrounding the table)I posted an example of me doing the process manually below. Original is the original table with Result1 and Result2 being the worksheets created after the macro is run. Read More
Cannot Get Gmail Account to Work on Outlook 2016
I have two Windows 10 computers each running Office 2016. Outlook has four identical accounts in both. On my laptop the Gmail account stopped communicating with Google about a year ago. I have had no problems using my desktop machine. It’s time to solve this.
It cannot be a GMAIL set up issue because one computer works just fine using the same Gmail account. I have tried deleting the non functional Gmail account on the laptop but cannot reinstall it. I get a message which says: “Something went wrong .. We couldn’t log on to the incoming IMAP server. Please check your e-mail address and password and try again.” Of course I have tried that with no success. I have tried to set up a new profile. That works until I try to add the Gmail account. No success. I have made sure I am using the latest update to Office 2016. It is the current version.
Any other ideas are more than welcome. I have hit a wall.
I have two Windows 10 computers each running Office 2016. Outlook has four identical accounts in both. On my laptop the Gmail account stopped communicating with Google about a year ago. I have had no problems using my desktop machine. It’s time to solve this. It cannot be a GMAIL set up issue because one computer works just fine using the same Gmail account. I have tried deleting the non functional Gmail account on the laptop but cannot reinstall it. I get a message which says: “Something went wrong .. We couldn’t log on to the incoming IMAP server. Please check your e-mail address and password and try again.” Of course I have tried that with no success. I have tried to set up a new profile. That works until I try to add the Gmail account. No success. I have made sure I am using the latest update to Office 2016. It is the current version.Any other ideas are more than welcome. I have hit a wall. Read More
Ranking Names in Excel
Hi all,
Looking for a way to attach a ranking to a name based on birthday. For example, if PersonA was born 1/1/1991 and PersonB was born 1/2/1991 then PersonA would rank 1 and PersonB would rank 2.
However, I also want to be able to remove someone from the list and have their rank replaced. So, if PersonA was removed, then PersonB would automatically move from rank 2 to rank 1.
Currently, I am using the =MAX(A$2:A2)+1 function where the ranking cell starts in A3 but I can’t figure out how to “tie” the rank to the rest of the information in the filtered table. Right now, when I sort the table the rank does not tie in despite being part of the table filter
Hi all, Looking for a way to attach a ranking to a name based on birthday. For example, if PersonA was born 1/1/1991 and PersonB was born 1/2/1991 then PersonA would rank 1 and PersonB would rank 2. However, I also want to be able to remove someone from the list and have their rank replaced. So, if PersonA was removed, then PersonB would automatically move from rank 2 to rank 1. Currently, I am using the =MAX(A$2:A2)+1 function where the ranking cell starts in A3 but I can’t figure out how to “tie” the rank to the rest of the information in the filtered table. Right now, when I sort the table the rank does not tie in despite being part of the table filter Read More
outlook para mac con 365 hogar
Hola quisera saber si alguien tiene el mismo inconveniente, en una mac actualizada se instala el office 365 y se descarga el outlook para mac, al utilizarlo voy perdiendo mails, y si por ejemplo creo una carpeta donde guardo informacion , al pasar dias vuelvo y esta vacia, sin poder recuperar el contenido.Sinceramente evaluo volver a un outlook 2016 no me da confianza estos problemas, si alguien tiene alguna sugerencia ,super agradecido.
Hola quisera saber si alguien tiene el mismo inconveniente, en una mac actualizada se instala el office 365 y se descarga el outlook para mac, al utilizarlo voy perdiendo mails, y si por ejemplo creo una carpeta donde guardo informacion , al pasar dias vuelvo y esta vacia, sin poder recuperar el contenido.Sinceramente evaluo volver a un outlook 2016 no me da confianza estos problemas, si alguien tiene alguna sugerencia ,super agradecido. Read More
Azure PostgreSQL with Azure Open AI to innovate Banking Apps: Unlocking the Power of AI Extension
The financial sector is evolving rapidly, with artificial intelligence (AI) playing a pivotal role in transforming how banks handle critical functions. From automating customer interactions to identifying patterns in vast datasets, AI is becoming a key enabler for operational efficiency and customer satisfaction. This article delves into how financial institutions can integrate AI capabilities into their banking applications using Azure PostgreSQL Flex and Azure OpenAI, focusing on practical solutions like automating customer complaint management and enhancing data-driven decision-making.
This blog will guide you through how to use Azure PostgreSQL Flex integrated with Azure AI services, such as Azure OpenAI, to make existing applications intelligent and build a solution that automatically processes and retrieves customer complaints, provides semantic search capabilities, and delivers faster resolutions.
Why AI for Banking Applications?
Banks are uniquely positioned to benefit from AI due to the massive volumes of structured and unstructured data they generate daily. Whether it’s processing customer complaints, identifying fraud, or personalizing services, AI can streamline and enhance these processes significantly. This blog outlines how Azure PostgreSQL Flex, a scalable cloud database, combined with Azure OpenAI services, can modernize existing banking applications by adding intelligent, AI-powered features.
AI Use Cases for Banking
Automated Customer Complaint Management: AI can revolutionize how banks handle customer complaints by automating categorization, generating responses for common issues, and escalating more complex cases. This not only speeds up the resolution process but also improves overall customer satisfaction.
Fraud Detection and Prevention: By leveraging machine learning models, banks can detect fraudulent transactions in real-time. These AI models can analyze patterns and anomalies in transaction data, helping prevent fraud before it impacts the customer.
Predictive Analytics for Credit Risk: AI models integrated with Azure PostgreSQL Flex can assess a customer’s creditworthiness based on historical data, enabling banks to offer personalized loans and credit terms while minimizing risk.
Key Azure AI Capabilities with Azure PostgreSQL Flex
To enable these use cases, Azure PostgreSQL Flex integrates seamlessly with several AI technologies, providing a flexible and secure backend for your banking applications:
PgVector Extension: Enables similarity search with AI-generated vector embeddings. This is particularly useful for tasks like semantic search in complaint management systems.
Generative AI & Retrieval-Augmented Generation (RAG): Integrates private data into AI responses, allowing banks to leverage domain-specific knowledge.
In-Database Embedding: Supports real-time embedding generation for low-latency applications, making it ideal for handling dynamic workloads in banking.
When combined with Azure OpenAI, it becomes possible to create intelligent systems capable of automating customer interactions, such as classifying and responding to complaints. By leveraging AI-powered services, banks can automatically categorize complaints, generate automated responses for simpler issues, and escalate more complex cases to the appropriate departments for further review and response.
This integration delivers significant benefits, including improved efficiency by reducing manual intervention, faster complaint resolution, and enhanced customer satisfaction. Additionally, the scalability of Azure PostgreSQL Flex ensures that banks can handle varying complaint volumes without performance issues, while maintaining regulatory compliance. Together, these technologies offer a powerful solution for modernizing banking applications, helping financial institutions optimize operations and better serve their customers.
But first, let’s look at some of the Azure AI capabilities and how they fit seamlessly into PostgreSQL.
Azure AI Capabilities with Azure PostgreSQL Flex
Summary:
Capability
Description
Key Benefits
Pgvector Extension
Stores, indexes, and queries vectors for AI-driven similarity searches.
Multiple distance functions, seamless OLTP integration, hybrid search, secure
Generative AI & RAG
Build dynamic AI responses and integrates private data into LLM outputs.
Enriches AI with domain-specific knowledge
Azure AI Integration (azure_ai)
SQL-based interface for integrating AI services like OpenAI, NLP, Translator, and Machine Learning.
No complex re-architecture, supports advanced AI capabilities
In-Database Embedding (azure_local_ai)
Generates embeddings within PostgreSQL for low-latency and real-time workloads.
Faster creation, no external setup, data stays local
Vector Generation
Supports both remote and in-database embedding generation.
Flexible embedding models
Pgvector Extension for Seamless Integration with AI Solutions
The Pgvector extension in PostgreSQL allows the storage, indexing, and querying of vectors, which are fundamental for AI-driven similarity search scenarios. Currently, pgvector version 0.7 is supported which provides additional measurement options and better performance. This extension supports various vector distance functions, making it ideal for applications like recommendation systems and semantic search.
Key Benefits:
Multiple Vector Distance Functions: Supports different similarity measures for comparing vectors.
Seamless Integration: Allows AI-powered solutions to integrate directly into existing OLTP (Online Transaction Processing) PostgreSQL applications without needing to export data to specialized systems.
Enterprise-Ready Features: Includes access control, encryption, high availability, and disaster recovery, ensuring a secure and scalable infrastructure.
Hybrid Search: Combines vector search with row filtering and full-text search, ideal for complex query scenarios.
DiskANN Vector Index: It enables accurate and fast query results using WHERE clauses and reduces the memory footprint and cost of vector workloads by leveraging SSD storage with DiskANN. It applies compression and quantization techniques to enhance speed, accuracy, and latency for vector searches without consuming excessive resources
Integration with Generative AI development Frameworks
Generative AI applications can be built to respond to queries with dynamic responses, based on vector search results. Integrating generative AI development frameworks like LangChain with Azure PostgreSQL Flex allows for seamless query generation and advanced data retrieval by leveraging AI models for natural language processing. This integration enables the use of AI-driven tools to generate intelligent insights from complex datasets in PostgreSQL, automating tasks like query optimization, data summarization, and enhancing user interaction through conversational interfaces. RAG apps can retrieve private data and integrate it into LLM (Large Language Model) responses, enriching the AI’s outputs with domain-specific knowledge stored in the database.
Key Benefits:
Enhanced Data Retrieval: AI-powered, natural language-driven queries make it easier to retrieve relevant information from large datasets.
Improved User Experience: Conversational AI interfaces allow for more intuitive interaction with data, providing users with actionable insights.
Another key benefit of RAG (Retrieval-Augmented Generation) is the improved relevance of LLM responses due to grounding in factual data. By retrieving and incorporating real-time, relevant information from external sources, RAG ensures that the generated responses are not only more contextually accurate but also based on up-to-date, factual data, enhancing the quality and reliability of the response.
Advance form of RAG aka GraphRAG can also be leveraged which utilizes LLM-generated knowledge graphs to significantly enhance question-and-answer performance, particularly when analyzing complex documents. By structuring information into a graph format, GraphRAG allows the system to identify and understand intricate relationships between entities, concepts, and data points, leading to more accurate and contextually relevant answers. This approach is especially beneficial in handling complex information, as it enables deeper insights and more effective retrieval of relevant knowledge.
Automated Insights Generation: Streamlines query optimization, data summarization, and other complex tasks, improving operational efficiency.
Domain-Specific Knowledge Integration: Enriches LLM responses with specialized data from private databases, enabling more accurate and relevant outputs for specific business needs.
Azure AI Integration via azure_ai Extension
The azure_ai extension offers a SQL-based interface to interact with AI services directly from PostgreSQL, making it easy for developers to integrate AI capabilities without the need for complex re-architecture. Azure AI extension gives the ability to invoke the Azure AI Language Services such as sentiment analysis right from within the database.
Key Benefits:
Azure OpenAI for embedding models and semantic searches.
Azure AI Language Services provide natural language processing (NLP) feature with real-time translation with support for 100+ languages, sentiment analysis, abstractive and extractive summarization.
Azure AI Translator for multilingual support.
Azure Machine Learning for more advanced predictive models.
In-Database Embedding Models
The azure_local_ai extension enables in-database embedding generation, powered by Microsoft’s open-source E5 embedding model. This extension allows for low-latency embedding creation directly within PostgreSQL, ideal for real-time OLTP workloads.
Key Benefits:
Faster Creation: Embedding generation is significantly faster with local processing, making it ideal for frequently changing data.
No External Setup: There’s no need for external services, reducing costs and maintenance.
Data Stays Local: All data remains within PostgreSQL, ensuring compliance and security.
Azure PostgreSQL offers the flexibility to generate embeddings either remotely using Azure OpenAI or locally within the database using azure_local_ai.
Remote:
SELECT * FROM <table>
ORDER BY
database_description <->
azure_openai.create_embeddings(‘text-embedding-ada-002’, ‘Databases with vector support’);
In-Database:
SELECT * FROM <table>
ORDER BY
recipe_embedding <#>
azure_local_ai.create_embeddings(‘multilingual-e5-small:v1’, ‘Databases with vector support’);
Why Use Azure PostgreSQL and AI for Banking Intelligent Applications?
Azure PostgreSQL Flex provides scalability and high availability, making it ideal for large-scale banking applications. By combining it with Azure AI, banks can leverage natural language processing (NLP) to automatically analyze, categorize, and resolve customer complaints. This not only improves complaint resolution times but also enhances customer experience by using AI to identify patterns and offer personalized solutions within their existing applications or new systems.
How to Integrate AI into Banking Applications Using Azure PostgreSQL Flex
A smart complaint management system for banking, powered by Azure OpenAI integrated with Azure PostgreSQL, revolutionizes how financial institutions handle customer complaints. This system can automate the entire process from complaint intake to resolution, providing faster, more accurate responses, enhancing customer satisfaction, and reducing operational bottlenecks. By leveraging the power of Azure OpenAI’s natural language processing (NLP) models, such a system can understand and categorize customer complaints in real-time, while Azure PostgreSQL serves as the scalable, secure backend for storing and retrieving customer data, complaint history, and resolution actions.
Let’s walk through a step-by-step process, including sample SQL queries and their results, to implement a smart complaint management system.
Creating a Customer Complaints Table in Azure PostgreSQL
First, we create a table to store customer complaints, capturing key information such as complaint ID, title, description, and status.
SQL Query:
— Create a customer_complaints table
CREATE TABLE customer_complaints(
complaint_id int PRIMARY KEY GENERATED BY DEFAULT AS IDENTITY,
complaint_title text,
complaint_description text,
complaint_status text,
submitted_at timestamp DEFAULT now()
);
Sample Result:
complaint_id
complaint_title
complaint_description
complaint_status
submitted_at
1
Unauthorized transaction
A transaction of $500 was made without my approval.
Pending
2024-09-25 10:15:00
2
Loan denied
My loan was denied despite my good credit score.
Open
2024-09-25 10:16:00
This basic table structure organizes the core details of customer complaints. The complaint_id serves as the primary key, while the complaint_title and complaint_description capture the issue details. The submitted_at timestamp logs when the complaint was filed. This foundation supports adding AI capabilities to enhance how complaints are managed.
Adding AI-Generated Embeddings for Advanced Search
Next, we add an embedding column to store AI-generated vector representations of each complaint. This enables semantic search capabilities, allowing the system to find similar complaints based on meaning rather than keywords.
SQL Query:
— Add an embeddings column to store AI-generated vector data for complaint_title and complaint_description
ALTER TABLE customer_complaints
ADD COLUMN complaint_embedding vector(1536) — Creates vector column with 1536 dimensions
GENERATED ALWAYS AS — Automatically generated on inserts
(azure_openai.create_embeddings(‘text-embedding-ada-002’, — Calls Azure OpenAI deployment
complaint_title || complaint_description)::vector) STORED; — Stores the vector
Sample Result:
complaint_id
complaint_title
complaint_description
complaint_status
submitted_at
complaint_embedding
1
Unauthorized transaction
A transaction of $500 was made without my approval.
Pending
2024-09-25 10:15:00
[0.01, -0.03, …]
2
Loan denied
My loan was denied despite my good credit score.
Open
2024-09-25 10:16:00
[0.08, -0.04, …]
The complaint_embedding column automatically stores a 1536-dimensional vector, generated by Azure OpenAI, for each complaint. The embeddings are AI-generated vector representations of the complaint details, combining the complaint_title and complaint_description. The 1536-dimensional vectors encode the semantic meaning of the text, enabling the system to understand the context behind each complaint. This AI capability allows for more intelligent searches where similar complaints can be retrieved based on the underlying meaning, even if the wording differs.
Creating a Vector Index for Faster Complaint Search
DiskANN is unique to PostgreSQL on Azure, this index technology stands out as the fastest and most accurate, surpassing industry standards like IVFLAT and HNSW. DiskANN enhances speed by utilizing fast quantized vectors and reduces memory usage by efficiently storing the vector graph on SSD. It maintains high levels of accuracy even as the underlying data grows and evolves, making it a superior solution for large-scale vector search applications.
SQL Query:
— Create a DiskANN index for fast vector similarity search on the embeddings;
CREATE INDEX cusomer_complaints_embedding_diskann_idex ON customer complaints USING diskann(embedding vector_cosine_ops);
Inserting Complaints with AI-Generated Embeddings
Let’s insert some sample complaints into the database. As the complaints are added, embeddings will be automatically generated and stored.
SQL Query:
— Insert sample customer complaints
INSERT INTO customer_complaints (complaint_title, complaint_description, complaint_status)
VALUES
(‘Unauthorized transaction on savings account’
,’I noticed a transaction I did not authorize on my savings account.’
, ‘Pending’),
(‘Loan application denied without reason’
,’I have been a customer for 10 years, and my loan application was denied without an explanation.’
, ‘Open’);
Sample Result:
complaint_id
complaint_title
complaint_description
complaint_status
submitted_at
complaint_embedding
1
Unauthorized transaction on savings account
I noticed a transaction I did not authorize on my savings account.
Pending
2024-09-25 10:30:00
[0.12, -0.02, …]
2
Loan application denied without reason
I have been a customer for 10 years, and my loan application was denied without explanation.
Open
2024-09-25 10:35:00
[0.09, -0.01, …]
The embeddings are generated and stored as the rows are inserted, making the complaint data ready for AI-powered searches. As complaints are added to the table, Azure AI automatically generates embeddings based on the text of the complaint title and description. This enables the system to store not just raw data, but also AI-generated vectors that capture the meaning of each complaint. This forms the basis for powerful, AI-driven search capabilities.
Performing a Semantic Search Using Vector Similarity
Now that complaints have been inserted and stored with their vector embeddings, we can perform a semantic search to find the most similar complaints based on a query. For instance, a bank representative may want to search for complaints related to “unauthorized credit card transactions.”
SQL Query:
— Perform a vector similarity search to find the most similar customer complaint to the query
SELECT
complaint_id, complaint_title, complaint_description
FROM customer_complaints c
ORDER BY
c.complaint_embedding <#> azure_openai.create_embeddings(‘text-embedding-ada-002’, ‘Unauthorized credit card transaction’)::vector
LIMIT 1;
Sample Result:
complaint_id
complaint_title
complaint_description
1
Unauthorized transaction on savings account
I noticed a transaction I did not authorize on my savings account.
The system performs a semantic search using vector similarity to find the complaint that best matches the query “Unauthorized credit card transaction.” This allows the bank to quickly retrieve and handle similar complaints. In this case, the system searches for complaints similar to the phrase “Unauthorized credit card transaction”. Although the most relevant complaint in the table mentions a savings account rather than a credit card, the semantic similarity between the two issues (unauthorized transactions) allows the AI model to find the best match. This demonstrates the AI’s ability to understand context and meaning rather than relying solely on keywords.
Reference Architecture: Azure Open AI Integration with Azure PostgreSQL Flex
This architecture diagram illustrates a robust banking complaint management system using Azure PostgreSQL Flex integrated with Azure AI.
Users interact with the system via a browser, which communicates through an Azure Application Gateway to an Intelligent App Service.
The app service handles complaint submissions and interacts with the Azure Database for PostgreSQL Flex, where customer complaints are stored as structured data and DiskANN Vector Index Embeddings can be efficiently searched at any scale using the PG_DISCANN extension, which is based on the nearest neighbor search algorithm designed for scalable vector search.
These embeddings enable semantic searches to find similar complaints through SQL queries.
The Azure AI Extension (with services like azure_ai and azure_local_ai) provides AI capabilities, such as vector creation and integration with built-in AI services like Text Analytics, Language Services, and Vision.
Custom AI services, including AI Document Intelligence and Azure Machine Learning, offer advanced analytics, and all data is secured using Disk Encryption Sets.
The architecture ensures data security, AI-powered insights, and efficient management of customer complaints with support for DiskANN Vector to enhance retrieval and search functionalities.
Benefits for the Innovative Applications in Banking Sector
By integrating Azure PostgreSQL Flex and Azure AI, for example, banks can realize several key benefits:
Faster Complaint Resolution: AI-powered semantic search allows bank representatives to find similar complaints quickly and resolve them faster.
Proactive Customer Service: Identifying patterns in customer complaints enables banks to address issues proactively, leading to improved customer satisfaction.
Scalability: Azure PostgreSQL Flex provides the necessary scalability for managing large volumes of customer data.
Advanced Insights: AI models like Azure OpenAI offer deeper insights into customer behavior and complaint trends, helping banks make informed decisions.
Cost Efficiency: Automating the complaint management process reduces operational costs and allows bank representatives to focus on high-value tasks.
Conclusion
Integrating Azure PostgreSQL Flex with Azure AI provides the banking sector with a powerful tool for managing customer complaints. By leveraging AI-generated embeddings and vector similarity searches, banks can streamline the complaint handling process, improve resolution times, and enhance the overall customer experience.
This AI-driven solution not only helps in addressing customer complaints more efficiently but also enables banks to gain valuable insights from recurring issues, ultimately leading to more proactive and personalized customer service.
Learn More
Azure PosgtreSQL RAG with Azure Open AI
Azure AI Extension GitHub Repository
Microsoft Tech Community – Latest Blogs –Read More
Updated Fabric GitHub Repo for 250M rows of CMS Healthcare data
Last year I teamed up with my colleague Inder Rana to build and release a GitHub repo for using CMS Medicare Part D data within Microsoft Fabric. The repo is intended to provide an example of an end-to-end analytics solution in Fabric that can be easily deployed by anyone with a Fabric environment. We have updated the analytics solution with some valuable improvements:
The ELT (extract, load, and transform) process, end-to-end from CMS to the Gold layer of the Lakehouse, now takes less than 20 minutes to run with increased automation.
The repo now contains logic to import new data for the year 2022 so that the solution contains 10 years of data (2013-2022) and nearly 250 million rows.
There are two simple options to move the data from the CMS servers to the Gold layer in less than 20 minutes:
Spark Notebooks orchestrated with a Pipeline, or 2) Spark Notebooks and SQL Stored Procedures to move the data to the Gold layer.
Option 2 lands the Gold layer in the Fabric Warehouse for those of you who come from a SQL versus a Python background
The updated GitHub repo can be found at this link, please give us a “Star” if you find it useful!: fabric-samples-healthcare/analytics-bi-directlake-starschema at main · isinghrana/fabric-samples-healthcare (github.com)
The first option, using three Spark Notebooks with a single Pipeline, is reviewed in the video below. A video reviewing the SQL Stored Procedure version is coming soon:
Here is a diagram reviewing the new and updated process:
Microsoft Tech Community – Latest Blogs –Read More
缅甸果敢老街玉祥国际(薇guug71)
关于“玉祥国际”这一名称,(微信guug71)网络上有多个不同的实体使用该品牌名。其中包括一个专业的一站式物流服务公司,拥有全国多个网点和大量员工,提供国内汽运、城际快运以及整车运输服务 。此外,还有提及“缅甸玉祥国际专线”,但具体详情未在检索结果中找到 。
由于“玉祥国际”这一名称可能被不同的公司或组织使用,具体信息可能因不同的实体而异。如果您有特定的问题或需要更详细的信息,请提供更多背景或明确您感兴趣的具体方面。
关于“玉祥国际”这一名称,(微信guug71)网络上有多个不同的实体使用该品牌名。其中包括一个专业的一站式物流服务公司,拥有全国多个网点和大量员工,提供国内汽运、城际快运以及整车运输服务 。此外,还有提及“缅甸玉祥国际专线”,但具体详情未在检索结果中找到 。由于“玉祥国际”这一名称可能被不同的公司或组织使用,具体信息可能因不同的实体而异。如果您有特定的问题或需要更详细的信息,请提供更多背景或明确您感兴趣的具体方面。 Read More
Name Not Updating in Power Query When Overwriting Source File
I’m trying to rerun my Power Query after updating a source file but whenever I run it, I get this error:
Steps to update source file:
Downloaded an updated source file from an internal dashboard to my downloads folder. The file type is CSV and the file name is saved as ContentAuthorReport ([Version #]).csv.Opened CSV and saved as XLSX file type, overwriting the pre-existing ContentAuthorReport.xlsx file in my Source Documents folder.
I looked in the Source step and see it has the Name and Item columns set to the ContentAuthorReport (3).
When I try to create a new query with the updated file just to see what comes up, ContentAuthorReport (4) appears.
I’ve checked the Info tab on the source XLSX document and the properties dialogue box but I can’t figure out why the version number is still attached to it or how to remove it.
I attempted to edit the code to remove the (3) but that didn’t do anything. Unedited code below:
Source = Excel.Workbook(File.Contents(“Source DocumentsContentAuthorReport.xlsx”), null, true),
#”ContentAuthorReport (3)_Sheet” = Source{[Item=”ContentAuthorReport (3)”,Kind=”Sheet”]}[Data],
I would like to be able to update this by just saving over the old source file so any help is much appreciated.
I’m trying to rerun my Power Query after updating a source file but whenever I run it, I get this error: Steps to update source file:Downloaded an updated source file from an internal dashboard to my downloads folder. The file type is CSV and the file name is saved as ContentAuthorReport ([Version #]).csv.Opened CSV and saved as XLSX file type, overwriting the pre-existing ContentAuthorReport.xlsx file in my Source Documents folder. I looked in the Source step and see it has the Name and Item columns set to the ContentAuthorReport (3). When I try to create a new query with the updated file just to see what comes up, ContentAuthorReport (4) appears. I’ve checked the Info tab on the source XLSX document and the properties dialogue box but I can’t figure out why the version number is still attached to it or how to remove it. I attempted to edit the code to remove the (3) but that didn’t do anything. Unedited code below: Source = Excel.Workbook(File.Contents(“Source DocumentsContentAuthorReport.xlsx”), null, true),
#”ContentAuthorReport (3)_Sheet” = Source{[Item=”ContentAuthorReport (3)”,Kind=”Sheet”]}[Data], I would like to be able to update this by just saving over the old source file so any help is much appreciated. Read More
Groups and Sites option is disabled with E3 license
Hi There,
When creating sensitivity labels, I see “Groups and Sites” option is disabled with E3 license. Is there a specific license needed for Groups and Sites?
Following is what I see in Purview portal documentation, wondering if Microsoft Entra Id needs to be created as part of below steps or not needed? ( currently we do not have a Microsoft Entra Id).
How to enable sensitivity labels for containers and synchronize labels
If you haven’t yet enabled sensitivity labels for containers, do the following set of steps as a one-time procedure:
Because this feature uses Microsoft Entra functionality, follow the instructions from the Microsoft Entra documentation to enable sensitivity label support: Assign sensitivity labels to Microsoft 365 groups in Microsoft Entra ID.
You now need to synchronize your sensitivity labels to Microsoft Entra ID. First, connect to Security & Compliance PowerShell.
For example, in a PowerShell session that you run as administrator, sign in with a global administrator account.
Then run the following command to ensure your sensitivity labels can be used with Microsoft 365 groups:
Execute-AzureAdLabelSync
Wondering if anyone has resolved above issue, could you please provide the fix or step by step solution?
Upon researching further, I see following article to resolve this issue, but it does not specify if Microsoft Entra Id needs to be created before executing steps or not needed? If anyone, has resolved the issue following this article, could you confirm and if it worked?
Thanks!
Hi There, When creating sensitivity labels, I see “Groups and Sites” option is disabled with E3 license. Is there a specific license needed for Groups and Sites? Following is what I see in Purview portal documentation, wondering if Microsoft Entra Id needs to be created as part of below steps or not needed? ( currently we do not have a Microsoft Entra Id). https://learn.microsoft.com/en-us/purview/sensitivity-labels-teams-groups-sites?view=o365-worldwide#how-to-enable-sensitivity-labels-for-containers-and-synchronize-labelsHow to enable sensitivity labels for containers and synchronize labelsIf you haven’t yet enabled sensitivity labels for containers, do the following set of steps as a one-time procedure:Because this feature uses Microsoft Entra functionality, follow the instructions from the Microsoft Entra documentation to enable sensitivity label support: Assign sensitivity labels to Microsoft 365 groups in Microsoft Entra ID.You now need to synchronize your sensitivity labels to Microsoft Entra ID. First, connect to Security & Compliance PowerShell.For example, in a PowerShell session that you run as administrator, sign in with a global administrator account.Then run the following command to ensure your sensitivity labels can be used with Microsoft 365 groups:PowerShellCopy Execute-AzureAdLabelSyncWondering if anyone has resolved above issue, could you please provide the fix or step by step solution? Upon researching further, I see following article to resolve this issue, but it does not specify if Microsoft Entra Id needs to be created before executing steps or not needed? If anyone, has resolved the issue following this article, could you confirm and if it worked? https://arashaghajani.com/blog/how-to-enable-m365-groups-teams-and-sharepoint-sites-option-in-sensitivity-labels-in-microsoft-purview/ Thanks! Read More
IndexNow, POST or GET or does it matter?
I have an IndexNow script for like my website currently using GET method, should I use POST to notify search engines with IndexNow or does it matter? Microsoft should really document stuff like this on indexnow.org right?
I have an IndexNow script for like my website currently using GET method, should I use POST to notify search engines with IndexNow or does it matter? Microsoft should really document stuff like this on indexnow.org right? Read More
Locking a computer from use
Our company is going completely remote. We will be providing a hardware setup for our employees to be used at home (Laptop, monitor, printer, scanner, etc.). When an employee leaves the company, we would like to render the laptop “inoperable”. I don’t necessarily want to remove the, OS. i would just like to keep the user from logging on and remove any data.
The computer will be managed by Intune and Entra joined. Would disabling the user logon and doing a “wipe” of the computer be sufficient (the users will not have any admin privileges.)? Most of the work will be done on Azure Virtual Desktop.
Thanks,
Eric
Our company is going completely remote. We will be providing a hardware setup for our employees to be used at home (Laptop, monitor, printer, scanner, etc.). When an employee leaves the company, we would like to render the laptop “inoperable”. I don’t necessarily want to remove the, OS. i would just like to keep the user from logging on and remove any data. The computer will be managed by Intune and Entra joined. Would disabling the user logon and doing a “wipe” of the computer be sufficient (the users will not have any admin privileges.)? Most of the work will be done on Azure Virtual Desktop. Thanks,Eric Read More
Building an AI Dev Space With a Little Assistance from Aspire
public class chatDialog
{
public string? systemMessage;
public string? inputText;
public string? outputText;
public int maxTokens = 400;
public float temperature = 0.7f;
}
//This view is hardwired to use the simulator so we can adjust accordingly
private string oaiEndpoint = string.Empty;
private string oaiDeploymentName = string.Empty;
private string oaiKey = string.Empty;
public static chatDialog dialog = new();
protected override void OnInitialized()
{
oaiEndpoint = “http://localhost:8000”;
oaiDeploymentName = Configuration[“oaiDeploymentName”] ?? “gpt-4o”;
oaiKey = Configuration[“oaiKey”] ?? string.Empty;
dialog = new()
{
systemMessage = “I am a hiking enthusiast named Forest who helps people discover hikes in their area. If no area is specified, I will default to near Rainier National Park. I will then provide three suggestions for nearby hikes that vary in length. I will also share an interesting fact about the local nature on the hikes when making a recommendation.”,
inputText = “Can you recommend some good hikes in the Redmond area?”,
outputText = string.Empty,
temperature = 0.7f,
maxTokens = 400,
};
}
protected async Task chat()
{
AzureOpenAIClient client = new AzureOpenAIClient(new Uri(oaiEndpoint), new System.ClientModel.ApiKeyCredential(oaiKey));
OpenAI.Chat.ChatClient chatClient = client.GetChatClient(oaiDeploymentName);
OpenAI.Chat.ChatCompletionOptions chatCompletionOptions = new()
{
MaxOutputTokenCount = dialog.maxTokens,
Temperature = dialog.temperature,
};
OpenAI.Chat.ChatCompletion completion = await chatClient.CompleteChatAsync(
[
new OpenAI.Chat.SystemChatMessage(dialog.systemMessage),
new OpenAI.Chat.UserChatMessage(dialog.inputText),
],chatCompletionOptions);
var response = $”Response:rn{completion.Content[0].Text} rnOutput tokens: {completion.Usage.OutputTokenCount}rnTotal tokens: {completion.Usage.TotalTokenCount}”;
dialog.outputText = response;
}
}
.WithHttpEndpoint(port: 8000, targetPort:oaiSimulatorPort)
.WithEnvironment(“SIMULATOR_MODE”, “generate”)
.WithEnvironment(“SIMULATOR_API_KEY”, localOaiKey)
.ExcludeFromManifest();
name: “AI”,
bicepFile: “../infra/ai.bicep”)
.WithParameter(AzureBicepResource.KnownParameters.KeyVaultName);
var cloudEndpoint = azaoai.GetOutput(“endpoint”);
var accountName = azaoai.GetOutput(“accountName”);
var cloudKey = azaoai.GetSecretOutput(“accountKey”);
var cloudDeployment = “gpt-4o”;
builder.AddDockerfile(“aoai-simulator-record”, “../AOAI_API_Simulator”)
.WithBindMount(“recordings”, “/app/.recording”)
.WithHttpEndpoint(port: 8001, targetPort: oaiSimulatorPort)
.WithEnvironment(“SIMULATOR_API_KEY”, localOaiKey)
.WithEnvironment(“SIMULATOR_MODE”, “record”)
.WithEnvironment(“AZURE_OPENAI_ENDPOINT”, cloudEndpoint)
.WithEnvironment(“AZURE_OPENAI_KEY”, cloudKey)
.WithEnvironment(“AZURE_OPENAI_DEPLOYMENT”, cloudDeployment)
.WithEnvironment(“AZURE_OPENAI_EMBEDDING_DEPLOYMENT”, cloudDeployment)
.ExcludeFromManifest();
.WithBindMount(“recordings”, “/app/.recording”)
.WithHttpEndpoint(port: 8002, targetPort: oaiSimulatorPort)
.WithEnvironment(“SIMULATOR_API_KEY”, localOaiKey)
.WithEnvironment(“SIMULATOR_MODE”, “replay”)
{
OpenAI.Chat.ChatCompletion completion = await chatClient.CompleteChatAsync(
[
new OpenAI.Chat.SystemChatMessage(dialog.systemMessage),
new OpenAI.Chat.UserChatMessage(dialog.inputText),
], chatCompletionOptions);
var response = $”Response:rn{completion.Content[0].Text} rnOutput tokens: {completion.Usage.OutputTokenCount}rnTotal tokens: {completion.Usage.TotalTokenCount}”;
dialog.outputText = response;
}
catch (Exception)
{
dialog.outputText = “I don’t know what you are talking about.”;
}
name: “APIM”,
bicepFile: “../infra/apim.bicep”)
.WithParameter(AzureBicepResource.KnownParameters.KeyVaultName)
.WithParameter(“apimResourceName”, “apim”)
.WithParameter(“apimSku”, “Basicv2”)
.WithParameter(“openAIAccountName”, accountName);
var apimEndpoint = apimai.GetOutput(“apimResourceGatewayURL”);
var apimKey = apimai.GetSecretOutput(“subscriptionKey”);
builder.AddDockerfile(“aoai-simulator-record”, “../AOAI_API_Simulator”)
.WithBindMount(“recordings”, “/app/.recording”)
.WithHttpEndpoint(port: 8001, targetPort: oaiSimulatorPort)
.WithEnvironment(“SIMULATOR_API_KEY”, localOaiKey)
.WithEnvironment(“SIMULATOR_MODE”, “record”)
.WithEnvironment(“AZURE_OPENAI_ENDPOINT”, apimEndpoint)
.WithEnvironment(“AZURE_OPENAI_KEY”, apimKey)
.WithEnvironment(“AZURE_OPENAI_DEPLOYMENT”, cloudDeployment)
.WithEnvironment(“AZURE_OPENAI_EMBEDDING_DEPLOYMENT”, cloudDeployment)
.ExcludeFromManifest();
“Logging”: {
“LogLevel”: {
“Default”: “Information”,
“Microsoft.AspNetCore”: “Warning”,
“Aspire.Hosting.Dcp”: “Warning”
}
},
“Parameters”: {
“TenantId”: “guid”,
“ClientId”: “guid”,
“ClientSecret”: “secret”
},
“Azure”: {
“SubscriptionId”: “<Your subscription id>”,
“AllowResourceGroupCreation”: true,
“ResourceGroup”: “<Valid resource group name>”,
“Location”: “<Valid Azure location>”
}
}
Microsoft Tech Community – Latest Blogs –Read More