Category: Microsoft
Category Archives: Microsoft
SQL job fails while inserting records to Oracle db
Hi,
We have created SSIS package for inserting some of the table records from SQL server database to Oracle database. To insert the records we have created custom script and automated it has SQL agent and it is working fine for quite few months. But for last 1 week we are getting error unable to acquire connection.
Tried the below steps:
1. Both ports are opened in oracle and SQL.
2. Tested connectivity and was working fine.
3. Tried by creating a export data task from SQL database and run with the custom script to insert data and it worked fine.
Only SQL agent job the data is not transferring and getting unable to acquire connection.
SQL version : 2019 RTM CU-25 Enterprise edition.
Can someone please help on this.
Thanks,
Sujay
Hi, We have created SSIS package for inserting some of the table records from SQL server database to Oracle database. To insert the records we have created custom script and automated it has SQL agent and it is working fine for quite few months. But for last 1 week we are getting error unable to acquire connection. Tried the below steps:1. Both ports are opened in oracle and SQL.2. Tested connectivity and was working fine.3. Tried by creating a export data task from SQL database and run with the custom script to insert data and it worked fine.Only SQL agent job the data is not transferring and getting unable to acquire connection.SQL version : 2019 RTM CU-25 Enterprise edition.Can someone please help on this. Thanks,Sujay Read More
How to fix QuickBook error 1402 after update?
I’m encountering QuickBook Error 1402, and it’s preventing me from installing or updating the software. The error message states that a certain key could not be opened. I have tried restarting my computer and running the installation as an administrator, but the issue persists. How can I resolve this error and successfully install or update QuickBook?
I’m encountering QuickBook Error 1402, and it’s preventing me from installing or updating the software. The error message states that a certain key could not be opened. I have tried restarting my computer and running the installation as an administrator, but the issue persists. How can I resolve this error and successfully install or update QuickBook? Read More
How to update the “DeviceID” of a PC already in TeamViewer
My problem is the following:
I can’t find a way to change the name of a pc registered in TeamViewer in real time (even if it’s not exactly in real time, it’s not serious, but at least check several times a day. For example, every 3 hours) with Intune
I sometimes rename PCs in Intune
For example, old name: desktop-002
New name: dsktp-2024
TeamViewer will keep the old name
I’d like it to update regularly
I’ve tried several scripts but nothing works.
If you have an idea, thank you in advance 🙂
My problem is the following:I can’t find a way to change the name of a pc registered in TeamViewer in real time (even if it’s not exactly in real time, it’s not serious, but at least check several times a day. For example, every 3 hours) with IntuneI sometimes rename PCs in IntuneFor example, old name: desktop-002New name: dsktp-2024TeamViewer will keep the old nameI’d like it to update regularlyI’ve tried several scripts but nothing works.If you have an idea, thank you in advance 🙂 Read More
Recommend a good YouTube videos downloader for PC Windows 11
I am preparing a project report recently and need to download some experimental videos from YouTube as materials for the opening remarks. I have tried using some online video download services, but the download speed is very slow and there are many ads on the page.
Therefore, I am now looking for an efficient YouTube videos downloader suitable for Windows 11, hoping to find a software that is easy to operate, fast to download, and can guarantee the quality of the video. If you have used a good download tool, please recommend it. Thank you for your help!
I am preparing a project report recently and need to download some experimental videos from YouTube as materials for the opening remarks. I have tried using some online video download services, but the download speed is very slow and there are many ads on the page. Therefore, I am now looking for an efficient YouTube videos downloader suitable for Windows 11, hoping to find a software that is easy to operate, fast to download, and can guarantee the quality of the video. If you have used a good download tool, please recommend it. Thank you for your help! Read More
Azure Blogs – Articles from 8-July-2024 to 14-July-2024
AI + Machine Learning
Covering: Anomaly Detector, Azure Bot Services, Azure Cognitive Search, Azure ML, Azure Open Datasets, Azure Cognitive Services, Azure Video Indexer, Computer Vision, Content Moderator, Custom Vision, Data Science VM, Face API, Azure Form Recognizer, Azure Immersive Reader, Kinect DK, Language Understanding (LUIS), Microsoft Genomics, Personalizer, Project Bonsai, QnA Maker, Speaker recognition, Speech to Text, Speech translation, Cognitive Service for Language, Text to Speech, Translator, Azure Metrics Advisor, Health Bot, Azure Percept, Azure Applied AI Services, Azure OpenAI Service
Introducing the Azure AI Model Inference API
Azure OpenAI Extension for Function Apps Hands-on Experience
Running Open AI Whisper Model on Azure
Deploy a Phi-3 model in Azure AI, and consume it with C# and Semantic Kernel
Fine-Tune and Integrate Custom Phi-3 Models with Prompt Flow: Step-by-Step Guide
Four steps to expanding your AI skills with Python and Microsoft Learn
Building the Ultimate Nerdland Podcast Chatbot with RAG and LLM: Step-by-Step Guide
Comprehensive AI Safety and Security with defense in depth for Enterprises
From Principles to Practice: Developer Resources for Responsible AI Innovation
Streamlining SAP Processes with Azure OpenAI, Copilot Studio, and Power Platform
Accelerate data democratization in era of generative AI using Denodo Platform and Microsoft Fabric
Use WebGPU + ONNX Runtime Web + Transformer.js to build RAG applications by Phi-3-mini
Azure Video Indexer & Phi-3 introduce Textual Video Summary on Edge: Better Together story
Fast Transcription Public Preview in Azure AI Speech
Supply chain AI for the new era of value realization
GenAI Mastery: Crafting Robust Enterprise Solutions with PromptFlow and LangChain
Analytics
Covering: Azure Analysis Services, Azure Data Explorer, Azure Data Factory, Azure Data Lake Storage, Azure Data Share, Azure Databricks, Azure Stream Analytics, Azure Synapse Analytics, Data Catalog, Data Lake Analytics, HDInsight, Power BI Embedded, R Server for HDInsight, Microsoft Purview, Microsoft Graph Data Connect, Azure Chaos Studio
Accelerate data democratization in era of generative AI using Denodo Platform and Microsoft Fabric
Compute
Covering: Azure CycleCloud, Azure Quantum, Azure Spot Virtual Machines, Azure VMware Solution, Batch, Linux Virtual Machines, Virtual Machine Scale Sets, Virtual Machines, Azure Dedicated Host, Azure VM Image Builder, Azure Functions, Service Fabric
NVMe-enabled Ebsv5 VMs offering 400K IOPS and 10GBps throughput now generally available
General Availability Announcement: Azure VM Regional to Zonal Move
Where Does One Machine End and the Next Begin?
Microsoft Virtualization Migration Options
Azure OpenAI Extension for Function Apps Hands-on Experience
Containers
Covering: Azure Kubernetes Service (AKS), Azure Red Hat OpenShift, Azure Container Apps, Web App for Containers, Azure Container Instances, Azure Container Registry
Public Preview of the Windows Server Annual Channel for Containers on Azure Kubernetes Service
IBM Cloud Pak for Integration on Azure Red Hat OpenShift Now Generally Available
Microsoft Copilot in Azure Series – Kubectl
Databases
Covering: Azure Cache for Redis, Azure Cosmos DB, Azure Database for MariaDB, Azure Database for MySQL, Azure Database for PostgreSQL, Azure SQL, Azure SQL Database, Azure SQL Edge, Azure SQL Managed Instance, SQL Server on Azure VM, Table Storage, Azure Managed Instance for Apache Cassandra, Azure Confidential Ledger
Say hello to the Talking Postgres podcast
Update: Security hotfix released for OLE DB driver for SQL Server
Announcing SSMS 20.2 … and getting feedback for SSMS 21
Security Update for SQL Server 2016 SP3 Azure Connect Feature Pack
Security Update for SQL Server 2016 SP3 GDR
Security Update for SQL Server 2017 RTM CU31
Security Update for SQL Server 2017 RTM GDR
Security Update for SQL Server 2019 RTM CU27
Security Update for SQL Server 2019 RTM GDR
Security Update for SQL Server 2022 RTM CU13
Security Update for SQL Server 2022 RTM GDR
SQL Server 2022 分散型可用性グループにおける同期失敗
Azure Backup for SQL Server in Azure VM: Tips and Tricks from the Field
Increasing Security for SQL Server Enabled by Azure Arc
Azure Database for MySQL – June 2024 updates and latest feature roadmap
Developer Tools
Covering: App Configuration, Azure DevTest Labs, Azure Lab Services, SDKs, Visual Studio, Visual Studio Code, Azure Load Testing
Unlocking the Potential of Phi-3 and C# in AI Development
C# 13: Explore the latest preview features
.NET and .NET Framework July 2024 servicing releases updates
Why and How to Execute GraphQL Queries in .NET
.NET 9 Preview 6 is now available!
DevOps
Covering: Azure Artifacts, Azure Boards, Azure DevOps, Azure Pipelines, Azure Repos, Azure Test Plans, DevOps tool integrations, Azure Load Testing
Azure DevOps Server 2022.2 RTW now available
GitHub Availability Report: June 2024
Hybrid
Covering: Microsoft Azure Stack, Azure Arc
Supercharge your datacenters with Hyper-V and virtualized GPUs
Apply critical update for Azure Stack HCI VMs to maintain Azure verification
Increasing Security for SQL Server Enabled by Azure Arc
Identity
Covering: Azure Active Directory, Multi-factor Authentication, Azure Active Directory Domain Services, Azure Active Directory External Identities
Microsoft Entra certificate-based authentication enhancements
Microsoft Entra Suite now generally available
Integration
Covering: API Management, Event Grid, Logic Apps , Service Bus
Integrating Logic App with Semantic Kernel: A Detailed Guide and Demo
Azure API Center – The ultimate service to streamline API Governance across your organization.
Internet Of Things
Covering: Azure IoT Central, Azure IoT Edge, Azure IoT Hub, Azure RTOS, Azure Sphere, Azure Stream Analytics, Azure Time Series Insights, Microsoft Defender for IoT, Azure Percept, Windows for IoT
Azure Sphere – Image signing certificate update
Management and Governance
Covering: Automation, Azure Advisor, Azure Backup, Azure Blueprints, Azure Lighthouse, Azure Monitor, Azure Policy, Azure Resource Manager, Azure Service Health, Azure Site Recovery, Cloud Shell, Cost Management, Azure Portal, Network Watcher, Azure Automanage, Azure Resource Mover, Azure Chaos Studio, Azure Managed Grafana
What’s the difference between Azure savings plans for compute and Azure reservations?
New on Azure Marketplace: June 27-30, 2024
Public Preview Announcement: Azure Policy Built-in Versioning
Using Azure Automation to perform Azure Site Recovery post failover tasks in virtual machines
Govern your Azure Firewall configuration with Azure Policies
Azure Backup for SQL Server in Azure VM: Tips and Tricks from the Field
Azure Monitor: How To Stop Log-Based Alerts for Specific Resources
Introducing Agent and Gateway Extensions in Azure Monitor SCOM MI
Information protection: Auto labelling policy vs Information protection: Label Policy
Azure Verified Modules – Monthly Update [June]
Media
Covering: Azure Media Player, Content Protection, Encoding, Live and On-Demand Streaming, Media Services
No New Articles
Migration
Covering: Azure Database Migration Service, Azure Migrate, Data Box, Azure Site Recovery
Microsoft Virtualization Migration Options
Mixed Reality
Covering: Digital Twins, Kinect DK, Spatial Anchors, Remote Rendering, Object Anchors
No New Articles
Mobile
Covering: Azure Maps, MAUI, Notification Hubs, Visual Studio App Center, Xamarin, Azure Communication Services
Anywhere365 integrates Azure Communication Services into their Dialogue Cloud Platform
Networking
Covering: Application Gateway, Bastion, DDoS Protection, DNS, Azure ExpressRoute, Azure Firewall, Load Balancer, Firewall Manager, Front Door, Internet Analyzer, Azure Private Link, Content Delivery Network, Network Watcher, Traffic Manager, Virtual Network, Virtual WAN, VPN Gateway, Web Application Firewall, Azure Orbital, Route Server, Network Function Manager, Virtual Network Manager, Azure Private 5G Core
Dual-region deployments using Secure Virtual WAN Hub with Routing-Intent without Global Reach
Single-region deployment without Global Reach, using Secure Virtual WAN Hub with Routing-Intent
Azure WAF Public Preview: JavaScript Challenge
Save Costs with Basic SKU Application Gateway for more features and less fixed costs
Govern your Azure Firewall configuration with Azure Policies
Security
Covering: Defender for Cloud, DDoS Protection, Dedicated HSM, Azure Information Protection, Microsoft Sentinel, Key Vault, Microsoft Defender for Cloud, Microsoft Defender for IoT, Microsoft Azure Attestation, Azure Confidential Ledger
Microsoft Security Service Edge now generally available
Unified Security Operations Platform – Technical FAQ!
Guidance for handling “regreSSHion” (CVE-2024-6387) using Microsoft Security capabilities
Storage
Covering: Archive Storage, Avere vFXT for Azure, Azure Data Lake Storage, Azure Data Share, Files, FXT Edge Filer, HPC Cache, NetApp Files, Blob Storage, Data Box, Disk Storage, Queue Storage, Storage Accounts, Storage Explorer, StorSimple
Web
Covering: App Configuration, App Service, Azure Cognitive Search, Azure Maps, Azure SignalR Service, Static Web Apps, Azure Communication Services, Azure Web PubSub, Azure Fluid Relay, Web App for Containers
Memory Dump Collection using Procdump.exe for App Service (Windows)
Anywhere365 integrates Azure Communication Services into their Dialogue Cloud Platform
Azure Virtual Desktop
Covering: Windows Virtual Desktop, VMware Horizon Cloud on Microsoft Azure, Citrix Virtual Apps and Desktops for Azure
No New Articles
AI + Machine Learning
Covering: Anomaly Detector, Azure Bot Services, Azure Cognitive Search, Azure ML, Azure Open Datasets, Azure Cognitive Services, Azure Video Indexer, Computer Vision, Content Moderator, Custom Vision, Data Science VM, Face API, Azure Form Recognizer, Azure Immersive Reader, Kinect DK, Language Understanding (LUIS), Microsoft Genomics, Personalizer, Project Bonsai, QnA Maker, Speaker recognition, Speech to Text, Speech translation, Cognitive Service for Language, Text to Speech, Translator, Azure Metrics Advisor, Health Bot, Azure Percept, Azure Applied AI Services, Azure OpenAI Service
Introducing the Azure AI Model Inference API
Open AI Whisper
Azure OpenAI Extension for Function Apps Hands-on Experience
Running Open AI Whisper Model on Azure
Deploy a Phi-3 model in Azure AI, and consume it with C# and Semantic Kernel
Fine-Tune and Integrate Custom Phi-3 Models with Prompt Flow: Step-by-Step Guide
Four steps to expanding your AI skills with Python and Microsoft Learn
Building the Ultimate Nerdland Podcast Chatbot with RAG and LLM: Step-by-Step Guide
Comprehensive AI Safety and Security with defense in depth for Enterprises
From Principles to Practice: Developer Resources for Responsible AI Innovation
Streamlining SAP Processes with Azure OpenAI, Copilot Studio, and Power Platform
Accelerate data democratization in era of generative AI using Denodo Platform and Microsoft Fabric
Use WebGPU + ONNX Runtime Web + Transformer.js to build RAG applications by Phi-3-mini
Azure Video Indexer & Phi-3 introduce Textual Video Summary on Edge: Better Together story
Fast Transcription Public Preview in Azure AI Speech
Supply chain AI for the new era of value realization
GenAI Mastery: Crafting Robust Enterprise Solutions with PromptFlow and LangChain
Analytics
Covering: Azure Analysis Services, Azure Data Explorer, Azure Data Factory, Azure Data Lake Storage, Azure Data Share, Azure Databricks, Azure Stream Analytics, Azure Synapse Analytics, Data Catalog, Data Lake Analytics, HDInsight, Power BI Embedded, R Server for HDInsight, Microsoft Purview, Microsoft Graph Data Connect, Azure Chaos Studio
Accelerate data democratization in era of generative AI using Denodo Platform and Microsoft Fabric
Compute
Covering: Azure CycleCloud, Azure Quantum, Azure Spot Virtual Machines, Azure VMware Solution, Batch, Linux Virtual Machines, Virtual Machine Scale Sets, Virtual Machines, Azure Dedicated Host, Azure VM Image Builder, Azure Functions, Service Fabric
NVMe-enabled Ebsv5 VMs offering 400K IOPS and 10GBps throughput now generally available
General Availability Announcement: Azure VM Regional to Zonal Move
Where Does One Machine End and the Next Begin?
Microsoft Virtualization Migration Options
Azure OpenAI Extension for Function Apps Hands-on Experience
Containers
Covering: Azure Kubernetes Service (AKS), Azure Red Hat OpenShift, Azure Container Apps, Web App for Containers, Azure Container Instances, Azure Container Registry
Public Preview of the Windows Server Annual Channel for Containers on Azure Kubernetes Service
IBM Cloud Pak for Integration on Azure Red Hat OpenShift Now Generally Available
Microsoft Copilot in Azure Series – Kubectl
Databases
Covering: Azure Cache for Redis, Azure Cosmos DB, Azure Database for MariaDB, Azure Database for MySQL, Azure Database for PostgreSQL, Azure SQL, Azure SQL Database, Azure SQL Edge, Azure SQL Managed Instance, SQL Server on Azure VM, Table Storage, Azure Managed Instance for Apache Cassandra, Azure Confidential Ledger
Say hello to the Talking Postgres podcast
Update: Security hotfix released for OLE DB driver for SQL Server
Announcing SSMS 20.2 … and getting feedback for SSMS 21
Security Update for SQL Server 2016 SP3 Azure Connect Feature Pack
Security Update for SQL Server 2016 SP3 GDR
Security Update for SQL Server 2017 RTM CU31
Security Update for SQL Server 2017 RTM GDR
Security Update for SQL Server 2019 RTM CU27
Security Update for SQL Server 2019 RTM GDR
Security Update for SQL Server 2022 RTM CU13
Security Update for SQL Server 2022 RTM GDR
SQL Server 2022 分散型可用性グループにおける同期失敗
Azure Backup for SQL Server in Azure VM: Tips and Tricks from the Field
Increasing Security for SQL Server Enabled by Azure Arc
Azure Database for MySQL – June 2024 updates and latest feature roadmap
Developer Tools
Covering: App Configuration, Azure DevTest Labs, Azure Lab Services, SDKs, Visual Studio, Visual Studio Code, Azure Load Testing
Unlocking the Potential of Phi-3 and C# in AI Development
C# 13: Explore the latest preview features
.NET and .NET Framework July 2024 servicing releases updates
Why and How to Execute GraphQL Queries in .NET
.NET 9 Preview 6 is now available!
DevOps
Covering: Azure Artifacts, Azure Boards, Azure DevOps, Azure Pipelines, Azure Repos, Azure Test Plans, DevOps tool integrations, Azure Load Testing
Azure DevOps Server 2022.2 RTW now available
GitHub Availability Report: June 2024
Hybrid
Covering: Microsoft Azure Stack, Azure Arc
Supercharge your datacenters with Hyper-V and virtualized GPUs
Apply critical update for Azure Stack HCI VMs to maintain Azure verification
Increasing Security for SQL Server Enabled by Azure Arc
Identity
Covering: Azure Active Directory, Multi-factor Authentication, Azure Active Directory Domain Services, Azure Active Directory External Identities
Microsoft Entra certificate-based authentication enhancements
Simplified Zero Trust security with the Microsoft Entra Suite and unified security operations platform, now generally available
Microsoft Entra Suite now generally available
Integration
Covering: API Management, Event Grid, Logic Apps , Service Bus
Integrating Logic App with Semantic Kernel: A Detailed Guide and Demo
Azure API Center – The ultimate service to streamline API Governance across your organization.
Internet Of Things
Covering: Azure IoT Central, Azure IoT Edge, Azure IoT Hub, Azure RTOS, Azure Sphere, Azure Stream Analytics, Azure Time Series Insights, Microsoft Defender for IoT, Azure Percept, Windows for IoT
Azure Sphere – Image signing certificate update
Management and Governance
Covering: Automation, Azure Advisor, Azure Backup, Azure Blueprints, Azure Lighthouse, Azure Monitor, Azure Policy, Azure Resource Manager, Azure Service Health, Azure Site Recovery, Cloud Shell, Cost Management, Azure Portal, Network Watcher, Azure Automanage, Azure Resource Mover, Azure Chaos Studio, Azure Managed Grafana
What’s the difference between Azure savings plans for compute and Azure reservations?
New on Azure Marketplace: June 27-30, 2024
Public Preview Announcement: Azure Policy Built-in Versioning
Using Azure Automation to perform Azure Site Recovery post failover tasks in virtual machines
Govern your Azure Firewall configuration with Azure Policies
Azure Backup for SQL Server in Azure VM: Tips and Tricks from the Field
Azure Monitor: How To Stop Log-Based Alerts for Specific Resources
Introducing Agent and Gateway Extensions in Azure Monitor SCOM MI
Information protection: Auto labelling policy vs Information protection: Label Policy
Azure Verified Modules – Monthly Update [June]
Media
Covering: Azure Media Player, Content Protection, Encoding, Live and On-Demand Streaming, Media Services
No New Articles
Migration
Covering: Azure Database Migration Service, Azure Migrate, Data Box, Azure Site Recovery
Microsoft Virtualization Migration Options
Mixed Reality
Covering: Digital Twins, Kinect DK, Spatial Anchors, Remote Rendering, Object Anchors
No New Articles
Mobile
Covering: Azure Maps, MAUI, Notification Hubs, Visual Studio App Center, Xamarin, Azure Communication Services
Anywhere365 integrates Azure Communication Services into their Dialogue Cloud Platform
Networking
Covering: Application Gateway, Bastion, DDoS Protection, DNS, Azure ExpressRoute, Azure Firewall, Load Balancer, Firewall Manager, Front Door, Internet Analyzer, Azure Private Link, Content Delivery Network, Network Watcher, Traffic Manager, Virtual Network, Virtual WAN, VPN Gateway, Web Application Firewall, Azure Orbital, Route Server, Network Function Manager, Virtual Network Manager, Azure Private 5G Core
Dual-region deployments using Secure Virtual WAN Hub with Routing-Intent without Global Reach
Single-region deployment without Global Reach, using Secure Virtual WAN Hub with Routing-Intent
Azure WAF Public Preview: JavaScript Challenge
Save Costs with Basic SKU Application Gateway for more features and less fixed costs
Govern your Azure Firewall configuration with Azure Policies
Security
Covering: Defender for Cloud, DDoS Protection, Dedicated HSM, Azure Information Protection, Microsoft Sentinel, Key Vault, Microsoft Defender for Cloud, Microsoft Defender for IoT, Microsoft Azure Attestation, Azure Confidential Ledger
Microsoft Security Service Edge now generally available
Unified Security Operations Platform – Technical FAQ!
Guidance for handling “regreSSHion” (CVE-2024-6387) using Microsoft Security capabilities
Storage
Covering: Archive Storage, Avere vFXT for Azure, Azure Data Lake Storage, Azure Data Share, Files, FXT Edge Filer, HPC Cache, NetApp Files, Blob Storage, Data Box, Disk Storage, Queue Storage, Storage Accounts, Storage Explorer, StorSimple
Web
Covering: App Configuration, App Service, Azure Cognitive Search, Azure Maps, Azure SignalR Service, Static Web Apps, Azure Communication Services, Azure Web PubSub, Azure Fluid Relay, Web App for Containers
Memory Dump Collection using Procdump.exe for App Service (Windows)
Anywhere365 integrates Azure Communication Services into their Dialogue Cloud Platform
Azure Virtual Desktop
Covering: Windows Virtual Desktop, VMware Horizon Cloud on Microsoft Azure, Citrix Virtual Apps and Desktops for Azure
No New Articles
Read More
Tracing LangChain Code on Azure with OpenTelemetry and Application Insights
As AI and machine learning applications grow more complex, ensuring their observability becomes crucial. Tracing helps identify and resolve performance bottlenecks and errors, providing insights into the internal workings of your applications. LangChain has become a popular framework for building applications with large language models. When deploying LangChain apps to production, tracing and monitoring are crucial for understanding performance and troubleshooting issues. In this blog, we will explore how to trace LangChain code on Azure using OpenTelemetry and Application Insights. We’ll leverage tools and libraries such as OpenInference, Azure’s OpenTelemetry exporter, and Application Insights.
Why Tracing Matters for LangChain Apps
LangChain applications often involve complex chains of operations – prompting language models, calling external APIs, accessing vector stores, etc. Tracing helps developers visualize these operations, identify bottlenecks, and debug errors. It’s especially important for AI apps that may have non-deterministic behavior.
Prerequisites
Before we dive into the implementation, ensure you have the following installed:
Python 3.7+
Azure account
Basic knowledge of Python and LangChain
OpenAI API key
Step 1: Setting Up OpenInference LangChain Instrumentation:
OpenInference provides auto-instrumentation for LangChain, making it compatible with OpenTelemetry. Let’s start by installing the necessary packages:
requirements.txt
azure-monitor-opentelemetry-exporter
openinference-instrumentation-langchain
langchain
opentelemetry-sdk
opentelemetry-exporter-otlp
openai
Now install the required packages by pip install -r requirements.txt
Step 2: Set up Azure Monitor Exporter:
Azure Monitor provides powerful tools for monitoring applications, including Application Insights. We’ll use the Azure Monitor OpenTelemetry Exporter to send trace data to Application Insights.
import os
from opentelemetry import trace
from opentelemetry.sdk.trace import TracerProvider
from opentelemetry.sdk.trace.export import BatchSpanProcessor
from dotenv import load_dotenv
load_dotenv(‘azure.env’)
from azure.monitor.opentelemetry.exporter import AzureMonitorTraceExporter
exporter = AzureMonitorTraceExporter.from_connection_string(
os.environ[“APPLICATIONINSIGHTS_CONNECTION_STRING”]
)
Step 3: Integrating with Azure Monitor as LangChain Instrumentor
Azure Monitor provides powerful tools for monitoring applications, including Application Insights. We’ll use the Azure Monitor OpenTelemetry Exporter to send trace data to Application Insights. The below code sets up OpenTelemetry tracing for a LangChain application, configuring it to batch and export spans every 60 seconds, and automatically instrument LangChain operations. This allows you to collect detailed telemetry data about your LangChain application’s performance and behavior.
tracer_provider = TracerProvider()
from openinference.instrumentation.langchain import LangChainInstrumentor
from opentelemetry import trace as trace_api
from opentelemetry.exporter.otlp.proto.http.trace_exporter import OTLPSpanExporter
from opentelemetry.sdk import trace as trace_sdk
from opentelemetry.sdk.trace.export import ConsoleSpanExporter, SimpleSpanProcessor
trace_api.set_tracer_provider(tracer_provider)
trace.set_tracer_provider(tracer_provider)
tracer = trace.get_tracer(__name__)
span_processor = BatchSpanProcessor(exporter, schedule_delay_millis=60000)
trace.get_tracer_provider().add_span_processor(span_processor)
LangChainInstrumentor().instrument()
Step 3: Create LangChain LLM Chain
Now lets set up a LangChain application to generate jokes using Azure’s OpenAI service. It begins by importing necessary classes from the langchain_openai and langchain.chains modules. A PromptTemplate is created with a template that asks for a joke based on the provided adjective. The AzureChatOpenAI class is then instantiated with the API key, endpoint, API version, and model name, all of which are retrieved from environment variables. This configuration enables the LangChain application to interact with Azure’s OpenAI model deployment to generate responses based on the specified prompt template.
from langchain_openai import AzureChatOpenAI
from langchain.chains import LLMChain
from langchain_core.prompts import PromptTemplate
prompt_template = “Tell me a {adjective} joke”
prompt = PromptTemplate(input_variables=[“adjective”], template=prompt_template)
llm = AzureChatOpenAI(api_key = os.environ[‘AZURE_OPENAI_API_KEY’],
azure_endpoint = os.environ[‘AZURE_OPENAI_ENDPOINT’],
api_version = ‘2024-06-01’,
model= os.environ[‘AZURE_OPENAI_GPT_DEPLOYMENT’])
Step 4: Viewing Traces in Azure Monitor
Lets invoke the LangChain chain before viewing the trace.
chain = LLMChain(llm=llm, prompt=prompt, metadata={“category”: “jokes”})
completion = chain.predict(adjective=”funny”, metadata={“variant”: “funny”})
print(completion)
After integrating the Azure Monitor exporter, your LangChain application will send traces to Application Insights. You can view these traces:
Navigate to the Azure portal.
Select your Application Insights resource.
Go to the “Transactions” section to view the traces.
Conclusion
By following these steps, you can effectively trace your LangChain applications using OpenTelemetry and view these traces in Azure Monitor’s Application Insights. This setup not only enhances observability but also helps in identifying and resolving performance issues efficiently. For more detailed information, refer to the official documentation:
OpenInference LangChain Instrumentation
Azure Monitor OpenTelemetry Exporter
Sample Trace Configuration
Happy tracing!
Microsoft Tech Community – Latest Blogs –Read More
80 Teacher Trainers from NTT Vocational High Schools to Enhance Education through Generative AI
Kupang, 18 July 2024 – The development of artificial intelligence (AI) should benefit everyone in Indonesia, including teachers from vocational high schools (SMK) in the East Nusa Tenggara (NTT) region. By providing access to Generative AI for SMK teachers and students in the region, we could also help them to have better chances in entering the job market and competing with the other 149.38 billion people of productive age in Indonesia (Central Agency of Statistics Indonesia—BPS, 2024).
Yayasan Plan International Indonesia (Plan Indonesia) held a hybrid training of trainers joined by 80 vocational high school teachers from five different regions in NTT, spanning from the Kupang City, South Timor Tengah regency, Lembata regency, Nagekeo regency, and Manggarai regency on Thursday (18/07/2024). The training was as a part of Plan Indonesia’s Youth Employment and Entrepreneurship initiative–the AI TEACH program supported by Microsoft.
Dini Arifah, AI TEACH Project Manager at Plan Indonesia explained that the training of trainers served as a continuation of Plan Indonesia’s support in elevating access to digital employment for people in the NTT region. “As an organization that has been working for more than 50 years in the NTT region—which is our main implementation area—Plan Indonesia hoped to use this chance to enhance the ability and work readiness of SMK teachers and students in NTT, particularly to enable them to compete within the 4.0 digital industry landscape,” Dini said at the beginning of AI TEACH ToT in Kupang, Thursday (18/07/2024).
The AI TEACH training of trainers was held through cooperation between Plan Indonesia and NTT’s local Department of Education and Culture. Both institutions will work together to reach 1,000 SMK teachers who will then cascade their AI Generative knowledge to approximately 60,000 SMK students by 2024.
Ambrosius Kodo, the current Head of Department of Education and Culture in NTT, said that the government appreciated Plan Indonesia’s initiative to enhance education quality in the region, particularly to reduce the amount of open unemployment rate in NTT that amounted to 3.17 per cent of in 2024.
“We are given the aptitude and the space to make good use of technology, especially to improve education in NTT. AI can actually help make things easier. Teachers and the students will need understand how to leverage AI to increase their knowledge and career, instead of viewing it as a threat,” Ambrosius said.
Meanwhile, Microsoft ASEAN Philanthropies Lead Supahrat Juramongkol said, “In line with Microsoft’s mission to empower every person and every organization on the planet to achieve more, we are excited to accelerate the implementation of AI TEACH program in collaboration with Plan Indonesia. Through the AI Generative Toolkit we have prepared, we aim to enhance career and educational opportunities for participants, promote equitable access to digital education, and foster inclusive digital economic growth in NTT.”
Among the subjects offered through the AI TEACH training are Generative AI for education, soft skills (work readiness), basic digital skills, gender equality and social inclusion, as well as awareness of risky behaviour. All of the training materials could be accessed online through Plan Indonesia’s e-learning platform, kitakerja.id, accompanied with additional materials by LinkedIn learning. Aside from providing initial training to 80 teachers who will become the trainers in NTT, the AI TEACH program by Plan Indonesia and Microsoft will also reach out to 5,000 teachers, cascading to 300,000 vocational high school students in the country. The teachers will then support at least 60,000 students to graduate from the program and receive certifications by Microsoft and LinkedIn by the end of December 2024.
—
80 Guru Pelatih SMK dari SMK NTT Siap Tingkatkan Mutu Pengajaran dengan AI Generatif
Read the English version here
Kupang, 18 Juli 2024 – Kemajuan teknologi kecerdasan buatan (Artificial Intelligence-AI) harus bermanfaat bagi masyarakat Indonesia, termasuk bagi para pendidik Sekolah Menengah Kejuruan (SMK) di wilayah Nusa Tenggara Timur. Terutama, agar para guru dan murid SMK di wilayah ini bisa memanfaatkan AI Generatif dan bersaing lebih baik dalam memasuki bursa kerja, di tengah persaingan dengan 149,38 juta angkatan kerja nasional lainnya (BPS, 2024).
Yayasan Plan International Indonesia (Plan Indonesia) menggelar pelatihan untuk para pelatih (Training of Trainers–ToT) yang diikuti oleh 80 guru SMK dari Kota Kupang, Kabupaten Lembata, Kabupaten Timor Tengah Selatan, Kabupaten Nagekeo, dan Kabupaten Manggarai secara daring dan luring pada Kamis (18/07/2024). Kegiatan ini merupakan bagian dari program ketenagakerjaan dan kewirausahaan Plan Indonesia, AI TEACH, yang didukung penuh oleh Microsoft.
Dini Arifah, AI TEACH Project Manager Plan Indonesia, menjelaskan bahwa ToT ini merupakan lanjutan dari upaya berkelanjutan Plan Indonesia untuk meningkatkan akses penduduk di NTT terhadap pekerjaan digital. “Sebagai organisasi yang sudah bekerja lebih dari 50 tahun di NTT yang merupakan wilayah kerja utama kami, Plan Indonesia berharap dapat menggunakan kesempatan ini untuk meningkatkan kemampuan para guru dan kesiapan kerja murid SMK di NTT. Tujuannya agar mereka dapat bersaing di era industri digital 4.0,” sebut Dini dalam pembukaan acara ToT di Kupang, Kamis (18/07/2024).
Kegiatan ToT AI TEACH ini terselenggara melalui kerja sama antara Plan Indonesia dengan Dinas Pendidikan dan Kebudayaan NTT. Kedua lembaga ini bertujuan menjangkau 1.000 guru SMK melalui pelatihan berjenjang (cascading) dan menjangkau sekitar 60.000 murid SMK di NTT hingga akhir 2024.
Ambrosius Kodo, Kepala Dinas Pendidikan dan Kebudayaan NTT, menyambut baik inisiatif Plan Indonesia untuk memajukan kuailtas pendidikan di NTT, terutama untuk mengurangi tingkat pengangguran terbuka NTT yang mencapai 3,17 persen pada 2024.
“Kita tentunya diberikan kecerdasan, ruang untuk memanfaatkan teknologi dengan baik, teristimewa untuk kemajuan sektor pendidikan di NTT. Dengan adanya AI, sebetulnya segala sesuatu akan menjadi lebih mudah. Pendidik maupun peserta didik harus benar-benar memahami cara memanfaatkan AI untuk pengetahuan dan kemajuan karier, daripada melihatnya sebagai suatu ancaman,” sebut Ambrosius.
Sementara itu, Microsoft ASEAN Philanthropies Lead Supahrat Juramongkol mengatakan, “Sejalan dengan misi Microsoft untuk memberdayakan setiap individu dan setiap organisasi di planet ini untuk mencapai lebih, kami merasa senang dapat mempercepat pengimplementasian program AI TEACH bersama Plan Indonesia. Melalui AI Generative Toolkit yang kami siapkan, kami berharap tidak hanya dapat meningkatkan peluang karier dan pendidikan para peserta, tetapi juga membantu pemerataan akses pendidikan digital, serta mendorong pertumbuhan ekonomi digital inklusif di NTT.
Topik pembelajaran yang diberikan melalui AI TEACH adalah keterampilan AI Generatif di dunia pendidikan, soft skill (kesiapan kerja), keterampilan digital dasar, kesetaraan Gender dan Inklusi Sosial (GESI), hingga kesadaran terhadap perilaku berisiko. Seluruh pelatihan ini diakses melalui modul AI Generative Toolkit yang tersedia di platform pembelajaran kitakerja.id milik Plan Indonesia, dilengkapi dengan materi tambahan dalam platform LinkedIn learning.
Selain memberikan pelatihan awal kepada 80 guru yang akan menjadi pelatih di NTT, program AI TEACH oleh Plan Indonesia dan Microsoft juga bertujuan menjangkau 5.000 pendidik SMK yang akan melatih 300.000 murid SMK dari seluruh penjuru negeri. Para pendidik juga akan mendampingi setidaknya 60.000 murid untuk mendapatkan sertifikasi penyelesaian oleh Microsoft dan LinkedIn hingga akhir Desember 2024.
—-
How to get access token for Graph API in Teams bot-based message extension?
I’m developing a Teams bot-based message extension application using the Teams Toolkit in TypeScript. I need to retrieve all the replies for a message in the current channel. According to the documentation, I need to use the Graph API to get the replies. However, to use the Graph API, I need an access token.
My questions are:
How can I implement OAuth to get the token in a bot-based message extension?Are there any specific permissions or configurations needed in the Azure portal to enable this?Is there an alternative way to get the access token or retrieve the replies without using the Graph API?
I’m developing a Teams bot-based message extension application using the Teams Toolkit in TypeScript. I need to retrieve all the replies for a message in the current channel. According to the documentation, I need to use the Graph API to get the replies. However, to use the Graph API, I need an access token. My questions are:How can I implement OAuth to get the token in a bot-based message extension?Are there any specific permissions or configurations needed in the Azure portal to enable this?Is there an alternative way to get the access token or retrieve the replies without using the Graph API? Read More
I have an Issue in calculate depreciation formulas
=AMORLINC(G54;I54;DATE(2023;12;31);0;0;H54;0)
date purchased is the same of first period
but the result not correct
=AMORLINC(G54;I54;DATE(2023;12;31);0;0;H54;0)date purchased is the same of first period but the result not correct Read More
OneDrive (Android)
Why is it when you want to move files on OneDrive android; you are only able to select 100 files, and even if you do select only 100 files. The app crashes as soon as you try to move them to a folder! I just don’t understand…
Why is it when you want to move files on OneDrive android; you are only able to select 100 files, and even if you do select only 100 files. The app crashes as soon as you try to move them to a folder! I just don’t understand… Read More
Windows Server upgrades
Morning All.
We in the process of replacing dated Windows Server hardware and OS. I would like to find out what you think of in-place upgrades from 2012 to 2022 and the pros and cons going with this.
a additional method is to run a restore from a backup on the new server and then do an in place upgrade to 2022 , will it then be nessacery to re-join the user machines to the domain.
Thanks in advance
Morning All. We in the process of replacing dated Windows Server hardware and OS. I would like to find out what you think of in-place upgrades from 2012 to 2022 and the pros and cons going with this.a additional method is to run a restore from a backup on the new server and then do an in place upgrade to 2022 , will it then be nessacery to re-join the user machines to the domain. Thanks in advance Read More
Fine-Tune and Integrate Custom Phi-3 Models with Prompt Flow in Azure AI Studio
Fine-Tune and Integrate Custom Phi-3 Models with Prompt Flow in Azure AI Studio
This blog series have several versions, each covering different aspects and techniques. Check out the following resource:
– Fine-Tune and Integrate Custom Phi-3 Models with Prompt Flow: Step-by-Step Guide
: Detailed instructions for fine-tuning and integrating custom Phi-3 models with Prompt flow using a code-first approach.
– Code-first approach: End-to-end (E2E) sample on Phi-3CookBook
: An end-to-end (E2E) sample on Phi-3CookBook, developed based on the “Fine-Tune and Integrate Custom Phi-3 Models with Prompt Flow: Step-by-Step Guide” for a code-first approach.
– Fine-Tune and Integrate Custom Phi-3 Models with Prompt Flow in Azure AI Studio
: Detailed instructions for fine-tuning and integrating custom Phi-3 models with Prompt Flow in Azure AI / ML Studio using a low-code approach.
Introduction
Phi-3 is a family of small language models (SLMs) developed by Microsoft that delivers exceptional performance and cost-effectiveness. In this tutorial, you will learn how to fine-tune the Phi-3 model and integrate the custom Phi-3 model with Prompt Flow in Azure AI Studio. By leveraging Azure AI / ML Studio, you will establish a workflow for deploying and utilizing custom AI models. This tutorial is divided into three series:
Series 1: Set up Azure resources and Prepare for fine-tuning
Create Azure Machine Learning Workspace: You start by setting up an Azure Machine Learning workspace, which serves as the hub for managing machine learning experiments and models.
Request GPU Quotas: Since Phi-3 model fine-tuning typically benefits from GPU acceleration, you request GPU quotas in your Azure subscription.
Add Role Assignment: You set up a User Assigned Managed Identity (UAI) and assign it necessary permissions (Contributor, Storage Blob Data Reader, AcrPull) to access resources like storage accounts and container registries.
Set up the Project: You create a local environment, set up a virtual environment, install required packages, and create a script (download_dataset.py) to download the dataset (ULTRACHAT_200k) required for fine-tuning.
Series 2: Fine-tune and Deploy the Phi-3 model in Azure ML Studio
Create Compute Cluster: In Azure ML Studio, you create a dedicated GPU compute cluster (Standard_NC24ads_A100_v4) for fine-tuning the Phi-3 model.
Fine-tune the Phi-3 Model: Using the Azure ML Studio interface, you fine-tune the Phi-3 model by specifying training and validation datasets, and configuring parameters like learning rate.
Deploy the Fine-tuned Model: Once fine-tuning is complete, you register the model, create an online endpoint, and deploy the model to make it accessible for real-time inference.
Series 3: Integrate the custom Phi-3 model with Prompt Flow in Azure AI Studio
Create Azure AI Studio Hub and Project: You create a Hub (similar to a resource group) and a Project within Azure AI Studio to manage your AI-related work.
Add a Custom Connection: To integrate the fine-tuned Phi-3 model with Prompt Flow, you create a custom connection in Azure AI Studio, specifying the endpoint and authentication key generated during model deployment in Azure ML Studio.
Create Prompt Flow: You create a new Prompt flow within the Azure AI Studio Project, configure it to use the custom connection, and design the flow to interact with the Phi-3 model for tasks like chat completion.
Note
Unlike the previous tutorial, Fine-Tune and Integrate Custom Phi-3 Models with Prompt Flow: Step-by-Step Guide, which involved running code locally, this tutorial focuses entirely on fine-tuning and integrating your model within the Azure AI / ML Studio environment.
Here is an overview of this tutorial.
Note
For more detailed information and to explore additional resources about Phi-3, please visit the Phi-3CookBook.
Prerequisites
Python
Azure subscription
Visual Studio Code
Table of Contents
Series 1: Set Up Azure resources and Prepare for fine-tuning
Create Azure Machine Learning workspace
Request GPU quotas in Azure subscription
Add role assignment
Set up the project 1.Prepare dataset for fine-tuning
Series 2: Fine-tune and Deploy the Phi-3 model in Azure ML Studio
Fine-tune the Phi-3 model
Deploy the fine-tuned Phi-3 model
Series 3: Integrate the custom phi-3 model with Prompt flow in Azure AI Studio
Integrate the custom Phi-3 model with Prompt flow
Chat with your custom Phi-3 model
Congratulation!
Series 1: Set up Azure resources and Prepare for fine-tuning
Create Azure Machine Learning workspace
In this exercise, you will:
Create an Azure Machine Learning Workspace.
Create an Azure Machine Learning Workspace
Type azure machine learning in the search bar at the top of the portal page and select Azure Machine Learning from the options that appear.
Select + Create from the navigation menu.
Select New workspace from the navigation menu.
Perform the following tasks:
Select your Azure Subscription.
Select the Resource group to use (create a new one if needed).
Enter Workspace Name. It must be a unique value.
Select the Region you’d like to use.
Select the Storage account to use (create a new one if needed).
Select the Key vault to use (create a new one if needed).
Select the Application insights to use (create a new one if needed).
Select the Container registry to None.
Select Review + Create.
Select Create.
Request GPU Quotas in Azure Subscription
In this tutorial, you will learn how to fine-tune and deploy a Phi-3 model, using GPUs. For fine-tuning, you will use the Standard_NC24ads_A100_v4 GPU, which requires a quota request. For deployment, you will use the Standard_NC6s_v3 GPU, which also requires a quota request.
Note
Only Pay-As-You-Go subscriptions (the standard subscription type) are eligible for GPU allocation; benefit subscriptions are not currently supported.
For those using benefit subscriptions (such as Visual Studio Enterprise Subscription) or those looking to quickly test the fine-tuning and deployment process, this tutorial also provides guidance for fine-tuning with a minimal dataset using a CPU. However, it is important to note that fine-tuning results are significantly better when using a GPU with larger datasets.
In this exercise, you will:
Request GPU Quotas in your Azure Subscription
Request GPU Quotas in Azure Subscription
Visit Azure ML Studio.
Perform the following tasks to request Standard NCADSA100v4 Family quota:
Select Quota from the left side tab.
Select the Virtual machine family to use. For example, select Standard NCADSA100v4 Family Cluster Dedicated vCPUs, which includes the Standard_NC24ads_A100_v4 GPU.
Select the Request quota from the navigation menu.
Inside the Request quota page, enter the New cores limit you’d like to use. For example, 24.
Inside the Request quota page, select Submit to request the GPU quota.
Perform the following tasks to request Standard NCSv3 Family quota:
Select Quota from the left side tab.
Select the Virtual machine family to use. For example, select Standard NCSv3 Family Cluster Dedicated vCPUs, which includes the Standard_NC6s_v3 GPU.
Select the Request quota from the navigation menu.
Inside the Request quota page, enter the New cores limit you’d like to use. For example, 24.
Inside the Request quota page, select Submit to request the GPU quota.
Add role assignment
To fine-tune and deploy your models, you must first ceate a User Assigned Managed Identity (UAI) and assign it the appropriate permissions. This UAI will be used for authentication during deployment, so it is critical to grant it access to the storage accounts, container registry, and resource group.
In this exercise, you will:
Create User Assigned Managed Identity(UAI).
Add Contributor role assignment to Managed Identity.
Add Storage Blob Data Reader role assignment to Managed Identity.
Add AcrPull role assignment to Managed Identity.
Create User Assigned Managed Identity(UAI)
Type managed identities in the search bar at the top of the portal page and select Managed Identities from the options that appear.
Select + Create.
Perform the following tasks to navigate to Add role assignment page:
Select your Azure Subscription.
Select the Resource group to use (create a new one if needed).
Select the Region you’d like to use.
Enter the Name. It must be a unique value.
Select Review + create.
Select + Create.
Add Contributor role assignment to Managed Identity
Navigate to the Managed Identity resource that you created.
Select Azure role assignments from the left side tab.
Select +Add role assignment from the navigation menu.
Inside Add role assignment page, Perform the following tasks:
Select the Scope to Resource group.
Select your Azure Subscription.
Select the Resource group to use.
Select the Role to Contributor.
Select Save.
Add Storage Blob Data Reader role assignment to Managed Identity
Type azure storage accounts in the search bar at the top of the portal page and select Storage accounts from the options that appear.
Select the storage account that associated with the Azure Machine Learning workspace. For example, finetunephistorage.
Perform the following tasks to navigate to Add role assignment page:
Navigate to the Azure Storage account that you created.
Select Access Control (IAM) from the left side tab.
Select + Add from the navigation menu.
Select Add role assignment from the navigation menu.
Inside Add role assignment page, Perform the following tasks:
Inside the Role page, type Storage Blob Data Reader in the search bar and select Storage Blob Data Reader from the options that appear.
Inside the Role page, select Next.
Inside the Members page, select Assign access to Managed identity.
Inside the Members page, select + Select members.
Inside Select managed identities page, select your Azure Subscription.
Inside Select managed identities page, select the Managed identity to Manage Identity.
Inside Select managed identities page, select the Manage Identity that you created. For example, finetunephi-managedidentity.
Inside Select managed identities page, select Select.
Select Review + assign.
Add AcrPull role assignment to Managed Identity
Type container registries in the search bar at the top of the portal page and select Container registries from the options that appear.
Select the container registry that associated with the Azure Machine Learning workspace. For example, finetunephicontainerregistries
Perform the following tasks to navigate to Add role assignment page:
Select Access Control (IAM) from the left side tab.
Select + Add from the navigation menu.
Select Add role assignment from the navigation menu.
Inside Add role assignment page, Perform the following tasks:
Inside the Role page, Type AcrPull in the search bar and select AcrPull from the options that appear.
Inside the Role page, select Next.
Inside the Members page, select Assign access to Managed identity.
Inside the Members page, select + Select members.
Inside Select managed identities page, select your Azure Subscription.
Inside Select managed identities page, select the Managed identity to Manage Identity.
Inside Select managed identities page, select the Manage Identity that you created. For example, finetunephi-managedidentity.
Inside Select managed identities page, select Select.
Select Review + assign.
Set up the project
To download the datasets needed for fine-tuning, you will set up a local environment.
In this exercise, you will
Create a folder to work inside it.
Create a virtual environment.
Install the required packages.
Create a download_dataset.py file to download the dataset.
Create a folder to work inside it
Open a terminal window and type the following command to create a folder named finetune-phi in the default path.
mkdir finetune-phi
Type the following command inside your terminal to navigate to the finetune-phi folder you created.
cd finetune-phi
Create a virtual environment
Type the following command inside your terminal to create a virtual environment named .venv.
python -m venv .venv
Type the following command inside your terminal to activate the virtual environment.
.venvScriptsactivate.bat
Note
If it worked, you should see (.venv) before the command prompt.
Install the required packages
Type the following commands inside your terminal to install the required packages.
pip install datasets==2.19.1
Create donload_dataset.py
Note
Complete folder structure:
└── YourUserName
. └── finetune-phi
. └── download_dataset.py
Open Visual Studio Code.
Select File from the menu bar.
Select Open Folder.
Select the finetune-phi folder that you created, which is located at C:UsersyourUserNamefinetune-phi.
In the left pane of Visual Studio Code, right-click and select New File to create a new file named download_dataset.py.
Prepare dataset for fine-tuning
In this exercise, you will run the download_dataset.py file to download the ultrachat_200k datasets to your local environment. You will then use this datasets to fine-tune the Phi-3 model in Azure Machine Learning.
In this exercise, you will:
Add code to the download_dataset.py file to download the datasets.
Run the download_dataset.py file to download datasets to your local environment.
Download your dataset using download_dataset.py
Open the download_dataset.py file in Visual Studio Code.
Add the following code into download_dataset.py.
import json
import os
from datasets import load_dataset
def load_and_split_dataset(dataset_name, config_name, split_ratio):
“””
Load and split a dataset.
“””
# Load the dataset with the specified name, configuration, and split ratio
dataset = load_dataset(dataset_name, config_name, split=split_ratio)
print(f”Original dataset size: {len(dataset)}”)
# Split the dataset into train and test sets (80% train, 20% test)
split_dataset = dataset.train_test_split(test_size=0.2)
print(f”Train dataset size: {len(split_dataset[‘train’])}”)
print(f”Test dataset size: {len(split_dataset[‘test’])}”)
return split_dataset
def save_dataset_to_jsonl(dataset, filepath):
“””
Save a dataset to a JSONL file.
“””
# Create the directory if it does not exist
os.makedirs(os.path.dirname(filepath), exist_ok=True)
# Open the file in write mode
with open(filepath, ‘w’, encoding=’utf-8′) as f:
# Iterate over each record in the dataset
for record in dataset:
# Dump the record as a JSON object and write it to the file
json.dump(record, f)
# Write a newline character to separate records
f.write(‘n’)
print(f”Dataset saved to {filepath}”)
def main():
“””
Main function to load, split, and save the dataset.
“””
# Load and split the ULTRACHAT_200k dataset with a specific configuration and split ratio
dataset = load_and_split_dataset(“HuggingFaceH4/ultrachat_200k”, ‘default’, ‘train_sft[:1%]’)
# Extract the train and test datasets from the split
train_dataset = dataset[‘train’]
test_dataset = dataset[‘test’]
# Save the train dataset to a JSONL file
save_dataset_to_jsonl(train_dataset, “data/train_data.jsonl”)
# Save the test dataset to a separate JSONL file
save_dataset_to_jsonl(test_dataset, “data/test_data.jsonl”)
if __name__ == “__main__”:
main()
Type the following command inside your terminal to run the script and download the dataset to your local environment.
python download_dataset.py
Verify that the datasets were saved successfully to your local finetune-phi/data directory.
Note
Note on dataset size and fine-tuning time
In this tutorial, you use only 1% of the dataset (split=’train[:1%]’). This significantly reduces the amount of data, speeding up both the upload and fine-tuning processes. You can adjust the percentage to find the right balance between training time and model performance. Using a smaller subset of the dataset reduces the time required for fine-tuning, making the process more manageable for a tutorial.
Series 2: Fine-tune and Deploy the Phi-3 model in Azure ML Studio
Fine-tune the Phi-3 model
In this exercise, you will fine-tune the Phi-3 model in Azure Machine Learning Studio.
In this exercise, you will:
Create computer cluster for fine-tuning.
Fine-tune the Phi-3 model in Azure Machine Learning Studio.
Create computer cluster for fine-tuning
Visit Azure ML Studio.
Select Compute from the left side tab.
Select Compute clusters from the navigation menu.
Select + New.
Perform the following tasks:
Select the Region you’d like to use.
Select the Virtual machine tier to Dedicated.
Select the Virtual machine type to GPU.
Select the Virtual machine size filter to Select from all options.
Select the Virtual machine size to Standard_NC24ads_A100_v4.
Select Next.
Perform the following tasks:
Enter Compute name. It must be a unique value.
Select the Minimum number of nodes to 0.
Select the Maximum number of nodes to 1.
Select the Idle seconds before scale down to 120.
Select Create.
Fine-tune the Phi-3 model
Visit Azure ML Studio.
Select the Azure Macnine Learning workspace that you created.
Perform the following tasks:
Select Model catalog from the left side tab.
Type phi-3-mini-4k in the search bar and select Phi-3-mini-4k-instruct from the options that appear.
Select Fine-tune from the navigation menu.
Perform the following tasks:
Select Select task type to Chat completion.
Select + Select data to upload Traning data.
Select the Validation data upload type to Provide different validation data.
Select + Select data to upload Validation data.
Tip
You can select Advanced settings to customize configurations such as learning_rate and lr_scheduler_type to optimize the fine-tuning process according to your specific needs.
Select Finish.
In this exercise, you successfully fine-tuned the Phi-3 model using Azure Machine Learning. Please note that the fine-tuning process can take a considerable amount of time. After running the fine-tuning job, you need to wait for it to complete. You can monitor the status of the fine-tuning job by navigating to the Jobs tab on the left side of your Azure Machine Learning Workspace. In the next series, you will deploy the fine-tuned model and integrate it with Prompt flow.
Deploy the fine-tuned model
To integrate the fine-tuned Phi-3 model with Prompt flow, you need to deploy the model to make it accessible for real-time inference. This process involves registering the model, creating an online endpoint, and deploying the model.
In this exercise, you will:
Register the fine-tuned model in the Azure Machine Learning workspace.
Create an online endpoint.
Deploy the registered fine-tuned Phi-3 model.
Register the fine-tuned model
Visit Azure ML Studio.
Select the Azure Macnine Learning workspace that you created.
Select Models from the left side tab.
Select + Register.
Select From a job output.
Select the job that you created.
Select Next.
Select Model type to MLflow.
Ensure that Job output is selected; it should be automatically selected.
Select Next.
Select Register.
You can view your registered model by navigating to the Models menu from the left side tab.
Deploy the fine-tuned model
Navigate to the Azure Macnine Learning workspace that you created.
Select Endpoints from the left side tab.
Select Real-time endpoints from the navigation menu.
Select Create.
select the registered model that you created.
Select Select.
Perform the following tasks:
Select Virtual machine to Standard_NC6s_v3.
Select the Instance count you’d like to use. For example, 1.
Select the Endpoint to New to create an endpoint.
Enter Endpoint name. It must be a unique value.
Enter Deployment name. It must be a unique value.
Select Deploy.
Warning
To avoid additional charges to your account, make sure to delete the created endpoint in the Azure Machine Learning workspace.
Check deployment status in Azure Machine Learning Workspace
Navigate to Azure Machine Learning workspace that you created.
Select Endpoints from the left side tab.
Select the endpoint that you created.
On this page, you can manage the endpoints during the deployment process.
Note
Once the deployment is complete, ensure that Live traffic is set to 100%. If it is not, select Update traffic to adjust the traffic settings. Note that you cannot test the model if the traffic is set to 0%.
Series 3: Integrate the custom phi-3 model with Prompt flow in Azure AI Studio
Integrate the custom Phi-3 model with Prompt flow
After successfully deploying your fine-tuned model, you can now integrate it with Prompt Flow to use your model in real-time applications, enabling a variety of interactive tasks with your custom Phi-3 model.
In this exercise, you will:
Create Azure AI Studio Hub.
Create Azure AI Studio Project.
Create Prompt flow.
Add a custom connection for the fine-tuned Phi-3 model.
Set up Prompt flow to chat with your custom Phi-3 model
Note
You can also integrate with Promptflow using Azure ML Studio. The same integration process can be applied to Azure ML Studio.
Create Azure AI Studio Hub
You need to create a Hub before creating the Project. A Hub acts like a Resource Group, allowing you to organize and manage multiple Projects within Azure AI Studio.
Visit Azure AI Studio.
Select All hubs from the left side tab.
Select + New hub from the navigation menu.
Perform the following tasks:
Enter Hub name. It must be a unique value.
Select your Azure Subscription.
Select the Resource group to use (create a new one if needed).
Select the Location you’d like to use.
Select the Connect Azure AI Services to use (create a new one if needed).
Select Connect Azure AI Search to Skip connecting.
Select Next.
Create Azure AI Studio Project
In the Hub that you created, select All projects from the left side tab.
Select + New project from the navigation menu.
Enter Project name. It must be a unique value.
Select Create a project.
Add a custom connection for the fine-tuned Phi-3 model
To integrate your custom Phi-3 model with Prompt flow, you need to save the model’s endpoint and key in a custom connection. This setup ensures access to your custom Phi-3 model in Prompt flow.
Set api key and endpoint uri of the fine-tuned Phi-3 model
Visit Azure ML Studio.
Navigate to the Azure Machine learning workspace that you created.
Select Endpoints from the left side tab.
Select endpoint that you created.
Select Consume from the navigation menu.
Copy your REST endpoint and Primary key.
Add the Custom Connection
Visit Azure AI Studio.
Navigate to the Azure AI Studio project that you created.
In the Project that you created, select Settings from the left side tab.
Select + New connection.
Select Custom keys from the navigation menu.
Perform the following tasks:
Select + Add key value pairs.
For the key name, enter endpoint and paste the endpoint you copied from Azure ML Studio into the value field.
Select + Add key value pairs again.
For the key name, enter key and paste the key you copied from Azure ML Studio into the value field.
After adding the keys, select is secret to prevent the key from being exposed.
Select Add connection.
Perform the following tasks to add the custom Phi-3 model’s key:
Create Prompt flow
You have added a custom connection in Azure AI Studio. Now, let’s create a Prompt flow using the following steps. Then, you will connect this Prompt flow to the custom connection so that you can use the fine-tuned model within the Prompt flow.
Navigate to the Azure AI Studio project that you created.
Select Prompt flow from the left side tab.
Select + Create from the navigation menu.
Select Chat flow from the navigation menu.
Enter Folder name to use.
Select Create.
Set up Prompt flow to chat with your custom Phi-3 model
In the Prompt flow, perform the following tasks to rebuild the existing flow:
Select Raw file mode.
Delete all existing code in the flow.dag.yml file.
Add the folling code to flow.dag.yml file.
inputs:
input_data:
type: string
default: “Who founded Microsoft?”
outputs:
answer:
type: string
reference: ${integrate_with_promptflow.output}
nodes:
– name: integrate_with_promptflow
type: python
source:
type: code
path: integrate_with_promptflow.py
inputs:
input_data: ${inputs.input_data}
Select Save.
Add the following code to integrate_with_promptflow.py file to use the custom Phi-3 model in Prompt flow.
import logging
import requests
from promptflow import tool
from promptflow.connections import CustomConnection
# Logging setup
logging.basicConfig(
format=”%(asctime)s – %(levelname)s – %(name)s – %(message)s”,
datefmt=”%Y-%m-%d %H:%M:%S”,
level=logging.DEBUG
)
logger = logging.getLogger(__name__)
def query_phi3_model(input_data: str, connection: CustomConnection) -> str:
“””
Send a request to the Phi-3 model endpoint with the given input data using Custom Connection.
“””
# “connection” is the name of the Custom Connection, “endpoint”, “key” are the keys in the Custom Connection
endpoint_url = connection.endpoint
api_key = connection.key
headers = {
“Content-Type”: “application/json”,
“Authorization”: f”Bearer {api_key}”
}
data = {
“input_data”: {
“input_string”: [
{“role”: “user”, “content”: input_data}
],
“parameters”: {
“temperature”: 0.7,
“max_new_tokens”: 128
}
}
}
try:
response = requests.post(endpoint_url, json=data, headers=headers)
response.raise_for_status()
# Log the full JSON response
logger.debug(f”Full JSON response: {response.json()}”)
result = response.json()[“output”]
logger.info(“Successfully received response from Azure ML Endpoint.”)
return result
except requests.exceptions.RequestException as e:
logger.error(f”Error querying Azure ML Endpoint: {e}”)
raise
@tool
def my_python_tool(input_data: str, connection: CustomConnection) -> str:
“””
Tool function to process input data and query the Phi-3 model.
“””
return query_phi3_model(input_data, connection)
Note
For more detailed information on using Prompt flow in Azure AI Studio, you can refer to Prompt flow in Azure AI Studio.
Select Chat input, Chat output to enable chat with your model.
Now you are ready to chat with your custom Phi-3 model. In the next exercise, you will learn how to start Prompt flow and use it to chat with your fine-tuned Phi-3 model.
Note
The rebuilt flow should look like the image below:
Chat with your custom Phi-3 model
Now that you have fine-tuned and integrated your custom Phi-3 model with Prompt flow, you are ready to start interacting with it. This exercise will guide you through the process of setting up and initiating a chat with your model using Prompt flow. By following these steps, you will be able to fully utilize the capabilities of your fine-tuned Phi-3 model for various tasks and conversations.
Chat with your custom Phi-3 model using Prompt flow.
Start Prompt flow
Select Start compute sessions to start Prompt flow.
Select Validate and parse input to renew parameters.
Select the Value of the connection to the custom connection you created. For example, connection.
Chat with your custom Phi-3 model
Select Chat.
Here’s an example of the results: Now you can chat with your custom Phi-3 model. It is recommended to ask questions based on the data used for fine-tuning.
Congratulations!
You’ve completed this tutorial
Congratulations! You have successfully completed the tutorial on fine-tuning and integrating custom Phi-3 models with Prompt flow in Azure AI Studio. This tutorial introduced the process of fine-tuning, deploying, and integrating the custom Phi-3 model with Prompt flow using Azure ML Studio and Azure AI Studio.
Clean Up Azure Resources
Cleanup your Azure resources to avoid additional charges to your account. Go to the Azure portal and delete the following resources:
The Azure Machine learning resource.
The Azure Machine learning model endpoint.
The Azure AI Studio Project resource.
The Azure AI Studio Prompt flow resource.
Next Steps
Documentation
microsoft/Phi-3CookBook
Azure/azure-llm-fine-tuning
Azure Machine Learning documentation
Azure AI Studio documentation
Prompt flow documentation
Training Content
Prompt flow tutorials
Introduction to Azure AI Studio
Reference
Microsoft Tech Community – Latest Blogs –Read More
New Video Course: Generative AI for Beginners
It’s hard to deny how big a deal AI has been in recent years. It’s everywhere, from medicine to the entertainment industry. With all the new tech, AI is now accessible to everyone, not just experts.
It’s important for everyone to know something about generative AI. This is a very promising area with some big wins already.
If you want to work in different areas, it’s really important to understand Generative AI. With that in mind, the Microsoft Advocacy Team has put together a Generative AI for Beginners Course to teach you the basics and then show you how to use it to create practical applications.
In this article, we’ll dive a bit deeper into the course and what you can expect from it.
Generative AI for Beginners v.2 – What to Expect?
The Generative AI for Beginners course is a totally free online course that teaches you the basics, all the way to creating your own Generative AI applications.
The course has 18 lessons and lots of practical projects. The Microsoft Advocacy team put it together.
The course includes two types of lessons: Learn lessons, which break down the fundamental concepts of a topic, and Build lessons, which go over those concepts and then show examples made in Python and TypeScript.
Who is the course for?
The course is designed for anyone who wants to learn about generative AI, from beginners to experts. So, you don’t need any prior knowledge or programming expertise to take the course.
What will you learn?
By the end of the course, you will be able to:
Understand the fundamental concepts of Generative AI
Understand the Lifecycle of a Generative AI Application
Create Generative AI applications with Python and TypeScript
Understand Prompt Engineering
Learn about LLMs and GPTs
Use Vector Databases for Generative AI application creation
Use No Code/Low Code applications for Generative AI application creation
Learn about Agents
Fine-Tune LLMs
RAG (Retrieval Augmented Generation)
And much more!
Generative AI Learning Outcomes
The course has 18 lessons:
Introduction to Generative AI and LLMs
Exploring and comparing different LLMs
Using Generative AI Responsibly
Understanding Prompt Engineering Fundamentals
Building Text Generation Applications
Building Search Apps Vector Databases
Building Image Generation Applications
Building Low Code AI Applications
Integrating External Applications with Function Calling
Designing UX for AI Applications
Securing Your Generative AI Applications
The Generative AI Application Lifecycle
Retrieval Augmented Generation (RAG) and Vector Databases
Open Source Models and Hugging Face
What if I have any questions? How can I get to the bottom of them?
Don’t worry! You won’t be on this journey alone! The course includes a discussion forum on Discord, where you can ask questions, share knowledge, interact with the course creators and AI specialists from Microsoft, and other students.
Join the Azure AI Community Discord
I love it! I’m ready to get started. How do I go about doing it?
To start the course, just go to: Generative AI for Beginners and follow the instructions.
The course is available in several languages, including:
English
Chinese/Mandarin
Brazilian Portuguese
Japanese
If you prefer learning through videos, the Microsoft Advocacy team has put together a series of videos about the Generative AI for Beginners course, which you can watch below.
Conclusion
Generative AI is a hot topic that’s becoming more accessible to everyone. It’s becoming more and more important for professionals to learn about generative AI.
Take advantage of this opportunity and start the Generative AI for Beginners course now and become an expert in Generative AI!
Additional Resources
Just wanted to let you know about a couple more resources that might be useful for you. Just a heads-up: The resources below go hand-in-hand with the main Generative AI for Beginners course.
Free Course: Get started with Azure OpenAI Service
Free Course: Fundamentals of Generative AI
Collection: Generative AI for Beginners
We hope you enjoyed the article and that you’re interested in taking the Generative AI for Beginners course. If you have any questions, please don’t hesitate to ask!
See you next time!
Microsoft Tech Community – Latest Blogs –Read More
Protection Label’s watermarks are editable?
Hi,
We are rolling out information protection labels using Purview.
I noticed the limited number of colours for watermarking content within the label setting. This was intriguing as I don’t understand why a colour hex value couldn’t be supported… Especially as the yellow is extremely hard to read, if I was to change it to readable orange that would be better.
I hope am wrong, watermarks are just editable textbox withing the header and footer of word doc? I can change text and colour, save the document and the label’s watermark are tampered with just like that?
I was hoping that the label’s watermarking wouldn’t be editable in a word doc and embedded. I understand the label can still provide technical controls, but tamper proof water marks would be useful.
Hi, We are rolling out information protection labels using Purview. I noticed the limited number of colours for watermarking content within the label setting. This was intriguing as I don’t understand why a colour hex value couldn’t be supported… Especially as the yellow is extremely hard to read, if I was to change it to readable orange that would be better. I hope am wrong, watermarks are just editable textbox withing the header and footer of word doc? I can change text and colour, save the document and the label’s watermark are tampered with just like that? I was hoping that the label’s watermarking wouldn’t be editable in a word doc and embedded. I understand the label can still provide technical controls, but tamper proof water marks would be useful. Read More
Connect with Application Insights in ‘not Local auth mode’ using OpenTelemetry
TOC
What is it
How to use it
References
What is it
Azure Web Apps or Azure Function Apps frequently communicate with Application Insights to log various levels of data, which can later be reviewed and filtered in the Log Analytics Workspace.
Taking Python as an example, the official documentation mentions that the OpenCensus package will no longer be supported after 2024-09-30.
The article suggests OpenTelemetry as the latest alternative. In response to the growing cybersecurity awareness among many companies, many users have disabled the ‘Local Authentication’ feature in Application Insights to enhance security.
Therefore, this article will focus on how Web Apps/Function Apps can use Managed Identity to communicate with Application Insights and utilize the latest OpenTelemetry package to avoid the predicament of unsupported packages.
How to use it
According to Microsoft Entra authentication for Application Insights – Azure Monitor | Microsoft Learn, sample code with “OpenCensus” will EOS after 2024-09-30 which means this method is deprecatedfrom now. (will show up in further code snippet with method 1)
Currently, Microsoft officially suggest user apply OpenTelemetry as the new method. (will show up in further code snippet with method 2).
Step 1:
Function App should use system/user assigned managed identity to issue credential for accessing AI (i.e., Application Insights), I choose system assigned managed identity in this sample.
In the “Role Assignment”, please add the “Monitoring Metrics Publisher” to the target AI resource, I add the parent RG (i.e., resource group) from that AI in this experiment.
Step 2:
In code level, I use Function App python V1 architecture from the python code, but I think V1 and V2 could achieve the same goal.
[requirements.txt]
# Method 2: opentelemetry
azure-monitor-opentelemetry
azure-identity
[<TriggerName>/__init__.py]
# Method 2: opentelemetry
from azure.monitor.opentelemetry import configure_azure_monitor
from logging import INFO, getLogger
from azure.identity import ManagedIdentityCredential
credential = ManagedIdentityCredential()
configure_azure_monitor(
connection_string=’InstrumentationKey=XXXXXXXX-XXXX-XXXX-XXXX-XXXXXXXXXXXX;IngestionEndpoint=https://XXXXXX-X.in.applicationinsights.azure.com/;LiveEndpoint=https://XXXXXX.livediagnostics.monitor.azure.com/;ApplicationId=XXXXXXXX-XXXX-XXXX-XXXX-XXXXXXXXXXXX’,
credential=credential
)
# Method 2: opentelemetry
logger2 = getLogger(__name__)
logger2.setLevel(INFO)
logger2.info(“Method 2: opentelemetry”)
logger2.handlers.clear()
The connection_string mentioned in the code can be obtained through the AI’s overview page.
Step 3:
After the deployment to the Function App, we could use online Code+Test from Azure portal
And the corresponding AI will got the log.
References:
azure-monitor-opentelemetry · PyPI
Microsoft Tech Community – Latest Blogs –Read More
Where to complain against OneCard?
compromised, report this immediately from the OneCard app or via phone on 08093-158-918 or email us on email address removed for privacy reasons.
compromised, report this immediately from the OneCard app or via phone on 08093-158-918 or email us on email address removed for privacy reasons. Read More
Microsoft 365 Copilot
I tried to use AI command to convert from word document(Information sources) to power point, however, it didnt giving any result. Any same scenario and suggestion can share to me
I tried to use AI command to convert from word document(Information sources) to power point, however, it didnt giving any result. Any same scenario and suggestion can share to me Read More
Your network access has been interrupted – MS Access application on Remote Desktop Web Access
We are in the process of moving a Citrix provided MS Access application to “Remote Desktop Web Access” After the application has been launched for about an hour it bugs out with “Your Network Access has been interrupted … ” We lose access to our SQL tables but even more significantly – even local tables within the front end itself throw the same error when attempting to open them – that is not even attaching to an external table.
We are in the process of moving a Citrix provided MS Access application to “Remote Desktop Web Access” After the application has been launched for about an hour it bugs out with “Your Network Access has been interrupted … ” We lose access to our SQL tables but even more significantly – even local tables within the front end itself throw the same error when attempting to open them – that is not even attaching to an external table. Read More