Category: Microsoft
Category Archives: Microsoft
What's new in security for Azure SQL and SQL Server | Data Exposed
Check out this episode to learn the newest information on security for Azure SQL and SQL Server!
View/share our latest episodes on Microsoft Learn and YouTube!
Microsoft Tech Community – Latest Blogs –Read More
What’s new in security for Azure SQL and SQL Server | Data Exposed
Check out this episode to learn the newest information on security for Azure SQL and SQL Server!
View/share our latest episodes on Microsoft Learn and YouTube!
Microsoft Tech Community – Latest Blogs –Read More
Add LLM Prompts to Reports using Power BI Copilot for Microsoft Fabric
Interested in learning more about Power BI Copilot for Microsoft Fabric? I’ve published a new video walking through the Power BI Narrative visual with Copilot that provides a no-code (SaaS) mechanism for report developers to embed Azure OpenAI (Copilot) prompts into their reports.
There are a few great videos out there on the web for building and editing reports using Power BI Copilot, but the new Copilot Narrative (still in preview at time of recording) visual deserves more attention. LLM prompts can be added to the visual, which can be re-run every time an end user filters a report. Switching your filters from “Florida in December” to “Maine in January,” and you’d like to enhance the report with some external demographic data that ties to the data from your Power BI Semantic Model? All you need to do is push a button for a new narrative.
Also, by enabling report developers to store prompts in the visual, you can instruct the Azure OpenAI LLM that is powering Copilot to add urls and citations for the data that was used in the response.
The demo in the video is using over 220 million rows of data from the Git Repo that I put together with Inderjit Rana for customers to try out Microsoft Fabric and the Power BI Direct Lake connector, and you can recreate it yourself at this link: https://lnkd.in/gRavJURT
Microsoft Tech Community – Latest Blogs –Read More
Lesson Learned #473:Harnessing the Synergy of Linked Server, Python, and sp_execute_external_script
In an era where data management transcends individual database systems, SQL Server offers a sophisticated feature set that includes Linked Server integration, Python scripting, and the powerful sp_execute_external_script function. The main objective of this approach is to leverage a Python script within SQL Server using sp_execute_external_script connecting to other database outside of SQL Server On-premise, for example, Azure SQL Database or Azure SQL Managed Instance as an alternative to employing the pyodbc library. This method not only streamlines processes but also addresses key concerns in security and network configuration, such as opening ports, which are prevalent when using external libraries for database connections. By focusing on querying a Linked Server, we can achieve seamless data integration and manipulation while maintaining a secure and efficient environment.
Section 1: Unpacking Linked Servers in SQL Server
Linked Servers act as bridges, enabling SQL Server to execute commands and access data across different database systems. This capability is crucial for enterprises managing data across multiple platforms, offering a unified approach to data interaction. Utilizing Linked Servers, SQL Server can effectively communicate with various data sources, ensuring flexibility and scalability in data management.
Section 2: The Power of Python in SQL Server
The integration of Python into SQL Server, particularly through the sp_execute_external_script function, marks a significant advancement in data processing capabilities. This integration allows for the utilization of Python’s comprehensive libraries and analytical prowess directly within the SQL Server environment. It opens doors to sophisticated data analysis, complex transformations, and advanced machine learning applications, all while leveraging the robust security and performance features of SQL Server.
Section 3: Preparing the Groundwork
To embark on this integration, certain prerequisites must be met. This includes enabling SQL Server Machine Learning Services for Python support and configuring a Linked Server for external data access. Detailed steps guide you through this setup process, ensuring a smooth integration.
Section 4: Executing a Practical Use-case
We present a practical scenario where sp_execute_external_script is employed to query data from a Linked Server. The walkthrough covers creating a stored procedure that harnesses Python’s prowess to access and process data from an external database, illustrating the script’s development and execution.
Definition of Linked Server
USE [master]
GO
/****** Object: LinkedServer [MYSERVER2] Script Date: 11/01/2024 19:07:42 ******/
EXEC master.dbo.sp_addlinkedserver @server = N’MYSERVER2′, @srvproduct=N”, @Provider=N’MSOLEDBSQL’, @datasrc=N’tcp:servername.database.windows.net,1433′, @catalog=N’dotnetexample’
/* For security reasons the linked server remote logins password is changed with ######## */
EXEC master.dbo.sp_addlinkedsrvlogin @rmtsrvname=N’MYSERVER2′,@useself=N’False’,@locallogin=NULL,@rmtuser=N’username’,@rmtpassword=’########’
GO
EXEC master.dbo.sp_serveroption @server=N’MYSERVER2′, @optname=N’collation compatible’, @optvalue=N’false’
GO
EXEC master.dbo.sp_serveroption @server=N’MYSERVER2′, @optname=N’data access’, @optvalue=N’true’
GO
EXEC master.dbo.sp_serveroption @server=N’MYSERVER2′, @optname=N’dist’, @optvalue=N’false’
GO
EXEC master.dbo.sp_serveroption @server=N’MYSERVER2′, @optname=N’pub’, @optvalue=N’false’
GO
EXEC master.dbo.sp_serveroption @server=N’MYSERVER2′, @optname=N’rpc’, @optvalue=N’false’
GO
EXEC master.dbo.sp_serveroption @server=N’MYSERVER2′, @optname=N’rpc out’, @optvalue=N’false’
GO
EXEC master.dbo.sp_serveroption @server=N’MYSERVER2′, @optname=N’sub’, @optvalue=N’false’
GO
EXEC master.dbo.sp_serveroption @server=N’MYSERVER2′, @optname=N’connect timeout’, @optvalue=N’0′
GO
EXEC master.dbo.sp_serveroption @server=N’MYSERVER2′, @optname=N’collation name’, @optvalue=null
GO
EXEC master.dbo.sp_serveroption @server=N’MYSERVER2′, @optname=N’lazy schema validation’, @optvalue=N’false’
GO
EXEC master.dbo.sp_serveroption @server=N’MYSERVER2′, @optname=N’query timeout’, @optvalue=N’0′
GO
EXEC master.dbo.sp_serveroption @server=N’MYSERVER2′, @optname=N’use remote collation’, @optvalue=N’true’
GO
EXEC master.dbo.sp_serveroption @server=N’MYSERVER2′, @optname=N’remote proc transaction promotion’, @optvalue=N’true’
GO
Store procedure definition
CREATE PROCEDURE FetchDataFromLinkedServer
AS
BEGIN
EXEC sp_execute_external_script
@language = N’Python’,
@script = N’
import pandas as pd
customer_data = my_input_data
OutputDataSet = customer_data
‘,
@input_data_1 = N’SELECT TOP 50 ID, TextToSearch FROM [MyServer2].[dotnetexample].[dbo].[PErformanceVarcharNvarchar]’,
@input_data_1_name = N’my_input_data’
WITH RESULT SETS ((ID INT NOT NULL, TextToSearch VARCHAR(200) NOT NULL));
END
Just we need to call our store procedure to obtain the data from another datasource.
EXEC dbo.FetchDataFromLinkedServer
Microsoft Tech Community – Latest Blogs –Read More
The Intrinsic Value of DevOps for the US Department of Defense
DevOps is defined as the union of people, process, and technology to remove siloed roles, development, IT operations, quality engineering, and security to coordinate and collaborate to produce better, more reliable products. Every major cloud provider, Independent Software Vendor (ISV), and software consultancy has promoted this approach to reduce time to market, eliminate bugs, introduce new features rapidly, implement governance, and streamline the software development lifecycle. Microsoft has outlined the benefits of DevOps in the following online post By adopting a DevOps culture along with DevOps practices and tools, teams gain the ability to better respond to customer needs, increase confidence in the applications they build, and achieve business goals faster .
The challenge of cultivating a DevOps culture requires deep changes in the way people work and collaborate. The bureaucratic nature, culture, and traditional software process management of the Department of Defense (DoD) can be an inhibitor for adoption. There are signs as the various branches of the DoD are starting to introduce DevOps as a path forward not only with the adoption of cloud, but also for traditional on-premises programs. The following program Iron Bank (dso.mil) and DSOP (af.mil) are positive sign the DoD is moving away from the traditional Waterfall process, but the change is not reflected in every program. The adoption and implementation of DevOps practices are ever more crucial as we are currently at the intersection of technology and geopolitical world events.
In the last three years we have witnessed the conflict in Ukraine advance the technical capabilities through automation and computer engineering to execute military objectives. The very nature of conventional war has changed and there is a term used the “transparent battlefield” where drones are playing a greater role in providing real time intelligence updates as well as first strike capabilities.
At the start of the war there were various weapon systems that were introduced and heralded as “game changing” with measurable impact in shaping the battlefield. Let us take the example of modern rocket systems provided to Ukraine. The introduction had an immediate impact on the battlefield. Russian air defense missile batteries and radars systems could not properly intercept incoming attacks against their forward operating bases or military positions. Initial indicators pointed that Russian systems did not understand the incoming signatures from incoming rocket attacks. Soon after there was a slow but gradual degrading of the effectiveness of Ukrainian attacks. Interception of launches started to become more common. What changed? DevOps. The Russians learned common technical signs for an incoming strike, developed patches for their air defense systems, evaluated as well as assessed the code, and deployed the necessary patches for their systems. The Russians were able to provide an effective countermeasure during an active conflict across distributed systems within Ukraine and Russia performing out of band updates to mitigate attacks by the Ukrainians. The Ukrainians are also equally doing the same and paving the way in this methodology. This is the ethos of DevOps. When we think of DevOps, we tend to visualize a developer deploying code against a system on-premises or in the cloud. The same technical methodology can be implemented to support future war fighter efforts and advanced weapon systems.
As the DoD is starting to introduce more complex systems, including Space, the need for continuous system updates over low bandwidth communication has new significance. A streamlined process for continuous code improvement, resiliency, and self-healing software on an active battlefield will need to be accounted by military planners in support of the mission. Engineering a rapid recovery on the macro and micro level of systems that fail fast and heal quickly are essential in the software design, delivery, and long-term sustainment. Additional benefits of DevOps can be a force multiplier, optimize total cost of ownership, lower risk, introduce capabilities enhancements faster, lower lead time, and increasing return. Although the DoD budget presently has a significant percentage of the US Federal Budget, there may be additional pressures for leaders in the current and near future to look at ways to streamline their software development and sustainment process. This makes it imperative that organizations within the DoD begin to train staff and implement DevOps as part of their overall strategic plan. The traditional Waterfall model, change management, and promotion of code through various environments before it goes into production will need to go through a radical change across the department.
Recommended Readings and Videos
Recoding America: Why Government Is Failing in the Digital Age and How We Can Do Better
by Jennifer Pahlka
The DevOps Handbook, Second Edition: How to Create World-Class Agility, Reliability, & Security in Technology Organizations
by Gene Kim
The Phoenix Project: A Novel about IT, DevOps, and Helping Your Business Win 5th Anniversary Edition
by Gene Kim, Kevin Behr, Goerge Spafford, and Chris Ruen.
War and Peace and DevOps – Mark Schwartz
https://youtu.be/2BM0xYfcexY
What the Military Taught Me about DevOps – Chris Short
https://youtu.be/TIE1rKkJWyY
Acknowledgements
I would like to thank Chris Ayers and Erik Munson for both reviewing and proving edits in the formulation of this article.
Microsoft Tech Community – Latest Blogs –Read More
Lesson Learned #472:Why It’s Important to Add the TCP Protocol When Connecting to Azure SQL Database
In certain service requests, our customers encounter the following error while connecting to the database, similar like this one: “Connection failed: (‘08001’, ‘[08001] [Microsoft][ODBC Driver 17 for SQL Server]Named Pipes Provider: Could not open a connection to SQL Server [65]. (65) (SQLDriverConnect); [08001] [Microsoft][ODBC Driver 17 for SQL Server]Login timeout expired (0); [08001] [Microsoft][ODBC Driver 17 for SQL Server]A network-related or instance-specific error has occurred while establishing a connection to SQL Server. Server is not found or not accessible. Check if the instance name is correct and if SQL Server is configured to allow remote connections. For more information see SQL Server Books Online. (65)’. I would like to give some insights about this.
The crucial point to mention is that Azure SQL Database only responds to TCP, and any attempt to use Named Pipes will result in an error.
1. Understanding the Error Message:
The error message encountered by our customer is typically associated with attempts to connect using the Named Pipes protocol, which Azure SQL Database does not support. It signifies a network-related or instance-specific error in establishing a connection to SQL Server, often caused by incorrect protocol usage.
2. Azure SQL Database’s Protocol Support:
Azure SQL Database is designed to work exclusively with the TCP protocol for network communication. TCP is a reliable, standard network protocol that ensures the orderly and error-checked transmission of data between the server and client.
3. Why Specify TCP in Connection Strings:
Specifying “TCP:” in the server name within your connection strings ensures that the client application directly attempts to use the TCP protocol. This bypasses any default attempts to use Named Pipes, leading to a more straightforward and faster connection process.
4. Error Diagnosis and Efficiency:
By using TCP, any connectivity issues encountered will return errors specific to the TCP protocol, making diagnosis more straightforward. This direct approach eliminates the time spent on protocol negotiation and reduces the time to connect.
5. Recommendations for Azure SQL Database Connectivity:
Always use TCP in your connection strings when connecting to Azure SQL Database.
Ensure that your client and network configuration are optimized for TCP/IP connectivity.
Regularly update your ODBC drivers and client software to the latest versions to benefit from improved performance and security features.
6. Prioritizing TCP to Avoid Unnecessary Delays in Connectivity:
An important aspect to consider in database connectivity is the order in which different protocols are attempted by the client or application. Depending on the configuration, the client may try to connect using Named Pipes before or after TCP in the event of a connectivity issue. This can lead to unnecessary delays in the validation process.
When Named Pipes is attempted first and fails (as it is unsupported in Azure SQL Database), the client then falls back to TCP, thereby wasting valuable time. This scenario is particularly common when default settings are left unchanged in client applications or drivers.
To mitigate this, it is strongly recommended to explicitly use “TCP:” in the server name within your connection strings. This directive ensures that the TCP protocol is prioritized from the outset, facilitating a more direct and efficient connection attempt.
By doing so, not only do we avoid the overhead of an unsuccessful attempt with Named Pipes, but we also gain clarity in error reporting. If a connectivity issue arises, the error returned will be specific to TCP, allowing for a more accurate diagnosis and faster resolution.
Additionally, this approach can significantly reduce the time taken to establish a connection. In high-performance environments or situations where rapid scaling is required, this efficiency can have a substantial impact on overall system responsiveness and resource utilization.
In summary, explicitly specifying the TCP protocol in your connection strings is a best practice for Azure SQL Database connectivity. It ensures a more streamlined connection process, clearer error diagnostics, and can contribute to overall system efficiency.
Enjoy!
Microsoft Tech Community – Latest Blogs –Read More