Category: Microsoft
Category Archives: Microsoft
Querying Watchlists
special thanks to @Ofer_Shezaf for showing me the new function call.
Watchlists
Watchlists are a feature of Microsoft Sentinel that provide great flexibility and useability. They allow for user-defined tables that can be used in KQL queries to provide additional data. By uploading data using CSV files, users control the data that are in the watchlists and that data can be modified and new rows added as needed.
Watchlists provide a “searchkey” field that is unique across the watchlist and can be used to reference an individual row as discussed below.
Querying Watchlists
The Microsoft Sentinel portal UI provides a way to query the watchlist in the logs watchlist. You can select the watchlist and then click on the “View in Logs” button. This will then transfer the user to Logs, or Advanced Hunting if using Unified SOC Platform, and will execute the following command:
_GetWatchlist(‘<Watchlist Name>’)
This will load the entire watchlist and return all the rows. While this works fine for most of the applications, if your watchlist is very large, it could cause a timeout issue depending on the rest of the KQL query.
For those cases, use the “_ASIM_GetWatchlistRaw” function to return specific rows. It takes two parameters: The Workbook name and a dynamic list of the values for the “searchkey” column that will be used to determine which rows to return.
The example below demonstrates how to use this function:
let Users = dynamic([“User1@contoso.com”, “user1@contoso.com”]);
_ASIM_GetWatchlistRaw(‘VIP’,Users)
| evaluate bag_unpack(WatchlistItem)
Note the call to “evaluate bag_unpack(WatchlistItem)” at the end. This is needed to expand the raw information that is stored in the watch list rows into the same format you get using “_GetWatchlist”
Microsoft Tech Community – Latest Blogs –Read More
Lesson Learned #477:NEXT VALUE FOR function cannot be used if ROWCOUNT option has been set
In the realm of SQL Server, certain combinations of commands and functions can lead to unexpected conflicts and errors. A notable example is the conflict between the NEXT VALUE FOR function and the ROWCOUNT setting. This article aims to dissect the nature of this error, explaining why it occurs, its implications, and how to effectively capture and analyze it using Extended Events in Azure SQL Database. For example, we got the following error message: Msg 11739, Level 15, State 1, Line 11 – NEXT VALUE FOR function cannot be used if ROWCOUNT option has been set, or the query contains TOP or OFFSET. – NEXT VALUE FOR (Transact-SQL) – SQL Server | Microsoft Learn
Section 1: Understanding the Error
What is NEXT VALUE FOR? The NEXT VALUE FOR function in SQL Server is a crucial tool for generating sequential values from a defined sequence. It’s commonly used for auto-generating unique identifiers, like primary keys.
Conflict with ROWCOUNT: The error arises when NEXT VALUE FOR is used in conjunction with the ROWCOUNT option. ROWCOUNT, when set, limits the number of rows affected by a query. However, NEXT VALUE FOR expects to operate without such limitations, leading to a conflict. This issue can also manifest when using TOP or OFFSET clauses, which similarly restrict the result set.
Error Scenario: Imagine a scenario where a developer attempts to retrieve the next value from a sequence while ROWCOUNT is set to a specific limit. This operation triggers an error, as SQL Server cannot reconcile the sequence’s need for unbounded operation with the imposed row count restriction.
Section 2: Capturing the Error with Extended Events
Introduction to Extended Events: Extended Events are a lightweight, highly configurable system for monitoring and troubleshooting in SQL Server and Azure SQL Database.
Setting up an Extended Event Session: Guide the reader through setting up an Extended Event session to capture this specific error. Mention the need to focus on the error_reported event and how to configure the session to target the ring buffer for data collection.
Querying the Ring Buffer: Provide a detailed explanation and a sample query on how to retrieve and analyze the error information from the ring buffer. This will help in understanding the occurrence and frequency of the error in a live environment.
You could reproduce the issue following this syntax:
CREATE SEQUENCE TestSequence
AS INT
START WITH 1
INCREMENT BY 1;
SET ROWCOUNT 1;
SELECT NEXT VALUE FOR TestSequence, * FROM MyTable;
In order to capture this info we could create an extended event
CREATE EVENT SESSION [CaptureError] ON database
ADD EVENT sqlserver.error_reported(
ACTION(sqlserver.sql_text)
)
ADD TARGET package0.ring_buffer
WITH (MAX_MEMORY=4096 KB,EVENT_RETENTION_MODE=ALLOW_SINGLE_EVENT_LOSS,MAX_DISPATCH_LATENCY=30 SECONDS,MAX_EVENT_SIZE=0 KB,MEMORY_PARTITION_MODE=NONE,TRACK_CAUSALITY=OFF,STARTUP_STATE=OFF)
GO
ALTER EVENT SESSION [CaptureError] ON database STATE = START;
SELECT
event_data.value(‘(@timestamp)[1]’, ‘DATETIME2’) AS TimeStamp,
event_data.value(‘(data[@name=”error_number”]/value)[1]’, ‘INT’) AS ErrorNumber,
event_data.value(‘(data[@name=”message”]/value)[1]’, ‘VARCHAR(MAX)’) AS ErrorMessage,
event_data.value(‘(action[@name=”sql_text”]/value)[1]’, ‘VARCHAR(MAX)’) AS SqlText
FROM
(
SELECT
CAST(target_data AS XML) AS target_data
FROM
sys.dm_xe_database_session_targets AS t
INNER JOIN
sys.dm_xe_database_sessions AS s ON t.event_session_address = s.address
WHERE
s.name = ‘CaptureError’
AND t.target_name = ‘ring_buffer’
) AS tab
CROSS APPLY
target_data.nodes(‘RingBufferTarget/event’) AS q(event_data)
Enjoy!
Microsoft Tech Community – Latest Blogs –Read More
Lesson Learned #476:Identifying Sleeping Sessions with Open Transactions in Azure SQL Database
In SQL Server environments, managing session states and transactions is key to ensuring optimal database performance. A particular challenge arises with sessions in a ‘sleeping’ state holding open transactions for extended periods. These sessions, while seemingly inactive, can hold locks on resources, leading to potential deadlocks or performance degradation.
Our focus is on a SQL query designed to pinpoint such sessions. The query utilizes SQL Server’s dynamic management views: sys.dm_exec_sessions, sys.dm_exec_requests, and sys.dm_tran_session_transactions. These views provide real-time data about active sessions, their current requests, and associated transaction details.
The heart of the query lies in its ability to filter sessions based on specific criteria: sessions must be in a ‘sleeping’ state, have an open transaction, and be inactive for over 5 minutes. This precise filtering allows database administrators to quickly identify and address sessions that might contribute to resource locking and overall performance issues…”
SELECT
ses.session_id,
ses.login_name,
req.start_time,
req.total_elapsed_time,
req.command,
req.status,
trans.transaction_id,
ses.status,
ses.total_elapsed_time,
ses.last_request_start_time,
ses.last_request_end_time,
ses.login_time,
DATEDIFF(minute, ses.last_request_end_time, GETDATE()) AS InactiveTime
FROM sys.dm_exec_sessions ses
LEFT JOIN sys.dm_exec_requests req ON req.session_id = ses.session_id
LEFT JOIN sys.dm_tran_session_transactions trans ON ses.session_id = trans.session_id
WHERE trans.transaction_id IS NOT NULL
AND DATEDIFF(minute, ses.last_request_end_time, GETDATE()) > 5
and ses.status = ‘sleeping’
Enjoy!
Microsoft Tech Community – Latest Blogs –Read More
Python Data Science Day 2024: Unleashing the Power of Python in Data Analysis
Hello, fellow data enthusiasts!
We’re excited to share that the Python Data Science Day is set to take place on March 14th, 2024, aligning with the mathematical constant Pi Day (3.14). This event is a fantastic opportunity for Python developers, entrepreneurs, data scientists, students, and researchers to come together and explore modern solutions for data pipelines and complex queries.
What to Expect?
The Python Data Science Day will feature a variety of sessions and lightning talks from experts in the field. Whether you’re interested in high-level programming topics or diving deep into specific features, there’s something for everyone.
Sessions: These 25-minute presentations, either pre-recorded or live, will be delivered by up to two people and cover a range of programming stories, approaches, and solutions.
Lightning Talks: If you’re new to public speaking or have a concise idea to share, these 5 to 7-minute talks are perfect for you. They focus on a single idea and are designed to inspire further learning.
Call for Speaker Proposals
The call for speaker proposals is open until January 25th, 2024. If you have a cool tool, product, or skill to discuss, we encourage you to submit your proposal and join the list of amazing speakers. https://aka.ms/Python/DataScienceDay/CFP
More Ways to Engage
Microsoft Fabric Global AI Hack Together: Join us from February 15th to March 4th, 2024, for live streams and real-world problem-solving with an AI-powered analytics platform.
14 Days of Python Data Science Series: Leading up to the event, we’ll release articles and recipes for using Data Science on Microsoft tools.
Data Science Cloud Skills Challenge: Participate in self-paced learning until April 15th, 2024.
Special Guest Speakers
You could be among the special guest speakers, joining the ranks of Sarah Kaiser, PhD, and Soojin Choi. Don’t miss this chance to contribute to the Python Data Science community.
Join the Conversation
Connect with us on Discord to continue the discussion and share your experiences with like-minded individuals.
We can’t wait to see you there and witness the innovative ideas and projects you’ll bring to the table. Mark your calendars for March 14th, 2024, and prepare for a day filled with Python and data science wonders! https://aka.ms/python-discord
More Data Science at Microsoft…
Data Scientist Certifications
Data Scientist Training Path
Data Science for Beginners – GitHub Repo
Microsoft Tech Community – Latest Blogs –Read More
Expanding availability of Copilot for Microsoft 365
Today, we announced that we are expanding Copilot for Microsoft 365 to a much broader set of organizations, available across more channels, and without a minimum seat required. We are also extending our data residency commitments for Copilot for Microsoft 365 and bringing Microsoft Copilot Graph-grounded chat to Copilot in Windows. Join the upcoming AMA and Tech Accelerator event and engage with experts from Microsoft to learn more about Copilot for Microsoft 365.
Expanding availability of Copilot for Microsoft 365
Starting today, we have removed the 300-seat minimum purchase for Copilot for Microsoft 365 commercial plans. We have also extended support so that Office 365 E3 and E5 customers are eligible to purchase Copilot, and we’re extending Semantic Index for Copilot to Office 365 users with a paid Copilot license. Finally, we have announced that Copilot for Microsoft 365 is generally available for businesses of all sizes, supported on Microsoft 365 Business Standard or Business Premium. This follows a successful early access program focused specifically on small and medium businesses, as well as the previously announced availability for staff and faculty of education institutions with Microsoft 365 A3 or A5 licenses. Commercial customers—including small and medium-sized businesses—can now purchase Copilot for Microsoft 365 through our network Cloud Solution Provider partners (CSPs) and you can learn more about them here.
We still recommend that customers start by giving Copilot to a critical mass of their information workers. We learned during the early access program that this creates a flywheel of interest and adoption, accelerating time to value and an organization’s ability to measure impact in a meaningful way. Copilot for Microsoft 365 licenses will be capped by the total number of eligible base licenses that a customer has. That is, customers must have a product license of one of the prerequisite base SKUs for each seat of Copilot for Microsoft 365 they purchase. You can review requirements for Copilot here.
Updating data residency commitments
We’ve heard feedback from our Enterprise customers that they need assurances that Copilot data is managed appropriately across geographically diverse teams. To support, Copilot for Microsoft 365 upholds residency commitments as outlined in the Microsoft Product Terms and Data Protection Addendum.
We’re pleased to share that later this year Copilot for Microsoft 365 will be added as a covered workload under the data residency commitments in Microsoft Product Terms and the Microsoft Advanced Data Residency (ADR) and Multi-Geo Capabilities add-ons. For additional information on Copilot for Microsoft 365 privacy and data storage please visit Data, Privacy, and Security for Microsoft Copilot for Microsoft 365. To learn more about our commitments to data residency, see Microsoft 365 Data Residency Overview and Definitions.
Microsoft Copilot capabilities coming to Windows desktop
Organizations will soon be able to experience Copilot for Microsoft 365 integrated in Windows desktop, bringing Graph-grounded chat capabilities to Copilot in Windows for users with a Copilot for Microsoft 365 license. This update will be available to organizations with Copilot for Microsoft 365 and Copilot in Windows enabled, beginning the week of February 4th. This adds a new, simple way for users to access Copilot, in addition to the current surfaces in Teams, Edge, and copilot.microsoft.com. For information on managing Copilot in Windows, review this article.
Start preparing your organization for Copilot for Microsoft 365
There are steps you can take today to get your tenant prepared for Copilot:
Prepare your data and assess all relevant data security, privacy, and compliance controls are in place. Copilot inherits your existing permissions and policies so ensuring that these are in place helps ensure seamless deployment. Conduct access reviews for SharePoint sites, documents and tenant data, employ the use of sensitivity labels to protect important data, and validate policies for data loss prevention, retention, and compliance.
Review prerequisites for Copilot by reviewing overview and requirements for Microsoft 365 Copilot to position your tenant to seamlessly deploy Copilot. This setup guide also provides a simple walkthrough of the process.
Learn more about Copilot for Microsoft 365, how it works, benefits to your organizations, how your data is handled and protected.
Familiarize yourself with the admin controls available to manage Copilot in the Microsoft 365 admin center Copilot page.
Develop your adoption strategy by leveraging the resources available on our adoption site, including this adoption kit and user onboarding toolkit.
Check readiness, measure adoption and impact through the Microsoft Copilot Dashboard (in Preview) in Viva Insights or PowerBI that helps organizations maximize the value of Copilot for Microsoft 365. It provides actionable insights to help your organization get ready to deploy AI, drive adoption based on how AI is transforming workplace behavior and measure the impact of Copilot.
To learn more about Copilot, you can review our documentation hub, requirements, setup, and information about privacy, security, and compliance. You can also watch our sessions at this past Ignite on getting ready for Copilot.
For small and medium business customers, join our discussion forum to collaborate with other Copilot for Microsoft 365 users, take part in community calls with open Q&A, hear directly from Microsoft Copilot engineers, and access exclusive resources. Also, check out the resources available on the small and medium business Copilot adoption site.
For a comprehensive introduction and deep dive into Copilot for Microsoft 365, join us during our Copilot for Microsoft 365 Tech Accelerator, February 28 and 29, right here in the Tech Community. Listen in as experts from Microsoft talk delve into preparing for Copilot with recommendations and best practices, share strategies on driving adoption, and measuring and maximizing value for your organization. There will also be plenty of opportunities to ask questions and engage with our experts.
Finally, join our next Ask Me Anything (AMA) tomorrow at 9am PT, here in the Copilot for Microsoft 365 Tech Community. Feel free to post your questions onto the event page ahead of time, and our panel of experts will answer them during the event.
Microsoft Tech Community – Latest Blogs –Read More
The Publisher failed to allocate a new set of identity ranges for the subscription
Problem:
===========
Assume that you have tables with Identity columns declared as datatype INT and you are using Auto Identity management for those articles in a Merge Publication.
This Publication has one or more subscribers and you tried to re-initialize one subscriber using a new Snapshot.
Merge agent fails with this error:
>>
Source: Merge Replication Provider
Number: -2147199417
Message: The Publisher failed to allocate a new set of identity ranges for the subscription. This can occur when a Publisher or a republishing Subscriber has run out of identity ranges to allocate to its own Subscribers or when an identity column data type does not support an additional identity range allocation. If a republishing Subscriber has run out of identity ranges, synchronize the republishing Subscriber to obtain more identity ranges before restarting the synchronization. If a Publisher runs out of identit
Cause:
============
Identity range Merge agent is trying to allocate, exceeds maximum value an INT datatype can have.
Resolution
=================
Assume that publisher database has only one Merge publication with 2 subscribers, and your merge articles have this definition:
>>>
exec sp_addmergearticle @publication = N’MergeRepl_ReproDB’, @article = N’tblCity’, @source_owner = N’dbo’, @source_object = N’tblCity’, @type = N’table’, @description = N”, @creation_script = N”, @pre_creation_cmd = N’drop’, @schema_option = 0x000000004C034FD1, @identityrangemanagementoption = N’auto’, @pub_identity_range = 1000, @identity_range = 1000, @threshold = 90, @destination_owner = N’dbo’, @force_reinit_subscription = 1, @column_tracking = N’false’, @subset_filterclause = N”, @vertical_partition = N’false’, @verify_resolver_signature = 1, @allow_interactive_resolver = N’false’, @fast_multicol_updateproc = N’true’, @check_permissions = 0, @subscriber_upload_options = 0, @delete_tracking = N’true’, @compensate_for_errors = N’false’, @stream_blob_columns = N’false’, @partition_options = 0
exec sp_addmergearticle @publication = N’MergeRepl_ReproDB’, @article = N’tblCity1′, @source_owner = N’dbo’, @source_object = N’tblCity1′, @type = N’table’, @description = N”, @creation_script = N”, @pre_creation_cmd = N’drop’, @schema_option = 0x000000004C034FD1, @identityrangemanagementoption = N’auto’, @pub_identity_range = 1000, @identity_range = 1000, @threshold = 90, @destination_owner = N’dbo’, @force_reinit_subscription = 1, @column_tracking = N’false’, @subset_filterclause = N”, @vertical_partition = N’false’, @verify_resolver_signature = 1, @allow_interactive_resolver = N’false’, @fast_multicol_updateproc = N’true’, @check_permissions = 0, @subscriber_upload_options = 0, @delete_tracking = N’true’, @compensate_for_errors = N’false’, @stream_blob_columns = N’false’, @partition_options = 0
You can run this query against the Published database to see what articles range is full or have very few values left:
>>>
select a.name,
max_used=max_used,
diff_pub_range_end_max_used=range_end – max_used, –this tells how many values are left
pub_range_begin=range_begin,
pub_range_end=range_end
from dbo.MSmerge_identity_range b ,
sysmergearticles a
where
a.artid = b.artid
and is_pub_range=1
order by max_used desc
name max_used diff_pub_range_end_max_used pub_range_begin pub_range_end
————– ————————————— ————————————— ————————————— ————-
tblCity 2147483647 0 2147477647 2147483647
tblCity1 6001 2147477646 1 2147483647
As you see from above diff_pub_range_end_max_used column is zero for tblCity.
When Merge agent runs depending on how many servers are involved it has to allocate 2 ranges for each.
In the example above we have Publisher and 2 subscribers and @identity_range is 1000. So, we will have to allocate range for 3 servers i.e., 3 * (2*1000) = 6000
Our diff_pub_range_end_max_used should be greater than 6000, only then we will be able to allocate a new range for all the servers.
To resolve the issue.
Remove tblCity table from publication.
Change the datatype from int to bigint and add this table back to publication.
Then generate a new snapshot. It will generate snapshots for all articles, but only this 1 table will be added back to the existing Subscribers.
Microsoft Tech Community – Latest Blogs –Read More