Category: Microsoft
Category Archives: Microsoft
Lesson Learned #474:Identifying and Preventing Unauthorized Application Access to Azure SQL Database
In recent scenarios encountered with our customers, we have come across a specific need: restricting certain users from using SQL Server Management Studio (SSMS) or other applications to connect to a designated database in Azure SQL Database. A common solution in traditional SQL Server environments, like the use of LOGIN TRIGGERS, is not available in Azure SQL Database. This limitation poses a unique challenge in database management and security.
To address this challenge, I’d like to share an alternative that combines the power of Extended Events in Azure SQL Database with PowerShell scripting. This method effectively captures and monitors login events, providing administrators with timely alerts whenever a specified user connects to the database using a prohibited application, such as SSMS.
How It Works
Extended Events Setup: We start by setting up an Extended Event in Azure SQL Database. This event is configured to capture login activities, specifically focusing on the application name used for the connection. By filtering for certain applications (like SSMS), we can track unauthorized access attempts.
PowerShell Script: A PowerShell script is then employed to query these captured events at regular intervals. This script connects to the Azure SQL Database, retrieves the relevant event data, and checks for any instances where the specified users have connected via the restricted applications.
Email Alerts: Upon detecting such an event, the PowerShell script automatically sends an email notification to the database administrator. This alert contains details of the unauthorized login attempt, such as the timestamp, username, and application used. This prompt information allows the administrator to take immediate corrective measures.
Advantages
Proactive Monitoring: This approach provides continuous monitoring of the database connections, ensuring that any unauthorized access is quickly detected and reported.
Customizable: The method is highly customizable. Administrators can specify which applications to monitor and can easily adjust the script to cater to different user groups or connection parameters.
No Direct Blocking: While this method does not directly block the connection, it provides immediate alerts, enabling administrators to react swiftly to enforce compliance and security protocols.
This article provides a high-level overview of how to implement this solution. For detailed steps and script examples, administrators are encouraged to tailor the approach to their specific environment and requirements.
Extended Event
CREATE EVENT SESSION Track_SSMS_Logins
ON DATABASE
ADD EVENT sqlserver.sql_batch_starting(
ACTION(sqlserver.client_app_name, sqlserver.client_hostname, sqlserver.username, sqlserver.session_id)
WHERE (sqlserver.client_app_name LIKE ‘%Management Studio%’)
)
ADD TARGET package0.ring_buffer
(SET max_events_limit = 1000, max_memory = 4096)
WITH (EVENT_RETENTION_MODE = NO_EVENT_LOSS, MAX_DISPATCH_LATENCY = 5 SECONDS);
GO
ALTER EVENT SESSION Track_SSMS_Logins ON DATABASE STATE = START;
Query to run using ring buffers
SELECT
n.value(‘(@timestamp)[1]’, ‘datetime2’) AS TimeStamp,
n.value(‘(action[@name=”client_app_name”]/value)[1]’, ‘varchar(max)’) AS Application,
n.value(‘(action[@name=”username”]/value)[1]’, ‘varchar(max)’) AS Username,
n.value(‘(action[@name=”client_hostname”]/value)[1]’, ‘varchar(max)’) AS HostName,
n.value(‘(action[@name=”session_id”]/value)[1]’, ‘int’) AS SessionID
FROM
(SELECT CAST(target_data AS xml) AS event_data
FROM sys.dm_xe_database_session_targets
WHERE event_session_address =
(SELECT address FROM sys.dm_xe_database_sessions WHERE name = ‘Track_SSMS_Logins’)
AND target_name = ‘ring_buffer’) AS tab
CROSS APPLY event_data.nodes(‘/RingBufferTarget/event’) AS q(n);
Powershell Script
# Connection configuration
$Database = “DBNAme”
$Server = “Servername.database.windows.net”
$Username = “username”
$Password = “pwd!”
$emailFrom = “EmailFrom@ZYX.com”
$emailTo = “EmailTo@XYZ.com”
$smtpServer = “smtpservername”
$smtpUsername = “smtpusername”
$smtpPassword = “smtppassword”
$smtpPort=25
$ConnectionString = “Server=$Server;Database=$Database;User Id=$Username;Password=$Password;”
# Last check date
$LastCheckFile = “c:tempLastCheck.txt”
$LastCheck = Get-Content $LastCheckFile -ErrorAction SilentlyContinue
if (!$LastCheck) {
$LastCheck = [DateTime]::MinValue
}
# SQL query
$Query = @”
SELECT
n.value(‘(@timestamp)[1]’, ‘datetime2’) AS TimeStamp,
n.value(‘(action[@name=”client_app_name”]/value)[1]’, ‘varchar(max)’) AS Application,
n.value(‘(action[@name=”username”]/value)[1]’, ‘varchar(max)’) AS Username,
n.value(‘(action[@name=”client_hostname”]/value)[1]’, ‘varchar(max)’) AS HostName,
n.value(‘(action[@name=”session_id”]/value)[1]’, ‘int’) AS SessionID
FROM
(SELECT CAST(target_data AS xml) AS event_data
FROM sys.dm_xe_database_session_targets
WHERE event_session_address =
(SELECT address FROM sys.dm_xe_database_sessions WHERE name = ‘Track_SSMS_Logins’)
AND target_name = ‘ring_buffer’) AS tab
CROSS APPLY event_data.nodes(‘/RingBufferTarget/event’) AS q(n)
WHERE
n.value(‘(@timestamp)[1]’, ‘datetime2’) > ‘$LastCheck’
“@
# Create and open SQL connection
$SqlConnection = New-Object System.Data.SqlClient.SqlConnection
$SqlConnection.ConnectionString = $ConnectionString
$SqlConnection.Open()
# Create SQL command
$SqlCommand = $SqlConnection.CreateCommand()
$SqlCommand.CommandText = $Query
# Execute SQL command
$SqlAdapter = New-Object System.Data.SqlClient.SqlDataAdapter $SqlCommand
$DataSet = New-Object System.Data.DataSet
$SqlAdapter.Fill($DataSet)
$SqlConnection.Close()
# Process the results
$Results = $DataSet.Tables[0]
# Check for new events
if ($Results.Rows.Count -gt 0) {
# Prepare email content
$EmailBody = $Results | Out-String
$smtp = New-Object Net.Mail.SmtpClient($smtpServer, $smtpPort)
$smtp.EnableSsl = $true
$smtp.Credentials = New-Object System.Net.NetworkCredential($smtpUsername, $smtpPassword)
$mailMessage = New-Object Net.Mail.MailMessage($emailFrom, $emailTo)
$mailMessage.Subject = “Alert: SQL Access in database $Database”
$mailMessage.Body = “SQL Access Alert in database $Database on server $Server at $LastCheck.”
$smtp.Send($EmailBody)
# Save the current timestamp for the next check
Get-Date -Format “o” | Out-File $LastCheckFile
}
# Remember to schedule this script to run every 5 minutes using Windows Task Scheduler
Of course, that using SQL auditing o Log analytics will be another alternative.
Microsoft Tech Community – Latest Blogs –Read More
Lesson Learned #474:Identifying and Preventing Unauthorized Application Access to Azure SQL Database
In recent scenarios encountered with our customers, we have come across a specific need: restricting certain users from using SQL Server Management Studio (SSMS) or other applications to connect to a designated database in Azure SQL Database. A common solution in traditional SQL Server environments, like the use of LOGIN TRIGGERS, is not available in Azure SQL Database. This limitation poses a unique challenge in database management and security.
To address this challenge, I’d like to share an alternative that combines the power of Extended Events in Azure SQL Database with PowerShell scripting. This method effectively captures and monitors login events, providing administrators with timely alerts whenever a specified user connects to the database using a prohibited application, such as SSMS.
How It Works
Extended Events Setup: We start by setting up an Extended Event in Azure SQL Database. This event is configured to capture login activities, specifically focusing on the application name used for the connection. By filtering for certain applications (like SSMS), we can track unauthorized access attempts.
PowerShell Script: A PowerShell script is then employed to query these captured events at regular intervals. This script connects to the Azure SQL Database, retrieves the relevant event data, and checks for any instances where the specified users have connected via the restricted applications.
Email Alerts: Upon detecting such an event, the PowerShell script automatically sends an email notification to the database administrator. This alert contains details of the unauthorized login attempt, such as the timestamp, username, and application used. This prompt information allows the administrator to take immediate corrective measures.
Advantages
Proactive Monitoring: This approach provides continuous monitoring of the database connections, ensuring that any unauthorized access is quickly detected and reported.
Customizable: The method is highly customizable. Administrators can specify which applications to monitor and can easily adjust the script to cater to different user groups or connection parameters.
No Direct Blocking: While this method does not directly block the connection, it provides immediate alerts, enabling administrators to react swiftly to enforce compliance and security protocols.
This article provides a high-level overview of how to implement this solution. For detailed steps and script examples, administrators are encouraged to tailor the approach to their specific environment and requirements.
Extended Event
CREATE EVENT SESSION Track_SSMS_Logins
ON DATABASE
ADD EVENT sqlserver.sql_batch_starting(
ACTION(sqlserver.client_app_name, sqlserver.client_hostname, sqlserver.username, sqlserver.session_id)
WHERE (sqlserver.client_app_name LIKE ‘%Management Studio%’)
)
ADD TARGET package0.ring_buffer
(SET max_events_limit = 1000, max_memory = 4096)
WITH (EVENT_RETENTION_MODE = NO_EVENT_LOSS, MAX_DISPATCH_LATENCY = 5 SECONDS);
GO
ALTER EVENT SESSION Track_SSMS_Logins ON DATABASE STATE = START;
Query to run using ring buffers
SELECT
n.value(‘(@timestamp)[1]’, ‘datetime2’) AS TimeStamp,
n.value(‘(action[@name=”client_app_name”]/value)[1]’, ‘varchar(max)’) AS Application,
n.value(‘(action[@name=”username”]/value)[1]’, ‘varchar(max)’) AS Username,
n.value(‘(action[@name=”client_hostname”]/value)[1]’, ‘varchar(max)’) AS HostName,
n.value(‘(action[@name=”session_id”]/value)[1]’, ‘int’) AS SessionID
FROM
(SELECT CAST(target_data AS xml) AS event_data
FROM sys.dm_xe_database_session_targets
WHERE event_session_address =
(SELECT address FROM sys.dm_xe_database_sessions WHERE name = ‘Track_SSMS_Logins’)
AND target_name = ‘ring_buffer’) AS tab
CROSS APPLY event_data.nodes(‘/RingBufferTarget/event’) AS q(n);
Powershell Script
# Connection configuration
$Database = “DBNAme”
$Server = “Servername.database.windows.net”
$Username = “username”
$Password = “pwd!”
$emailFrom = “EmailFrom@ZYX.com”
$emailTo = “EmailTo@XYZ.com”
$smtpServer = “smtpservername”
$smtpUsername = “smtpusername”
$smtpPassword = “smtppassword”
$smtpPort=25
$ConnectionString = “Server=$Server;Database=$Database;User Id=$Username;Password=$Password;”
# Last check date
$LastCheckFile = “c:tempLastCheck.txt”
$LastCheck = Get-Content $LastCheckFile -ErrorAction SilentlyContinue
if (!$LastCheck) {
$LastCheck = [DateTime]::MinValue
}
# SQL query
$Query = @”
SELECT
n.value(‘(@timestamp)[1]’, ‘datetime2’) AS TimeStamp,
n.value(‘(action[@name=”client_app_name”]/value)[1]’, ‘varchar(max)’) AS Application,
n.value(‘(action[@name=”username”]/value)[1]’, ‘varchar(max)’) AS Username,
n.value(‘(action[@name=”client_hostname”]/value)[1]’, ‘varchar(max)’) AS HostName,
n.value(‘(action[@name=”session_id”]/value)[1]’, ‘int’) AS SessionID
FROM
(SELECT CAST(target_data AS xml) AS event_data
FROM sys.dm_xe_database_session_targets
WHERE event_session_address =
(SELECT address FROM sys.dm_xe_database_sessions WHERE name = ‘Track_SSMS_Logins’)
AND target_name = ‘ring_buffer’) AS tab
CROSS APPLY event_data.nodes(‘/RingBufferTarget/event’) AS q(n)
WHERE
n.value(‘(@timestamp)[1]’, ‘datetime2’) > ‘$LastCheck’
“@
# Create and open SQL connection
$SqlConnection = New-Object System.Data.SqlClient.SqlConnection
$SqlConnection.ConnectionString = $ConnectionString
$SqlConnection.Open()
# Create SQL command
$SqlCommand = $SqlConnection.CreateCommand()
$SqlCommand.CommandText = $Query
# Execute SQL command
$SqlAdapter = New-Object System.Data.SqlClient.SqlDataAdapter $SqlCommand
$DataSet = New-Object System.Data.DataSet
$SqlAdapter.Fill($DataSet)
$SqlConnection.Close()
# Process the results
$Results = $DataSet.Tables[0]
# Check for new events
if ($Results.Rows.Count -gt 0) {
# Prepare email content
$EmailBody = $Results | Out-String
$smtp = New-Object Net.Mail.SmtpClient($smtpServer, $smtpPort)
$smtp.EnableSsl = $true
$smtp.Credentials = New-Object System.Net.NetworkCredential($smtpUsername, $smtpPassword)
$mailMessage = New-Object Net.Mail.MailMessage($emailFrom, $emailTo)
$mailMessage.Subject = “Alert: SQL Access in database $Database”
$mailMessage.Body = “SQL Access Alert in database $Database on server $Server at $LastCheck.”
$smtp.Send($EmailBody)
# Save the current timestamp for the next check
Get-Date -Format “o” | Out-File $LastCheckFile
}
# Remember to schedule this script to run every 5 minutes using Windows Task Scheduler
Of course, that using SQL auditing o Log analytics will be another alternative.
Microsoft Tech Community – Latest Blogs –Read More
Lesson Learned #474:Identifying and Preventing Unauthorized Application Access to Azure SQL Database
In recent scenarios encountered with our customers, we have come across a specific need: restricting certain users from using SQL Server Management Studio (SSMS) or other applications to connect to a designated database in Azure SQL Database. A common solution in traditional SQL Server environments, like the use of LOGIN TRIGGERS, is not available in Azure SQL Database. This limitation poses a unique challenge in database management and security.
To address this challenge, I’d like to share an alternative that combines the power of Extended Events in Azure SQL Database with PowerShell scripting. This method effectively captures and monitors login events, providing administrators with timely alerts whenever a specified user connects to the database using a prohibited application, such as SSMS.
How It Works
Extended Events Setup: We start by setting up an Extended Event in Azure SQL Database. This event is configured to capture login activities, specifically focusing on the application name used for the connection. By filtering for certain applications (like SSMS), we can track unauthorized access attempts.
PowerShell Script: A PowerShell script is then employed to query these captured events at regular intervals. This script connects to the Azure SQL Database, retrieves the relevant event data, and checks for any instances where the specified users have connected via the restricted applications.
Email Alerts: Upon detecting such an event, the PowerShell script automatically sends an email notification to the database administrator. This alert contains details of the unauthorized login attempt, such as the timestamp, username, and application used. This prompt information allows the administrator to take immediate corrective measures.
Advantages
Proactive Monitoring: This approach provides continuous monitoring of the database connections, ensuring that any unauthorized access is quickly detected and reported.
Customizable: The method is highly customizable. Administrators can specify which applications to monitor and can easily adjust the script to cater to different user groups or connection parameters.
No Direct Blocking: While this method does not directly block the connection, it provides immediate alerts, enabling administrators to react swiftly to enforce compliance and security protocols.
This article provides a high-level overview of how to implement this solution. For detailed steps and script examples, administrators are encouraged to tailor the approach to their specific environment and requirements.
Extended Event
CREATE EVENT SESSION Track_SSMS_Logins
ON DATABASE
ADD EVENT sqlserver.sql_batch_starting(
ACTION(sqlserver.client_app_name, sqlserver.client_hostname, sqlserver.username, sqlserver.session_id)
WHERE (sqlserver.client_app_name LIKE ‘%Management Studio%’)
)
ADD TARGET package0.ring_buffer
(SET max_events_limit = 1000, max_memory = 4096)
WITH (EVENT_RETENTION_MODE = NO_EVENT_LOSS, MAX_DISPATCH_LATENCY = 5 SECONDS);
GO
ALTER EVENT SESSION Track_SSMS_Logins ON DATABASE STATE = START;
Query to run using ring buffers
SELECT
n.value(‘(@timestamp)[1]’, ‘datetime2’) AS TimeStamp,
n.value(‘(action[@name=”client_app_name”]/value)[1]’, ‘varchar(max)’) AS Application,
n.value(‘(action[@name=”username”]/value)[1]’, ‘varchar(max)’) AS Username,
n.value(‘(action[@name=”client_hostname”]/value)[1]’, ‘varchar(max)’) AS HostName,
n.value(‘(action[@name=”session_id”]/value)[1]’, ‘int’) AS SessionID
FROM
(SELECT CAST(target_data AS xml) AS event_data
FROM sys.dm_xe_database_session_targets
WHERE event_session_address =
(SELECT address FROM sys.dm_xe_database_sessions WHERE name = ‘Track_SSMS_Logins’)
AND target_name = ‘ring_buffer’) AS tab
CROSS APPLY event_data.nodes(‘/RingBufferTarget/event’) AS q(n);
Powershell Script
# Connection configuration
$Database = “DBNAme”
$Server = “Servername.database.windows.net”
$Username = “username”
$Password = “pwd!”
$emailFrom = “EmailFrom@ZYX.com”
$emailTo = “EmailTo@XYZ.com”
$smtpServer = “smtpservername”
$smtpUsername = “smtpusername”
$smtpPassword = “smtppassword”
$smtpPort=25
$ConnectionString = “Server=$Server;Database=$Database;User Id=$Username;Password=$Password;”
# Last check date
$LastCheckFile = “c:tempLastCheck.txt”
$LastCheck = Get-Content $LastCheckFile -ErrorAction SilentlyContinue
if (!$LastCheck) {
$LastCheck = [DateTime]::MinValue
}
# SQL query
$Query = @”
SELECT
n.value(‘(@timestamp)[1]’, ‘datetime2’) AS TimeStamp,
n.value(‘(action[@name=”client_app_name”]/value)[1]’, ‘varchar(max)’) AS Application,
n.value(‘(action[@name=”username”]/value)[1]’, ‘varchar(max)’) AS Username,
n.value(‘(action[@name=”client_hostname”]/value)[1]’, ‘varchar(max)’) AS HostName,
n.value(‘(action[@name=”session_id”]/value)[1]’, ‘int’) AS SessionID
FROM
(SELECT CAST(target_data AS xml) AS event_data
FROM sys.dm_xe_database_session_targets
WHERE event_session_address =
(SELECT address FROM sys.dm_xe_database_sessions WHERE name = ‘Track_SSMS_Logins’)
AND target_name = ‘ring_buffer’) AS tab
CROSS APPLY event_data.nodes(‘/RingBufferTarget/event’) AS q(n)
WHERE
n.value(‘(@timestamp)[1]’, ‘datetime2’) > ‘$LastCheck’
“@
# Create and open SQL connection
$SqlConnection = New-Object System.Data.SqlClient.SqlConnection
$SqlConnection.ConnectionString = $ConnectionString
$SqlConnection.Open()
# Create SQL command
$SqlCommand = $SqlConnection.CreateCommand()
$SqlCommand.CommandText = $Query
# Execute SQL command
$SqlAdapter = New-Object System.Data.SqlClient.SqlDataAdapter $SqlCommand
$DataSet = New-Object System.Data.DataSet
$SqlAdapter.Fill($DataSet)
$SqlConnection.Close()
# Process the results
$Results = $DataSet.Tables[0]
# Check for new events
if ($Results.Rows.Count -gt 0) {
# Prepare email content
$EmailBody = $Results | Out-String
$smtp = New-Object Net.Mail.SmtpClient($smtpServer, $smtpPort)
$smtp.EnableSsl = $true
$smtp.Credentials = New-Object System.Net.NetworkCredential($smtpUsername, $smtpPassword)
$mailMessage = New-Object Net.Mail.MailMessage($emailFrom, $emailTo)
$mailMessage.Subject = “Alert: SQL Access in database $Database”
$mailMessage.Body = “SQL Access Alert in database $Database on server $Server at $LastCheck.”
$smtp.Send($EmailBody)
# Save the current timestamp for the next check
Get-Date -Format “o” | Out-File $LastCheckFile
}
# Remember to schedule this script to run every 5 minutes using Windows Task Scheduler
Of course, that using SQL auditing o Log analytics will be another alternative.
Microsoft Tech Community – Latest Blogs –Read More
Lesson Learned #474:Identifying and Preventing Unauthorized Application Access to Azure SQL Database
In recent scenarios encountered with our customers, we have come across a specific need: restricting certain users from using SQL Server Management Studio (SSMS) or other applications to connect to a designated database in Azure SQL Database. A common solution in traditional SQL Server environments, like the use of LOGIN TRIGGERS, is not available in Azure SQL Database. This limitation poses a unique challenge in database management and security.
To address this challenge, I’d like to share an alternative that combines the power of Extended Events in Azure SQL Database with PowerShell scripting. This method effectively captures and monitors login events, providing administrators with timely alerts whenever a specified user connects to the database using a prohibited application, such as SSMS.
How It Works
Extended Events Setup: We start by setting up an Extended Event in Azure SQL Database. This event is configured to capture login activities, specifically focusing on the application name used for the connection. By filtering for certain applications (like SSMS), we can track unauthorized access attempts.
PowerShell Script: A PowerShell script is then employed to query these captured events at regular intervals. This script connects to the Azure SQL Database, retrieves the relevant event data, and checks for any instances where the specified users have connected via the restricted applications.
Email Alerts: Upon detecting such an event, the PowerShell script automatically sends an email notification to the database administrator. This alert contains details of the unauthorized login attempt, such as the timestamp, username, and application used. This prompt information allows the administrator to take immediate corrective measures.
Advantages
Proactive Monitoring: This approach provides continuous monitoring of the database connections, ensuring that any unauthorized access is quickly detected and reported.
Customizable: The method is highly customizable. Administrators can specify which applications to monitor and can easily adjust the script to cater to different user groups or connection parameters.
No Direct Blocking: While this method does not directly block the connection, it provides immediate alerts, enabling administrators to react swiftly to enforce compliance and security protocols.
This article provides a high-level overview of how to implement this solution. For detailed steps and script examples, administrators are encouraged to tailor the approach to their specific environment and requirements.
Extended Event
CREATE EVENT SESSION Track_SSMS_Logins
ON DATABASE
ADD EVENT sqlserver.sql_batch_starting(
ACTION(sqlserver.client_app_name, sqlserver.client_hostname, sqlserver.username, sqlserver.session_id)
WHERE (sqlserver.client_app_name LIKE ‘%Management Studio%’)
)
ADD TARGET package0.ring_buffer
(SET max_events_limit = 1000, max_memory = 4096)
WITH (EVENT_RETENTION_MODE = NO_EVENT_LOSS, MAX_DISPATCH_LATENCY = 5 SECONDS);
GO
ALTER EVENT SESSION Track_SSMS_Logins ON DATABASE STATE = START;
Query to run using ring buffers
SELECT
n.value(‘(@timestamp)[1]’, ‘datetime2’) AS TimeStamp,
n.value(‘(action[@name=”client_app_name”]/value)[1]’, ‘varchar(max)’) AS Application,
n.value(‘(action[@name=”username”]/value)[1]’, ‘varchar(max)’) AS Username,
n.value(‘(action[@name=”client_hostname”]/value)[1]’, ‘varchar(max)’) AS HostName,
n.value(‘(action[@name=”session_id”]/value)[1]’, ‘int’) AS SessionID
FROM
(SELECT CAST(target_data AS xml) AS event_data
FROM sys.dm_xe_database_session_targets
WHERE event_session_address =
(SELECT address FROM sys.dm_xe_database_sessions WHERE name = ‘Track_SSMS_Logins’)
AND target_name = ‘ring_buffer’) AS tab
CROSS APPLY event_data.nodes(‘/RingBufferTarget/event’) AS q(n);
Powershell Script
# Connection configuration
$Database = “DBNAme”
$Server = “Servername.database.windows.net”
$Username = “username”
$Password = “pwd!”
$emailFrom = “EmailFrom@ZYX.com”
$emailTo = “EmailTo@XYZ.com”
$smtpServer = “smtpservername”
$smtpUsername = “smtpusername”
$smtpPassword = “smtppassword”
$smtpPort=25
$ConnectionString = “Server=$Server;Database=$Database;User Id=$Username;Password=$Password;”
# Last check date
$LastCheckFile = “c:tempLastCheck.txt”
$LastCheck = Get-Content $LastCheckFile -ErrorAction SilentlyContinue
if (!$LastCheck) {
$LastCheck = [DateTime]::MinValue
}
# SQL query
$Query = @”
SELECT
n.value(‘(@timestamp)[1]’, ‘datetime2’) AS TimeStamp,
n.value(‘(action[@name=”client_app_name”]/value)[1]’, ‘varchar(max)’) AS Application,
n.value(‘(action[@name=”username”]/value)[1]’, ‘varchar(max)’) AS Username,
n.value(‘(action[@name=”client_hostname”]/value)[1]’, ‘varchar(max)’) AS HostName,
n.value(‘(action[@name=”session_id”]/value)[1]’, ‘int’) AS SessionID
FROM
(SELECT CAST(target_data AS xml) AS event_data
FROM sys.dm_xe_database_session_targets
WHERE event_session_address =
(SELECT address FROM sys.dm_xe_database_sessions WHERE name = ‘Track_SSMS_Logins’)
AND target_name = ‘ring_buffer’) AS tab
CROSS APPLY event_data.nodes(‘/RingBufferTarget/event’) AS q(n)
WHERE
n.value(‘(@timestamp)[1]’, ‘datetime2’) > ‘$LastCheck’
“@
# Create and open SQL connection
$SqlConnection = New-Object System.Data.SqlClient.SqlConnection
$SqlConnection.ConnectionString = $ConnectionString
$SqlConnection.Open()
# Create SQL command
$SqlCommand = $SqlConnection.CreateCommand()
$SqlCommand.CommandText = $Query
# Execute SQL command
$SqlAdapter = New-Object System.Data.SqlClient.SqlDataAdapter $SqlCommand
$DataSet = New-Object System.Data.DataSet
$SqlAdapter.Fill($DataSet)
$SqlConnection.Close()
# Process the results
$Results = $DataSet.Tables[0]
# Check for new events
if ($Results.Rows.Count -gt 0) {
# Prepare email content
$EmailBody = $Results | Out-String
$smtp = New-Object Net.Mail.SmtpClient($smtpServer, $smtpPort)
$smtp.EnableSsl = $true
$smtp.Credentials = New-Object System.Net.NetworkCredential($smtpUsername, $smtpPassword)
$mailMessage = New-Object Net.Mail.MailMessage($emailFrom, $emailTo)
$mailMessage.Subject = “Alert: SQL Access in database $Database”
$mailMessage.Body = “SQL Access Alert in database $Database on server $Server at $LastCheck.”
$smtp.Send($EmailBody)
# Save the current timestamp for the next check
Get-Date -Format “o” | Out-File $LastCheckFile
}
# Remember to schedule this script to run every 5 minutes using Windows Task Scheduler
Of course, that using SQL auditing o Log analytics will be another alternative.
Microsoft Tech Community – Latest Blogs –Read More
Lesson Learned #474:Identifying and Preventing Unauthorized Application Access to Azure SQL Database
In recent scenarios encountered with our customers, we have come across a specific need: restricting certain users from using SQL Server Management Studio (SSMS) or other applications to connect to a designated database in Azure SQL Database. A common solution in traditional SQL Server environments, like the use of LOGIN TRIGGERS, is not available in Azure SQL Database. This limitation poses a unique challenge in database management and security.
To address this challenge, I’d like to share an alternative that combines the power of Extended Events in Azure SQL Database with PowerShell scripting. This method effectively captures and monitors login events, providing administrators with timely alerts whenever a specified user connects to the database using a prohibited application, such as SSMS.
How It Works
Extended Events Setup: We start by setting up an Extended Event in Azure SQL Database. This event is configured to capture login activities, specifically focusing on the application name used for the connection. By filtering for certain applications (like SSMS), we can track unauthorized access attempts.
PowerShell Script: A PowerShell script is then employed to query these captured events at regular intervals. This script connects to the Azure SQL Database, retrieves the relevant event data, and checks for any instances where the specified users have connected via the restricted applications.
Email Alerts: Upon detecting such an event, the PowerShell script automatically sends an email notification to the database administrator. This alert contains details of the unauthorized login attempt, such as the timestamp, username, and application used. This prompt information allows the administrator to take immediate corrective measures.
Advantages
Proactive Monitoring: This approach provides continuous monitoring of the database connections, ensuring that any unauthorized access is quickly detected and reported.
Customizable: The method is highly customizable. Administrators can specify which applications to monitor and can easily adjust the script to cater to different user groups or connection parameters.
No Direct Blocking: While this method does not directly block the connection, it provides immediate alerts, enabling administrators to react swiftly to enforce compliance and security protocols.
This article provides a high-level overview of how to implement this solution. For detailed steps and script examples, administrators are encouraged to tailor the approach to their specific environment and requirements.
Extended Event
CREATE EVENT SESSION Track_SSMS_Logins
ON DATABASE
ADD EVENT sqlserver.sql_batch_starting(
ACTION(sqlserver.client_app_name, sqlserver.client_hostname, sqlserver.username, sqlserver.session_id)
WHERE (sqlserver.client_app_name LIKE ‘%Management Studio%’)
)
ADD TARGET package0.ring_buffer
(SET max_events_limit = 1000, max_memory = 4096)
WITH (EVENT_RETENTION_MODE = NO_EVENT_LOSS, MAX_DISPATCH_LATENCY = 5 SECONDS);
GO
ALTER EVENT SESSION Track_SSMS_Logins ON DATABASE STATE = START;
Query to run using ring buffers
SELECT
n.value(‘(@timestamp)[1]’, ‘datetime2’) AS TimeStamp,
n.value(‘(action[@name=”client_app_name”]/value)[1]’, ‘varchar(max)’) AS Application,
n.value(‘(action[@name=”username”]/value)[1]’, ‘varchar(max)’) AS Username,
n.value(‘(action[@name=”client_hostname”]/value)[1]’, ‘varchar(max)’) AS HostName,
n.value(‘(action[@name=”session_id”]/value)[1]’, ‘int’) AS SessionID
FROM
(SELECT CAST(target_data AS xml) AS event_data
FROM sys.dm_xe_database_session_targets
WHERE event_session_address =
(SELECT address FROM sys.dm_xe_database_sessions WHERE name = ‘Track_SSMS_Logins’)
AND target_name = ‘ring_buffer’) AS tab
CROSS APPLY event_data.nodes(‘/RingBufferTarget/event’) AS q(n);
Powershell Script
# Connection configuration
$Database = “DBNAme”
$Server = “Servername.database.windows.net”
$Username = “username”
$Password = “pwd!”
$emailFrom = “EmailFrom@ZYX.com”
$emailTo = “EmailTo@XYZ.com”
$smtpServer = “smtpservername”
$smtpUsername = “smtpusername”
$smtpPassword = “smtppassword”
$smtpPort=25
$ConnectionString = “Server=$Server;Database=$Database;User Id=$Username;Password=$Password;”
# Last check date
$LastCheckFile = “c:tempLastCheck.txt”
$LastCheck = Get-Content $LastCheckFile -ErrorAction SilentlyContinue
if (!$LastCheck) {
$LastCheck = [DateTime]::MinValue
}
# SQL query
$Query = @”
SELECT
n.value(‘(@timestamp)[1]’, ‘datetime2’) AS TimeStamp,
n.value(‘(action[@name=”client_app_name”]/value)[1]’, ‘varchar(max)’) AS Application,
n.value(‘(action[@name=”username”]/value)[1]’, ‘varchar(max)’) AS Username,
n.value(‘(action[@name=”client_hostname”]/value)[1]’, ‘varchar(max)’) AS HostName,
n.value(‘(action[@name=”session_id”]/value)[1]’, ‘int’) AS SessionID
FROM
(SELECT CAST(target_data AS xml) AS event_data
FROM sys.dm_xe_database_session_targets
WHERE event_session_address =
(SELECT address FROM sys.dm_xe_database_sessions WHERE name = ‘Track_SSMS_Logins’)
AND target_name = ‘ring_buffer’) AS tab
CROSS APPLY event_data.nodes(‘/RingBufferTarget/event’) AS q(n)
WHERE
n.value(‘(@timestamp)[1]’, ‘datetime2’) > ‘$LastCheck’
“@
# Create and open SQL connection
$SqlConnection = New-Object System.Data.SqlClient.SqlConnection
$SqlConnection.ConnectionString = $ConnectionString
$SqlConnection.Open()
# Create SQL command
$SqlCommand = $SqlConnection.CreateCommand()
$SqlCommand.CommandText = $Query
# Execute SQL command
$SqlAdapter = New-Object System.Data.SqlClient.SqlDataAdapter $SqlCommand
$DataSet = New-Object System.Data.DataSet
$SqlAdapter.Fill($DataSet)
$SqlConnection.Close()
# Process the results
$Results = $DataSet.Tables[0]
# Check for new events
if ($Results.Rows.Count -gt 0) {
# Prepare email content
$EmailBody = $Results | Out-String
$smtp = New-Object Net.Mail.SmtpClient($smtpServer, $smtpPort)
$smtp.EnableSsl = $true
$smtp.Credentials = New-Object System.Net.NetworkCredential($smtpUsername, $smtpPassword)
$mailMessage = New-Object Net.Mail.MailMessage($emailFrom, $emailTo)
$mailMessage.Subject = “Alert: SQL Access in database $Database”
$mailMessage.Body = “SQL Access Alert in database $Database on server $Server at $LastCheck.”
$smtp.Send($EmailBody)
# Save the current timestamp for the next check
Get-Date -Format “o” | Out-File $LastCheckFile
}
# Remember to schedule this script to run every 5 minutes using Windows Task Scheduler
Of course, that using SQL auditing o Log analytics will be another alternative.
Microsoft Tech Community – Latest Blogs –Read More
Lesson Learned #474:Identifying and Preventing Unauthorized Application Access to Azure SQL Database
In recent scenarios encountered with our customers, we have come across a specific need: restricting certain users from using SQL Server Management Studio (SSMS) or other applications to connect to a designated database in Azure SQL Database. A common solution in traditional SQL Server environments, like the use of LOGIN TRIGGERS, is not available in Azure SQL Database. This limitation poses a unique challenge in database management and security.
To address this challenge, I’d like to share an alternative that combines the power of Extended Events in Azure SQL Database with PowerShell scripting. This method effectively captures and monitors login events, providing administrators with timely alerts whenever a specified user connects to the database using a prohibited application, such as SSMS.
How It Works
Extended Events Setup: We start by setting up an Extended Event in Azure SQL Database. This event is configured to capture login activities, specifically focusing on the application name used for the connection. By filtering for certain applications (like SSMS), we can track unauthorized access attempts.
PowerShell Script: A PowerShell script is then employed to query these captured events at regular intervals. This script connects to the Azure SQL Database, retrieves the relevant event data, and checks for any instances where the specified users have connected via the restricted applications.
Email Alerts: Upon detecting such an event, the PowerShell script automatically sends an email notification to the database administrator. This alert contains details of the unauthorized login attempt, such as the timestamp, username, and application used. This prompt information allows the administrator to take immediate corrective measures.
Advantages
Proactive Monitoring: This approach provides continuous monitoring of the database connections, ensuring that any unauthorized access is quickly detected and reported.
Customizable: The method is highly customizable. Administrators can specify which applications to monitor and can easily adjust the script to cater to different user groups or connection parameters.
No Direct Blocking: While this method does not directly block the connection, it provides immediate alerts, enabling administrators to react swiftly to enforce compliance and security protocols.
This article provides a high-level overview of how to implement this solution. For detailed steps and script examples, administrators are encouraged to tailor the approach to their specific environment and requirements.
Extended Event
CREATE EVENT SESSION Track_SSMS_Logins
ON DATABASE
ADD EVENT sqlserver.sql_batch_starting(
ACTION(sqlserver.client_app_name, sqlserver.client_hostname, sqlserver.username, sqlserver.session_id)
WHERE (sqlserver.client_app_name LIKE ‘%Management Studio%’)
)
ADD TARGET package0.ring_buffer
(SET max_events_limit = 1000, max_memory = 4096)
WITH (EVENT_RETENTION_MODE = NO_EVENT_LOSS, MAX_DISPATCH_LATENCY = 5 SECONDS);
GO
ALTER EVENT SESSION Track_SSMS_Logins ON DATABASE STATE = START;
Query to run using ring buffers
SELECT
n.value(‘(@timestamp)[1]’, ‘datetime2’) AS TimeStamp,
n.value(‘(action[@name=”client_app_name”]/value)[1]’, ‘varchar(max)’) AS Application,
n.value(‘(action[@name=”username”]/value)[1]’, ‘varchar(max)’) AS Username,
n.value(‘(action[@name=”client_hostname”]/value)[1]’, ‘varchar(max)’) AS HostName,
n.value(‘(action[@name=”session_id”]/value)[1]’, ‘int’) AS SessionID
FROM
(SELECT CAST(target_data AS xml) AS event_data
FROM sys.dm_xe_database_session_targets
WHERE event_session_address =
(SELECT address FROM sys.dm_xe_database_sessions WHERE name = ‘Track_SSMS_Logins’)
AND target_name = ‘ring_buffer’) AS tab
CROSS APPLY event_data.nodes(‘/RingBufferTarget/event’) AS q(n);
Powershell Script
# Connection configuration
$Database = “DBNAme”
$Server = “Servername.database.windows.net”
$Username = “username”
$Password = “pwd!”
$emailFrom = “EmailFrom@ZYX.com”
$emailTo = “EmailTo@XYZ.com”
$smtpServer = “smtpservername”
$smtpUsername = “smtpusername”
$smtpPassword = “smtppassword”
$smtpPort=25
$ConnectionString = “Server=$Server;Database=$Database;User Id=$Username;Password=$Password;”
# Last check date
$LastCheckFile = “c:tempLastCheck.txt”
$LastCheck = Get-Content $LastCheckFile -ErrorAction SilentlyContinue
if (!$LastCheck) {
$LastCheck = [DateTime]::MinValue
}
# SQL query
$Query = @”
SELECT
n.value(‘(@timestamp)[1]’, ‘datetime2’) AS TimeStamp,
n.value(‘(action[@name=”client_app_name”]/value)[1]’, ‘varchar(max)’) AS Application,
n.value(‘(action[@name=”username”]/value)[1]’, ‘varchar(max)’) AS Username,
n.value(‘(action[@name=”client_hostname”]/value)[1]’, ‘varchar(max)’) AS HostName,
n.value(‘(action[@name=”session_id”]/value)[1]’, ‘int’) AS SessionID
FROM
(SELECT CAST(target_data AS xml) AS event_data
FROM sys.dm_xe_database_session_targets
WHERE event_session_address =
(SELECT address FROM sys.dm_xe_database_sessions WHERE name = ‘Track_SSMS_Logins’)
AND target_name = ‘ring_buffer’) AS tab
CROSS APPLY event_data.nodes(‘/RingBufferTarget/event’) AS q(n)
WHERE
n.value(‘(@timestamp)[1]’, ‘datetime2’) > ‘$LastCheck’
“@
# Create and open SQL connection
$SqlConnection = New-Object System.Data.SqlClient.SqlConnection
$SqlConnection.ConnectionString = $ConnectionString
$SqlConnection.Open()
# Create SQL command
$SqlCommand = $SqlConnection.CreateCommand()
$SqlCommand.CommandText = $Query
# Execute SQL command
$SqlAdapter = New-Object System.Data.SqlClient.SqlDataAdapter $SqlCommand
$DataSet = New-Object System.Data.DataSet
$SqlAdapter.Fill($DataSet)
$SqlConnection.Close()
# Process the results
$Results = $DataSet.Tables[0]
# Check for new events
if ($Results.Rows.Count -gt 0) {
# Prepare email content
$EmailBody = $Results | Out-String
$smtp = New-Object Net.Mail.SmtpClient($smtpServer, $smtpPort)
$smtp.EnableSsl = $true
$smtp.Credentials = New-Object System.Net.NetworkCredential($smtpUsername, $smtpPassword)
$mailMessage = New-Object Net.Mail.MailMessage($emailFrom, $emailTo)
$mailMessage.Subject = “Alert: SQL Access in database $Database”
$mailMessage.Body = “SQL Access Alert in database $Database on server $Server at $LastCheck.”
$smtp.Send($EmailBody)
# Save the current timestamp for the next check
Get-Date -Format “o” | Out-File $LastCheckFile
}
# Remember to schedule this script to run every 5 minutes using Windows Task Scheduler
Of course, that using SQL auditing o Log analytics will be another alternative.
Microsoft Tech Community – Latest Blogs –Read More
Lesson Learned #474:Identifying and Preventing Unauthorized Application Access to Azure SQL Database
In recent scenarios encountered with our customers, we have come across a specific need: restricting certain users from using SQL Server Management Studio (SSMS) or other applications to connect to a designated database in Azure SQL Database. A common solution in traditional SQL Server environments, like the use of LOGIN TRIGGERS, is not available in Azure SQL Database. This limitation poses a unique challenge in database management and security.
To address this challenge, I’d like to share an alternative that combines the power of Extended Events in Azure SQL Database with PowerShell scripting. This method effectively captures and monitors login events, providing administrators with timely alerts whenever a specified user connects to the database using a prohibited application, such as SSMS.
How It Works
Extended Events Setup: We start by setting up an Extended Event in Azure SQL Database. This event is configured to capture login activities, specifically focusing on the application name used for the connection. By filtering for certain applications (like SSMS), we can track unauthorized access attempts.
PowerShell Script: A PowerShell script is then employed to query these captured events at regular intervals. This script connects to the Azure SQL Database, retrieves the relevant event data, and checks for any instances where the specified users have connected via the restricted applications.
Email Alerts: Upon detecting such an event, the PowerShell script automatically sends an email notification to the database administrator. This alert contains details of the unauthorized login attempt, such as the timestamp, username, and application used. This prompt information allows the administrator to take immediate corrective measures.
Advantages
Proactive Monitoring: This approach provides continuous monitoring of the database connections, ensuring that any unauthorized access is quickly detected and reported.
Customizable: The method is highly customizable. Administrators can specify which applications to monitor and can easily adjust the script to cater to different user groups or connection parameters.
No Direct Blocking: While this method does not directly block the connection, it provides immediate alerts, enabling administrators to react swiftly to enforce compliance and security protocols.
This article provides a high-level overview of how to implement this solution. For detailed steps and script examples, administrators are encouraged to tailor the approach to their specific environment and requirements.
Extended Event
CREATE EVENT SESSION Track_SSMS_Logins
ON DATABASE
ADD EVENT sqlserver.sql_batch_starting(
ACTION(sqlserver.client_app_name, sqlserver.client_hostname, sqlserver.username, sqlserver.session_id)
WHERE (sqlserver.client_app_name LIKE ‘%Management Studio%’)
)
ADD TARGET package0.ring_buffer
(SET max_events_limit = 1000, max_memory = 4096)
WITH (EVENT_RETENTION_MODE = NO_EVENT_LOSS, MAX_DISPATCH_LATENCY = 5 SECONDS);
GO
ALTER EVENT SESSION Track_SSMS_Logins ON DATABASE STATE = START;
Query to run using ring buffers
SELECT
n.value(‘(@timestamp)[1]’, ‘datetime2’) AS TimeStamp,
n.value(‘(action[@name=”client_app_name”]/value)[1]’, ‘varchar(max)’) AS Application,
n.value(‘(action[@name=”username”]/value)[1]’, ‘varchar(max)’) AS Username,
n.value(‘(action[@name=”client_hostname”]/value)[1]’, ‘varchar(max)’) AS HostName,
n.value(‘(action[@name=”session_id”]/value)[1]’, ‘int’) AS SessionID
FROM
(SELECT CAST(target_data AS xml) AS event_data
FROM sys.dm_xe_database_session_targets
WHERE event_session_address =
(SELECT address FROM sys.dm_xe_database_sessions WHERE name = ‘Track_SSMS_Logins’)
AND target_name = ‘ring_buffer’) AS tab
CROSS APPLY event_data.nodes(‘/RingBufferTarget/event’) AS q(n);
Powershell Script
# Connection configuration
$Database = “DBNAme”
$Server = “Servername.database.windows.net”
$Username = “username”
$Password = “pwd!”
$emailFrom = “EmailFrom@ZYX.com”
$emailTo = “EmailTo@XYZ.com”
$smtpServer = “smtpservername”
$smtpUsername = “smtpusername”
$smtpPassword = “smtppassword”
$smtpPort=25
$ConnectionString = “Server=$Server;Database=$Database;User Id=$Username;Password=$Password;”
# Last check date
$LastCheckFile = “c:tempLastCheck.txt”
$LastCheck = Get-Content $LastCheckFile -ErrorAction SilentlyContinue
if (!$LastCheck) {
$LastCheck = [DateTime]::MinValue
}
# SQL query
$Query = @”
SELECT
n.value(‘(@timestamp)[1]’, ‘datetime2’) AS TimeStamp,
n.value(‘(action[@name=”client_app_name”]/value)[1]’, ‘varchar(max)’) AS Application,
n.value(‘(action[@name=”username”]/value)[1]’, ‘varchar(max)’) AS Username,
n.value(‘(action[@name=”client_hostname”]/value)[1]’, ‘varchar(max)’) AS HostName,
n.value(‘(action[@name=”session_id”]/value)[1]’, ‘int’) AS SessionID
FROM
(SELECT CAST(target_data AS xml) AS event_data
FROM sys.dm_xe_database_session_targets
WHERE event_session_address =
(SELECT address FROM sys.dm_xe_database_sessions WHERE name = ‘Track_SSMS_Logins’)
AND target_name = ‘ring_buffer’) AS tab
CROSS APPLY event_data.nodes(‘/RingBufferTarget/event’) AS q(n)
WHERE
n.value(‘(@timestamp)[1]’, ‘datetime2’) > ‘$LastCheck’
“@
# Create and open SQL connection
$SqlConnection = New-Object System.Data.SqlClient.SqlConnection
$SqlConnection.ConnectionString = $ConnectionString
$SqlConnection.Open()
# Create SQL command
$SqlCommand = $SqlConnection.CreateCommand()
$SqlCommand.CommandText = $Query
# Execute SQL command
$SqlAdapter = New-Object System.Data.SqlClient.SqlDataAdapter $SqlCommand
$DataSet = New-Object System.Data.DataSet
$SqlAdapter.Fill($DataSet)
$SqlConnection.Close()
# Process the results
$Results = $DataSet.Tables[0]
# Check for new events
if ($Results.Rows.Count -gt 0) {
# Prepare email content
$EmailBody = $Results | Out-String
$smtp = New-Object Net.Mail.SmtpClient($smtpServer, $smtpPort)
$smtp.EnableSsl = $true
$smtp.Credentials = New-Object System.Net.NetworkCredential($smtpUsername, $smtpPassword)
$mailMessage = New-Object Net.Mail.MailMessage($emailFrom, $emailTo)
$mailMessage.Subject = “Alert: SQL Access in database $Database”
$mailMessage.Body = “SQL Access Alert in database $Database on server $Server at $LastCheck.”
$smtp.Send($EmailBody)
# Save the current timestamp for the next check
Get-Date -Format “o” | Out-File $LastCheckFile
}
# Remember to schedule this script to run every 5 minutes using Windows Task Scheduler
Of course, that using SQL auditing o Log analytics will be another alternative.
Microsoft Tech Community – Latest Blogs –Read More
Lesson Learned #474:Identifying and Preventing Unauthorized Application Access to Azure SQL Database
In recent scenarios encountered with our customers, we have come across a specific need: restricting certain users from using SQL Server Management Studio (SSMS) or other applications to connect to a designated database in Azure SQL Database. A common solution in traditional SQL Server environments, like the use of LOGIN TRIGGERS, is not available in Azure SQL Database. This limitation poses a unique challenge in database management and security.
To address this challenge, I’d like to share an alternative that combines the power of Extended Events in Azure SQL Database with PowerShell scripting. This method effectively captures and monitors login events, providing administrators with timely alerts whenever a specified user connects to the database using a prohibited application, such as SSMS.
How It Works
Extended Events Setup: We start by setting up an Extended Event in Azure SQL Database. This event is configured to capture login activities, specifically focusing on the application name used for the connection. By filtering for certain applications (like SSMS), we can track unauthorized access attempts.
PowerShell Script: A PowerShell script is then employed to query these captured events at regular intervals. This script connects to the Azure SQL Database, retrieves the relevant event data, and checks for any instances where the specified users have connected via the restricted applications.
Email Alerts: Upon detecting such an event, the PowerShell script automatically sends an email notification to the database administrator. This alert contains details of the unauthorized login attempt, such as the timestamp, username, and application used. This prompt information allows the administrator to take immediate corrective measures.
Advantages
Proactive Monitoring: This approach provides continuous monitoring of the database connections, ensuring that any unauthorized access is quickly detected and reported.
Customizable: The method is highly customizable. Administrators can specify which applications to monitor and can easily adjust the script to cater to different user groups or connection parameters.
No Direct Blocking: While this method does not directly block the connection, it provides immediate alerts, enabling administrators to react swiftly to enforce compliance and security protocols.
This article provides a high-level overview of how to implement this solution. For detailed steps and script examples, administrators are encouraged to tailor the approach to their specific environment and requirements.
Extended Event
CREATE EVENT SESSION Track_SSMS_Logins
ON DATABASE
ADD EVENT sqlserver.sql_batch_starting(
ACTION(sqlserver.client_app_name, sqlserver.client_hostname, sqlserver.username, sqlserver.session_id)
WHERE (sqlserver.client_app_name LIKE ‘%Management Studio%’)
)
ADD TARGET package0.ring_buffer
(SET max_events_limit = 1000, max_memory = 4096)
WITH (EVENT_RETENTION_MODE = NO_EVENT_LOSS, MAX_DISPATCH_LATENCY = 5 SECONDS);
GO
ALTER EVENT SESSION Track_SSMS_Logins ON DATABASE STATE = START;
Query to run using ring buffers
SELECT
n.value(‘(@timestamp)[1]’, ‘datetime2’) AS TimeStamp,
n.value(‘(action[@name=”client_app_name”]/value)[1]’, ‘varchar(max)’) AS Application,
n.value(‘(action[@name=”username”]/value)[1]’, ‘varchar(max)’) AS Username,
n.value(‘(action[@name=”client_hostname”]/value)[1]’, ‘varchar(max)’) AS HostName,
n.value(‘(action[@name=”session_id”]/value)[1]’, ‘int’) AS SessionID
FROM
(SELECT CAST(target_data AS xml) AS event_data
FROM sys.dm_xe_database_session_targets
WHERE event_session_address =
(SELECT address FROM sys.dm_xe_database_sessions WHERE name = ‘Track_SSMS_Logins’)
AND target_name = ‘ring_buffer’) AS tab
CROSS APPLY event_data.nodes(‘/RingBufferTarget/event’) AS q(n);
Powershell Script
# Connection configuration
$Database = “DBNAme”
$Server = “Servername.database.windows.net”
$Username = “username”
$Password = “pwd!”
$emailFrom = “EmailFrom@ZYX.com”
$emailTo = “EmailTo@XYZ.com”
$smtpServer = “smtpservername”
$smtpUsername = “smtpusername”
$smtpPassword = “smtppassword”
$smtpPort=25
$ConnectionString = “Server=$Server;Database=$Database;User Id=$Username;Password=$Password;”
# Last check date
$LastCheckFile = “c:tempLastCheck.txt”
$LastCheck = Get-Content $LastCheckFile -ErrorAction SilentlyContinue
if (!$LastCheck) {
$LastCheck = [DateTime]::MinValue
}
# SQL query
$Query = @”
SELECT
n.value(‘(@timestamp)[1]’, ‘datetime2’) AS TimeStamp,
n.value(‘(action[@name=”client_app_name”]/value)[1]’, ‘varchar(max)’) AS Application,
n.value(‘(action[@name=”username”]/value)[1]’, ‘varchar(max)’) AS Username,
n.value(‘(action[@name=”client_hostname”]/value)[1]’, ‘varchar(max)’) AS HostName,
n.value(‘(action[@name=”session_id”]/value)[1]’, ‘int’) AS SessionID
FROM
(SELECT CAST(target_data AS xml) AS event_data
FROM sys.dm_xe_database_session_targets
WHERE event_session_address =
(SELECT address FROM sys.dm_xe_database_sessions WHERE name = ‘Track_SSMS_Logins’)
AND target_name = ‘ring_buffer’) AS tab
CROSS APPLY event_data.nodes(‘/RingBufferTarget/event’) AS q(n)
WHERE
n.value(‘(@timestamp)[1]’, ‘datetime2’) > ‘$LastCheck’
“@
# Create and open SQL connection
$SqlConnection = New-Object System.Data.SqlClient.SqlConnection
$SqlConnection.ConnectionString = $ConnectionString
$SqlConnection.Open()
# Create SQL command
$SqlCommand = $SqlConnection.CreateCommand()
$SqlCommand.CommandText = $Query
# Execute SQL command
$SqlAdapter = New-Object System.Data.SqlClient.SqlDataAdapter $SqlCommand
$DataSet = New-Object System.Data.DataSet
$SqlAdapter.Fill($DataSet)
$SqlConnection.Close()
# Process the results
$Results = $DataSet.Tables[0]
# Check for new events
if ($Results.Rows.Count -gt 0) {
# Prepare email content
$EmailBody = $Results | Out-String
$smtp = New-Object Net.Mail.SmtpClient($smtpServer, $smtpPort)
$smtp.EnableSsl = $true
$smtp.Credentials = New-Object System.Net.NetworkCredential($smtpUsername, $smtpPassword)
$mailMessage = New-Object Net.Mail.MailMessage($emailFrom, $emailTo)
$mailMessage.Subject = “Alert: SQL Access in database $Database”
$mailMessage.Body = “SQL Access Alert in database $Database on server $Server at $LastCheck.”
$smtp.Send($EmailBody)
# Save the current timestamp for the next check
Get-Date -Format “o” | Out-File $LastCheckFile
}
# Remember to schedule this script to run every 5 minutes using Windows Task Scheduler
Of course, that using SQL auditing o Log analytics will be another alternative.
Microsoft Tech Community – Latest Blogs –Read More
Lesson Learned #474:Identifying and Preventing Unauthorized Application Access to Azure SQL Database
In recent scenarios encountered with our customers, we have come across a specific need: restricting certain users from using SQL Server Management Studio (SSMS) or other applications to connect to a designated database in Azure SQL Database. A common solution in traditional SQL Server environments, like the use of LOGIN TRIGGERS, is not available in Azure SQL Database. This limitation poses a unique challenge in database management and security.
To address this challenge, I’d like to share an alternative that combines the power of Extended Events in Azure SQL Database with PowerShell scripting. This method effectively captures and monitors login events, providing administrators with timely alerts whenever a specified user connects to the database using a prohibited application, such as SSMS.
How It Works
Extended Events Setup: We start by setting up an Extended Event in Azure SQL Database. This event is configured to capture login activities, specifically focusing on the application name used for the connection. By filtering for certain applications (like SSMS), we can track unauthorized access attempts.
PowerShell Script: A PowerShell script is then employed to query these captured events at regular intervals. This script connects to the Azure SQL Database, retrieves the relevant event data, and checks for any instances where the specified users have connected via the restricted applications.
Email Alerts: Upon detecting such an event, the PowerShell script automatically sends an email notification to the database administrator. This alert contains details of the unauthorized login attempt, such as the timestamp, username, and application used. This prompt information allows the administrator to take immediate corrective measures.
Advantages
Proactive Monitoring: This approach provides continuous monitoring of the database connections, ensuring that any unauthorized access is quickly detected and reported.
Customizable: The method is highly customizable. Administrators can specify which applications to monitor and can easily adjust the script to cater to different user groups or connection parameters.
No Direct Blocking: While this method does not directly block the connection, it provides immediate alerts, enabling administrators to react swiftly to enforce compliance and security protocols.
This article provides a high-level overview of how to implement this solution. For detailed steps and script examples, administrators are encouraged to tailor the approach to their specific environment and requirements.
Extended Event
CREATE EVENT SESSION Track_SSMS_Logins
ON DATABASE
ADD EVENT sqlserver.sql_batch_starting(
ACTION(sqlserver.client_app_name, sqlserver.client_hostname, sqlserver.username, sqlserver.session_id)
WHERE (sqlserver.client_app_name LIKE ‘%Management Studio%’)
)
ADD TARGET package0.ring_buffer
(SET max_events_limit = 1000, max_memory = 4096)
WITH (EVENT_RETENTION_MODE = NO_EVENT_LOSS, MAX_DISPATCH_LATENCY = 5 SECONDS);
GO
ALTER EVENT SESSION Track_SSMS_Logins ON DATABASE STATE = START;
Query to run using ring buffers
SELECT
n.value(‘(@timestamp)[1]’, ‘datetime2’) AS TimeStamp,
n.value(‘(action[@name=”client_app_name”]/value)[1]’, ‘varchar(max)’) AS Application,
n.value(‘(action[@name=”username”]/value)[1]’, ‘varchar(max)’) AS Username,
n.value(‘(action[@name=”client_hostname”]/value)[1]’, ‘varchar(max)’) AS HostName,
n.value(‘(action[@name=”session_id”]/value)[1]’, ‘int’) AS SessionID
FROM
(SELECT CAST(target_data AS xml) AS event_data
FROM sys.dm_xe_database_session_targets
WHERE event_session_address =
(SELECT address FROM sys.dm_xe_database_sessions WHERE name = ‘Track_SSMS_Logins’)
AND target_name = ‘ring_buffer’) AS tab
CROSS APPLY event_data.nodes(‘/RingBufferTarget/event’) AS q(n);
Powershell Script
# Connection configuration
$Database = “DBNAme”
$Server = “Servername.database.windows.net”
$Username = “username”
$Password = “pwd!”
$emailFrom = “EmailFrom@ZYX.com”
$emailTo = “EmailTo@XYZ.com”
$smtpServer = “smtpservername”
$smtpUsername = “smtpusername”
$smtpPassword = “smtppassword”
$smtpPort=25
$ConnectionString = “Server=$Server;Database=$Database;User Id=$Username;Password=$Password;”
# Last check date
$LastCheckFile = “c:tempLastCheck.txt”
$LastCheck = Get-Content $LastCheckFile -ErrorAction SilentlyContinue
if (!$LastCheck) {
$LastCheck = [DateTime]::MinValue
}
# SQL query
$Query = @”
SELECT
n.value(‘(@timestamp)[1]’, ‘datetime2’) AS TimeStamp,
n.value(‘(action[@name=”client_app_name”]/value)[1]’, ‘varchar(max)’) AS Application,
n.value(‘(action[@name=”username”]/value)[1]’, ‘varchar(max)’) AS Username,
n.value(‘(action[@name=”client_hostname”]/value)[1]’, ‘varchar(max)’) AS HostName,
n.value(‘(action[@name=”session_id”]/value)[1]’, ‘int’) AS SessionID
FROM
(SELECT CAST(target_data AS xml) AS event_data
FROM sys.dm_xe_database_session_targets
WHERE event_session_address =
(SELECT address FROM sys.dm_xe_database_sessions WHERE name = ‘Track_SSMS_Logins’)
AND target_name = ‘ring_buffer’) AS tab
CROSS APPLY event_data.nodes(‘/RingBufferTarget/event’) AS q(n)
WHERE
n.value(‘(@timestamp)[1]’, ‘datetime2’) > ‘$LastCheck’
“@
# Create and open SQL connection
$SqlConnection = New-Object System.Data.SqlClient.SqlConnection
$SqlConnection.ConnectionString = $ConnectionString
$SqlConnection.Open()
# Create SQL command
$SqlCommand = $SqlConnection.CreateCommand()
$SqlCommand.CommandText = $Query
# Execute SQL command
$SqlAdapter = New-Object System.Data.SqlClient.SqlDataAdapter $SqlCommand
$DataSet = New-Object System.Data.DataSet
$SqlAdapter.Fill($DataSet)
$SqlConnection.Close()
# Process the results
$Results = $DataSet.Tables[0]
# Check for new events
if ($Results.Rows.Count -gt 0) {
# Prepare email content
$EmailBody = $Results | Out-String
$smtp = New-Object Net.Mail.SmtpClient($smtpServer, $smtpPort)
$smtp.EnableSsl = $true
$smtp.Credentials = New-Object System.Net.NetworkCredential($smtpUsername, $smtpPassword)
$mailMessage = New-Object Net.Mail.MailMessage($emailFrom, $emailTo)
$mailMessage.Subject = “Alert: SQL Access in database $Database”
$mailMessage.Body = “SQL Access Alert in database $Database on server $Server at $LastCheck.”
$smtp.Send($EmailBody)
# Save the current timestamp for the next check
Get-Date -Format “o” | Out-File $LastCheckFile
}
# Remember to schedule this script to run every 5 minutes using Windows Task Scheduler
Of course, that using SQL auditing o Log analytics will be another alternative.
Microsoft Tech Community – Latest Blogs –Read More
Validate your skills with our new certification for Microsoft Fabric Analytics Engineers
We’re looking for Microsoft Fabric Analytics Engineers to take our new beta exam. Do you have subject matter expertise in designing, creating, and deploying enterprise-scale data analytics solutions? If so, and if you know how to transform data into reusable analytics assets by using Microsoft Fabric components, such as lakehouses, data warehouses, notebooks, dataflows, data pipelines, semantic models, and reports, be sure to check out this exam. Other helpful qualifications include the ability to implement analytics best practices in Fabric, including version control and deployment.
If this is your skill set, we have a new certification for you. The Microsoft Certified: Fabric Analytics Engineer Associate certification validates your expertise in this area and offers you the opportunity to prove your skills. To earn this certification, pass Exam DP-600: Implementing Analytics Solutions Using Microsoft Fabric, currently in beta.
Is this the right certification for you?
This certification could be a great fit if you have in-depth familiarity with the Fabric solution and you have experience with data modeling, data transformation, Git-based source control, exploratory analytics, and languages, including Structured Query Language (SQL), Data Analysis Expressions (DAX), and PySpark.
Review the Exam DP-600 (beta) page for details, and check out the self-paced learning paths and instructor-led training there. The Exam DP-600 study guide alerts you for key topics covered on the exam.
Ready to prove your skills?
Take advantage of the discounted beta exam offer. The first 300 people who take Exam DP-600 (beta) on or before January 25, 2024, can get 80 percent off market price.
To receive the discount, when you register for the exam and are prompted for payment, use code DP600Winfield. This is not a private access code. The seats are offered on a first-come, first-served basis. As noted, you must take the exam on or before January 25, 2024. Please note that this beta exam is not available in Turkey, Pakistan, India, or China.
The rescore process starts on the day an exam goes live—8 to 12 weeks after the beta period, and final scores for beta exams are released approximately 10 days after that. For details on the timing of beta exam rescoring and results, read my post Creating high-quality exams: The path from beta to live.
Get ready to take Exam DP-600 (beta)
Explore the Fabric Career Hub. Access live training, skills challenges, group learning, and career insights.
Join the Fabric Cloud Skills Challenge. Complete all modules in the challenge within 30 days and become eligible for 50% off the cost of a Microsoft Certification exam. This 50% discount can’t be used toward the Exam DP-600 (beta). If you miss the beta period, you can use it later once the exam goes live or for another live certification exam.
Looking for in-depth training? Check out the new course Microsoft Fabric Analytics Engineer. Connect with Microsoft Training Services Partners in your area for in-person training.
Need other preparation ideas? Check out my blog post Just How Does One Prepare for Beta Exams?
Did you know that you can take any role-based exam online? Online delivered exams—taken from your home or office—can be less hassle, less stress, and even less worry than traveling to a test center, especially if you’re adequately prepared for what to expect. To find out more, check out my blog post Online proctored exams: What to expect and how to prepare.
Ready to get started?
Remember, the number of spots for the discounted beta exam offer is limited to the first 300 candidates taking Exam DP-600 (beta) on or before January 25, 2024.
Related announcements
9 ways Microsoft Learn helps you with the skills-first economy
Introducing a new resource for all role-based Microsoft Certification exams
Microsoft Learn: Four key features to help expand your knowledge and advance your career
Meet learners who changed their career with the help of Microsoft Learn
Microsoft Tech Community – Latest Blogs –Read More
Introducing Automatic File and URL (Detonation) Analysis
The Microsoft Defender Threat Intelligence (MDTI) team continuously adds new threat intelligence capabilities to MDTI and Defender XDR, giving customers new ways to hunt, research, and contextualize threats.
Today, we are excited to share a new feature that enhances our file and URL analysis (detonation) capabilities in the threat intelligence blade within the Defender XDR user interface. If MDTI cannot return any results when a customer searches for a file or URL, MDTI now automatically detonates it to improve search coverage and add to our corpus of knowledge of the global threat landscape:
Here’s how it works:
The detonation request for the searched file or URL entity is processed asynchronously in the background in the United States region.
If the end user is not served with a reputation and detonation results at the time of the search request. A subsequent search request for the same entity is initiated in the background.
Although there are no fixed SLAs regarding the volume and availability of the auto-detonated results, we aim to provide the results within 2 hours, depending on the load.
Next time you search and don’t find anything, don’t worry. The system is working in the background to give you better results later!
Next steps
Whether you are just kick-starting a threat intelligence program or looking to augment your existing threat intelligence toolset, the MDTI standard version can add critical context to your existing security investigations, keep your organization informed on current threats through leading research and intel profiles, provide crucial brand intelligence, and help you to collect powerful threat intelligence associated with your organization or others in your industry – all free of charge.
To learn more about how you and your organization can leverage MDTI, watch our overview video and follow our “Become an MDTI Ninja” training path today.
Microsoft Tech Community – Latest Blogs –Read More
December ’23 Monthly M365 Webinar – Microsoft Collaboration Framework
Dan Carroll and Richard Wakeman supported a great discussion around the Microsoft Collaboration Framework and explored the current state of collaboration capabilities across your partner ecosystem.
Recording here: https://www.microsoft.com/en-us/videoplayer/embed/RW1g6Zt
Microsoft Tech Community – Latest Blogs –Read More
All GCCH M365 Webinar Recordings Here!
December ’23, Microsoft Collaboration Framework Office Hours – https://www.microsoft.com/en-us/videoplayer/embed/RW1g6Zt
October ’23, New Channels Experience and New Webhook Connector in Teams – https://www.microsoft.com/en-us/videoplayer/embed/RW1dYng
September ’23, New Teams App – https://www.microsoft.com/en-us/videoplayer/embed/RW1c5kr
August ’23, Teams Phone Device Update and Cross-Cloud Teams Collaboration Capabilities – https://www.microsoft.com/en-us/videoplayer/embed/RW1aCjk
June ’23, Demystifying Task Management in M365 – https://www.microsoft.com/en-us/videoplayer/embed/RW16V80
June ’23, Teams Premium – https://www.microsoft.com/en-us/videoplayer/embed/RW16yZi
May ’23, M365 Search and Search Analytics, Teams Panels – https://www.microsoft.com/en-us/videoplayer/embed/RW14RyF
April ’23, Company Communicator and Viva Connections – https://www.microsoft.com/en-us/videoplayer/embed/RW12prs
March ’23, Cross-Cloud Collaboration and Viva Personal Insights – https://www.microsoft.com/en-us/videoplayer/embed/RW10qiX
February ’23, 1st and 3rd Party App Teams Integration Update, Learning Pathways, and New and Coming Features – https://www.microsoft.com/en-us/videoplayer/embed/RWXhnb
January ’23, Teams as a Platform Overview (intranet concept), OneDrive session #2, and @mention functionality – https://www.microsoft.com/en-us/videoplayer/embed/RWWURs
December ’22, Cross-Cloud Collaboration Overview, OneDrive for Sharing and Teams Integration, Teams Phone and CQD Update – https://www.microsoft.com/en-us/videoplayer/embed/RE5dWYA
November ’22, App Integration with Teams, Teams Meeting Options, Microsoft Whiteboard – https://www.microsoft.com/en-us/videoplayer/embed/RE5dkYn
Microsoft Tech Community – Latest Blogs –Read More
Announcing Public Preview of Confidential VMs with Intel TDX in Azure Virtual Desktop
We are excited to announce that Azure Virtual Desktop now supports the public preview of DCesv5 and ECesv5-series confidential VMs. These confidential VMs are powered by 4th Gen Intel® Xeon® Scalable processors with Intel® Trust Domain Extensions (Intel® TDX) and enable organizations to bring confidential workloads to the cloud without code changes to applications. Through the gated preview, we continued to enhance performance with our Intel partnership. These new virtual machines are up to 20% faster than 3rd Gen Intel Xeon virtual machines, and we expect performance for I/O intensive workloads to continue to improve as the technology matures.
Azure confidential VMs (CVMs) offer VM memory encryption with integrity protection, which strengthens guest protections to deny the hypervisor and other host management components code access to the VM memory and state. For additional CVM security benefits, please see the CVM documentation for more information.
For more information on AVD’s support for confidential VMs, please see this blog.
For more information about Intel TDX confidential VMs, please see this blog for more information.
Note: Intel TDX is offered in Europe West, Central US, and East US 2 regions. Europe North will be available in January 2024.
How to deploy Intel TDX Confidential VMs in AVD Host Pool Provisioning
On the Virtual machine location, select “Europe West”, “Central US”, or “East US 2”.
Select Confidential Virtual Machines from the Security Type dropdown in the AVD Host Pool Virtual Machine blade.
From there, go down to Virtual machine size, and click on “Change size” link.
You will then get directed towards a table that gives you all SKUs available, make sure on the top, that the “Type” is “Confidential Compute”.
Expand the DC or EC-Series categories and select and of the DCesv5/ECesv5 SKUs appropriate for your demand.
Getting Started
To get started, please visit Azure Virtual Desktop to learn more about the various benefits AVD provides and to get started with your first deployment.
Visit Create a host pool – Azure Virtual Desktop to start deploying your first confidential VM in Azure Virtual Desktop through the Azure Portal. For more information about any of these features, please visit Azure Virtual Desktop security best practices – Azure.
Continue the conversation. Find best practices. Bookmark the Azure Virtual Desktop Community. Have feedback on the service? Share your thoughts and upvote others on the Azure Virtual Desktop Feedback board.
Microsoft Tech Community – Latest Blogs –Read More
Identity in focus: Exploring the new ITDR experience within Microsoft Defender
Earlier this year I shared the news that the features and functionality of Microsoft Defender for Identity had been converged into Microsoft Defender XDR and were now a core part of that experience. Today I am excited to discuss some new enhancements to how our customers can find and engage with their Identity security capabilities.
New navigation
Identities have become an inherent part of modern security and the latest update to the Microsoft Defender XDR navigation further elevates Identity security within the SOC experience with a new dedicated section for the domain. As illustrated in the image below Defender for Identity customers will now see a section titled “Identities” which today encapsulates 3 new Identity specific pages or views.
1. ITDR Dashboard
The new ITDR dashboard is designed to provide SOC professionals with a single, prioritized view of Identity-specific security information and recommendations. Pulling relevant alerts and insights from across their identity footprint, this pane helps SOR teams better understand their identity posture and quickly manage potential identity-related security risks.
The page itself is broken down into 3 main areas. At the top users benefit from a visual representation of their unique identity landscape, breaking down the number and location of corporate identities across Entra ID, on-premises Active Directory and hybrid identities.
Just below this area is a section dedicated to critical recommended actions. Here users will see important steps they should take immediately to minimize risks, such as eliminating lateral movement paths and removing dormant accounts from sensitive groups.
The bottom section of the page consists of different cards each offering security professional’s a focused view into a specific element of their ITDR practice. These widgets offer identity-specific filters of broader security capabilities and serve as a jumping off point into other areas of the Defender XDR portal. For example, the “identity posture” card surfaces the Identity recommendations within Secure Score and the “Identity-Related Incidents” card highlights security incidents with identity elements or alerts. There are also some exciting new features available through these cards like the “highly privileged identities” widget which summarizes sensitive accounts within the environment, including Entra ID security administrators and global admin users. This consolidated view will give SOC teams additional insight to implement more targeted and effective management strategies, helping enhancing the organizations overall security posture. Similarly, the “deployment health” card offers info into both the deployment status, and overall health of Defender for Identity agents across the environment but also sheds some light into available licenses for Defender for Identity and Entra ID Protection.
For more information about this page and the available widgets, see the documentation here.
2. Health Issues
The existing “Health issues” page from “Settings” has now been elevated to its own standalone page within the identities tab. Here customers can find a deeper view into the deployment health of their Defender for Identity sensors and see any current issues and recommend fixes to help optimize their Defender for Identity protections.
For more information, see Microsoft Defender for Identity health alerts.
3. Tools
This page provides links to helpful resources relating to Defender for Identity and ITDR. Here customers can find links to documentation and other resources like our capacity sizing tool and readiness script to help them better prepare and maintain their infrastructure and protections.
Check out our updated documentation to learn more about these new updates and follow the What’s New page to keep up with the coming enhancements and new widgets the team is working on.
To conclude I want to again thank our dedicated customers, our teams mission is to improving the protections Defender for Identity provides and we could not do that without your continued support, suggestions, and feedback.
Microsoft Tech Community – Latest Blogs –Read More
KRB_AP_ERR_BAD_INTEGRITY
First cousin once removed to KRB_AP_ERR_MODIFIED
Most anyone who would be interested in reading an article like this has very likely encountered the error, KRB_AP_ERR_MODIFIED. This error tells us one thing: The account secret (aka password hash) that is being used to decipher the ticket cannot decipher the ticket.
The most common reasons are:
The computer upon which the decipher occurs has a broken Secure Channel.
(In short, the secret (aka computer’s password hash) is not the same between the computer and the Domain and/or the DC that issued the ticket.)
The Service Principal Name is on the wrong account.
Misconfiguration of a service.
Wrong account being used.
Clustering incorrectly configured.
Name resolution routes connections to the wrong server (CNAMEs are often the culprit).
Something on the network has mangled the packet.
Malicious activity.
So what does KRB_AP_ERR_BAD_INTEGRITY tell us?
KRB_AP_ERR_BAD_INTEGRITY tells us one thing: a failure to decipher a Kerberos referral ticket.
When an account (the client), be it a user or a machine, wants to access resources in another trusting domain, the client must first get a referral ticket (Inter-Realm TGT) from a Key Distribution Center (KDC, aka Domain Controller) in its own domain. The client can then present the referral ticket to a KDC in the Trusting Domain.
The referral ticket is enciphered with a secret shared between the two domains. This common secret is stored on the Trusted Domain Object (TDO) in the domain partition of Active Directory. Since the KDCs in the different domains share a common secret, they can both encipher and decipher tickets with that secret.
If the receiving KDC cannot decipher the referral ticket, using the secret on its copy of the TDO in its domain, then the resulting error is KRB_AP_ERR_BAD_INTEGRITY.
Let’s Repro!
So, this is all well and good, a bunch of words, now let’s break it and then fix it!
We’re going to examine Kerberos traffic in a network trace when all is working, compared to what we see when we get the error. And then lastly, we’ll try to fix our lab environment.
The Lab
Here’s our lab:
We have a root domain (contoso.com) with two child domains, na.contoco.com and eu.contoso.com with the NetBIOS names, CONTOSO, NA and EU. Each domain has two domain controllers.
For our test scenario, the user jondoe, in the NA domain, wants to access files shared from a server in the EU domain. Our user, najondoe, is interactively logged on to the file server NAFS in the NA domain. To get to the file share, jondoe is going to have to get two referral tickets, one for CONTOSO and one for EU.
To maintain the focus of this document, we will only examine the Kerberos traffic in the network traces. We will not explore DNS or network problems. Note however, that network and name resolution problems could be why the secret on the TDO is wrong.
When everything is working as expected …
In our good scenario, najondoe has interactively logged on to the file server NAFS and then, using file explorer, connected to \eufs.eu.contoso.comeu_share.
In the frames below, we see the Kerberos conversations that the client had with the DCs in the different domains.
legend:
Bright green with black text == Our client
Dark green with white text == NA domain DCs
Black with white Text == CONTOSO Domain DCs
Red with white text == EU Domain DCs
In frames, 1535, 1536, 1543 and 1544, we see our standard AS request and eventual successful response with a TGT from the domain in which the user resides, na.contoso.com. We’ll skip looking into this well-known pattern. Let’s get on to the referrals.
Next, in frame 1553, we see the client attempt to get a service ticket for cifs/eufs.eu.contoso.com. Note the Realm and the Sname. As we can see, they are two different domains.
So, in frame 1556, we get a referral ticket from a NA DC for the CONTSO domain. Why? Because DCs in NA do not have any accounts with a Service Principal Name (SPN) that matches “cifs/eu.contoso.com”. So, the DC gives the client a referral ticket to another Domain, along the trust path, that may know more about this resource, in this case, since it is a child domain, the referral is for the parent domain.
Let’s take a closer look at frame 1556 …
This is our referral ticket from a NA DC. Note the Crealm and Cname: references to the domain in which the user resides. (Cname and Crealm can also be read as ‘Client Realm’ and ‘Client Name’.) The Sname (Service Name), the service for whom this ticket is intended, is in a different domain, in this case krbtgt/CONTOSO.COM. Note the bit in red at the bottom. This cipher is derived from the secret stored on the contoso.com trust object, the TDO, stored in the domain partition on DCs in the NA domain.
Next, frame 1565 we see the client make a request to a root DC. One of the things we can see in this TGS request is the ticket we received from a DC in the NA domain. We can see it here, in blue:
Note the bit in green. This is what the client is requesting, a service ticket for krbtgt/EU.CONTOSO.COM.
The root DC, in frame 1567, gives the client yet another referral ticket, and you guessed it, this time using the secret on the TDO for the EU domain that is stored in the domain partition of the CONTOSO domain.
In frame 1587, we see the client present the ticket given to it by the root domain to a DC in the EU domain, and asking for a ticket for the service, cifs/eufs.eu.contoso.com.
We can see this frame is nearly identical to 1565. But in our next frame, 1589, instead of yet another referral, we get our ticket for the file server EUFS. DCs in EU have an SPN that matches cifs/eufs.eu.contoso.com, and can get on with the business of granting the user the ticket, which happens in frame 1589, below.
At this point the client will make an SMB connection to the file server and present the ticket in the SMB session setup.
In summary, the client wants to get to a resource in another domain. Since the client’s domain does not have this resource, it gets a referral to another domain in the trust-path that might host this resource or know where it might be. This process continues along the trust-path until a KDC can find the SPN for the service the client is requesting.
The term ‘trust-path’ used a few times, refers to the relationship of the domains and/or forests to one another. In this scenario, in a forest, the path goes from a child to the parent to another child. This is no direct trust between the two child domains. In a multi-forest scenario, the trust path can go from root to root, or perhaps even so called “short-cut” trusts. Keeping the trust-path in mind is a good idea.
Looking at versions of secrets
Before we break this lab, let’s take a look at something else very important and is at the heart of what we are discussing, the attributes, trustAuthIncoming and trustAuthOutgoing. These are attributes on TDOs; they store our shared secrets.
These attributes have attributes of their own, such as their version, the time they were changed and upon with DC the change was made. These ‘extra attributes’, the metadata, is not exposed in tools such as ADSI Edit, Active Directory Users and Computers (ADUC) snap-in. (It is possible to see the values with ldp.exe, but on a DC by DC basis, which makes it not very convenient.) We cannot directly query these values with the common tools conveniently. But we can query the metadata about the attributes with repadmin /showmetadata. Also, this command allows us to query every DC in a domain with one command.
Logon to EUDC01 and open an elevated command prompt, and then issue the following command:
repadmin /showobjmeta * “CN=contoso.com,CN=System,DC=eu,DC=contoso,DC=com”
For large environments with many DCs, redirect the output to a text file like so:
repadmin /showobjmeta * “CN=contoso.com,CN=System,DC=eu,DC=contoso,DC=com” > c:MyData.txt
This is going to give us a lot of information, but we are only interested in the attributes trustAuthIncoming and trustAuthOutgoing on the DCs in the EU domain. For purposes of brevity and formatting, I’m only going to display the portion of interest.
/showobjmeta against DC eudc01.eu.contoso.com
Org.Time/Date Ver Attribute
============= === =========
2023-12-14 15:49:39 13 trustAuthIncoming
2023-12-14 15:35:25 13 trustAuthOutgoing
/showobjmeta against DC eudc02.eu.contoso.com
Org.Time/Date Ver Attribute
============= === =========
2023-12-14 15:49:39 13 trustAuthIncoming
2023-12-14 15:35:25 13 trustAuthOutgoing
Notice the versions (ver), they are consistent on all the DCs in the EU domain. The current version is 13.
Now let’s take a look at the TDO for the EU domain on the DCs in the CONTOSO domain:
repadmin /showobjmeta * “CN=eu.contoso.com,CN=System,DC=contoso,DC=com”
… yields the following:
/showobjmeta against DC rootdc01.contoso.com
Org.Time/Date Ver Attribute
============ === =========
2023-12-14 15:35:25 13 trustAuthIncoming
2023-12-14 15:49:39 13 trustAuthOutgoing
/showobjmeta against DC rootdc02.contoso.com
Org.Time/Date Ver Attribute
============= === =========
2023-12-14 15:35:25 13 trustAuthIncoming
2023-12-14 15:49:39 13 trustAuthOutgoing
Again, we see consistent values on all the DCs in the CONTOSO domain for these attributes.
Keep in mind that these attributes contain our secret. The fact that the version is 13 means that secret for the trust has been changed 12 times since it was created. Domain Controllers take care of this for us. You can read more about that here: TDO password changes.
Take note that, although this lab environment shows the same version in each domain, and the same times, they may not always be the same. If the versions are not the same between the two domains, this is not a cause for concern. Time stamps between the domains will be nearly identical when DCs change the secret as described in the article linked in the previous paragraph. If done manually, discussed later in this blog, the time stamp differences will be larger.
You can read more about where the secrets are stored here, 6.1.6.9.1 trustAuthInfo Attributes. These values AuthenticationInformation and PreviousAuthenticationInformation are not exposed anywhere in the GUI or with any tools. NO HUMANS ALLOWED!
Queue Terminator music
Breaking the Lab
How are we going to break this environment and generate the error?
We will shut down EUDC01and then change the trust secret. We’re going to change it twice to be sure that the value stored in PreviousAuthenticationInformation is new and unknown to EUDC01. After the trust secret has been changed, we’ll shut down EUDC02, and bring EUDC01 back up. EUDC01 will have the old secret(s) and DCs in the parent domain will have the new secret(s).
Change the Secret
To change the secret on the trust, we’ll going to use the netdom command.
Logged on as a domain admin in the parent, run the following command:
netdom trust contoso.com /domain:eu.contoso.com /resetOneSide /passwordT:RedBoat51 /userO:administrator /passwordO:*
And now, on a child DC, logged on as a domain admin in EU:
netdom trust eu.contoso.com /domain:contoso.com /resetOneSide /passwordT:RedBoat51 /userO:administrator /passwordO:*
Notice that the parameter ‘PasswordT’ has the same value in both domains … the shared secret. We’ll give a moment for replication to occur and then do these commands again, with a different ‘passwordT’ value.
Prove we changed it twice
Now that we’ve changed the secret a couple of times, let’s check the version of trustAuthIncoming and trustAuthOutgoing. We’ll use repadmin /showobjmeta like we did before.
Here’s for the parent:
/showobjmeta against DC rootdc01.contoso.com
Org.Time/Date Ver Attribute
============= === =========
2024-01-03 18:20:35 15 trustAuthIncoming
2024-01-03 18:20:35 15 trustAuthOutgoing
/showobjmeta against DC rootdc02.contoso.com
Org.Time/Date Ver Attribute
============= === =========
2024-01-03 18:20:35 15 trustAuthIncoming
2024-01-03 18:20:35 15 trustAuthOutgoing
And for the child domain, EU:
/showobjmeta against DC eudc02.eu.contoso.com
Org.Time/Date Ver Attribute
============= === =========
2024-01-03 18:20:53 15 trustAuthIncoming
2024-01-03 18:20:53 15 trustAuthOutgoing
Notice that we only have one DC in EU, remember, EUDC01 is currently down.
And last, we will shut down EUDC02 and bring EUDC01 back up. We must make sure that EUDC02 is all the way down before we start EUDC01. If we don’t, EUDC01 will replicate from EUDC02 and get the new secrets.
Bad Scenario
After all that, the domain trust is now broken, and we should be able to generate a KRB_AP_ERR_BAD_INTEGRITY when we have our user jondoe attempt to get to the file share \eufs.eu.contoso.comeu_share. Let’s take a look …
We have an entire crop of KRB_AP_ERR_BAD_INTEGRITY! Enough to make Kerberos salad!
Kerberos Salad by AI – Looks crunchy.
We can ignore frames, 46, 48, 1068 and 1071. These are the result of background group policy processing.
Using our good scenario as a guide, we can see in frames 1048, 1051, 1058 and 1059 our typical pattern for getting a TGT in our domain.
Frames 1291 and 1293, we’re trying to get a TGS for the server eufs.eu.contoso.com from an NA DC, and we get the expected referral ticket:
Then, in frames 1302 and 1307, we ask a DC in the root domain for a ticket for the EU domain and get the expected response:
Note the bit in red here. This is ciphered with a secret that is newer than the secret on EUDC01. In our next frames, 1382 and 1384, we get the error:
How to fix it?
Fixing this lab environment is very easy, we know exactly how we got here. The simplest method would be to boot up EUDC02 and allow the EU DCs to replicate – but that would be cheating, right?
Not Cheating …
Ok – Let’s see if we can fix this lab without cheating by not booting up EUDC02 and allowing replication in the EU domain to do its thing. Instead, we’ll use the netdom command, as we had done before, setting a new secret between CONTOSO and EU while only EUDC01 is up and running.
So, what’s going to happen when we change the TDO secret on EUDC01 and then we eventually bring EUDC02 backup, which version will win? Hmmm, who knows, let’s find out!
Here’s the status of the TDOs in each domain before booting EUDC02:
contoso.com
/showobjmeta against DC rootdc01.contoso.com
Org.Time/Date Ver Attribute
============= === =========
2024-01-04 12:52:35 18 trustAuthIncoming
2024-01-04 12:52:35 18 trustAuthOutgoing
/showobjmeta against DC rootdc02.contoso.com
Org.Time/Date Ver Attribute
============= === =========
2024-01-04 12:52:35 18 trustAuthIncoming
2024-01-04 12:52:35 18 trustAuthOutgoing
eu.contoso.com
/showobjmeta against DC eudc01.eu.contoso.com
Org.Time/Date Ver Attribute
============= === =========
2024-01-04 12:35:33 19 trustAuthIncoming
2024-01-04 12:35:33 19 trustAuthOutgoing
Note that EUDC02 is down right now, so we can’t see what the version of the attributes are – When I broke the lab, the version was at 21 on EUDC02 before it was shut down.
Those readers paying close attention will note that earlier in this doc, the values were 15 … how did we get to 19 and 21? Reason is, during the writing of the doc, I also use this lab to do customer support, and had to fix it so I could use it … interrupting my experiments, then I had to break it again :p
Next step is to change the secret on the TDOs for CONTOSO and EU.
For CONTOSO we’ll run this command on ROOTDC01:
netdom trust contoso.com /domain:eu.contoso.com /resetOneSide /passwordT:GoldBird78 /userO:ghost /passwordO:*
For EU we’ll run this command on EUDC01:
netdom trust eu.contoso.com /domain:contoso.com /resetOneSide /passwordT:GoldBird78 /userO:ghost /passwordO:*
Now when we look at the version of our attributes trustAuthIncoming and trustAuthOutgoing we have version 20 on EUDC01 and version 19 on the DCs in the root domain, contoso.com.
At this point, when we test najondoe going to the share in EU, we get all the expected Kerberos referrals and tickets. Since we’ve already examined that network traffic, we’ll not cover it again.
So … we should be all fixed, right? Let’s boot up EUDC02 and see what happens …
Tick, Tock, Tick, Tock …
… uh … broken!?
The version of our attributes are now 21, on both EU DCs! But I didn’t do anything (innocent halo)! Something else must have changed it!
That something was AD replication. Recall, when we used netdom to change the TDO secrets on EUDC02, when we broke the lab, we did it twice. So EUDC02 had version 21, and the downed EUDC01 had version 19. Then, when we fixed the trust with netdom and changed the secret on the on EUDC01 we got version 20. … 20 is less than 21. EUDC02 won replication.
So even though we got the lab to work again, and fixed our trust problem, soon as we booted EUDC02, the problem came back! It had a higher version for the attribute, so the TDO secret on EUDC02 won replication and broke the trust again!
Using the Domains and Trust snap-in, on the PDC in EU, (EUDC01) I went and validated the Outgoing trust to CONTOSO, and when prompted, entered domain admin credentials to validate the other side of the trust. I was then prompted to reset the password because the trust could not be validated. I chose yes, and then everything started to work again. Yay!
Using the Domain and Trusts snap-in is no different than had I issued the netdom command on both sides of the trust a couple more times before restarting EUDC02. Had I done it again, that also would have worked as EUDC01 would have had version 22 and won replication when EUDC02 was brought back with version 21. (In a case where both DCs have the same version, the one with the most recent time stamp would win replication. If by chance they had the same time stamp, the InvocationID would have been used to determine the winner. See the section titled Multi-Master Replication for more on that.)
The lesson here is, when dealing with this and trying to change trust secrets, you may need to change the secret more than once if DCs are being bounced up and down and/or replication is acting up – Best bet is, change it twice as it won’t hurt anything. Just allow enough time for domain wide replication to occur between each change.
How does any of this help me, you ask.
What this document can’t tell you is how to fix this in your environment, it is likely more complex than this lab environment. But we know one thing for sure – this error is generated because the secret used to encipher the ticket is not the same secret being used to decipher the ticket. Who has the out of sync secrets and why will need to be investigated and corrected.
When trouble shooting, get a network trace from the client and look for the KRB_AP_ERR_BAD_INTEGRITY error. Keep in mind who the client is. For things like ADFS, IIS, SQL, Exchange and other server applications, the client may be the server-side application and not the end user’s computer.
Once in this state, there are some common symptoms you may notice:
Users are being prompted for passwords over and over.
Logon failures.
Trust validation failures.
Active Directory Replication problems, particularly between Global Catalog servers in different domains.
Clearing the client Kerberos cache fixes the problem, perhaps temporarily or permanently, (klist purge).
Note you can have these exact same symptoms for reasons other than the TDOs not sharing the same secret. If you’re not seeing the KRB_AP_ERR_BAD_INTEGRITY error, the problem is likely something else.
If you do find it …
Run the repadmin /showobjmeta command as we did earlier in this document. Check and see if the version values (the metadata) for the attributes trustAuthIncoming and trustAuthOutgoing on the TDO are consistent on all DCs in the domain. Do this for both the domain that created the referral ticket and the domain receiving the referral ticket. Do not worry if the versions are different in the different domains. Remember, the TDO is stored on the domain partition of each domain and are different objects in their respective domains. They can have different version values in different domains. What’s important is that all the DCs in each domain have the same version for trustAuthIncoming and trustAuthOutgoing and that both domains have the same secret.
When you run repadmin /showobjmeta, you may discover that some DCs in the domain cannot be contacted. If that is the case, go to that DC and run the command. You may very well find that the version is different. It could be that a network connectivity failure (or something else) is at the root of the problem. If DCs can’t replicate properly, you may have one or more DCs with a different secret.
If all the versions are correct across DCs in the respective domains, then we can conclude that the secret is not the same in both domains. If this is the case, use the netdom command, as done previously in this doc, or the Domains and Trusts snap-in, to reset the secret. You may need to do it more than once. If you do it more than once, give a moment or two between changes to allow AD replication to do its thing. You could remove and recreate the trusts, but that incurs more risk and expense compared to using netdmon or the Domains and Trusts snap-in.
Conclusion
In my experience, this error most often occurs in complex environments that are undergoing significant change. Such as DCs being replaced, upgraded or moved around. Also, authoritative restores from old backups or snapshots. Network changes such as topology, new equipment, security software, etc, can also lead to this, as these changes can break replication. And lastly and most unfortunate, large complex environments that are not being properly maintained.
Since we know that the secrets on TDOs are changed automatically once every 30 days (ref: TDO password changes) and we know we keep the previous secret, a change that happened 60 days ago could be why the TDO secrets are out of sync.
References
How trust relationships work for forests in Active Directory
https://learn.microsoft.com/en-us/entra/identity/domain-services/concepts-forest-trust
TDO password changes
Active Directory Forest Recovery – Reset a trust password on one side of the trust
3.3.5.7.5 Cross-Domain Trust and Referrals
TrustedDomain Object
https://learn.microsoft.com/en-us/windows/win32/secmgmt/trusteddomain-object
6.1.6.7 Essential Attributes of a Trusted Domain Object
6.1.6.9.1 trustAuthInfo Attributes
Detailed Concepts: Secure Channel Explained
Service principal names
https://learn.microsoft.com/en-us/windows/win32/ad/service-principal-names
Machine Account Password Process
4768(S, F): A Kerberos authentication ticket (TGT) was requested
https://learn.microsoft.com/en-us/windows/security/threat-protection/auditing/event-4768
How Active Directory Replication Works (Section: Multi-Master Replication)
Active Directory replication troubleshooting guidance
Internet Engineering Task Force (IETF) Link:
Provides a good walk through of how Kerberos Referrals work – See section 8, “Server Referrals”.
Kerberos Principal Name Canonicalization and KDC-Generated Cross-Realm Referrals
https://datatracker.ietf.org/doc/id/draft-ietf-krb-wg-kerberos-referrals-12.html
Microsoft Tech Community – Latest Blogs –Read More
Simplify your app journey with app advisor on ISV Hub
We know that, at times, the app journey can seem complicated. That’s why Microsoft developed app advisor on ISV Hub, a self-serve experience to surface resources, benefits, and incentives to streamline your progress.
App advisor provides a simple and accelerated process to assist ISVs to find the right resources, no matter where they start. This self-serve experience, found on the ISV Hub, will help your team stop searching and start doing.
How does app advisor work?
Your app is unique, so we want your journey to be tailored, too. When you navigate to app advisor, you’ll answer a few short questions and be placed in an appropriate stage and step for what you want to learn.
You’ll start with a wizard and answer just a few short questions to get tailored suggestions.
Collected from the best resources, benefits, incentives, and support that thousands of successful partners have used, app advisor provides a streamlined path to the next step for your company. You’ll always know exactly where you are in the journey, be able to mark items complete, and even browse to other steps to quickly look at what’s available.
Answering the questions will bring you to a results page, oriented in a step that matches your responses.
Who should use app advisor?
If you’re considering or already building a B2B app with Microsoft technology, you’ll get value using app advisor. The guidance is tailored to ISVs who will sell on the Microsoft commercial marketplace and show how to make the most of what Microsoft has to offer.
There is no fee associated with using app advisor. And it’s self-paced, meaning you can take as much or as little time as you need with the resources.
Ready to streamline your app journey? With app advisor on ISV Hub, Microsoft is here to help.
Currently available in the United States, app advisor will launch in other languages in early February.
Microsoft Tech Community – Latest Blogs –Read More
Validate your skills with our new certification for Microsoft Fabric Analytics Engineers
We’re looking for Microsoft Fabric Analytics Engineers to take our new beta exam. Do you have subject matter expertise in designing, creating, and deploying enterprise-scale data analytics solutions? If so, and if you know how to transform data into reusable analytics assets by using Microsoft Fabric components, such as lakehouses, data warehouses, notebooks, dataflows, data pipelines, semantic models, and reports, be sure to check out this exam. Other helpful qualifications include the ability to implement analytics best practices in Fabric, including version control and deployment.
If this is your skill set, we have a new certification for you. The Microsoft Certified: Fabric Analytics Engineer Associate certification validates your expertise in this area and offers you the opportunity to prove your skills. To earn this certification, pass Exam DP-600: Implementing Analytics Solutions Using Microsoft Fabric, currently in beta.
Is this the right certification for you?
This certification could be a great fit if you have in-depth familiarity with the Fabric solution and you have experience with data modeling, data transformation, Git-based source control, exploratory analytics, and languages, including Structured Query Language (SQL), Data Analysis Expressions (DAX), and PySpark.
Review the Exam DP-600 (beta) page for details, and check out the self-paced learning paths and instructor-led training there. The Exam DP-600 study guide alerts you for key topics covered on the exam.
Ready to prove your skills?
Take advantage of the discounted beta exam offer. The first 300 people who take Exam DP-600 (beta) on or before January 25, 2024, can get 80 percent off market price.
To receive the discount, when you register for the exam and are prompted for payment, use code DP600Winfield. This is not a private access code. The seats are offered on a first-come, first-served basis. As noted, you must take the exam on or before January 25, 2024. Please note that this beta exam is not available in Turkey, Pakistan, India, or China.
The rescore process starts on the day an exam goes live—8 to 12 weeks after the beta period, and final scores for beta exams are released approximately 10 days after that. For details on the timing of beta exam rescoring and results, read my post Creating high-quality exams: The path from beta to live.
Get ready to take Exam DP-600 (beta)
Explore the Fabric Career Hub. Access live training, skills challenges, group learning, and career insights.
Join the Fabric Cloud Skills Challenge. Complete all modules in the challenge within 30 days and become eligible for 50% off the cost of a Microsoft Certification exam. This 50% discount can’t be used toward the Exam DP-600 (beta). If you miss the beta period, you can use it later once the exam goes live or for another live certification exam.
Looking for in-depth training? Check out the new course Microsoft Fabric Analytics Engineer. Connect with Microsoft Training Services Partners in your area for in-person training.
Need other preparation ideas? Check out my blog post Just How Does One Prepare for Beta Exams?
Did you know that you can take any role-based exam online? Online delivered exams—taken from your home or office—can be less hassle, less stress, and even less worry than traveling to a test center, especially if you’re adequately prepared for what to expect. To find out more, check out my blog post Online proctored exams: What to expect and how to prepare.
Ready to get started?
Remember, the number of spots for the discounted beta exam offer is limited to the first 300 candidates taking Exam DP-600 (beta) on or before January 25, 2024.
Related announcements
9 ways Microsoft Learn helps you with the skills-first economy
Introducing a new resource for all role-based Microsoft Certification exams
Microsoft Learn: Four key features to help expand your knowledge and advance your career
Meet learners who changed their career with the help of Microsoft Learn
Microsoft Tech Community – Latest Blogs –Read More
Validate your skills with our new certification for Microsoft Fabric Analytics Engineers
We’re looking for Microsoft Fabric Analytics Engineers to take our new beta exam. Do you have subject matter expertise in designing, creating, and deploying enterprise-scale data analytics solutions? If so, and if you know how to transform data into reusable analytics assets by using Microsoft Fabric components, such as lakehouses, data warehouses, notebooks, dataflows, data pipelines, semantic models, and reports, be sure to check out this exam. Other helpful qualifications include the ability to implement analytics best practices in Fabric, including version control and deployment.
If this is your skill set, we have a new certification for you. The Microsoft Certified: Fabric Analytics Engineer Associate certification validates your expertise in this area and offers you the opportunity to prove your skills. To earn this certification, pass Exam DP-600: Implementing Analytics Solutions Using Microsoft Fabric, currently in beta.
Is this the right certification for you?
This certification could be a great fit if you have in-depth familiarity with the Fabric solution and you have experience with data modeling, data transformation, Git-based source control, exploratory analytics, and languages, including Structured Query Language (SQL), Data Analysis Expressions (DAX), and PySpark.
Review the Exam DP-600 (beta) page for details, and check out the self-paced learning paths and instructor-led training there. The Exam DP-600 study guide alerts you for key topics covered on the exam.
Ready to prove your skills?
Take advantage of the discounted beta exam offer. The first 300 people who take Exam DP-600 (beta) on or before January 25, 2024, can get 80 percent off market price.
To receive the discount, when you register for the exam and are prompted for payment, use code DP600Winfield. This is not a private access code. The seats are offered on a first-come, first-served basis. As noted, you must take the exam on or before January 25, 2024. Please note that this beta exam is not available in Turkey, Pakistan, India, or China.
The rescore process starts on the day an exam goes live—8 to 12 weeks after the beta period, and final scores for beta exams are released approximately 10 days after that. For details on the timing of beta exam rescoring and results, read my post Creating high-quality exams: The path from beta to live.
Get ready to take Exam DP-600 (beta)
Explore the Fabric Career Hub. Access live training, skills challenges, group learning, and career insights.
Join the Fabric Cloud Skills Challenge. Complete all modules in the challenge within 30 days and become eligible for 50% off the cost of a Microsoft Certification exam. This 50% discount can’t be used toward the Exam DP-600 (beta). If you miss the beta period, you can use it later once the exam goes live or for another live certification exam.
Looking for in-depth training? Check out the new course Microsoft Fabric Analytics Engineer. Connect with Microsoft Training Services Partners in your area for in-person training.
Need other preparation ideas? Check out my blog post Just How Does One Prepare for Beta Exams?
Did you know that you can take any role-based exam online? Online delivered exams—taken from your home or office—can be less hassle, less stress, and even less worry than traveling to a test center, especially if you’re adequately prepared for what to expect. To find out more, check out my blog post Online proctored exams: What to expect and how to prepare.
Ready to get started?
Remember, the number of spots for the discounted beta exam offer is limited to the first 300 candidates taking Exam DP-600 (beta) on or before January 25, 2024.
Related announcements
9 ways Microsoft Learn helps you with the skills-first economy
Introducing a new resource for all role-based Microsoft Certification exams
Microsoft Learn: Four key features to help expand your knowledge and advance your career
Meet learners who changed their career with the help of Microsoft Learn
Microsoft Tech Community – Latest Blogs –Read More
Validate your skills with our new certification for Microsoft Fabric Analytics Engineers
We’re looking for Microsoft Fabric Analytics Engineers to take our new beta exam. Do you have subject matter expertise in designing, creating, and deploying enterprise-scale data analytics solutions? If so, and if you know how to transform data into reusable analytics assets by using Microsoft Fabric components, such as lakehouses, data warehouses, notebooks, dataflows, data pipelines, semantic models, and reports, be sure to check out this exam. Other helpful qualifications include the ability to implement analytics best practices in Fabric, including version control and deployment.
If this is your skill set, we have a new certification for you. The Microsoft Certified: Fabric Analytics Engineer Associate certification validates your expertise in this area and offers you the opportunity to prove your skills. To earn this certification, pass Exam DP-600: Implementing Analytics Solutions Using Microsoft Fabric, currently in beta.
Is this the right certification for you?
This certification could be a great fit if you have in-depth familiarity with the Fabric solution and you have experience with data modeling, data transformation, Git-based source control, exploratory analytics, and languages, including Structured Query Language (SQL), Data Analysis Expressions (DAX), and PySpark.
Review the Exam DP-600 (beta) page for details, and check out the self-paced learning paths and instructor-led training there. The Exam DP-600 study guide alerts you for key topics covered on the exam.
Ready to prove your skills?
Take advantage of the discounted beta exam offer. The first 300 people who take Exam DP-600 (beta) on or before January 25, 2024, can get 80 percent off market price.
To receive the discount, when you register for the exam and are prompted for payment, use code DP600Winfield. This is not a private access code. The seats are offered on a first-come, first-served basis. As noted, you must take the exam on or before January 25, 2024. Please note that this beta exam is not available in Turkey, Pakistan, India, or China.
The rescore process starts on the day an exam goes live—8 to 12 weeks after the beta period, and final scores for beta exams are released approximately 10 days after that. For details on the timing of beta exam rescoring and results, read my post Creating high-quality exams: The path from beta to live.
Get ready to take Exam DP-600 (beta)
Explore the Fabric Career Hub. Access live training, skills challenges, group learning, and career insights.
Join the Fabric Cloud Skills Challenge. Complete all modules in the challenge within 30 days and become eligible for 50% off the cost of a Microsoft Certification exam. This 50% discount can’t be used toward the Exam DP-600 (beta). If you miss the beta period, you can use it later once the exam goes live or for another live certification exam.
Looking for in-depth training? Check out the new course Microsoft Fabric Analytics Engineer. Connect with Microsoft Training Services Partners in your area for in-person training.
Need other preparation ideas? Check out my blog post Just How Does One Prepare for Beta Exams?
Did you know that you can take any role-based exam online? Online delivered exams—taken from your home or office—can be less hassle, less stress, and even less worry than traveling to a test center, especially if you’re adequately prepared for what to expect. To find out more, check out my blog post Online proctored exams: What to expect and how to prepare.
Ready to get started?
Remember, the number of spots for the discounted beta exam offer is limited to the first 300 candidates taking Exam DP-600 (beta) on or before January 25, 2024.
Related announcements
9 ways Microsoft Learn helps you with the skills-first economy
Introducing a new resource for all role-based Microsoft Certification exams
Microsoft Learn: Four key features to help expand your knowledge and advance your career
Meet learners who changed their career with the help of Microsoft Learn
Microsoft Tech Community – Latest Blogs –Read More