Category: Microsoft
Category Archives: Microsoft
How can I be a delegate when scheduling teams calls
I am sending invites on behalf of someone in outlook (I am a delegate on their account) but when I include a teams call in the invite it doesn’t let anyone join until I join. Even though I am not on the invite! How can I change this?
I am sending invites on behalf of someone in outlook (I am a delegate on their account) but when I include a teams call in the invite it doesn’t let anyone join until I join. Even though I am not on the invite! How can I change this? Read More
Saving a file to a SharePoint directory in Excel – VBA
I have tried a few variations, looked at youtube videos and looked to CHATGPT. I have two options on the dropdown for file destination: (1) My Choice; (2) a predefined SharePoint location. I have had the “My Choice” option working for some time and added option 2 which is the problem area.
I get through to the stream.SaveToFile statement and I see a spinner pop up and about 10 seconds later, it returns a 3004 error.
I have tried both http:// and changing it to a network directory setup (CHATGPT) so it reads: \mycompany.sharepoint.comsitesOBOSitePagesCollabHomeOBO_STG_Export_Files. The HTTP address of the sharepoint folder is: https://mycompany.sharepoint.com/sites/OBO/Shared%20Documents/Forms/AllItems.aspx?id=%2Fsites%2FOBO%2FShared%20Documents%2FOBO%5FQA%5FExport%5FFiles&viewid=a7e2c126%2Dddda%2D4583%2D8ae2%2D55346c1a7cbf
The code is below. It seems to be how I am forming things in the stream.SaveToFile
Destination filename would be something like: banner-product-20240529142717
Sub ExportToCSVCheckBox(ws_base As String <- this equates to “product”)
Dim src_ws As Worksheet
Dim create_csvs As Worksheet
Dim rng As Range
Dim cell As Range
Dim filePath As String
Dim rowIndex As Long
Dim colIndex As Long
Dim line As String
Dim delimiter As String
Dim banner As Variant
Dim outputSelection As String
Dim csv_filename As String
Dim expFile As Variant
Dim timeStamp As String
Dim stream As Object ‘ ADODB.Stream
‘
‘ This subroutine provides the following capabilities:
‘ 1) Reads cells from Create_CSVs worksheet for banner, which files to export and where to
‘ place the files
‘ 2) Generate the output in UTF-8 format to allow for special French characters
‘ 3) Blanks out any 0 filled cells provided by Excel – CCv2 does not like them
‘ 4) Replaces traditional “,” delimited file with “|” – CCv2 wants it that way
‘ 5) Puts double quotes “” around cell contents to deal with those cell contents that
‘ have double quotes in them. “12” Latex Balloon” needs to be “12”” Latex Balloon” so
‘ that is shows in CCv2 as 12″ Latex Balloon
‘
On Error GoTo ErrorHandler
‘ Set the worksheet location for the input variables
Set create_csvs = ThisWorkbook.Sheets(“Create_CSVs”)
‘ Get the banner
banner = create_csvs.Range(“B3”).Value
‘ Get the output location
outputSelection = create_csvs.Range(“B17”)
‘ Get the expFile area
expFile = banner & “-” & ws_base & csv_filename
‘ Get the current date and time stamp
timeStamp = Format(Now, “yyyymmddhhmmss”)
‘ Create the filename for the CSV
csv_filename = expFile & “-” & timeStamp & “.csv”
‘ Set the name of the worksheet that holds the data for export
Set src_ws = ThisWorkbook.Sheets(expFile)
‘ Define the file path for the CSV file
If outputSelection = “My Choice” Then
filePath = Application.GetSaveAsFilename(InitialFileName:=csv_filename, FileFilter:=”CSV Files (*.csv), *.csv”)
‘ Check if user cancelled the Save As dialog
If filePath = “False” Then
Exit Sub
End If
Else ‘ assume we have a SharePoint location
filePath = create_csvs.Range(“C18”) & csv_filename
End If
‘ Define the range
Set rng = src_ws.UsedRange ‘ Adjust as needed
‘ Define the custom delimiter
delimiter = “|”
‘ Create the ADODB.Stream object
Set stream = CreateObject(“ADODB.Stream”)
stream.Type = 2 ‘ Specify stream type – we want To save text/string data.
stream.Charset = “UTF-8” ‘ Specify charset For the source text data.
‘ Open the stream
stream.Open
‘ Loop through each row in the range
For rowIndex = 1 To rng.Rows.Count
line = “”
‘ Loop through each column in the row
For colIndex = 1 To rng.Columns.Count
‘ Get the cell value
Set cell = rng.Cells(rowIndex, colIndex)
If cell.Value <> “” And cell.Value <> 0 Then ‘ Cannot have 0 values for IMPEX
line = line & “””” & cell.Value & “”””
End If
If colIndex < rng.Columns.Count Then
‘ Append the cell value to the line with the custom delimiter
line = line & delimiter
End If
Next colIndex
‘ Write the line to the stream
stream.WriteText line & vbCrLf
Next rowIndex
MsgBox “filepath = ” & filePath
‘ Save the stream to a file
stream.SaveToFile filePath, 2 ‘ 2 = adSaveCreateOverWrite
‘ Close the stream
stream.Close
‘ Notify the user that the export is complete
MsgBox “Data has been exported to ” & filePath, vbInformation
Exit Sub
ErrorHandler:
MsgBox “An error occurred: ” & Err.Description & ” (Error Number: ” & Err.Number & “)”
End Sub
I have tried a few variations, looked at youtube videos and looked to CHATGPT. I have two options on the dropdown for file destination: (1) My Choice; (2) a predefined SharePoint location. I have had the “My Choice” option working for some time and added option 2 which is the problem area. I get through to the stream.SaveToFile statement and I see a spinner pop up and about 10 seconds later, it returns a 3004 error.I have tried both http:// and changing it to a network directory setup (CHATGPT) so it reads: \mycompany.sharepoint.comsitesOBOSitePagesCollabHomeOBO_STG_Export_Files. The HTTP address of the sharepoint folder is: https://mycompany.sharepoint.com/sites/OBO/Shared%20Documents/Forms/AllItems.aspx?id=%2Fsites%2FOBO%2FShared%20Documents%2FOBO%5FQA%5FExport%5FFiles&viewid=a7e2c126%2Dddda%2D4583%2D8ae2%2D55346c1a7cbfThe code is below. It seems to be how I am forming things in the stream.SaveToFileDestination filename would be something like: banner-product-20240529142717Sub ExportToCSVCheckBox(ws_base As String <- this equates to “product”)Dim src_ws As WorksheetDim create_csvs As WorksheetDim rng As RangeDim cell As RangeDim filePath As StringDim rowIndex As LongDim colIndex As LongDim line As StringDim delimiter As StringDim banner As VariantDim outputSelection As StringDim csv_filename As StringDim expFile As VariantDim timeStamp As StringDim stream As Object ‘ ADODB.Stream” This subroutine provides the following capabilities:’ 1) Reads cells from Create_CSVs worksheet for banner, which files to export and where to’ place the files’ 2) Generate the output in UTF-8 format to allow for special French characters’ 3) Blanks out any 0 filled cells provided by Excel – CCv2 does not like them’ 4) Replaces traditional “,” delimited file with “|” – CCv2 wants it that way’ 5) Puts double quotes “” around cell contents to deal with those cell contents that’ have double quotes in them. “12” Latex Balloon” needs to be “12”” Latex Balloon” so’ that is shows in CCv2 as 12″ Latex Balloon’On Error GoTo ErrorHandler’ Set the worksheet location for the input variablesSet create_csvs = ThisWorkbook.Sheets(“Create_CSVs”)’ Get the bannerbanner = create_csvs.Range(“B3”).Value’ Get the output locationoutputSelection = create_csvs.Range(“B17”)’ Get the expFile areaexpFile = banner & “-” & ws_base & csv_filename’ Get the current date and time stamptimeStamp = Format(Now, “yyyymmddhhmmss”)’ Create the filename for the CSVcsv_filename = expFile & “-” & timeStamp & “.csv”‘ Set the name of the worksheet that holds the data for exportSet src_ws = ThisWorkbook.Sheets(expFile)’ Define the file path for the CSV fileIf outputSelection = “My Choice” ThenfilePath = Application.GetSaveAsFilename(InitialFileName:=csv_filename, FileFilter:=”CSV Files (*.csv), *.csv”)’ Check if user cancelled the Save As dialogIf filePath = “False” ThenExit SubEnd IfElse ‘ assume we have a SharePoint locationfilePath = create_csvs.Range(“C18”) & csv_filenameEnd If’ Define the rangeSet rng = src_ws.UsedRange ‘ Adjust as needed’ Define the custom delimiterdelimiter = “|”‘ Create the ADODB.Stream objectSet stream = CreateObject(“ADODB.Stream”)stream.Type = 2 ‘ Specify stream type – we want To save text/string data.stream.Charset = “UTF-8” ‘ Specify charset For the source text data.’ Open the streamstream.Open’ Loop through each row in the rangeFor rowIndex = 1 To rng.Rows.Countline = “”‘ Loop through each column in the rowFor colIndex = 1 To rng.Columns.Count’ Get the cell valueSet cell = rng.Cells(rowIndex, colIndex)If cell.Value <> “” And cell.Value <> 0 Then ‘ Cannot have 0 values for IMPEXline = line & “””” & cell.Value & “”””End IfIf colIndex < rng.Columns.Count Then’ Append the cell value to the line with the custom delimiterline = line & delimiterEnd IfNext colIndex’ Write the line to the streamstream.WriteText line & vbCrLfNext rowIndexMsgBox “filepath = ” & filePath’ Save the stream to a filestream.SaveToFile filePath, 2 ‘ 2 = adSaveCreateOverWrite’ Close the streamstream.Close’ Notify the user that the export is completeMsgBox “Data has been exported to ” & filePath, vbInformationExit SubErrorHandler:MsgBox “An error occurred: ” & Err.Description & ” (Error Number: ” & Err.Number & “)”End Sub Read More
Colaborar e Aprender
Olá, um prazer participar deste forum. Estamos engatinhando em conhecimento e uso de AI.
Estamos a disposição para aprender a respeito de aplicações e soluções desenvolvidas para o 3º setor , e compartilhar boas práticas.
Olá, um prazer participar deste forum. Estamos engatinhando em conhecimento e uso de AI. Estamos a disposição para aprender a respeito de aplicações e soluções desenvolvidas para o 3º setor , e compartilhar boas práticas. Read More
Congratulations and Welcome to our new Microsoft Global Community Initiative (MGCI) Regional Leaders
The Microsoft Global Community Initiative (MGCI) Advisors would like to extend a special welcome to the new group of Regional Leaders joining us from all over the world! We are looking forward to growing our global communities together with your support! Curious who the new Regional Leaders are? Check them out here! Discover the details about MGCI.
How to become MGCI Regional Leader
Earlier this year we opened nominations to invite additional regional leaders into MGCI from the request of the community to have more local regional volunteers around the world to help facilitate more support, activities, and excitement regarding how to produce and amplify free community events.
Regional leaders are self-nominated, non-Microsoft employees, approved by board members and Microsoft. The term is one year and can be extended. Board members are appointed for an initial 18-month term that can be extended. Regional leaders are added quarterly.
MGCI Subcommittees
MGCI relies on the dedication of board members and regional leaders who lead the three subcommittees—Communications & Amplification, Training and Resources, and Technology and Tools—to drive the initiative forward. These volunteers bring their unique skills and passion to improve communication channels, offer valuable training and resources, and harness technology for the betterment of MGCI and its members. Through their volunteer efforts, MGCI thrives, fostering a spirit of collaboration and community empowerment. General members are invited to volunteer for a subcommittee after joining https://aka.ms/MGCISignUp — volunteers are welcome!
Communications & Amplification
The MGCI communications & amplification subcommittee drives community growth and engagement. This subcommittee crafts the overall strategy for communications/amplification for MGCI. Subcommittee members help write, edit, and create posts, blurbs, graphics and GIFs for usage to amplify the overall program and communitydays.org events which drives growth and awareness. Board Officer Leads: @JenniferMason & Isidora Katanic.
Training & Resources
The MGCI training subcommittee holds monthly training calls to drive hands on knowledge for those that are community event organizers or are interested in learning how to organize an event. This content and best practices expand and foster knowledge within the community. Regional leaders will craft training strategy and topic curation. Subcommittee members can help build and lead training. Regional Leads: David Leveille and Wes Preston.
Technology and Tools
The MGCI Tools and Technology subcommittee helps with testing and quality control over our one stop shop Community Event website (communitydays.org) which provides a worldwide listing of all Community Days events, paid events and free events in the Microsoft ecosystems. Subcommittee members will be asked to review, test and quality control new features, pages and content for communitydays.org and other community tools used to run the MGCI. Board Officer Lead: Tom Daly.
The next event training meeting is coming up on Thursday May 30 at 8:00am PT, to be invited please join MGCI today – https://aka.ms/MGCISignUp.
Thank you in advance to our new Regional Leaders for your volunteer contributions and our board members and existing regional leaders for their contributions.
Discover: https://aka.ms/MGCI
Join: https://aka.ms/MGCISignUp
Advisors: https://aka.ms/MGCIAdvisors
MGCI is community-led and supported by Microsoft. All are welcome to join!
— Heather
Microsoft Tech Community – Latest Blogs –Read More
Azure SQL Managed Instance Point-In-Time Restore Service Level Expectation
Introduction
Customers evaluating or already using SQL Managed Instance (SQL MI) need to estimate how much time a database restore would take to complete. This estimation could be required when designing an SLA and/or SLO, or to assess whether a database with certain characteristics would be able to fit into predefined recovery time objectives (RTO).
This Point-In-Time Restore Service Level Expectation (SLE) solution was created to provide guidance for the expected time it would take to restore a database to a point in time, by providing a database restore time expectation based on input parameters.
Note that this SLE was built for PITR database restore operations, and not for Long Term Retention backups restore operations.
Reference documentation: Point-in-time restore – Azure SQL Managed Instance | Microsoft Learn
Factors that affect database restore time
The following list contains several factors that may affect the recovery time of a database using automated database backups:
The size of the database being restored
The compute size of the target database
The number of transaction logs involved in the restore plan
The amount of activity that needs to be replayed to recover to the restore point
The network bandwidth if the restore is to a different region
The number of concurrent restore requests that are processed in the target region
Whether a system update is performed which will temporarily interrupt any database restore operation
Approach and modeling of the estimations
The approach chosen to generate a database restore time expectation is based on data from past restore operations that occurred across worldwide Azure regions. This data was collected from the backend telemetry of our SQL MI service to the end of 2023. It is assumed that future restore operations will take a similar amount of time as past restores with comparable database characteristics (such as backup size and number of files). However, there may be improvements in our service that could potentially reduce restoration times in the future, at which point this analysis will be revised to incorporate these advancements.
To model the estimations, 6 linear regressions were used with a series of independent variables, which were fed with restore operations data obtained from the Azure SQL MI backend telemetry.
To store the results of the regressions and to use them to make estimations, an Excel file called SQL MI Database Restore Service Level Expectation.xlsx can be used. The stored procedure at the bottom of this blog named usp_SQL_MI_Database_Restore_Service_Level_Expectation can be used to make estimations as well.
It is important to note that while regression analysis can be powerful, it assumes a linear relationship between the variables and makes certain assumptions about the data, such as normality and homoscedasticity. It Is important to recognize this method has its limitations, and that both the data used for the regressions as well as future database restores there will be outliers that will be very hard to predict. Even though that, these are mathematical models trying to provide an expectation based on actual database restores. In certain scenarios, such as with large databases in Business Critical SLO, the number of actual database restores may be relatively low due to the few large databases compared to medium and smaller ones. Consequently, this results in the generation of less accurate models. For the purpose of this work, a linear regression with an r2 greater than 0.65 was considered good enough from a statistics point of view.
Different models based on service tier and database size
Since database restores at different service tiers perform differently due to the number of resources available and in order to model estimations more accurately, six different estimation models were created. The table below shows the 2 service tiers and the 3 categories for database size for a total of 6 models:
Choosing the right estimation model
General Purpose
(GP Database Restore SLE tab)
Business Critical
(BC Database Restore SLE tab)
Small Databases
Smaller than 450 GB
Smaller than 400 GB
Medium Databases
Between 450 GB and 1,500 GB
Between 400 GB and 2,000 GB
Large Databases
Greater than 1,500 GB
Larger than 2,000 GB
For example, if you are trying to use the Excel file to estimate the restore time of a database that runs on Business Critical and the database is 1,800 GB in size, then use the BC Restore SLE tab and the second regression for Medium databases of the Excel file.
Reference documentation: Resource limits & purchasing model
Two options to obtain restore time estimations
This Database Restore Service Level Expectation (SLE) solution presents two options to obtain restore time estimations:
An Excel file to manually enter values of a hypothetical database
To obtain a copy of this file please email datasqlninja@microsoft.com.
A stored procedure specifying an existing database and a point in time (PITR)
The T-SQL code of this stored procedure called usp_SQL_MI_Database_Restore_Service_Level_Expectation is at the end of this blog.
Stored Procedure for existing databases
The stored procedure option provides estimations for an existing database in the SQL MI in which the stored procedure is first created and then executed. The stored procedure named usp_SQL_MI_Database_Restore_Service_Level_Expectation accompanying this solution does the calculations itself with only two input parameters. In order to use it, first create the stored procedure in any database of the SQL MI and second execute it as explained below.
Input Parameters
The only two parameters needed to run this stored procedure are shown below:
EXEC usp_SQL_MI_Database_Restore_Service_Level_Expectation @Database=’AdventureWorks2019′, @PITR_Time = ‘2024-01-26 09:21:00.000’;
Running the above example estimates the restore time of the AdventureWorks2019 database to a point in time of 9:21 AM – January 26th 2024. Please enter the name of the existing database and the point in time you would like to obtain estimations for.
Results
The result of the execution of the above example looks like this:
Note that it provides estimations for restoring the existing database in both General Purpose and Business Critical in various vCore configurations, as well as information of the restore plan for the specified point in time restore (PITR).
Excel file for hypothetical databases
Please use the accompanying Excel file named SQL MI Database Restore Service Level Expectation.xlsx and choose the estimation model in the tab and section that corresponds to the service tier you want to use for the estimation (General Purpose or Business Critical) and the size of the database. Once you locate the appropriate model, you will enter the input parameters of a hypothetical database to perform restore estimations.
Input parameters
Inside the Excel file, the cells colored green are the cells where data must be entered to generate an estimation. vCore is the only cell that contains a drop box with the valid values for each model based on tier and database size. The file comes with sample values in the cells for illustrative purposes.
As you enter values one by one, a number of minutes and hours will start to be calculated but should be ignored until you have entered all the 5 values in all the corresponding green cells.
Note that for some of the input parameters you will have to be creative and make assumptions, i.e., how many log backups are needed in the restore plan for a given point in time? Or how big of a differential backup to I want to use for an estimation? Each green cell performs validations to make sure valid values for each cell are entered.
Once all input parameters are entered into the green cells, the expected database restore time is calculated underneath and displayed both in minutes with one decimal and in hours with two decimals. Both represent the same amount of time only differing in the time unit. i.e. 113.4 minutes equals 1.89 hours.
Number of vCores
This input parameter represents the number of vCores that are configured on the hypothetical SQL Managed Instance that will be the target of the database restore. In most cases the number of vCores changes the amount of resources available for the restore operation that translates to a slight time difference.
Please enter into the green cell next to vcore_count the number that you want to use to generate an estimation.
Note that the number of vCores that can be configured in SQL MI depends on the following:
The service tier (General Purpose or Business Critical)
The hardware series (Standard, Premium, Memory Optimized Premium)
The maximum instance reserved storage size (1 TB, 2 TB, 4 TB, 5.5 TB and 16 TB)
Feature
General Purpose
Business Critical
Number of vCores*
4, 8, 16, 24, 32, 40, 64, 80
Standard-series (Gen5): 4, 8, 16, 24, 32, 40, 64, 80
Premium-series: 4, 6, 8, 10, 12, 16, 20, 24, 32, 40, 48, 56, 64, 80, 96, 128
Memory optimized premium-series: 4, 6, 8, 10, 12, 16, 20, 24, 32, 40, 48, 56, 64, 80, 96, 128
Max instance reserved storage size
– 2 TB for 4 vCores
– 8 TB for 8 vCores
– 16 TB for other sizes
Standard-series (Gen5):
– 1 TB for 4, 8, 16 vCores
– 2 TB for 24 vCores
– 4 TB for 32, 40, 64, 80 vCores
Premium-series:
– 1 TB for 4, 6 vCores
– 2 TB for 8, 10, 12 vCores
– 4 TB for 16, 20 vCores
– 5.5 TB for 24, 32, 40, 48, 56 vCores
– 5.5 TB or 16 TB (depending on the region) for 64, 80, 96, 128 vCores 2
Memory optimized premium-series:
– 1 TB for 4, 6 vCores
– 2 TB for 8, 10, 12 vCores
– 4 TB for 16, 20 vCores
– 5.5 TB for 24 vCores
– 5.5 TB or 8 TB (depending on the region) for 32, 40 vCores3
– 12 TB for 48, 56 vCores
– 16 TB for 64, 80, 96, 128 vCores
The tables above were extracted from the following documentation:
Reference documentation: Resource limits – Azure SQL Managed Instance | Microsoft Learn
Full Backup Size in GB
This input parameter represents the size in gigabytes (GB) of the full database backup that is going to be used to restore the database to the point in time chosen. This is equal to the actual size of the database at the time the full backup was taken and is the most significant factor that affects estimations.
Please enter into the green cell next to full_backup_size_GB a number in GB that you want to use as the size of the database to generate an estimation.
Reference documentation: Full database backups (SQL Server) – SQL Server | Microsoft Learn
Differential Backup Size in GB
This input parameter represents the size of the differential backup that may need to be restored after the full backup to get to the point in time specified in the operation. The size of this differential backup depends on the number and size of changes made in the database since the last full backup was completed and the point in time chosen. Considering a full backup is normally taken once a week and a differential backup is normally taken every 12 hours, the differential backup could contain database changes accumulated for up to ~6.5 days. Note that if a point in time is chosen close enough to the last full backup, then no differential backup is needed for that restore plan.
Please enter into the green cell next to diff_backup_size_GB a number in GB if you want to use a differential backup in the estimation. The most typical value we observe in the telemetry is between 1% and 18% of the full backup size.
Reference documentation: Differential Backups (SQL Server) – SQL Server | Microsoft Learn
Log Backup Count
This input parameter represents the number of transaction log backups that need to be restored to bring the database to the point in time specified in the database restore operation.
Considering a log backup is normally as long as there are database changes and differential backups are taken normally every 12 hours, we could estimate that at the most 144 log backups could be needed in the restore plan to get to any point in time specified in the database restore operation.
Please enter into the green cell next to log_backup_count a number of log backups you want to use to generate an estimation. If you want to model a worst-case scenario, use 144 and for best case scenario use 0. The most typical value we observe in the telemetry is between 0 and 55.
Reference documentation: Transaction log backups – SQL Server | Microsoft Learn
Total Log Backup Size in GB
This input parameter represents the sum in GB of all the log backups needed in the restore plan after the full + potential differential backup restores to get to the point in time. This total size depends on the number of database changes that occurred since the last full or differential backup.
Please enter into the green cell next to total_log_backup_size_GB a number in GB that you want to use to generate an estimation. The most typical value we observe in the telemetry is between 0% and 5% of the full backup size.
Feedback and suggestions
We hope that this post has helped you. If you have feedback or suggestions for improving this asset, please contact the Azure Databases SQL Customer Success Engineering Team. Thanks for your support!
The following is the T-SQL to create the aforementioned stored procedure (usp_SQL_MI_Database_Restore_Service_Level_Expectation) to calculate the restore SLE:
SET ANSI_NULLS ON
GO
SET QUOTED_IDENTIFIER ON
GO
— =================================================================================================================================================================================
— Disclaimer: This stored procedure is provided “as-is”. Microsoft provided guidance in this stored procedure with the understanding that MICROSOFT MAKES NO WARRANTIES,
— EXPRESS OR IMPLIED, WITH RESPECT TO THE INFORMATION CONTAINED HEREIN.
— Authors: Raghavendra Srinivasan, Diego Caracciolo, Microsoft Corporation
— Create date: 1/30/2024
— Description: This stored procedure provides a SQL MI Database Restore Service Level Expectation (SLE) based on the backup history information of a database in the current SQL MI
— This stored procedure is part of a solution called “Azure SQL MI Database Restore Service Level Expectation”
— which main document is called Azure SQL MI Database Restore Service Level Expectation.docx
— Parameters: @Database = The name of the database in the current SQL Managed Instance for which you want to obtain an SLE
— @PITR_Time = The point in time at which you want to obtain a database restore SLE
— Example: EXEC usp_SQL_MI_Database_Restore_Service_Level_Expectation @Database=’MyDatabase1′, @PITR_Time = ‘2024-01-26 09:21:00.000’
— =================================================================================================================================================================================
CREATE PROCEDURE usp_SQL_MI_Database_Restore_Service_Level_Expectation
@Database NVARCHAR(255),
@PITR_Time DATETIME2
AS
BEGIN
SET NOCOUNT ON;
BEGIN TRY
DECLARE @PITR_Checkpoint DATETIME2 = DATEADD(MINUTE, 30, @PITR_Time);
DECLARE @LogBackupFileCount INT = 0;
DECLARE @vores TABLE (Vcore INT)
DECLARE @FullBackup FLOAT, @DiffBackup FLOAT, @LogBackup FLOAT;
DECLARE @vcore_count FLOAT, @full_backup_size_GB FLOAT, @diff_backup_size_GB FLOAT, @total_log_backup_size_GB FLOAT, @log_backup_count FLOAT, @Intercept FLOAT;
— Check if the temporary table exists, drop and create
IF OBJECT_ID(‘tempdb..#dataSET’) IS NOT NULL
DROP TABLE #dataSET;
IF OBJECT_ID(‘tempdb..#FinalResult’) IS NOT NULL
DROP TABLE #FinalResult;
CREATE TABLE #dataSET (
Id INT IDENTITY(1, 1),
database_name NVARCHAR(256),
backup_start_date DATETIME2,
backup_finish_date DATETIME2,
[type] NVARCHAR(64),
backupsize_GB FLOAT
);
CREATE TABLE #FinalResult (
DatabaseName NVARCHAR(256),
Backup_start_date DATETIME2,
Backup_finish_date DATETIME2,
[Type] NVARCHAR(64),
Backupsize_GB FLOAT,
BackupFileCount int
);
— CTE to find the latest full backup before the specified PITR time
WITH LatestFullBackup AS (
SELECT TOP 1 database_name, checkpoint_lsn, backup_finish_date
FROM msdb.dbo.backupSET AS bs
WHERE database_name = @Database AND [type] = ‘D’ AND backup_finish_date < @PITR_Time
ORDER BY backup_start_date DESC
)
— Populate the #dataSET temporary table
INSERT INTO #dataSET
SELECT DISTINCT
bs.database_name,
bs.backup_start_date,
bs.backup_finish_date,
CASE bs.[type]
WHEN ‘D’ THEN ‘Full Backup’
WHEN ‘I’ THEN ‘Differential Backup’
WHEN ‘L’ THEN ‘Log Backup’
ELSE ‘Unknown’
END AS backup_type,
bs.backup_size / 1024 / 1024 / 1024 AS backup_size_GB
FROM
msdb.dbo.backupSET AS bs
JOIN msdb.dbo.backupmediafamily AS bmf ON bs.media_SET_id = bmf.media_SET_id
JOIN LatestFullBackup AS lfb ON bs.database_name = lfb.database_name AND (bs.database_backup_lsn = lfb.checkpoint_lsn OR lfb.checkpoint_lsn = bs.checkpoint_lsn)
WHERE
bs.database_name = @Database AND bs.backup_finish_date < @PITR_Checkpoint
ORDER BY
bs.backup_start_date DESC;
— Get log backup count
IF EXISTS (SELECT TOP 1 id FROM #dataSET WHERE [type] = ‘Differential Backup’ and backup_finish_date <= @PITR_Time)
BEGIN
SELECT @LogBackupFileCount = COUNT(1)
FROM #dataSET
WHERE [type] = ‘Log Backup’ and
id <= (SELECT TOP 1 id FROM #dataSET WHERE [type] = ‘Differential Backup’ and backup_finish_date <= @PITR_Time) and
id >= (SELECT TOP 1 id FROM #dataSET WHERE [type] = ‘Log Backup’ and backup_finish_date >= @PITR_Time ORDER BY id DESC)
END
ELSE
BEGIN
SELECT @LogBackupFileCount = COUNT(1)
FROM #dataSET
WHERE [type] = ‘Log Backup’ and
id >= (SELECT TOP 1 id FROM #dataSET WHERE [type] = ‘Log Backup’ and backup_finish_date >= @PITR_Time ORDER BY id DESC)
END
— Get full database backup
INSERT INTO #FinalResult
SELECT
database_name,
backup_start_date,
backup_finish_date,
[type],
ROUND(backupsize_GB, 2) AS backupsize_GB,
1
FROM #dataSET
WHERE [type] = ‘Full Backup’;
— Get log backups
IF EXISTS (SELECT TOP 1 id FROM #dataSET WHERE [type] = ‘Differential Backup’ and backup_finish_date <= @PITR_Time)
BEGIN
INSERT INTO #FinalResult
SELECT
database_name,
MIN(backup_start_date) AS min_backup_start_date,
MAX(backup_finish_date) AS max_backup_finish_date,
‘Log Backup’,
ROUND(SUM(backupsize_GB), 2),
@LogBackupFileCount
FROM #dataSET
WHERE [type] = ‘Log Backup’ and
id <= (SELECT TOP 1 id FROM #dataSET WHERE [type] = ‘Differential Backup’ and backup_finish_date <= @PITR_Time) and
id >= (SELECT TOP 1 id FROM #dataSET WHERE [type] = ‘Log Backup’ and backup_finish_date >= @PITR_Time ORDER BY id DESC)
GROUP BY database_name;
END
ELSE
BEGIN
INSERT INTO #FinalResult
SELECT
database_name,
MIN(backup_start_date) AS min_backup_start_date,
MAX(backup_finish_date) AS max_backup_finish_date,
‘Log Backup’,
ROUND(SUM(backupsize_GB), 2),
@LogBackupFileCount
FROM #dataSET
WHERE [type] = ‘Log Backup’
AND id >= (
SELECT TOP 1 id
FROM #dataSET
WHERE [type] = ‘Log Backup’
AND backup_finish_date >= @PITR_Time
ORDER BY id DESC
)
GROUP BY database_name;
END
— Get differential backups
INSERT INTO #FinalResult
SELECT TOP 1
database_name,
backup_start_date,
backup_finish_date,
[type],
ROUND(backupsize_GB, 2),
1
FROM #dataSET
WHERE [type] = ‘Differential Backup’
AND backup_finish_date <= @PITR_Time
ORDER BY Id ASC;
SELECT ‘Backups required for PITR restore’ as Description, *
FROM #FinalResult
ORDER BY backup_finish_date;
SELECT @FullBackup = ISNULL(Backupsize_GB, 0)
FROM #FinalResult
WHERE Type = ‘Full Backup’;
SELECT @DiffBackup = ISNULL(Backupsize_GB, 10)
FROM #FinalResult
WHERE Type = ‘Differential Backup’;
SELECT @LogBackup = ISNULL(Backupsize_GB, 0)
FROM #FinalResult
WHERE Type = ‘Log Backup’;
IF @DiffBackup IS NULL
SET @DiffBackup = 0;
— Calculate for General purpose SLO
IF (@FullBackup <= 450)
BEGIN
SET @Intercept = 4.25409547;
SET @vcore_count = -0.204980468;
SET @full_backup_size_GB = 0.135130245;
SET @diff_backup_size_GB = 0.163912245;
SET @log_backup_count = 0.081560044;
SET @total_log_backup_size_GB = 1.144489012;
SELECT
‘SQL MI General Purpose Database Restore Service Level Expectation (Minutes)’ AS SLO,
IIF(
ROUND(@Intercept + (4 * @vcore_count) + (@Fullbackup * @full_backup_size_GB) + (@Diffbackup * @diff_backup_size_GB) + (@LogBackupFileCount * @log_backup_count) + (@LogBackup * @total_log_backup_size_GB), 2) > 1,
ROUND(@Intercept + (4 * @vcore_count) + (@Fullbackup * @full_backup_size_GB) + (@Diffbackup * @diff_backup_size_GB) + (@LogBackupFileCount * @log_backup_count) + (@LogBackup * @total_log_backup_size_GB), 2),
1
) AS GP_4_vCores,
IIF(
ROUND(@Intercept + (8 * @vcore_count) + (@Fullbackup * @full_backup_size_GB) + (@Diffbackup * @diff_backup_size_GB) + (@LogBackupFileCount * @log_backup_count) + (@LogBackup * @total_log_backup_size_GB), 2) > 1,
ROUND(@Intercept + (8 * @vcore_count) + (@Fullbackup * @full_backup_size_GB) + (@Diffbackup * @diff_backup_size_GB) + (@LogBackupFileCount * @log_backup_count) + (@LogBackup * @total_log_backup_size_GB), 2),
1
) AS GP_8_vCores,
IIF(
ROUND(@Intercept + (16 * @vcore_count) + (@Fullbackup * @full_backup_size_GB) + (@Diffbackup * @diff_backup_size_GB) + (@LogBackupFileCount * @log_backup_count) + (@LogBackup * @total_log_backup_size_GB), 2) > 1,
ROUND(@Intercept + (16 * @vcore_count) + (@Fullbackup * @full_backup_size_GB) + (@Diffbackup * @diff_backup_size_GB) + (@LogBackupFileCount * @log_backup_count) + (@LogBackup * @total_log_backup_size_GB), 2),
1
) AS GP_16_vCores,
IIF(
ROUND(@Intercept + (24 * @vcore_count) + (@Fullbackup * @full_backup_size_GB) + (@Diffbackup * @diff_backup_size_GB) + (@LogBackupFileCount * @log_backup_count) + (@LogBackup * @total_log_backup_size_GB), 2) > 1,
ROUND(@Intercept + (24 * @vcore_count) + (@Fullbackup * @full_backup_size_GB) + (@Diffbackup * @diff_backup_size_GB) + (@LogBackupFileCount * @log_backup_count) + (@LogBackup * @total_log_backup_size_GB), 2),
1
) AS GP_24_vCores,
IIF(
ROUND(@Intercept + (32 * @vcore_count) + (@Fullbackup * @full_backup_size_GB) + (@Diffbackup * @diff_backup_size_GB) + (@LogBackupFileCount * @log_backup_count) + (@LogBackup * @total_log_backup_size_GB), 2) > 1,
ROUND(@Intercept + (32 * @vcore_count) + (@Fullbackup * @full_backup_size_GB) + (@Diffbackup * @diff_backup_size_GB) + (@LogBackupFileCount * @log_backup_count) + (@LogBackup * @total_log_backup_size_GB), 2),
1
) AS GP_32_vCores;
END
ELSE IF (@FullBackup > 450 AND @FullBackup <= 1500)
BEGIN
SET @Intercept = 34.85312089;
SET @vcore_count = -0.17126727;
SET @full_backup_size_GB = 0.035174866;
SET @diff_backup_size_GB = 0.296089387;
SET @log_backup_count = 0.141328736;
SET @total_log_backup_size_GB = 0.774604541;
SELECT
‘SQL MI General Purpose Database Restore Service Level Expectation (Minutes)’ AS SLO,
IIF(
ROUND(@Intercept + (4 * @vcore_count) + (@Fullbackup * @full_backup_size_GB) + (@Diffbackup * @diff_backup_size_GB) + (@LogBackupFileCount * @log_backup_count) + (@LogBackup * @total_log_backup_size_GB), 2) > 1,
ROUND(@Intercept + (4 * @vcore_count) + (@Fullbackup * @full_backup_size_GB) + (@Diffbackup * @diff_backup_size_GB) + (@LogBackupFileCount * @log_backup_count) + (@LogBackup * @total_log_backup_size_GB), 2),
1
) AS GP_4_vCores,
IIF(
ROUND(@Intercept + (8 * @vcore_count) + (@Fullbackup * @full_backup_size_GB) + (@Diffbackup * @diff_backup_size_GB) + (@LogBackupFileCount * @log_backup_count) + (@LogBackup * @total_log_backup_size_GB), 2) > 1,
ROUND(@Intercept + (8 * @vcore_count) + (@Fullbackup * @full_backup_size_GB) + (@Diffbackup * @diff_backup_size_GB) + (@LogBackupFileCount * @log_backup_count) + (@LogBackup * @total_log_backup_size_GB), 2),
1
) AS GP_8_vCores,
IIF(
ROUND(@Intercept + (16 * @vcore_count) + (@Fullbackup * @full_backup_size_GB) + (@Diffbackup * @diff_backup_size_GB) + (@LogBackupFileCount * @log_backup_count) + (@LogBackup * @total_log_backup_size_GB), 2) > 1,
ROUND(@Intercept + (16 * @vcore_count) + (@Fullbackup * @full_backup_size_GB) + (@Diffbackup * @diff_backup_size_GB) + (@LogBackupFileCount * @log_backup_count) + (@LogBackup * @total_log_backup_size_GB), 2),
1
) AS GP_16_vCores,
IIF(
ROUND(@Intercept + (24 * @vcore_count) + (@Fullbackup * @full_backup_size_GB) + (@Diffbackup * @diff_backup_size_GB) + (@LogBackupFileCount * @log_backup_count) + (@LogBackup * @total_log_backup_size_GB), 2) > 1,
ROUND(@Intercept + (24 * @vcore_count) + (@Fullbackup * @full_backup_size_GB) + (@Diffbackup * @diff_backup_size_GB) + (@LogBackupFileCount * @log_backup_count) + (@LogBackup * @total_log_backup_size_GB), 2),
1
) AS GP_24_vCores,
IIF(
ROUND(@Intercept + (32 * @vcore_count) + (@Fullbackup * @full_backup_size_GB) + (@Diffbackup * @diff_backup_size_GB) + (@LogBackupFileCount * @log_backup_count) + (@LogBackup * @total_log_backup_size_GB), 2) > 1,
ROUND(@Intercept + (32 * @vcore_count) + (@Fullbackup * @full_backup_size_GB) + (@Diffbackup * @diff_backup_size_GB) + (@LogBackupFileCount * @log_backup_count) + (@LogBackup * @total_log_backup_size_GB), 2),
1
) AS GP_32_vCores;
END
ELSE IF (@FullBackup > 1500)
BEGIN
SET @Intercept = 112.8998712;
SET @vcore_count = -2.211989797;
SET @full_backup_size_GB = 0.027989985;
SET @diff_backup_size_GB = 0.216673484;
SET @log_backup_count = 0.000731196;
SET @total_log_backup_size_GB = 0.688150443;
SELECT
‘SQL MI General Purpose Database Restore Service Level Expectation (Minutes)’ AS SLO,
IIF(
ROUND(@Intercept + (8 * @vcore_count) + (@Fullbackup * @full_backup_size_GB) + (@Diffbackup * @diff_backup_size_GB) + (@LogBackupFileCount * @log_backup_count) + (@LogBackup * @total_log_backup_size_GB), 2) > 1,
ROUND(@Intercept + (8 * @vcore_count) + (@Fullbackup * @full_backup_size_GB) + (@Diffbackup * @diff_backup_size_GB) + (@LogBackupFileCount * @log_backup_count) + (@LogBackup * @total_log_backup_size_GB), 2),
1
) AS GP_8_vCores,
IIF(
ROUND(@Intercept + (16 * @vcore_count) + (@Fullbackup * @full_backup_size_GB) + (@Diffbackup * @diff_backup_size_GB) + (@LogBackupFileCount * @log_backup_count) + (@LogBackup * @total_log_backup_size_GB), 2) > 1,
ROUND(@Intercept + (16 * @vcore_count) + (@Fullbackup * @full_backup_size_GB) + (@Diffbackup * @diff_backup_size_GB) + (@LogBackupFileCount * @log_backup_count) + (@LogBackup * @total_log_backup_size_GB), 2),
1
) AS GP_16_vCores,
IIF(
ROUND(@Intercept + (24 * @vcore_count) + (@Fullbackup * @full_backup_size_GB) + (@Diffbackup * @diff_backup_size_GB) + (@LogBackupFileCount * @log_backup_count) + (@LogBackup * @total_log_backup_size_GB), 2) > 1,
ROUND(@Intercept + (24 * @vcore_count) + (@Fullbackup * @full_backup_size_GB) + (@Diffbackup * @diff_backup_size_GB) + (@LogBackupFileCount * @log_backup_count) + (@LogBackup * @total_log_backup_size_GB), 2),
1
) AS GP_24_vCores,
IIF(
ROUND(@Intercept + (32 * @vcore_count) + (@Fullbackup * @full_backup_size_GB) + (@Diffbackup * @diff_backup_size_GB) + (@LogBackupFileCount * @log_backup_count) + (@LogBackup * @total_log_backup_size_GB), 2) > 1,
ROUND(@Intercept + (32 * @vcore_count) + (@Fullbackup * @full_backup_size_GB) + (@Diffbackup * @diff_backup_size_GB) + (@LogBackupFileCount * @log_backup_count) + (@LogBackup * @total_log_backup_size_GB), 2),
1
) AS GP_32_vCores;
END
— Calculate for Business Critical SLO
IF (@FullBackup <= 400)
BEGIN
SET @Intercept = 1.909336538;
SET @vcore_count = -0.041561702;
SET @full_backup_size_GB = 0.066450199;
SET @diff_backup_size_GB = 0.07559076;
SET @log_backup_count = 0.033185528;
SET @total_log_backup_size_GB = 0.297277091;
SELECT
‘SQL MI Business Critical Database Restore Service Level Expectation (Minutes)’ AS SLO,
IIF(
ROUND(@Intercept + (4 * @vcore_count) + (@Fullbackup * @full_backup_size_GB) + (@Diffbackup * @diff_backup_size_GB) + (@LogBackupFileCount * @log_backup_count) + (@LogBackup * @total_log_backup_size_GB), 2) > 1,
ROUND(@Intercept + (4 * @vcore_count) + (@Fullbackup * @full_backup_size_GB) + (@Diffbackup * @diff_backup_size_GB) + (@LogBackupFileCount * @log_backup_count) + (@LogBackup * @total_log_backup_size_GB), 2),
1
) AS BC_4_vCores,
IIF(
ROUND(@Intercept + (8 * @vcore_count) + (@Fullbackup * @full_backup_size_GB) + (@Diffbackup * @diff_backup_size_GB) + (@LogBackupFileCount * @log_backup_count) + (@LogBackup * @total_log_backup_size_GB), 2) > 1,
ROUND(@Intercept + (8 * @vcore_count) + (@Fullbackup * @full_backup_size_GB) + (@Diffbackup * @diff_backup_size_GB) + (@LogBackupFileCount * @log_backup_count) + (@LogBackup * @total_log_backup_size_GB), 2),
1
) AS BC_8_vCores,
IIF(
ROUND(@Intercept + (16 * @vcore_count) + (@Fullbackup * @full_backup_size_GB) + (@Diffbackup * @diff_backup_size_GB) + (@LogBackupFileCount * @log_backup_count) + (@LogBackup * @total_log_backup_size_GB), 2) > 1,
ROUND(@Intercept + (16 * @vcore_count) + (@Fullbackup * @full_backup_size_GB) + (@Diffbackup * @diff_backup_size_GB) + (@LogBackupFileCount * @log_backup_count) + (@LogBackup * @total_log_backup_size_GB), 2),
1
) AS BC_16_vCores,
IIF(
ROUND(@Intercept + (24 * @vcore_count) + (@Fullbackup * @full_backup_size_GB) + (@Diffbackup * @diff_backup_size_GB) + (@LogBackupFileCount * @log_backup_count) + (@LogBackup * @total_log_backup_size_GB), 2) > 1,
ROUND(@Intercept + (24 * @vcore_count) + (@Fullbackup * @full_backup_size_GB) + (@Diffbackup * @diff_backup_size_GB) + (@LogBackupFileCount * @log_backup_count) + (@LogBackup * @total_log_backup_size_GB), 2),
1
) AS BC_24_vCores,
IIF(
ROUND(@Intercept + (32 * @vcore_count) + (@Fullbackup * @full_backup_size_GB) + (@Diffbackup * @diff_backup_size_GB) + (@LogBackupFileCount * @log_backup_count) + (@LogBackup * @total_log_backup_size_GB), 2) > 1,
ROUND(@Intercept + (32 * @vcore_count) + (@Fullbackup * @full_backup_size_GB) + (@Diffbackup * @diff_backup_size_GB) + (@LogBackupFileCount * @log_backup_count) + (@LogBackup * @total_log_backup_size_GB), 2),
1
) AS BC_32_vCores;
END
ELSE IF (@FullBackup > 400 AND @FullBackup <= 2000)
BEGIN
SET @Intercept = 17.95688297;
SET @vcore_count = -0.556172229;
SET @full_backup_size_GB = 0.050444533;
SET @diff_backup_size_GB = 0.013286432;
SET @log_backup_count = 0.05186371;
SET @total_log_backup_size_GB = 0.180903906;
SELECT
‘SQL MI Business Critical Database Restore Service Level Expectation (Minutes)’ AS SLO,
IIF(
ROUND(@Intercept + (4 * @vcore_count) + (@Fullbackup * @full_backup_size_GB) + (@Diffbackup * @diff_backup_size_GB) + (@LogBackupFileCount * @log_backup_count) + (@LogBackup * @total_log_backup_size_GB), 2) > 1,
ROUND(@Intercept + (4 * @vcore_count) + (@Fullbackup * @full_backup_size_GB) + (@Diffbackup * @diff_backup_size_GB) + (@LogBackupFileCount * @log_backup_count) + (@LogBackup * @total_log_backup_size_GB), 2),
1
) AS BC_4_vCores,
IIF(
ROUND(@Intercept + (8 * @vcore_count) + (@Fullbackup * @full_backup_size_GB) + (@Diffbackup * @diff_backup_size_GB) + (@LogBackupFileCount * @log_backup_count) + (@LogBackup * @total_log_backup_size_GB), 2) > 1,
ROUND(@Intercept + (8 * @vcore_count) + (@Fullbackup * @full_backup_size_GB) + (@Diffbackup * @diff_backup_size_GB) + (@LogBackupFileCount * @log_backup_count) + (@LogBackup * @total_log_backup_size_GB), 2),
1
) AS BC_8_vCores,
IIF(
ROUND(@Intercept + (16 * @vcore_count) + (@Fullbackup * @full_backup_size_GB) + (@Diffbackup * @diff_backup_size_GB) + (@LogBackupFileCount * @log_backup_count) + (@LogBackup * @total_log_backup_size_GB), 2) > 1,
ROUND(@Intercept + (16 * @vcore_count) + (@Fullbackup * @full_backup_size_GB) + (@Diffbackup * @diff_backup_size_GB) + (@LogBackupFileCount * @log_backup_count) + (@LogBackup * @total_log_backup_size_GB), 2),
1
) AS BC_16_vCores,
IIF(
ROUND(@Intercept + (24 * @vcore_count) + (@Fullbackup * @full_backup_size_GB) + (@Diffbackup * @diff_backup_size_GB) + (@LogBackupFileCount * @log_backup_count) + (@LogBackup * @total_log_backup_size_GB), 2) > 1,
ROUND(@Intercept + (24 * @vcore_count) + (@Fullbackup * @full_backup_size_GB) + (@Diffbackup * @diff_backup_size_GB) + (@LogBackupFileCount * @log_backup_count) + (@LogBackup * @total_log_backup_size_GB), 2),
1
) AS BC_24_vCores,
IIF(
ROUND(@Intercept + (32 * @vcore_count) + (@Fullbackup * @full_backup_size_GB) + (@Diffbackup * @diff_backup_size_GB) + (@LogBackupFileCount * @log_backup_count) + (@LogBackup * @total_log_backup_size_GB), 2) > 1,
ROUND(@Intercept + (32 * @vcore_count) + (@Fullbackup * @full_backup_size_GB) + (@Diffbackup * @diff_backup_size_GB) + (@LogBackupFileCount * @log_backup_count) + (@LogBackup * @total_log_backup_size_GB), 2),
1
) AS BC_32_vCores;
END
ELSE IF (@FullBackup > 2000)
BEGIN
SET @Intercept = 177.1004847;
SET @vcore_count = -1.766472406;
SET @full_backup_size_GB = 0.017504939;
SET @diff_backup_size_GB = -0.044402262;
SET @log_backup_count = -0.164065766;
SET @total_log_backup_size_GB = 0.401938221;
SELECT
‘SQL MI Business Critical Database Restore Service Level Expectation (Minutes)’ AS SLO,
IIF(
ROUND(@Intercept + (16 * @vcore_count) + (@Fullbackup * @full_backup_size_GB) + (@Diffbackup * @diff_backup_size_GB) + (@LogBackupFileCount * @log_backup_count) + (@LogBackup * @total_log_backup_size_GB), 2) > 1,
ROUND(@Intercept + (16 * @vcore_count) + (@Fullbackup * @full_backup_size_GB) + (@Diffbackup * @diff_backup_size_GB) + (@LogBackupFileCount * @log_backup_count) + (@LogBackup * @total_log_backup_size_GB), 2),
1
) AS BC_16_vCores,
IIF(
ROUND(@Intercept + (20 * @vcore_count) + (@Fullbackup * @full_backup_size_GB) + (@Diffbackup * @diff_backup_size_GB) + (@LogBackupFileCount * @log_backup_count) + (@LogBackup * @total_log_backup_size_GB), 2) > 1,
ROUND(@Intercept + (20 * @vcore_count) + (@Fullbackup * @full_backup_size_GB) + (@Diffbackup * @diff_backup_size_GB) + (@LogBackupFileCount * @log_backup_count) + (@LogBackup * @total_log_backup_size_GB), 2),
1
) AS BC_20_vCores,
IIF(
ROUND(@Intercept + (24 * @vcore_count) + (@Fullbackup * @full_backup_size_GB) + (@Diffbackup * @diff_backup_size_GB) + (@LogBackupFileCount * @log_backup_count) + (@LogBackup * @total_log_backup_size_GB), 2) > 1,
ROUND(@Intercept + (24 * @vcore_count) + (@Fullbackup * @full_backup_size_GB) + (@Diffbackup * @diff_backup_size_GB) + (@LogBackupFileCount * @log_backup_count) + (@LogBackup * @total_log_backup_size_GB), 2),
1
) AS BC_24_vCores,
IIF(
ROUND(@Intercept + (32 * @vcore_count) + (@Fullbackup * @full_backup_size_GB) + (@Diffbackup * @diff_backup_size_GB) + (@LogBackupFileCount * @log_backup_count) + (@LogBackup * @total_log_backup_size_GB), 2) > 1,
ROUND(@Intercept + (32 * @vcore_count) + (@Fullbackup * @full_backup_size_GB) + (@Diffbackup * @diff_backup_size_GB) + (@LogBackupFileCount * @log_backup_count) + (@LogBackup * @total_log_backup_size_GB), 2),
1
) AS BC_32_vCores;
END
END TRY
BEGIN CATCH
— Handle the exception
PRINT ‘An error occurred: ‘ + ERROR_MESSAGE();
END CATCH
— Drop the temporary tables outside the try-catch block
IF OBJECT_ID(‘tempdb..#dataSET’) IS NOT NULL
DROP TABLE #dataSET;
IF OBJECT_ID(‘tempdb..#FinalResult’) IS NOT NULL
DROP TABLE #FinalResult;
END
GO
Microsoft Tech Community – Latest Blogs –Read More
Consulado da Portela de São Paulo
Gostaríamos de conexões com pessoas que possam ajudar nossa entidade a prosperar, também estamos à disposição para apoiar
Gostaríamos de conexões com pessoas que possam ajudar nossa entidade a prosperar, também estamos à disposição para apoiar Read More
Need Help in moving Desktop Pro 2021 to Desktop Pro 2024
Why everything stuck, I want to upgrade from QuickBook’s Desktop Pro 2021 to Desktop Pro 2024.
Why everything stuck, I want to upgrade from QuickBook’s Desktop Pro 2021 to Desktop Pro 2024. Read More
sync the computer attributes from onprem AD to azure ad
Hi Team,
Am need solution to sync the onprem Device distinguishedname to azure AD for creating filter on conditional access & create dynamic device group please let us know the possible solution.
Hi Team,Am need solution to sync the onprem Device distinguishedname to azure AD for creating filter on conditional access & create dynamic device group please let us know the possible solution. Read More
AI Summit Brasil 2024
Vou um prazer participar do evento da Microsoft. Muito bem organizado, excelente conteúdo!
Retribuo, convidando vocês para participarem do AI Summit Brasil 2024, que acontecerá nos dias 02 e 03 de Setembro, em que teremos Waldemir Cambiucci falando sobre IA responsável.
Até lá!!!
Vou um prazer participar do evento da Microsoft. Muito bem organizado, excelente conteúdo! Retribuo, convidando vocês para participarem do AI Summit Brasil 2024, que acontecerá nos dias 02 e 03 de Setembro, em que teremos Waldemir Cambiucci falando sobre IA responsável. https://aisummit.org.br/ Até lá!!! Read More
All ADF dataflow stages disappear after publish
Hello,
I made a source for my Dataflow and added several stages to the existing one. I published Dataflow and got publish succes. After run pipeline showed an error ina new source in dataflow. I tried to delete it (new source), but could not delete it. I succes published again and all my stages disappeared.
It is not possible to publish data flow like a this, because I will get an error, because there is no connected sink. In any case, it should have been publish with all steps.
How to get them back, why did they disappear and what happened?
And so it was before that. (It’s from last pipeline monitoring)
How it’s possible?
Hello,I made a source for my Dataflow and added several stages to the existing one. I published Dataflow and got publish succes. After run pipeline showed an error ina new source in dataflow. I tried to delete it (new source), but could not delete it. I succes published again and all my stages disappeared.It is not possible to publish data flow like a this, because I will get an error, because there is no connected sink. In any case, it should have been publish with all steps.How to get them back, why did they disappear and what happened?And so it was before that. (It’s from last pipeline monitoring)How it’s possible? Read More
SharePoint Access Request Email Form Error
All users are receiving the following error when clicking Approve or Decline in the email form for SharePoint access requests: Could not complete the requested action. Please try again later.
These users have the appropriate permissions as the workaround to navigate to the SharePoint Online access request page and Approve/Decline works without issue. The error is only occurring in Outlook (desktop client and web client).
I’ve attached a couple screenshots. The error pops up whether you select Edit or View, and whether you click Approve or Decline.
I do have a ticket open with Microsoft, but it’s been open for a week and no resolution yet, so I’m hoping to find someone else who’s seen this and solved it.
All users are receiving the following error when clicking Approve or Decline in the email form for SharePoint access requests: Could not complete the requested action. Please try again later. These users have the appropriate permissions as the workaround to navigate to the SharePoint Online access request page and Approve/Decline works without issue. The error is only occurring in Outlook (desktop client and web client). I’ve attached a couple screenshots. The error pops up whether you select Edit or View, and whether you click Approve or Decline. I do have a ticket open with Microsoft, but it’s been open for a week and no resolution yet, so I’m hoping to find someone else who’s seen this and solved it. Read More
Impossible to recover my outlook account
Helloeveryone
i need to recover my account that i don’t have accees anymore , it has been years that im trying to get back my account which has a really important information for me , i changed my number that is linked to this account so i cant link my account number anymore to do the sms verification which is the only option that i have
for the recovery method i have try it several time and all the times it fail because the leak of information that i have.
thats normal i have a leak of information because it has been years that i’m trying to get it back but always unable to do anything
good think i have the phone number associted to this email please someone from microsoft support team contact me in private and i will give you all information about this account
Thank you
please this is very important i neeed your HELP !
Helloeveryone i need to recover my account that i don’t have accees anymore , it has been years that im trying to get back my account which has a really important information for me , i changed my number that is linked to this account so i cant link my account number anymore to do the sms verification which is the only option that i have for the recovery method i have try it several time and all the times it fail because the leak of information that i have.thats normal i have a leak of information because it has been years that i’m trying to get it back but always unable to do anything good think i have the phone number associted to this email please someone from microsoft support team contact me in private and i will give you all information about this account Thank you please this is very important i neeed your HELP ! Read More
All Chat History
I have a user who wants to see all their chats appear in the left hand side navigation under ‘Chat’. There are currently two options ‘pinned’ and ‘recent’ where chats appear. Is there a limit or date range for chats that appear under recent? and is there any way to get all of a users chats messages to appear here?
I have a user who wants to see all their chats appear in the left hand side navigation under ‘Chat’. There are currently two options ‘pinned’ and ‘recent’ where chats appear. Is there a limit or date range for chats that appear under recent? and is there any way to get all of a users chats messages to appear here? Read More
Windows Clock and System Clock discrepency
We have an issue with Windows 11 domain joined workstations that have Windows Clock lagging behind the System Clock; resulting in shared folder browsing, Start Menu icons not appearing correctly, etc
SFC /SCANNOW found no issues.
Where should I go to find logs to begin investigating the issue?
We have an issue with Windows 11 domain joined workstations that have Windows Clock lagging behind the System Clock; resulting in shared folder browsing, Start Menu icons not appearing correctly, etc SFC /SCANNOW found no issues. Where should I go to find logs to begin investigating the issue? Read More
New T-SQL Enhancements in SQL Server 2022 | Data Exposed: MVP Edition
Not yet up to speed on all the new T-SQL Enhancements available in SQL Server 2022 and Azure SQL Database? Want to learn nine new capabilities in just nine minutes? Join MVP Lenni Lobel on Data Exposed to learn about DATE_BUCKET, DATETRUNC, LEAST and GREATEST, STRING_SPLIT ordinal, GENERATE_SERIES, TRIM improvements, IS DISTINCT FROM, Windowing improvements, and bit functions… each with a live demo, all in just 9 minutes!
Resources:
View/share our latest episodes on Microsoft Learn and YouTube!
Microsoft Tech Community – Latest Blogs –Read More
Troubleshooting S9 Game
Hey everyone! If you’ve run into any technical snags while playing S9 Game, whether it’s pesky glitches, performance hiccups, or platform-specific issues, this is the spot to share. Let’s pool our knowledge, brainstorm solutions, and help each other out to ensure a smoother gaming experience for all!. I downloaded it from trusted website 9game.cc.
Hey everyone! If you’ve run into any technical snags while playing S9 Game, whether it’s pesky glitches, performance hiccups, or platform-specific issues, this is the spot to share. Let’s pool our knowledge, brainstorm solutions, and help each other out to ensure a smoother gaming experience for all!. I downloaded it from trusted website 9game.cc. Read More
What is the Maximum Transfer Limit on PayPal?
Technically, there is no limit on your maximum transfer amount if you have a verified PayPal account at +1(801)512-0224. But generally, the maximum transfer limit for a single transaction is $60,000. If you don’t have a PayPal account, you can send a one-time payment of up to $4,000 USD.
What is the PayPal Transfer Limit Per Day?
The maximum amount that can be transferred in a single day is $100,000. But sometimes, the maximum transfer limit on PayPal varies depending on the user’s account type, location, PayPal balance, and history.
If you haven’t verified your PayPal account, there’s a limit on the total amount of money you can send. There is also a $500 maximum limit for withdrawals from an unverified account. Additional PayPal fees may apply.
Bank Account Transfer Limits
PayPal’s instant transfer limit for banks is a daily maximum of $25,000 and is $10,000 per transaction. This is significantly higher than the maximum sending limit if you connect through a debit card because your identity is already verified.
Debit Card Transfer Limits
When connecting with a debit card, there is a minimum transfer amount of $1.00 per transaction. The maximum limits for debit card payments are as follows:
$5,000 maximum per transaction
$5,000 maximum per day and week
$15,000 maximum per month
Lifting PayPal Transfer Limits
PayPal works better when you trust them. The best way to get around strict transfer limits is to become a verified user of the online payment platform. When you provide more information about your identity, it eliminates security risks.
Once your account is verified, a business can send up to $60,000 per transaction. There are also no limits on how much you can send from your PayPal account overall. Since your financial institution is now connected, the transactions become more like a bank transfer or a wire transfer.
Freelancers and small business owners should be the first to verify their accounts. Having a higher maximum limit enables you to use PayPal as a more reliable payment method. You can also accept payments more quickly with a verified account.
Technically, there is no limit on your maximum transfer amount if you have a verified PayPal account at +1(801)512-0224. But generally, the maximum transfer limit for a single transaction is $60,000. If you don’t have a PayPal account, you can send a one-time payment of up to $4,000 USD. What is the PayPal Transfer Limit Per Day?The maximum amount that can be transferred in a single day is $100,000. But sometimes, the maximum transfer limit on PayPal varies depending on the user’s account type, location, PayPal balance, and history.If you haven’t verified your PayPal account, there’s a limit on the total amount of money you can send. There is also a $500 maximum limit for withdrawals from an unverified account. Additional PayPal fees may apply. Bank Account Transfer LimitsPayPal’s instant transfer limit for banks is a daily maximum of $25,000 and is $10,000 per transaction. This is significantly higher than the maximum sending limit if you connect through a debit card because your identity is already verified. Debit Card Transfer Limits When connecting with a debit card, there is a minimum transfer amount of $1.00 per transaction. The maximum limits for debit card payments are as follows: $5,000 maximum per transaction$5,000 maximum per day and week$15,000 maximum per month Lifting PayPal Transfer Limits PayPal works better when you trust them. The best way to get around strict transfer limits is to become a verified user of the online payment platform. When you provide more information about your identity, it eliminates security risks.Once your account is verified, a business can send up to $60,000 per transaction. There are also no limits on how much you can send from your PayPal account overall. Since your financial institution is now connected, the transactions become more like a bank transfer or a wire transfer.Freelancers and small business owners should be the first to verify their accounts. Having a higher maximum limit enables you to use PayPal as a more reliable payment method. You can also accept payments more quickly with a verified account. Read More
How to disable new item glimmer icon in a SharePoint Online list?
I have a PowerAutomate workflow that daily deletes all the items in a SharePoint Online list and repopulates the lists afterwards. Of course, all items are always shown as “New Items” with the glimmer icon in the list view (3 blue lines radiating from top left corner):
I want to avoid showing this icon for aesthetic reasons. Is there a way to disable this glimmer icon for a specific list? Either disabling through a PowerShell command or hide it from the list view through JSON column formatting? Thank you for your help, as I couldn’t find any information apart from outdated solutions for SharePoint 2013.
I have a PowerAutomate workflow that daily deletes all the items in a SharePoint Online list and repopulates the lists afterwards. Of course, all items are always shown as “New Items” with the glimmer icon in the list view (3 blue lines radiating from top left corner): I want to avoid showing this icon for aesthetic reasons. Is there a way to disable this glimmer icon for a specific list? Either disabling through a PowerShell command or hide it from the list view through JSON column formatting? Thank you for your help, as I couldn’t find any information apart from outdated solutions for SharePoint 2013. Read More
Why I am receiving error H202 indicating a communication problem with the server.
Our bookkeeper uses her Win10 pro workstation to access QB, which is operated on its own Win10 pro “server”. She has access to two company databases. Suddenly, this morning, she is having trouble exiting Single User mode on the primary company database. She encounters the H202 problem when she attempts.
Our bookkeeper uses her Win10 pro workstation to access QB, which is operated on its own Win10 pro “server”. She has access to two company databases. Suddenly, this morning, she is having trouble exiting Single User mode on the primary company database. She encounters the H202 problem when she attempts. Read More