Category: Microsoft
Category Archives: Microsoft
New Blog | Simplify triage with the new Alert Timeline
Every second counts when it comes to detecting and responding to potential security breaches, and in today’s ever-evolving cybersecurity landscape, tools that facilitate rapid triage and decision-making become essential for upholding strong security hygiene.
Today, we’re excited to introduce the latest feature to our rich reporting feature set —the alert timeline—a new view that minimizes the time needed for triage and investigation without compromising the quality of analysis.
Simplifying alerts for faster response
Alerts are an important factor to consider amongst the broader data-mix of events, alerts, and incidents.
Alerts are primarily informational qualified events where predefined system logic indicates an issue before human input validates the information into an incident. Alerts mostly occur close to real-time, so that validation and restoration can happen as quickly as possible to keep MTTD (mean time to detect) and MTTR (mean time to resolve) low.
The new alert timeline revamps the way users interact with alerts in the Defender portal and adds a more nuanced layer of visibility to the telemetry data across your organization. The alert timeline is designed to complement the existing ‘process tree’ view and offers users a comprehensive perspective on each alert. While the process tree provides a detailed breakdown of the alert’s associated processes and activities, the alert timeline presents a condensed chronological view that facilitates rapid triage and decision-making.
Getting Started
Navigate to Investigation & Response –> Alerts in the Defender portal to explore the new alert timeline tab.
Read the full post here: Simplify triage with the new Alert Timeline
By Lior Liberman
Every second counts when it comes to detecting and responding to potential security breaches, and in today’s ever-evolving cybersecurity landscape, tools that facilitate rapid triage and decision-making become essential for upholding strong security hygiene.
Today, we’re excited to introduce the latest feature to our rich reporting feature set —the alert timeline—a new view that minimizes the time needed for triage and investigation without compromising the quality of analysis. Simplifying alerts for faster response
Alerts are an important factor to consider amongst the broader data-mix of events, alerts, and incidents.
Alerts are primarily informational qualified events where predefined system logic indicates an issue before human input validates the information into an incident. Alerts mostly occur close to real-time, so that validation and restoration can happen as quickly as possible to keep MTTD (mean time to detect) and MTTR (mean time to resolve) low.
The new alert timeline revamps the way users interact with alerts in the Defender portal and adds a more nuanced layer of visibility to the telemetry data across your organization. The alert timeline is designed to complement the existing ‘process tree’ view and offers users a comprehensive perspective on each alert. While the process tree provides a detailed breakdown of the alert’s associated processes and activities, the alert timeline presents a condensed chronological view that facilitates rapid triage and decision-making.
Getting Started
Navigate to Investigation & Response –> Alerts in the Defender portal to explore the new alert timeline tab.
Figure 1: Alert timeline in the Defender portal
Read the full post here: Simplify triage with the new Alert Timeline
Planner web part error when using anchor links
Hi
On a SharePoint Online Modern page, the Planner web part is displayed correctly when the page is loaded. However, after jumping to an anchor link on the page, the error ‘Page not found’ appears instead of the web part. Does anyone have any idea why this is the case or how the error could be avoided?
HiOn a SharePoint Online Modern page, the Planner web part is displayed correctly when the page is loaded. However, after jumping to an anchor link on the page, the error ‘Page not found’ appears instead of the web part. Does anyone have any idea why this is the case or how the error could be avoided? Read More
Planner Notification
Hello, we are trying to utilize the new Planner integration with Teams. We have a planner in the main Teams Channel. However, everyone who is part of that Teams Channel (a.k.a. the whole company) gets notifications when tasks are created in that planner. Is there a way to still use and keep that planner in that Teams Channel (so we can ensure everyone has access to it), but make it so that the planner does not notify everyone every time? Thank you.
Hello, we are trying to utilize the new Planner integration with Teams. We have a planner in the main Teams Channel. However, everyone who is part of that Teams Channel (a.k.a. the whole company) gets notifications when tasks are created in that planner. Is there a way to still use and keep that planner in that Teams Channel (so we can ensure everyone has access to it), but make it so that the planner does not notify everyone every time? Thank you. Read More
Truly am not understanding
I have a workbook with three sheets in it. I am having the hardest time figuring out how to create a formula to input data from a table into a column. Maybe what I want in my head just wont work idk. So I am likely not speaking the correct jargon, and for that I apologize. Sheet 1 is my main sheet and I would like to have information auto fill out when selecting certain items. Column B of Sheet one is a drop down list and I would love to have the item selected there to automatically input a number based on the selection into Column D. I have data from Sheet 2 supplying the information for Column one and cant seem to figure out how to get the data from Sheet 3 to work. I understand that I should use the VLOOKUP function, however thats about as far as I am able to figure out as nothing I create works.
I have a workbook with three sheets in it. I am having the hardest time figuring out how to create a formula to input data from a table into a column. Maybe what I want in my head just wont work idk. So I am likely not speaking the correct jargon, and for that I apologize. Sheet 1 is my main sheet and I would like to have information auto fill out when selecting certain items. Column B of Sheet one is a drop down list and I would love to have the item selected there to automatically input a number based on the selection into Column D. I have data from Sheet 2 supplying the information for Column one and cant seem to figure out how to get the data from Sheet 3 to work. I understand that I should use the VLOOKUP function, however thats about as far as I am able to figure out as nothing I create works. Read More
Power Platform & Dynamics 365 Newsletter – May 2024
May 2024 Edition
The newsletter is attached below.
Welcome to the May 2024 Edition of the Power Platform and Dynamics 365 CE Newsletter!
As usual we have a lot of news and highlights to catch up on and plan!
Business Application Launch Event On-Demand
2024 Release Wave 1 is released
Copilot in D365 Customer Insights
Copilot in D365 Field Service
Upcoming Microsoft events
Review the new deprecations and removal updates.
Review Power Apps, Dataverse, Copilot product updates.
TIPS: USE COPILOT TO SUMMARIZE THE DOC OR ASK QUESTIONS.
Thanks & regards.
Power Platform & D365 Newsletter Team
—
Premier Support for Partners (PSfP) and Advanced Support for Partners (ASfP) are paid partner offerings at Microsoft that provide unmatched value through a wide range of Partner benefits including account management, direct-from-Microsoft advisory consultations, the highest level of reactive support available including up to 15-minute response times on critical cases, and coverage across cloud, hybrid, and on-prem.
Please review these resources to learn more and consider booking a meeting to speak directly with our teams for a better understanding of the value-added benefits of PSfP and ASfP.
Book a meeting with a PSfP Specialist
Book a meeting with an ASfP Evangelist
Visit the ASfP Website
Download the ASfP Fact Sheet
View the ASfP Impact Slide
Stop by the ASfP Partner Community
May 2024 Edition
The newsletter is attached below.
Welcome to the May 2024 Edition of the Power Platform and Dynamics 365 CE Newsletter!
As usual we have a lot of news and highlights to catch up on and plan!
Business Application Launch Event On-Demand
2024 Release Wave 1 is released
Copilot in D365 Customer Insights
Copilot in D365 Field Service
Upcoming Microsoft events
Review the new deprecations and removal updates.
Review Power Apps, Dataverse, Copilot product updates.
TIPS: USE COPILOT TO SUMMARIZE THE DOC OR ASK QUESTIONS.
Thanks & regards.
Power Platform & D365 Newsletter Team
—
Premier Support for Partners (PSfP) and Advanced Support for Partners (ASfP) are paid partner offerings at Microsoft that provide unmatched value through a wide range of Partner benefits including account management, direct-from-Microsoft advisory consultations, the highest level of reactive support available including up to 15-minute response times on critical cases, and coverage across cloud, hybrid, and on-prem.
Please review these resources to learn more and consider booking a meeting to speak directly with our teams for a better understanding of the value-added benefits of PSfP and ASfP.
Book a meeting with a PSfP Specialist
Visit the PSfP Website
Book a meeting with an ASfP Evangelist
Visit the ASfP Website
Download the ASfP Fact Sheet
View the ASfP Impact Slide
Stop by the ASfP Partner Community
Access Europe meeting on Wed 5 June – Extended File Properties / Manage Import/Export Data Tasks
The next Access Europe meeting will be on Wednesday 5 June 2024 starting at 18:00 UK time (6PM UTC+1) and finishing by about 19:30 (7.30PM)
The start time is equivalent to 19:00 (7PM) in Central Europe and 10AM in Seattle / PST
Please note that the UK is now on Summer Time (UTC+1). For local times, please check World Time Buddy
In this month’s session, I will be demonstrating two free applications:
Extended File Properties
There are currently over 320 extended file properties, though no files will ever contain all of those.
This app provides an easy way of obtaining and storing all the extended properties for an individual file or all files in a selected folder and its subfolders.
This has many possible uses including cataloguing large number of photos or music files and identifying duplicates
Manage Import / Export (IMEX) Data Tasks
Prior to Access 2007, import/export (IMEX) specifications were always saved in 2 system tables: MSysIMEXSpecs / MSysIMEXColumns.
These were only available for text files e.g. CSV.
This approach is still available but was supplemented with a wizard driven XML based system of Import/Export Data Tasks in Access 2007.
The new approach is very simple to use and works for a wider range of files including Excel spreadsheets, Access tables, XML, HTML as well as text files.
However, ease of use has also led to much greater obscurity in how the newer approach works.
This app was created to allow users to view and easily edit the contents of Import/Export Data Tasks and make them a much more useful feature
For more details about this session, see: https://accessusergroups.org/europe/event/access-europe-2024-06-05/ or https://isladogs.co.uk/aeu-28/
The meeting will again be held on Zoom. When the time comes, you can connect using: Join Zoom Meeting. If you are asked, use: Meeting ID: 924 3129 5683 ; Passcode: 661210
For more connection options, please see the AccessUserGroups.org web page for this event
As always, the session will be recorded and the video uploaded to YouTube after the event
The next Access Europe meeting will be on Wednesday 5 June 2024 starting at 18:00 UK time (6PM UTC+1) and finishing by about 19:30 (7.30PM)The start time is equivalent to 19:00 (7PM) in Central Europe and 10AM in Seattle / PSTPlease note that the UK is now on Summer Time (UTC+1). For local times, please check World Time BuddyIn this month’s session, I will be demonstrating two free applications:Extended File PropertiesThere are currently over 320 extended file properties, though no files will ever contain all of those.This app provides an easy way of obtaining and storing all the extended properties for an individual file or all files in a selected folder and its subfolders.This has many possible uses including cataloguing large number of photos or music files and identifying duplicatesManage Import / Export (IMEX) Data TasksPrior to Access 2007, import/export (IMEX) specifications were always saved in 2 system tables: MSysIMEXSpecs / MSysIMEXColumns.These were only available for text files e.g. CSV.This approach is still available but was supplemented with a wizard driven XML based system of Import/Export Data Tasks in Access 2007.The new approach is very simple to use and works for a wider range of files including Excel spreadsheets, Access tables, XML, HTML as well as text files.However, ease of use has also led to much greater obscurity in how the newer approach works.This app was created to allow users to view and easily edit the contents of Import/Export Data Tasks and make them a much more useful feature
For more details about this session, see: https://accessusergroups.org/europe/event/access-europe-2024-06-05/ or https://isladogs.co.uk/aeu-28/
The meeting will again be held on Zoom. When the time comes, you can connect using: Join Zoom Meeting. If you are asked, use: Meeting ID: 924 3129 5683 ; Passcode: 661210For more connection options, please see the AccessUserGroups.org web page for this eventAs always, the session will be recorded and the video uploaded to YouTube after the event Read More
Premio AI for ChangeMakers
Olá pessoal,
Nosso evento AI for ChangeMakers foi sensacional e aprendemos muito sobre Inteligência Artificial e o que podemos implementar em nossas organizações. Creio que todos nós estamos com um gostinho de quero mais!
Pensando nisso, e no que divulguei no final de minha apresentação, já estamos discutindo com a Lucia e a equipe de Filantropia da Microsoft, sobre o Prêmio AI for ChangeMakes, para reconhecer e divulgar as melhores práticas no uso de soluções de AI da Microsoft pelas organizações brasileiras. Em breve vamos divulgar mais informações sobre o prêmio. Abraços.
Olá pessoal,Nosso evento AI for ChangeMakers foi sensacional e aprendemos muito sobre Inteligência Artificial e o que podemos implementar em nossas organizações. Creio que todos nós estamos com um gostinho de quero mais!Pensando nisso, e no que divulguei no final de minha apresentação, já estamos discutindo com a Lucia e a equipe de Filantropia da Microsoft, sobre o Prêmio AI for ChangeMakes, para reconhecer e divulgar as melhores práticas no uso de soluções de AI da Microsoft pelas organizações brasileiras. Em breve vamos divulgar mais informações sobre o prêmio. Abraços. Read More
Microsoft at SAP Sapphire 2024
Microsoft and SAP have been partners and customers of each other for over 30 years, collaborating on innovative business solutions and helping thousands of joint customers accelerate their business transformation. The Microsoft Cloud is the leading cloud platform for running SAP workloads in the cloud, including RISE with SAP, and we are super excited to share our presence this year at SAP Sapphire 2024, SAP’s flagship event of the year.
Join us at SAP Sapphire 2024
Join Microsoft at SAP Sapphire 2024 taking place in Orlando from June 3-5, 2024, and in Barcelona from June 11-13, 2024. This blog provides you details on our amazing session line-up, where you can find us, as well as networking events you can attend to learn more.
Sessions
We have an amazing line-up of sessions – a total of 12 across both locations – make sure you register with the below links to learn from Microsoft and SAP experts about the latest in AI, RISE, BTP, as well as hear directly from customers who are already innovating and achieving their business outcomes with the synergy of SAP and the Microsoft Cloud. Note: These are in-person sessions.
1. Customer success story: Accenture’s RISE with SAP journey with Microsoft and generative AI
2. Strategy talk: Unlock innovation with AI on the Microsoft Cloud
3. Customer success story: Security: Managing users and privileged access in hybrid environments
4. Strategy talk: The next normal: Setting objectives amid innovation for 2027 and beyond
5. Customer success story: How Microsoft and SAP are co-innovating to deliver AI for HR
6. ASUG Power Peer Group: Community alliance meet-up for SAP S/4 HANA: Live town hall
7. Customer success story: Microsoft’s journey from SAP ERP HCM to SAP SuccessFactors solutions
8. Customer success story: Microsoft’s planning and source-to-contract transformation
9. Customer success story: Microsoft: Real-world insights on strategic sourcing success
10. Deepdive: Empowering cloud supply chains with Microsoft and SAP
11. Customer success story: Security: Managing users and privileged access in hybrid environments
Booth
We will have an incredible line-up of executives and subject matter experts from Microsoft on-site to help address any questions and foster connecections. Find us at booth# 501 in Orlando and at booth#5.105 in Barcelona!
Networking receptions
What’s an event like Sapphire without some networking? We cordially event you to the below events to meet and mingle with Microsoft and our partner executives and get the most out of your experience:
1. Microsoft Customer reception at the Rocks Lounge, Monday June 3rd from 8 PM – Registration form
2. Canada Welcome Reception at the Hampton Social, Monday June 3rd from 7 PM – no RSVP required.
3. Microsoft + Capgemini + SAP Client Appreciate Night, Tuesday June 4th from 7 PM – RSVP
4. Microsoft + Wipro + Rizing Evening reception, Tuesday June 4th – RSVP
5. (Barcelona) Microsoft + Wipro + Rizing at the El Palace, Tuesday, June 11th from 7 PM – RSVP
6. (Barcelona) Microsoft + Capgemini at Pulitzer Barcelona on June 12th from 17:30 CET – RSVP
Learn more about the Microsoft and SAP partnership:
1. Joint e-book from Microsoft and SAP: RISE with SAP
Microsoft Tech Community – Latest Blogs –Read More
Booking confirmation not being sent to all Staff Assigned members
Notify all staff via email when a booking assigned to them is created or changed is enabled but only one random Staff member is receiving booking email instead of all, it worked fine till yesterday.
Notify all staff via email when a booking assigned to them is created or changed is enabled but only one random Staff member is receiving booking email instead of all, it worked fine till yesterday. Read More
How can I be a delegate when scheduling teams calls
I am sending invites on behalf of someone in outlook (I am a delegate on their account) but when I include a teams call in the invite it doesn’t let anyone join until I join. Even though I am not on the invite! How can I change this?
I am sending invites on behalf of someone in outlook (I am a delegate on their account) but when I include a teams call in the invite it doesn’t let anyone join until I join. Even though I am not on the invite! How can I change this? Read More
Saving a file to a SharePoint directory in Excel – VBA
I have tried a few variations, looked at youtube videos and looked to CHATGPT. I have two options on the dropdown for file destination: (1) My Choice; (2) a predefined SharePoint location. I have had the “My Choice” option working for some time and added option 2 which is the problem area.
I get through to the stream.SaveToFile statement and I see a spinner pop up and about 10 seconds later, it returns a 3004 error.
I have tried both http:// and changing it to a network directory setup (CHATGPT) so it reads: \mycompany.sharepoint.comsitesOBOSitePagesCollabHomeOBO_STG_Export_Files. The HTTP address of the sharepoint folder is: https://mycompany.sharepoint.com/sites/OBO/Shared%20Documents/Forms/AllItems.aspx?id=%2Fsites%2FOBO%2FShared%20Documents%2FOBO%5FQA%5FExport%5FFiles&viewid=a7e2c126%2Dddda%2D4583%2D8ae2%2D55346c1a7cbf
The code is below. It seems to be how I am forming things in the stream.SaveToFile
Destination filename would be something like: banner-product-20240529142717
Sub ExportToCSVCheckBox(ws_base As String <- this equates to “product”)
Dim src_ws As Worksheet
Dim create_csvs As Worksheet
Dim rng As Range
Dim cell As Range
Dim filePath As String
Dim rowIndex As Long
Dim colIndex As Long
Dim line As String
Dim delimiter As String
Dim banner As Variant
Dim outputSelection As String
Dim csv_filename As String
Dim expFile As Variant
Dim timeStamp As String
Dim stream As Object ‘ ADODB.Stream
‘
‘ This subroutine provides the following capabilities:
‘ 1) Reads cells from Create_CSVs worksheet for banner, which files to export and where to
‘ place the files
‘ 2) Generate the output in UTF-8 format to allow for special French characters
‘ 3) Blanks out any 0 filled cells provided by Excel – CCv2 does not like them
‘ 4) Replaces traditional “,” delimited file with “|” – CCv2 wants it that way
‘ 5) Puts double quotes “” around cell contents to deal with those cell contents that
‘ have double quotes in them. “12” Latex Balloon” needs to be “12”” Latex Balloon” so
‘ that is shows in CCv2 as 12″ Latex Balloon
‘
On Error GoTo ErrorHandler
‘ Set the worksheet location for the input variables
Set create_csvs = ThisWorkbook.Sheets(“Create_CSVs”)
‘ Get the banner
banner = create_csvs.Range(“B3”).Value
‘ Get the output location
outputSelection = create_csvs.Range(“B17”)
‘ Get the expFile area
expFile = banner & “-” & ws_base & csv_filename
‘ Get the current date and time stamp
timeStamp = Format(Now, “yyyymmddhhmmss”)
‘ Create the filename for the CSV
csv_filename = expFile & “-” & timeStamp & “.csv”
‘ Set the name of the worksheet that holds the data for export
Set src_ws = ThisWorkbook.Sheets(expFile)
‘ Define the file path for the CSV file
If outputSelection = “My Choice” Then
filePath = Application.GetSaveAsFilename(InitialFileName:=csv_filename, FileFilter:=”CSV Files (*.csv), *.csv”)
‘ Check if user cancelled the Save As dialog
If filePath = “False” Then
Exit Sub
End If
Else ‘ assume we have a SharePoint location
filePath = create_csvs.Range(“C18”) & csv_filename
End If
‘ Define the range
Set rng = src_ws.UsedRange ‘ Adjust as needed
‘ Define the custom delimiter
delimiter = “|”
‘ Create the ADODB.Stream object
Set stream = CreateObject(“ADODB.Stream”)
stream.Type = 2 ‘ Specify stream type – we want To save text/string data.
stream.Charset = “UTF-8” ‘ Specify charset For the source text data.
‘ Open the stream
stream.Open
‘ Loop through each row in the range
For rowIndex = 1 To rng.Rows.Count
line = “”
‘ Loop through each column in the row
For colIndex = 1 To rng.Columns.Count
‘ Get the cell value
Set cell = rng.Cells(rowIndex, colIndex)
If cell.Value <> “” And cell.Value <> 0 Then ‘ Cannot have 0 values for IMPEX
line = line & “””” & cell.Value & “”””
End If
If colIndex < rng.Columns.Count Then
‘ Append the cell value to the line with the custom delimiter
line = line & delimiter
End If
Next colIndex
‘ Write the line to the stream
stream.WriteText line & vbCrLf
Next rowIndex
MsgBox “filepath = ” & filePath
‘ Save the stream to a file
stream.SaveToFile filePath, 2 ‘ 2 = adSaveCreateOverWrite
‘ Close the stream
stream.Close
‘ Notify the user that the export is complete
MsgBox “Data has been exported to ” & filePath, vbInformation
Exit Sub
ErrorHandler:
MsgBox “An error occurred: ” & Err.Description & ” (Error Number: ” & Err.Number & “)”
End Sub
I have tried a few variations, looked at youtube videos and looked to CHATGPT. I have two options on the dropdown for file destination: (1) My Choice; (2) a predefined SharePoint location. I have had the “My Choice” option working for some time and added option 2 which is the problem area. I get through to the stream.SaveToFile statement and I see a spinner pop up and about 10 seconds later, it returns a 3004 error.I have tried both http:// and changing it to a network directory setup (CHATGPT) so it reads: \mycompany.sharepoint.comsitesOBOSitePagesCollabHomeOBO_STG_Export_Files. The HTTP address of the sharepoint folder is: https://mycompany.sharepoint.com/sites/OBO/Shared%20Documents/Forms/AllItems.aspx?id=%2Fsites%2FOBO%2FShared%20Documents%2FOBO%5FQA%5FExport%5FFiles&viewid=a7e2c126%2Dddda%2D4583%2D8ae2%2D55346c1a7cbfThe code is below. It seems to be how I am forming things in the stream.SaveToFileDestination filename would be something like: banner-product-20240529142717Sub ExportToCSVCheckBox(ws_base As String <- this equates to “product”)Dim src_ws As WorksheetDim create_csvs As WorksheetDim rng As RangeDim cell As RangeDim filePath As StringDim rowIndex As LongDim colIndex As LongDim line As StringDim delimiter As StringDim banner As VariantDim outputSelection As StringDim csv_filename As StringDim expFile As VariantDim timeStamp As StringDim stream As Object ‘ ADODB.Stream” This subroutine provides the following capabilities:’ 1) Reads cells from Create_CSVs worksheet for banner, which files to export and where to’ place the files’ 2) Generate the output in UTF-8 format to allow for special French characters’ 3) Blanks out any 0 filled cells provided by Excel – CCv2 does not like them’ 4) Replaces traditional “,” delimited file with “|” – CCv2 wants it that way’ 5) Puts double quotes “” around cell contents to deal with those cell contents that’ have double quotes in them. “12” Latex Balloon” needs to be “12”” Latex Balloon” so’ that is shows in CCv2 as 12″ Latex Balloon’On Error GoTo ErrorHandler’ Set the worksheet location for the input variablesSet create_csvs = ThisWorkbook.Sheets(“Create_CSVs”)’ Get the bannerbanner = create_csvs.Range(“B3”).Value’ Get the output locationoutputSelection = create_csvs.Range(“B17”)’ Get the expFile areaexpFile = banner & “-” & ws_base & csv_filename’ Get the current date and time stamptimeStamp = Format(Now, “yyyymmddhhmmss”)’ Create the filename for the CSVcsv_filename = expFile & “-” & timeStamp & “.csv”‘ Set the name of the worksheet that holds the data for exportSet src_ws = ThisWorkbook.Sheets(expFile)’ Define the file path for the CSV fileIf outputSelection = “My Choice” ThenfilePath = Application.GetSaveAsFilename(InitialFileName:=csv_filename, FileFilter:=”CSV Files (*.csv), *.csv”)’ Check if user cancelled the Save As dialogIf filePath = “False” ThenExit SubEnd IfElse ‘ assume we have a SharePoint locationfilePath = create_csvs.Range(“C18”) & csv_filenameEnd If’ Define the rangeSet rng = src_ws.UsedRange ‘ Adjust as needed’ Define the custom delimiterdelimiter = “|”‘ Create the ADODB.Stream objectSet stream = CreateObject(“ADODB.Stream”)stream.Type = 2 ‘ Specify stream type – we want To save text/string data.stream.Charset = “UTF-8” ‘ Specify charset For the source text data.’ Open the streamstream.Open’ Loop through each row in the rangeFor rowIndex = 1 To rng.Rows.Countline = “”‘ Loop through each column in the rowFor colIndex = 1 To rng.Columns.Count’ Get the cell valueSet cell = rng.Cells(rowIndex, colIndex)If cell.Value <> “” And cell.Value <> 0 Then ‘ Cannot have 0 values for IMPEXline = line & “””” & cell.Value & “”””End IfIf colIndex < rng.Columns.Count Then’ Append the cell value to the line with the custom delimiterline = line & delimiterEnd IfNext colIndex’ Write the line to the streamstream.WriteText line & vbCrLfNext rowIndexMsgBox “filepath = ” & filePath’ Save the stream to a filestream.SaveToFile filePath, 2 ‘ 2 = adSaveCreateOverWrite’ Close the streamstream.Close’ Notify the user that the export is completeMsgBox “Data has been exported to ” & filePath, vbInformationExit SubErrorHandler:MsgBox “An error occurred: ” & Err.Description & ” (Error Number: ” & Err.Number & “)”End Sub Read More
Colaborar e Aprender
Olá, um prazer participar deste forum. Estamos engatinhando em conhecimento e uso de AI.
Estamos a disposição para aprender a respeito de aplicações e soluções desenvolvidas para o 3º setor , e compartilhar boas práticas.
Olá, um prazer participar deste forum. Estamos engatinhando em conhecimento e uso de AI. Estamos a disposição para aprender a respeito de aplicações e soluções desenvolvidas para o 3º setor , e compartilhar boas práticas. Read More
Congratulations and Welcome to our new Microsoft Global Community Initiative (MGCI) Regional Leaders
The Microsoft Global Community Initiative (MGCI) Advisors would like to extend a special welcome to the new group of Regional Leaders joining us from all over the world! We are looking forward to growing our global communities together with your support! Curious who the new Regional Leaders are? Check them out here! Discover the details about MGCI.
How to become MGCI Regional Leader
Earlier this year we opened nominations to invite additional regional leaders into MGCI from the request of the community to have more local regional volunteers around the world to help facilitate more support, activities, and excitement regarding how to produce and amplify free community events.
Regional leaders are self-nominated, non-Microsoft employees, approved by board members and Microsoft. The term is one year and can be extended. Board members are appointed for an initial 18-month term that can be extended. Regional leaders are added quarterly.
MGCI Subcommittees
MGCI relies on the dedication of board members and regional leaders who lead the three subcommittees—Communications & Amplification, Training and Resources, and Technology and Tools—to drive the initiative forward. These volunteers bring their unique skills and passion to improve communication channels, offer valuable training and resources, and harness technology for the betterment of MGCI and its members. Through their volunteer efforts, MGCI thrives, fostering a spirit of collaboration and community empowerment. General members are invited to volunteer for a subcommittee after joining https://aka.ms/MGCISignUp — volunteers are welcome!
Communications & Amplification
The MGCI communications & amplification subcommittee drives community growth and engagement. This subcommittee crafts the overall strategy for communications/amplification for MGCI. Subcommittee members help write, edit, and create posts, blurbs, graphics and GIFs for usage to amplify the overall program and communitydays.org events which drives growth and awareness. Board Officer Leads: @JenniferMason & Isidora Katanic.
Training & Resources
The MGCI training subcommittee holds monthly training calls to drive hands on knowledge for those that are community event organizers or are interested in learning how to organize an event. This content and best practices expand and foster knowledge within the community. Regional leaders will craft training strategy and topic curation. Subcommittee members can help build and lead training. Regional Leads: David Leveille and Wes Preston.
Technology and Tools
The MGCI Tools and Technology subcommittee helps with testing and quality control over our one stop shop Community Event website (communitydays.org) which provides a worldwide listing of all Community Days events, paid events and free events in the Microsoft ecosystems. Subcommittee members will be asked to review, test and quality control new features, pages and content for communitydays.org and other community tools used to run the MGCI. Board Officer Lead: Tom Daly.
The next event training meeting is coming up on Thursday May 30 at 8:00am PT, to be invited please join MGCI today – https://aka.ms/MGCISignUp.
Thank you in advance to our new Regional Leaders for your volunteer contributions and our board members and existing regional leaders for their contributions.
Discover: https://aka.ms/MGCI
Join: https://aka.ms/MGCISignUp
Advisors: https://aka.ms/MGCIAdvisors
MGCI is community-led and supported by Microsoft. All are welcome to join!
— Heather
Microsoft Tech Community – Latest Blogs –Read More
Azure SQL Managed Instance Point-In-Time Restore Service Level Expectation
Introduction
Customers evaluating or already using SQL Managed Instance (SQL MI) need to estimate how much time a database restore would take to complete. This estimation could be required when designing an SLA and/or SLO, or to assess whether a database with certain characteristics would be able to fit into predefined recovery time objectives (RTO).
This Point-In-Time Restore Service Level Expectation (SLE) solution was created to provide guidance for the expected time it would take to restore a database to a point in time, by providing a database restore time expectation based on input parameters.
Note that this SLE was built for PITR database restore operations, and not for Long Term Retention backups restore operations.
Reference documentation: Point-in-time restore – Azure SQL Managed Instance | Microsoft Learn
Factors that affect database restore time
The following list contains several factors that may affect the recovery time of a database using automated database backups:
The size of the database being restored
The compute size of the target database
The number of transaction logs involved in the restore plan
The amount of activity that needs to be replayed to recover to the restore point
The network bandwidth if the restore is to a different region
The number of concurrent restore requests that are processed in the target region
Whether a system update is performed which will temporarily interrupt any database restore operation
Approach and modeling of the estimations
The approach chosen to generate a database restore time expectation is based on data from past restore operations that occurred across worldwide Azure regions. This data was collected from the backend telemetry of our SQL MI service to the end of 2023. It is assumed that future restore operations will take a similar amount of time as past restores with comparable database characteristics (such as backup size and number of files). However, there may be improvements in our service that could potentially reduce restoration times in the future, at which point this analysis will be revised to incorporate these advancements.
To model the estimations, 6 linear regressions were used with a series of independent variables, which were fed with restore operations data obtained from the Azure SQL MI backend telemetry.
To store the results of the regressions and to use them to make estimations, an Excel file called SQL MI Database Restore Service Level Expectation.xlsx can be used. The stored procedure at the bottom of this blog named usp_SQL_MI_Database_Restore_Service_Level_Expectation can be used to make estimations as well.
It is important to note that while regression analysis can be powerful, it assumes a linear relationship between the variables and makes certain assumptions about the data, such as normality and homoscedasticity. It Is important to recognize this method has its limitations, and that both the data used for the regressions as well as future database restores there will be outliers that will be very hard to predict. Even though that, these are mathematical models trying to provide an expectation based on actual database restores. In certain scenarios, such as with large databases in Business Critical SLO, the number of actual database restores may be relatively low due to the few large databases compared to medium and smaller ones. Consequently, this results in the generation of less accurate models. For the purpose of this work, a linear regression with an r2 greater than 0.65 was considered good enough from a statistics point of view.
Different models based on service tier and database size
Since database restores at different service tiers perform differently due to the number of resources available and in order to model estimations more accurately, six different estimation models were created. The table below shows the 2 service tiers and the 3 categories for database size for a total of 6 models:
Choosing the right estimation model
General Purpose
(GP Database Restore SLE tab)
Business Critical
(BC Database Restore SLE tab)
Small Databases
Smaller than 450 GB
Smaller than 400 GB
Medium Databases
Between 450 GB and 1,500 GB
Between 400 GB and 2,000 GB
Large Databases
Greater than 1,500 GB
Larger than 2,000 GB
For example, if you are trying to use the Excel file to estimate the restore time of a database that runs on Business Critical and the database is 1,800 GB in size, then use the BC Restore SLE tab and the second regression for Medium databases of the Excel file.
Reference documentation: Resource limits & purchasing model
Two options to obtain restore time estimations
This Database Restore Service Level Expectation (SLE) solution presents two options to obtain restore time estimations:
An Excel file to manually enter values of a hypothetical database
To obtain a copy of this file please email datasqlninja@microsoft.com.
A stored procedure specifying an existing database and a point in time (PITR)
The T-SQL code of this stored procedure called usp_SQL_MI_Database_Restore_Service_Level_Expectation is at the end of this blog.
Stored Procedure for existing databases
The stored procedure option provides estimations for an existing database in the SQL MI in which the stored procedure is first created and then executed. The stored procedure named usp_SQL_MI_Database_Restore_Service_Level_Expectation accompanying this solution does the calculations itself with only two input parameters. In order to use it, first create the stored procedure in any database of the SQL MI and second execute it as explained below.
Input Parameters
The only two parameters needed to run this stored procedure are shown below:
EXEC usp_SQL_MI_Database_Restore_Service_Level_Expectation @Database=’AdventureWorks2019′, @PITR_Time = ‘2024-01-26 09:21:00.000’;
Running the above example estimates the restore time of the AdventureWorks2019 database to a point in time of 9:21 AM – January 26th 2024. Please enter the name of the existing database and the point in time you would like to obtain estimations for.
Results
The result of the execution of the above example looks like this:
Note that it provides estimations for restoring the existing database in both General Purpose and Business Critical in various vCore configurations, as well as information of the restore plan for the specified point in time restore (PITR).
Excel file for hypothetical databases
Please use the accompanying Excel file named SQL MI Database Restore Service Level Expectation.xlsx and choose the estimation model in the tab and section that corresponds to the service tier you want to use for the estimation (General Purpose or Business Critical) and the size of the database. Once you locate the appropriate model, you will enter the input parameters of a hypothetical database to perform restore estimations.
Input parameters
Inside the Excel file, the cells colored green are the cells where data must be entered to generate an estimation. vCore is the only cell that contains a drop box with the valid values for each model based on tier and database size. The file comes with sample values in the cells for illustrative purposes.
As you enter values one by one, a number of minutes and hours will start to be calculated but should be ignored until you have entered all the 5 values in all the corresponding green cells.
Note that for some of the input parameters you will have to be creative and make assumptions, i.e., how many log backups are needed in the restore plan for a given point in time? Or how big of a differential backup to I want to use for an estimation? Each green cell performs validations to make sure valid values for each cell are entered.
Once all input parameters are entered into the green cells, the expected database restore time is calculated underneath and displayed both in minutes with one decimal and in hours with two decimals. Both represent the same amount of time only differing in the time unit. i.e. 113.4 minutes equals 1.89 hours.
Number of vCores
This input parameter represents the number of vCores that are configured on the hypothetical SQL Managed Instance that will be the target of the database restore. In most cases the number of vCores changes the amount of resources available for the restore operation that translates to a slight time difference.
Please enter into the green cell next to vcore_count the number that you want to use to generate an estimation.
Note that the number of vCores that can be configured in SQL MI depends on the following:
The service tier (General Purpose or Business Critical)
The hardware series (Standard, Premium, Memory Optimized Premium)
The maximum instance reserved storage size (1 TB, 2 TB, 4 TB, 5.5 TB and 16 TB)
Feature
General Purpose
Business Critical
Number of vCores*
4, 8, 16, 24, 32, 40, 64, 80
Standard-series (Gen5): 4, 8, 16, 24, 32, 40, 64, 80
Premium-series: 4, 6, 8, 10, 12, 16, 20, 24, 32, 40, 48, 56, 64, 80, 96, 128
Memory optimized premium-series: 4, 6, 8, 10, 12, 16, 20, 24, 32, 40, 48, 56, 64, 80, 96, 128
Max instance reserved storage size
– 2 TB for 4 vCores
– 8 TB for 8 vCores
– 16 TB for other sizes
Standard-series (Gen5):
– 1 TB for 4, 8, 16 vCores
– 2 TB for 24 vCores
– 4 TB for 32, 40, 64, 80 vCores
Premium-series:
– 1 TB for 4, 6 vCores
– 2 TB for 8, 10, 12 vCores
– 4 TB for 16, 20 vCores
– 5.5 TB for 24, 32, 40, 48, 56 vCores
– 5.5 TB or 16 TB (depending on the region) for 64, 80, 96, 128 vCores 2
Memory optimized premium-series:
– 1 TB for 4, 6 vCores
– 2 TB for 8, 10, 12 vCores
– 4 TB for 16, 20 vCores
– 5.5 TB for 24 vCores
– 5.5 TB or 8 TB (depending on the region) for 32, 40 vCores3
– 12 TB for 48, 56 vCores
– 16 TB for 64, 80, 96, 128 vCores
The tables above were extracted from the following documentation:
Reference documentation: Resource limits – Azure SQL Managed Instance | Microsoft Learn
Full Backup Size in GB
This input parameter represents the size in gigabytes (GB) of the full database backup that is going to be used to restore the database to the point in time chosen. This is equal to the actual size of the database at the time the full backup was taken and is the most significant factor that affects estimations.
Please enter into the green cell next to full_backup_size_GB a number in GB that you want to use as the size of the database to generate an estimation.
Reference documentation: Full database backups (SQL Server) – SQL Server | Microsoft Learn
Differential Backup Size in GB
This input parameter represents the size of the differential backup that may need to be restored after the full backup to get to the point in time specified in the operation. The size of this differential backup depends on the number and size of changes made in the database since the last full backup was completed and the point in time chosen. Considering a full backup is normally taken once a week and a differential backup is normally taken every 12 hours, the differential backup could contain database changes accumulated for up to ~6.5 days. Note that if a point in time is chosen close enough to the last full backup, then no differential backup is needed for that restore plan.
Please enter into the green cell next to diff_backup_size_GB a number in GB if you want to use a differential backup in the estimation. The most typical value we observe in the telemetry is between 1% and 18% of the full backup size.
Reference documentation: Differential Backups (SQL Server) – SQL Server | Microsoft Learn
Log Backup Count
This input parameter represents the number of transaction log backups that need to be restored to bring the database to the point in time specified in the database restore operation.
Considering a log backup is normally as long as there are database changes and differential backups are taken normally every 12 hours, we could estimate that at the most 144 log backups could be needed in the restore plan to get to any point in time specified in the database restore operation.
Please enter into the green cell next to log_backup_count a number of log backups you want to use to generate an estimation. If you want to model a worst-case scenario, use 144 and for best case scenario use 0. The most typical value we observe in the telemetry is between 0 and 55.
Reference documentation: Transaction log backups – SQL Server | Microsoft Learn
Total Log Backup Size in GB
This input parameter represents the sum in GB of all the log backups needed in the restore plan after the full + potential differential backup restores to get to the point in time. This total size depends on the number of database changes that occurred since the last full or differential backup.
Please enter into the green cell next to total_log_backup_size_GB a number in GB that you want to use to generate an estimation. The most typical value we observe in the telemetry is between 0% and 5% of the full backup size.
Feedback and suggestions
We hope that this post has helped you. If you have feedback or suggestions for improving this asset, please contact the Azure Databases SQL Customer Success Engineering Team. Thanks for your support!
The following is the T-SQL to create the aforementioned stored procedure (usp_SQL_MI_Database_Restore_Service_Level_Expectation) to calculate the restore SLE:
SET ANSI_NULLS ON
GO
SET QUOTED_IDENTIFIER ON
GO
— =================================================================================================================================================================================
— Disclaimer: This stored procedure is provided “as-is”. Microsoft provided guidance in this stored procedure with the understanding that MICROSOFT MAKES NO WARRANTIES,
— EXPRESS OR IMPLIED, WITH RESPECT TO THE INFORMATION CONTAINED HEREIN.
— Authors: Raghavendra Srinivasan, Diego Caracciolo, Microsoft Corporation
— Create date: 1/30/2024
— Description: This stored procedure provides a SQL MI Database Restore Service Level Expectation (SLE) based on the backup history information of a database in the current SQL MI
— This stored procedure is part of a solution called “Azure SQL MI Database Restore Service Level Expectation”
— which main document is called Azure SQL MI Database Restore Service Level Expectation.docx
— Parameters: @Database = The name of the database in the current SQL Managed Instance for which you want to obtain an SLE
— @PITR_Time = The point in time at which you want to obtain a database restore SLE
— Example: EXEC usp_SQL_MI_Database_Restore_Service_Level_Expectation @Database=’MyDatabase1′, @PITR_Time = ‘2024-01-26 09:21:00.000’
— =================================================================================================================================================================================
CREATE PROCEDURE usp_SQL_MI_Database_Restore_Service_Level_Expectation
@Database NVARCHAR(255),
@PITR_Time DATETIME2
AS
BEGIN
SET NOCOUNT ON;
BEGIN TRY
DECLARE @PITR_Checkpoint DATETIME2 = DATEADD(MINUTE, 30, @PITR_Time);
DECLARE @LogBackupFileCount INT = 0;
DECLARE @vores TABLE (Vcore INT)
DECLARE @FullBackup FLOAT, @DiffBackup FLOAT, @LogBackup FLOAT;
DECLARE @vcore_count FLOAT, @full_backup_size_GB FLOAT, @diff_backup_size_GB FLOAT, @total_log_backup_size_GB FLOAT, @log_backup_count FLOAT, @Intercept FLOAT;
— Check if the temporary table exists, drop and create
IF OBJECT_ID(‘tempdb..#dataSET’) IS NOT NULL
DROP TABLE #dataSET;
IF OBJECT_ID(‘tempdb..#FinalResult’) IS NOT NULL
DROP TABLE #FinalResult;
CREATE TABLE #dataSET (
Id INT IDENTITY(1, 1),
database_name NVARCHAR(256),
backup_start_date DATETIME2,
backup_finish_date DATETIME2,
[type] NVARCHAR(64),
backupsize_GB FLOAT
);
CREATE TABLE #FinalResult (
DatabaseName NVARCHAR(256),
Backup_start_date DATETIME2,
Backup_finish_date DATETIME2,
[Type] NVARCHAR(64),
Backupsize_GB FLOAT,
BackupFileCount int
);
— CTE to find the latest full backup before the specified PITR time
WITH LatestFullBackup AS (
SELECT TOP 1 database_name, checkpoint_lsn, backup_finish_date
FROM msdb.dbo.backupSET AS bs
WHERE database_name = @Database AND [type] = ‘D’ AND backup_finish_date < @PITR_Time
ORDER BY backup_start_date DESC
)
— Populate the #dataSET temporary table
INSERT INTO #dataSET
SELECT DISTINCT
bs.database_name,
bs.backup_start_date,
bs.backup_finish_date,
CASE bs.[type]
WHEN ‘D’ THEN ‘Full Backup’
WHEN ‘I’ THEN ‘Differential Backup’
WHEN ‘L’ THEN ‘Log Backup’
ELSE ‘Unknown’
END AS backup_type,
bs.backup_size / 1024 / 1024 / 1024 AS backup_size_GB
FROM
msdb.dbo.backupSET AS bs
JOIN msdb.dbo.backupmediafamily AS bmf ON bs.media_SET_id = bmf.media_SET_id
JOIN LatestFullBackup AS lfb ON bs.database_name = lfb.database_name AND (bs.database_backup_lsn = lfb.checkpoint_lsn OR lfb.checkpoint_lsn = bs.checkpoint_lsn)
WHERE
bs.database_name = @Database AND bs.backup_finish_date < @PITR_Checkpoint
ORDER BY
bs.backup_start_date DESC;
— Get log backup count
IF EXISTS (SELECT TOP 1 id FROM #dataSET WHERE [type] = ‘Differential Backup’ and backup_finish_date <= @PITR_Time)
BEGIN
SELECT @LogBackupFileCount = COUNT(1)
FROM #dataSET
WHERE [type] = ‘Log Backup’ and
id <= (SELECT TOP 1 id FROM #dataSET WHERE [type] = ‘Differential Backup’ and backup_finish_date <= @PITR_Time) and
id >= (SELECT TOP 1 id FROM #dataSET WHERE [type] = ‘Log Backup’ and backup_finish_date >= @PITR_Time ORDER BY id DESC)
END
ELSE
BEGIN
SELECT @LogBackupFileCount = COUNT(1)
FROM #dataSET
WHERE [type] = ‘Log Backup’ and
id >= (SELECT TOP 1 id FROM #dataSET WHERE [type] = ‘Log Backup’ and backup_finish_date >= @PITR_Time ORDER BY id DESC)
END
— Get full database backup
INSERT INTO #FinalResult
SELECT
database_name,
backup_start_date,
backup_finish_date,
[type],
ROUND(backupsize_GB, 2) AS backupsize_GB,
1
FROM #dataSET
WHERE [type] = ‘Full Backup’;
— Get log backups
IF EXISTS (SELECT TOP 1 id FROM #dataSET WHERE [type] = ‘Differential Backup’ and backup_finish_date <= @PITR_Time)
BEGIN
INSERT INTO #FinalResult
SELECT
database_name,
MIN(backup_start_date) AS min_backup_start_date,
MAX(backup_finish_date) AS max_backup_finish_date,
‘Log Backup’,
ROUND(SUM(backupsize_GB), 2),
@LogBackupFileCount
FROM #dataSET
WHERE [type] = ‘Log Backup’ and
id <= (SELECT TOP 1 id FROM #dataSET WHERE [type] = ‘Differential Backup’ and backup_finish_date <= @PITR_Time) and
id >= (SELECT TOP 1 id FROM #dataSET WHERE [type] = ‘Log Backup’ and backup_finish_date >= @PITR_Time ORDER BY id DESC)
GROUP BY database_name;
END
ELSE
BEGIN
INSERT INTO #FinalResult
SELECT
database_name,
MIN(backup_start_date) AS min_backup_start_date,
MAX(backup_finish_date) AS max_backup_finish_date,
‘Log Backup’,
ROUND(SUM(backupsize_GB), 2),
@LogBackupFileCount
FROM #dataSET
WHERE [type] = ‘Log Backup’
AND id >= (
SELECT TOP 1 id
FROM #dataSET
WHERE [type] = ‘Log Backup’
AND backup_finish_date >= @PITR_Time
ORDER BY id DESC
)
GROUP BY database_name;
END
— Get differential backups
INSERT INTO #FinalResult
SELECT TOP 1
database_name,
backup_start_date,
backup_finish_date,
[type],
ROUND(backupsize_GB, 2),
1
FROM #dataSET
WHERE [type] = ‘Differential Backup’
AND backup_finish_date <= @PITR_Time
ORDER BY Id ASC;
SELECT ‘Backups required for PITR restore’ as Description, *
FROM #FinalResult
ORDER BY backup_finish_date;
SELECT @FullBackup = ISNULL(Backupsize_GB, 0)
FROM #FinalResult
WHERE Type = ‘Full Backup’;
SELECT @DiffBackup = ISNULL(Backupsize_GB, 10)
FROM #FinalResult
WHERE Type = ‘Differential Backup’;
SELECT @LogBackup = ISNULL(Backupsize_GB, 0)
FROM #FinalResult
WHERE Type = ‘Log Backup’;
IF @DiffBackup IS NULL
SET @DiffBackup = 0;
— Calculate for General purpose SLO
IF (@FullBackup <= 450)
BEGIN
SET @Intercept = 4.25409547;
SET @vcore_count = -0.204980468;
SET @full_backup_size_GB = 0.135130245;
SET @diff_backup_size_GB = 0.163912245;
SET @log_backup_count = 0.081560044;
SET @total_log_backup_size_GB = 1.144489012;
SELECT
‘SQL MI General Purpose Database Restore Service Level Expectation (Minutes)’ AS SLO,
IIF(
ROUND(@Intercept + (4 * @vcore_count) + (@Fullbackup * @full_backup_size_GB) + (@Diffbackup * @diff_backup_size_GB) + (@LogBackupFileCount * @log_backup_count) + (@LogBackup * @total_log_backup_size_GB), 2) > 1,
ROUND(@Intercept + (4 * @vcore_count) + (@Fullbackup * @full_backup_size_GB) + (@Diffbackup * @diff_backup_size_GB) + (@LogBackupFileCount * @log_backup_count) + (@LogBackup * @total_log_backup_size_GB), 2),
1
) AS GP_4_vCores,
IIF(
ROUND(@Intercept + (8 * @vcore_count) + (@Fullbackup * @full_backup_size_GB) + (@Diffbackup * @diff_backup_size_GB) + (@LogBackupFileCount * @log_backup_count) + (@LogBackup * @total_log_backup_size_GB), 2) > 1,
ROUND(@Intercept + (8 * @vcore_count) + (@Fullbackup * @full_backup_size_GB) + (@Diffbackup * @diff_backup_size_GB) + (@LogBackupFileCount * @log_backup_count) + (@LogBackup * @total_log_backup_size_GB), 2),
1
) AS GP_8_vCores,
IIF(
ROUND(@Intercept + (16 * @vcore_count) + (@Fullbackup * @full_backup_size_GB) + (@Diffbackup * @diff_backup_size_GB) + (@LogBackupFileCount * @log_backup_count) + (@LogBackup * @total_log_backup_size_GB), 2) > 1,
ROUND(@Intercept + (16 * @vcore_count) + (@Fullbackup * @full_backup_size_GB) + (@Diffbackup * @diff_backup_size_GB) + (@LogBackupFileCount * @log_backup_count) + (@LogBackup * @total_log_backup_size_GB), 2),
1
) AS GP_16_vCores,
IIF(
ROUND(@Intercept + (24 * @vcore_count) + (@Fullbackup * @full_backup_size_GB) + (@Diffbackup * @diff_backup_size_GB) + (@LogBackupFileCount * @log_backup_count) + (@LogBackup * @total_log_backup_size_GB), 2) > 1,
ROUND(@Intercept + (24 * @vcore_count) + (@Fullbackup * @full_backup_size_GB) + (@Diffbackup * @diff_backup_size_GB) + (@LogBackupFileCount * @log_backup_count) + (@LogBackup * @total_log_backup_size_GB), 2),
1
) AS GP_24_vCores,
IIF(
ROUND(@Intercept + (32 * @vcore_count) + (@Fullbackup * @full_backup_size_GB) + (@Diffbackup * @diff_backup_size_GB) + (@LogBackupFileCount * @log_backup_count) + (@LogBackup * @total_log_backup_size_GB), 2) > 1,
ROUND(@Intercept + (32 * @vcore_count) + (@Fullbackup * @full_backup_size_GB) + (@Diffbackup * @diff_backup_size_GB) + (@LogBackupFileCount * @log_backup_count) + (@LogBackup * @total_log_backup_size_GB), 2),
1
) AS GP_32_vCores;
END
ELSE IF (@FullBackup > 450 AND @FullBackup <= 1500)
BEGIN
SET @Intercept = 34.85312089;
SET @vcore_count = -0.17126727;
SET @full_backup_size_GB = 0.035174866;
SET @diff_backup_size_GB = 0.296089387;
SET @log_backup_count = 0.141328736;
SET @total_log_backup_size_GB = 0.774604541;
SELECT
‘SQL MI General Purpose Database Restore Service Level Expectation (Minutes)’ AS SLO,
IIF(
ROUND(@Intercept + (4 * @vcore_count) + (@Fullbackup * @full_backup_size_GB) + (@Diffbackup * @diff_backup_size_GB) + (@LogBackupFileCount * @log_backup_count) + (@LogBackup * @total_log_backup_size_GB), 2) > 1,
ROUND(@Intercept + (4 * @vcore_count) + (@Fullbackup * @full_backup_size_GB) + (@Diffbackup * @diff_backup_size_GB) + (@LogBackupFileCount * @log_backup_count) + (@LogBackup * @total_log_backup_size_GB), 2),
1
) AS GP_4_vCores,
IIF(
ROUND(@Intercept + (8 * @vcore_count) + (@Fullbackup * @full_backup_size_GB) + (@Diffbackup * @diff_backup_size_GB) + (@LogBackupFileCount * @log_backup_count) + (@LogBackup * @total_log_backup_size_GB), 2) > 1,
ROUND(@Intercept + (8 * @vcore_count) + (@Fullbackup * @full_backup_size_GB) + (@Diffbackup * @diff_backup_size_GB) + (@LogBackupFileCount * @log_backup_count) + (@LogBackup * @total_log_backup_size_GB), 2),
1
) AS GP_8_vCores,
IIF(
ROUND(@Intercept + (16 * @vcore_count) + (@Fullbackup * @full_backup_size_GB) + (@Diffbackup * @diff_backup_size_GB) + (@LogBackupFileCount * @log_backup_count) + (@LogBackup * @total_log_backup_size_GB), 2) > 1,
ROUND(@Intercept + (16 * @vcore_count) + (@Fullbackup * @full_backup_size_GB) + (@Diffbackup * @diff_backup_size_GB) + (@LogBackupFileCount * @log_backup_count) + (@LogBackup * @total_log_backup_size_GB), 2),
1
) AS GP_16_vCores,
IIF(
ROUND(@Intercept + (24 * @vcore_count) + (@Fullbackup * @full_backup_size_GB) + (@Diffbackup * @diff_backup_size_GB) + (@LogBackupFileCount * @log_backup_count) + (@LogBackup * @total_log_backup_size_GB), 2) > 1,
ROUND(@Intercept + (24 * @vcore_count) + (@Fullbackup * @full_backup_size_GB) + (@Diffbackup * @diff_backup_size_GB) + (@LogBackupFileCount * @log_backup_count) + (@LogBackup * @total_log_backup_size_GB), 2),
1
) AS GP_24_vCores,
IIF(
ROUND(@Intercept + (32 * @vcore_count) + (@Fullbackup * @full_backup_size_GB) + (@Diffbackup * @diff_backup_size_GB) + (@LogBackupFileCount * @log_backup_count) + (@LogBackup * @total_log_backup_size_GB), 2) > 1,
ROUND(@Intercept + (32 * @vcore_count) + (@Fullbackup * @full_backup_size_GB) + (@Diffbackup * @diff_backup_size_GB) + (@LogBackupFileCount * @log_backup_count) + (@LogBackup * @total_log_backup_size_GB), 2),
1
) AS GP_32_vCores;
END
ELSE IF (@FullBackup > 1500)
BEGIN
SET @Intercept = 112.8998712;
SET @vcore_count = -2.211989797;
SET @full_backup_size_GB = 0.027989985;
SET @diff_backup_size_GB = 0.216673484;
SET @log_backup_count = 0.000731196;
SET @total_log_backup_size_GB = 0.688150443;
SELECT
‘SQL MI General Purpose Database Restore Service Level Expectation (Minutes)’ AS SLO,
IIF(
ROUND(@Intercept + (8 * @vcore_count) + (@Fullbackup * @full_backup_size_GB) + (@Diffbackup * @diff_backup_size_GB) + (@LogBackupFileCount * @log_backup_count) + (@LogBackup * @total_log_backup_size_GB), 2) > 1,
ROUND(@Intercept + (8 * @vcore_count) + (@Fullbackup * @full_backup_size_GB) + (@Diffbackup * @diff_backup_size_GB) + (@LogBackupFileCount * @log_backup_count) + (@LogBackup * @total_log_backup_size_GB), 2),
1
) AS GP_8_vCores,
IIF(
ROUND(@Intercept + (16 * @vcore_count) + (@Fullbackup * @full_backup_size_GB) + (@Diffbackup * @diff_backup_size_GB) + (@LogBackupFileCount * @log_backup_count) + (@LogBackup * @total_log_backup_size_GB), 2) > 1,
ROUND(@Intercept + (16 * @vcore_count) + (@Fullbackup * @full_backup_size_GB) + (@Diffbackup * @diff_backup_size_GB) + (@LogBackupFileCount * @log_backup_count) + (@LogBackup * @total_log_backup_size_GB), 2),
1
) AS GP_16_vCores,
IIF(
ROUND(@Intercept + (24 * @vcore_count) + (@Fullbackup * @full_backup_size_GB) + (@Diffbackup * @diff_backup_size_GB) + (@LogBackupFileCount * @log_backup_count) + (@LogBackup * @total_log_backup_size_GB), 2) > 1,
ROUND(@Intercept + (24 * @vcore_count) + (@Fullbackup * @full_backup_size_GB) + (@Diffbackup * @diff_backup_size_GB) + (@LogBackupFileCount * @log_backup_count) + (@LogBackup * @total_log_backup_size_GB), 2),
1
) AS GP_24_vCores,
IIF(
ROUND(@Intercept + (32 * @vcore_count) + (@Fullbackup * @full_backup_size_GB) + (@Diffbackup * @diff_backup_size_GB) + (@LogBackupFileCount * @log_backup_count) + (@LogBackup * @total_log_backup_size_GB), 2) > 1,
ROUND(@Intercept + (32 * @vcore_count) + (@Fullbackup * @full_backup_size_GB) + (@Diffbackup * @diff_backup_size_GB) + (@LogBackupFileCount * @log_backup_count) + (@LogBackup * @total_log_backup_size_GB), 2),
1
) AS GP_32_vCores;
END
— Calculate for Business Critical SLO
IF (@FullBackup <= 400)
BEGIN
SET @Intercept = 1.909336538;
SET @vcore_count = -0.041561702;
SET @full_backup_size_GB = 0.066450199;
SET @diff_backup_size_GB = 0.07559076;
SET @log_backup_count = 0.033185528;
SET @total_log_backup_size_GB = 0.297277091;
SELECT
‘SQL MI Business Critical Database Restore Service Level Expectation (Minutes)’ AS SLO,
IIF(
ROUND(@Intercept + (4 * @vcore_count) + (@Fullbackup * @full_backup_size_GB) + (@Diffbackup * @diff_backup_size_GB) + (@LogBackupFileCount * @log_backup_count) + (@LogBackup * @total_log_backup_size_GB), 2) > 1,
ROUND(@Intercept + (4 * @vcore_count) + (@Fullbackup * @full_backup_size_GB) + (@Diffbackup * @diff_backup_size_GB) + (@LogBackupFileCount * @log_backup_count) + (@LogBackup * @total_log_backup_size_GB), 2),
1
) AS BC_4_vCores,
IIF(
ROUND(@Intercept + (8 * @vcore_count) + (@Fullbackup * @full_backup_size_GB) + (@Diffbackup * @diff_backup_size_GB) + (@LogBackupFileCount * @log_backup_count) + (@LogBackup * @total_log_backup_size_GB), 2) > 1,
ROUND(@Intercept + (8 * @vcore_count) + (@Fullbackup * @full_backup_size_GB) + (@Diffbackup * @diff_backup_size_GB) + (@LogBackupFileCount * @log_backup_count) + (@LogBackup * @total_log_backup_size_GB), 2),
1
) AS BC_8_vCores,
IIF(
ROUND(@Intercept + (16 * @vcore_count) + (@Fullbackup * @full_backup_size_GB) + (@Diffbackup * @diff_backup_size_GB) + (@LogBackupFileCount * @log_backup_count) + (@LogBackup * @total_log_backup_size_GB), 2) > 1,
ROUND(@Intercept + (16 * @vcore_count) + (@Fullbackup * @full_backup_size_GB) + (@Diffbackup * @diff_backup_size_GB) + (@LogBackupFileCount * @log_backup_count) + (@LogBackup * @total_log_backup_size_GB), 2),
1
) AS BC_16_vCores,
IIF(
ROUND(@Intercept + (24 * @vcore_count) + (@Fullbackup * @full_backup_size_GB) + (@Diffbackup * @diff_backup_size_GB) + (@LogBackupFileCount * @log_backup_count) + (@LogBackup * @total_log_backup_size_GB), 2) > 1,
ROUND(@Intercept + (24 * @vcore_count) + (@Fullbackup * @full_backup_size_GB) + (@Diffbackup * @diff_backup_size_GB) + (@LogBackupFileCount * @log_backup_count) + (@LogBackup * @total_log_backup_size_GB), 2),
1
) AS BC_24_vCores,
IIF(
ROUND(@Intercept + (32 * @vcore_count) + (@Fullbackup * @full_backup_size_GB) + (@Diffbackup * @diff_backup_size_GB) + (@LogBackupFileCount * @log_backup_count) + (@LogBackup * @total_log_backup_size_GB), 2) > 1,
ROUND(@Intercept + (32 * @vcore_count) + (@Fullbackup * @full_backup_size_GB) + (@Diffbackup * @diff_backup_size_GB) + (@LogBackupFileCount * @log_backup_count) + (@LogBackup * @total_log_backup_size_GB), 2),
1
) AS BC_32_vCores;
END
ELSE IF (@FullBackup > 400 AND @FullBackup <= 2000)
BEGIN
SET @Intercept = 17.95688297;
SET @vcore_count = -0.556172229;
SET @full_backup_size_GB = 0.050444533;
SET @diff_backup_size_GB = 0.013286432;
SET @log_backup_count = 0.05186371;
SET @total_log_backup_size_GB = 0.180903906;
SELECT
‘SQL MI Business Critical Database Restore Service Level Expectation (Minutes)’ AS SLO,
IIF(
ROUND(@Intercept + (4 * @vcore_count) + (@Fullbackup * @full_backup_size_GB) + (@Diffbackup * @diff_backup_size_GB) + (@LogBackupFileCount * @log_backup_count) + (@LogBackup * @total_log_backup_size_GB), 2) > 1,
ROUND(@Intercept + (4 * @vcore_count) + (@Fullbackup * @full_backup_size_GB) + (@Diffbackup * @diff_backup_size_GB) + (@LogBackupFileCount * @log_backup_count) + (@LogBackup * @total_log_backup_size_GB), 2),
1
) AS BC_4_vCores,
IIF(
ROUND(@Intercept + (8 * @vcore_count) + (@Fullbackup * @full_backup_size_GB) + (@Diffbackup * @diff_backup_size_GB) + (@LogBackupFileCount * @log_backup_count) + (@LogBackup * @total_log_backup_size_GB), 2) > 1,
ROUND(@Intercept + (8 * @vcore_count) + (@Fullbackup * @full_backup_size_GB) + (@Diffbackup * @diff_backup_size_GB) + (@LogBackupFileCount * @log_backup_count) + (@LogBackup * @total_log_backup_size_GB), 2),
1
) AS BC_8_vCores,
IIF(
ROUND(@Intercept + (16 * @vcore_count) + (@Fullbackup * @full_backup_size_GB) + (@Diffbackup * @diff_backup_size_GB) + (@LogBackupFileCount * @log_backup_count) + (@LogBackup * @total_log_backup_size_GB), 2) > 1,
ROUND(@Intercept + (16 * @vcore_count) + (@Fullbackup * @full_backup_size_GB) + (@Diffbackup * @diff_backup_size_GB) + (@LogBackupFileCount * @log_backup_count) + (@LogBackup * @total_log_backup_size_GB), 2),
1
) AS BC_16_vCores,
IIF(
ROUND(@Intercept + (24 * @vcore_count) + (@Fullbackup * @full_backup_size_GB) + (@Diffbackup * @diff_backup_size_GB) + (@LogBackupFileCount * @log_backup_count) + (@LogBackup * @total_log_backup_size_GB), 2) > 1,
ROUND(@Intercept + (24 * @vcore_count) + (@Fullbackup * @full_backup_size_GB) + (@Diffbackup * @diff_backup_size_GB) + (@LogBackupFileCount * @log_backup_count) + (@LogBackup * @total_log_backup_size_GB), 2),
1
) AS BC_24_vCores,
IIF(
ROUND(@Intercept + (32 * @vcore_count) + (@Fullbackup * @full_backup_size_GB) + (@Diffbackup * @diff_backup_size_GB) + (@LogBackupFileCount * @log_backup_count) + (@LogBackup * @total_log_backup_size_GB), 2) > 1,
ROUND(@Intercept + (32 * @vcore_count) + (@Fullbackup * @full_backup_size_GB) + (@Diffbackup * @diff_backup_size_GB) + (@LogBackupFileCount * @log_backup_count) + (@LogBackup * @total_log_backup_size_GB), 2),
1
) AS BC_32_vCores;
END
ELSE IF (@FullBackup > 2000)
BEGIN
SET @Intercept = 177.1004847;
SET @vcore_count = -1.766472406;
SET @full_backup_size_GB = 0.017504939;
SET @diff_backup_size_GB = -0.044402262;
SET @log_backup_count = -0.164065766;
SET @total_log_backup_size_GB = 0.401938221;
SELECT
‘SQL MI Business Critical Database Restore Service Level Expectation (Minutes)’ AS SLO,
IIF(
ROUND(@Intercept + (16 * @vcore_count) + (@Fullbackup * @full_backup_size_GB) + (@Diffbackup * @diff_backup_size_GB) + (@LogBackupFileCount * @log_backup_count) + (@LogBackup * @total_log_backup_size_GB), 2) > 1,
ROUND(@Intercept + (16 * @vcore_count) + (@Fullbackup * @full_backup_size_GB) + (@Diffbackup * @diff_backup_size_GB) + (@LogBackupFileCount * @log_backup_count) + (@LogBackup * @total_log_backup_size_GB), 2),
1
) AS BC_16_vCores,
IIF(
ROUND(@Intercept + (20 * @vcore_count) + (@Fullbackup * @full_backup_size_GB) + (@Diffbackup * @diff_backup_size_GB) + (@LogBackupFileCount * @log_backup_count) + (@LogBackup * @total_log_backup_size_GB), 2) > 1,
ROUND(@Intercept + (20 * @vcore_count) + (@Fullbackup * @full_backup_size_GB) + (@Diffbackup * @diff_backup_size_GB) + (@LogBackupFileCount * @log_backup_count) + (@LogBackup * @total_log_backup_size_GB), 2),
1
) AS BC_20_vCores,
IIF(
ROUND(@Intercept + (24 * @vcore_count) + (@Fullbackup * @full_backup_size_GB) + (@Diffbackup * @diff_backup_size_GB) + (@LogBackupFileCount * @log_backup_count) + (@LogBackup * @total_log_backup_size_GB), 2) > 1,
ROUND(@Intercept + (24 * @vcore_count) + (@Fullbackup * @full_backup_size_GB) + (@Diffbackup * @diff_backup_size_GB) + (@LogBackupFileCount * @log_backup_count) + (@LogBackup * @total_log_backup_size_GB), 2),
1
) AS BC_24_vCores,
IIF(
ROUND(@Intercept + (32 * @vcore_count) + (@Fullbackup * @full_backup_size_GB) + (@Diffbackup * @diff_backup_size_GB) + (@LogBackupFileCount * @log_backup_count) + (@LogBackup * @total_log_backup_size_GB), 2) > 1,
ROUND(@Intercept + (32 * @vcore_count) + (@Fullbackup * @full_backup_size_GB) + (@Diffbackup * @diff_backup_size_GB) + (@LogBackupFileCount * @log_backup_count) + (@LogBackup * @total_log_backup_size_GB), 2),
1
) AS BC_32_vCores;
END
END TRY
BEGIN CATCH
— Handle the exception
PRINT ‘An error occurred: ‘ + ERROR_MESSAGE();
END CATCH
— Drop the temporary tables outside the try-catch block
IF OBJECT_ID(‘tempdb..#dataSET’) IS NOT NULL
DROP TABLE #dataSET;
IF OBJECT_ID(‘tempdb..#FinalResult’) IS NOT NULL
DROP TABLE #FinalResult;
END
GO
Microsoft Tech Community – Latest Blogs –Read More
Consulado da Portela de São Paulo
Gostaríamos de conexões com pessoas que possam ajudar nossa entidade a prosperar, também estamos à disposição para apoiar
Gostaríamos de conexões com pessoas que possam ajudar nossa entidade a prosperar, também estamos à disposição para apoiar Read More
Need Help in moving Desktop Pro 2021 to Desktop Pro 2024
Why everything stuck, I want to upgrade from QuickBook’s Desktop Pro 2021 to Desktop Pro 2024.
Why everything stuck, I want to upgrade from QuickBook’s Desktop Pro 2021 to Desktop Pro 2024. Read More
sync the computer attributes from onprem AD to azure ad
Hi Team,
Am need solution to sync the onprem Device distinguishedname to azure AD for creating filter on conditional access & create dynamic device group please let us know the possible solution.
Hi Team,Am need solution to sync the onprem Device distinguishedname to azure AD for creating filter on conditional access & create dynamic device group please let us know the possible solution. Read More
AI Summit Brasil 2024
Vou um prazer participar do evento da Microsoft. Muito bem organizado, excelente conteúdo!
Retribuo, convidando vocês para participarem do AI Summit Brasil 2024, que acontecerá nos dias 02 e 03 de Setembro, em que teremos Waldemir Cambiucci falando sobre IA responsável.
Até lá!!!
Vou um prazer participar do evento da Microsoft. Muito bem organizado, excelente conteúdo! Retribuo, convidando vocês para participarem do AI Summit Brasil 2024, que acontecerá nos dias 02 e 03 de Setembro, em que teremos Waldemir Cambiucci falando sobre IA responsável. https://aisummit.org.br/ Até lá!!! Read More
All ADF dataflow stages disappear after publish
Hello,
I made a source for my Dataflow and added several stages to the existing one. I published Dataflow and got publish succes. After run pipeline showed an error ina new source in dataflow. I tried to delete it (new source), but could not delete it. I succes published again and all my stages disappeared.
It is not possible to publish data flow like a this, because I will get an error, because there is no connected sink. In any case, it should have been publish with all steps.
How to get them back, why did they disappear and what happened?
And so it was before that. (It’s from last pipeline monitoring)
How it’s possible?
Hello,I made a source for my Dataflow and added several stages to the existing one. I published Dataflow and got publish succes. After run pipeline showed an error ina new source in dataflow. I tried to delete it (new source), but could not delete it. I succes published again and all my stages disappeared.It is not possible to publish data flow like a this, because I will get an error, because there is no connected sink. In any case, it should have been publish with all steps.How to get them back, why did they disappear and what happened?And so it was before that. (It’s from last pipeline monitoring)How it’s possible? Read More
SharePoint Access Request Email Form Error
All users are receiving the following error when clicking Approve or Decline in the email form for SharePoint access requests: Could not complete the requested action. Please try again later.
These users have the appropriate permissions as the workaround to navigate to the SharePoint Online access request page and Approve/Decline works without issue. The error is only occurring in Outlook (desktop client and web client).
I’ve attached a couple screenshots. The error pops up whether you select Edit or View, and whether you click Approve or Decline.
I do have a ticket open with Microsoft, but it’s been open for a week and no resolution yet, so I’m hoping to find someone else who’s seen this and solved it.
All users are receiving the following error when clicking Approve or Decline in the email form for SharePoint access requests: Could not complete the requested action. Please try again later. These users have the appropriate permissions as the workaround to navigate to the SharePoint Online access request page and Approve/Decline works without issue. The error is only occurring in Outlook (desktop client and web client). I’ve attached a couple screenshots. The error pops up whether you select Edit or View, and whether you click Approve or Decline. I do have a ticket open with Microsoft, but it’s been open for a week and no resolution yet, so I’m hoping to find someone else who’s seen this and solved it. Read More