Category: Microsoft
Category Archives: Microsoft
Accelerating Co-Pilot Adoption for Enhanced Business Efficiency at StriveTechAI
At StriveTechAI, we are excited to announce our commitment to accelerating the adoption of Microsoft Copilot programs among our existing customers. As a leader in AI-driven solutions, we recognise the transformative potential of Copilot to streamline operations and drive strategic initiatives across all organisational sectors.
Deepening Integration with Microsoft Copilot
Microsoft Copilot has revolutionised how businesses interact with their data and workflows. Our goal is to deepen Copilot’s integration into our clients’ ecosystems, ensuring that every department—from finance and HR to operations and marketing—can leverage AI to enhance their productivity and decision-making processes.
Seeking Experienced Copilot Adopters
We are actively seeking highly experienced Copilot adopters to support this ambitious initiative. The ideal candidates will possess extensive knowledge in deploying and managing AI solutions and a proven track record of strategic implementation across diverse organisational structures.
Role of Expert Adopters
Our expert adopters will play a crucial role in:
Guiding Clients: You will provide hands-on guidance to our clients, helping them understand how to implement Copilot to optimise their workflows and data analysis strategically.Strategic Implementation: Beyond essential integration, you can advise on aligning Copilot’s capabilities with long-term business goals, ensuring that AI adoption supports broader strategic outcomes.Training and Development: Facilitate training sessions and workshops to enable clients’ teams to effectively utilise Copilot, fostering an environment of AI literacy and self-sufficiency.
Focus on Security and Data Integrity
A significant part of this acceleration program involves addressing and mitigating potential security risks associated with AI implementation:
Data Labeling and Management: You will advise our clients on best data labelling and management practices, which are crucial for maintaining the integrity and security of sensitive information.Reducing Security Risks: Through strategic data handling and tailored security protocols, you will help minimise the risk of data breaches, ensuring that our clients’ deployments of Microsoft Copilot are secure and compliant with prevailing data protection regulations.
Join Us in Transforming Business Operations
We invite skilled professionals who are passionate about AI and have a keen understanding of Microsoft Copilot to join us on this journey. Together, we will implement technology and transform it into a cornerstone of business strategy and operations.
Contact Us
If you are interested in collaborating with us to drive AI adoption and want to make a tangible impact on the operational success of businesses across industries, please reach out. Visit our careers page or contact us directly at info@strivetech-ai.com.
Let’s harness the power of Microsoft Copilot together and lead the way in intelligent business transformation.
At StriveTechAI, we are excited to announce our commitment to accelerating the adoption of Microsoft Copilot programs among our existing customers. As a leader in AI-driven solutions, we recognise the transformative potential of Copilot to streamline operations and drive strategic initiatives across all organisational sectors.Deepening Integration with Microsoft CopilotMicrosoft Copilot has revolutionised how businesses interact with their data and workflows. Our goal is to deepen Copilot’s integration into our clients’ ecosystems, ensuring that every department—from finance and HR to operations and marketing—can leverage AI to enhance their productivity and decision-making processes.Seeking Experienced Copilot AdoptersWe are actively seeking highly experienced Copilot adopters to support this ambitious initiative. The ideal candidates will possess extensive knowledge in deploying and managing AI solutions and a proven track record of strategic implementation across diverse organisational structures.Role of Expert AdoptersOur expert adopters will play a crucial role in:Guiding Clients: You will provide hands-on guidance to our clients, helping them understand how to implement Copilot to optimise their workflows and data analysis strategically.Strategic Implementation: Beyond essential integration, you can advise on aligning Copilot’s capabilities with long-term business goals, ensuring that AI adoption supports broader strategic outcomes.Training and Development: Facilitate training sessions and workshops to enable clients’ teams to effectively utilise Copilot, fostering an environment of AI literacy and self-sufficiency.Focus on Security and Data IntegrityA significant part of this acceleration program involves addressing and mitigating potential security risks associated with AI implementation:Data Labeling and Management: You will advise our clients on best data labelling and management practices, which are crucial for maintaining the integrity and security of sensitive information.Reducing Security Risks: Through strategic data handling and tailored security protocols, you will help minimise the risk of data breaches, ensuring that our clients’ deployments of Microsoft Copilot are secure and compliant with prevailing data protection regulations.Join Us in Transforming Business OperationsWe invite skilled professionals who are passionate about AI and have a keen understanding of Microsoft Copilot to join us on this journey. Together, we will implement technology and transform it into a cornerstone of business strategy and operations.Contact UsIf you are interested in collaborating with us to drive AI adoption and want to make a tangible impact on the operational success of businesses across industries, please reach out. Visit our careers page or contact us directly at info@strivetech-ai.com. Let’s harness the power of Microsoft Copilot together and lead the way in intelligent business transformation. Read More
Sharepoint Delta API getting GeneralException Error
Hi All,
We are getting GeneralException, 500 Error for some tenants with the message;”General Exception while processing” when we are calling the Sharepoint Sites Delta API.
Would appreciate any support!
Hi All, We are getting GeneralException, 500 Error for some tenants with the message;”General Exception while processing” when we are calling the Sharepoint Sites Delta API. Would appreciate any support! Read More
Austin IAMCP Chapter Meeting, “The Future of AI and Its Impact on Your Organization” May 23 – Hybrid
You’re invited to join us in-person or virtually on May 23!
IAMCP (International Association of Microsoft Channel Partners) TOLA Chapter rotates monthly and hosts chapter meetings in Austin, Houston and Dallas. All meetings are hybrid and anyone is welcome to attend, no matter where you are!
Our May meeting:
The Future of AI and Its Impact on Your Organization
Artificial intelligence (AI) will change every aspect of digital work in the coming decade. AI’s reach will proliferate in your organization and those of your partners, customers, and rivals, thanks to continued advances in hardware, software, analytics, and the ability to scale these changes.
Join IAMCP for an interactive panel discussion with key business leaders to explore how their respective organizations are using or planning to use AI to advance and improve organization productivity. You will get the opportunity to interact with our panel and discuss the impact of planning, budgeting, and challenges of AI implementation.
Our Panel:
Stephen Elkins – Texas Client Director – MicrosoftRicardo Blanco – Deputy Executive Commissioner for IT and Chief Information Officer at Texas Health and Human ServicesTim Weinheimer – Chief Innovations Officer at Hahn Labs
Join us online or in-person (lunch included) in Austin at the Microsoft office –
10900 Stonelake Blvd., Suite B-225
Austin, TX 78759
11:30am-1:00pm CST
Not a member of IAMCP? You can still attend for $30 or, as a new member, join for $1 for your first 90 days!
I myself have been a member of IAMCP for about 2+ years and I’m the Secretary of the Houston chapter. There are plenty of other virtual meetings every month covering all topics concerning partners. It’s a great way to understand the ecosystem, how to gain designations and credentials, and find partner to partner opportunities.
You’re invited to join us in-person or virtually on May 23! IAMCP (International Association of Microsoft Channel Partners) TOLA Chapter rotates monthly and hosts chapter meetings in Austin, Houston and Dallas. All meetings are hybrid and anyone is welcome to attend, no matter where you are! Our May meeting:The Future of AI and Its Impact on Your OrganizationArtificial intelligence (AI) will change every aspect of digital work in the coming decade. AI’s reach will proliferate in your organization and those of your partners, customers, and rivals, thanks to continued advances in hardware, software, analytics, and the ability to scale these changes.Join IAMCP for an interactive panel discussion with key business leaders to explore how their respective organizations are using or planning to use AI to advance and improve organization productivity. You will get the opportunity to interact with our panel and discuss the impact of planning, budgeting, and challenges of AI implementation. Our Panel:Stephen Elkins – Texas Client Director – MicrosoftRicardo Blanco – Deputy Executive Commissioner for IT and Chief Information Officer at Texas Health and Human ServicesTim Weinheimer – Chief Innovations Officer at Hahn Labs Join us online or in-person (lunch included) in Austin at the Microsoft office – 10900 Stonelake Blvd., Suite B-225Austin, TX 7875911:30am-1:00pm CST Register Here > Not a member of IAMCP? You can still attend for $30 or, as a new member, join for $1 for your first 90 days! I myself have been a member of IAMCP for about 2+ years and I’m the Secretary of the Houston chapter. There are plenty of other virtual meetings every month covering all topics concerning partners. It’s a great way to understand the ecosystem, how to gain designations and credentials, and find partner to partner opportunities. Read More
In-line poll experience in Outlook Web
I’d like to introduce you to our brand-new poll experience in Outlook web with Microsoft Loop to streamline your workflow. The current process of creating polls in Outlook can be somewhat cumbersome and less user-friendly. Email/poll composers must navigate through a side-pane to create a poll and cannot preview how it will appear to recipients before sending the email. With the in-line experience that Microsoft Loop provides, they can now create polls directly within an email thread. Recipients can then vote in line and instantly check the real-time poll results. Let’s dive into this amazing new feature and explore it together!
Intuitive entry point for insertion
In Outlook web, just click on “Insert” in the ribbon, and select “Polls” to effortlessly add a poll into your email body. Insert a poll into Outlook web now!
In-line poll creation
The enhanced poll experience offers a seamless method for crafting polls directly within the email, eliminating the need to switch between the side-pane and the email body. Additionally, you can add multiples polls and choose a variety of question types while creating the poll for different scenarios. For example, an email aimed at gauging team morale, you can incorporate a ranking question to assess activity preferences and a multiple-choice question to determine preferred time slots.
WYSIYG (What You See is What You Get)
Previously, email/poll composers could only see a hyperlink of the poll when creating it within an email, now they have the ability to preview exactly how the poll will appear to recipients before sending the email.
In-line poll vote experience with live result
Once recipients get the email, they can easily vote in line without being redirected to Forms web, and both the email composer and recipients can see the live results as new responses coming in. It’s important to note that the email composer will need to navigate to the “sent items” in mailbox to view the live results.
Fallback experience to Forms web
In the initial phase, Loop Polls is now fully enabled for Outlook web. If recipients receive emails containing embedded polls via Outlook desktop or mobile, they will still see the hyperlink and being redirected to Forms web to continue voting.
Our goal is to ensure in-line poll availability across all platforms towards the end of this year. Try to insert a poll now!
Microsoft Tech Community – Latest Blogs –Read More
M365 Community Conference: Building Collaborative Apps in Teams to bring People together
Session: Building Collaborative Apps in Teams to bring People together
Speakers: Loki Meyburg
Collaboration and productivity are essential for any organization, especially in the hybrid work environment. Microsoft Teams is the ultimate platform for collaboration, allowing you to work together with apps in chats, channels, and meetings. Loki Meyburg explained in his session how to build collaborative apps in Teams and enhance your work experience.
Collaboration vs. Productivity
First, let’s understand the difference between collaboration and productivity, and how Teams can support both aspects of work. Collaboration is the act of working together with multiple people to achieve a common goal, while productivity is the efficiency and effectiveness of individual or collective work efforts. Microsoft Teams enables you to collaborate around apps by sharing, discovering, notifying, and collaborating on app content in various contexts.
Sharing is the first step of collaboration
One of the key features of Teams is the ability to collaborate around shared links. When you share a link to an app content in a chat or channel, Teams can automatically unfurl the link and attach a rich interactive preview card, using adaptive cards and bots. The preview card can show relevant information and actions related to the app content, such as a product launch diagram, a survey, or a report. You can also open the app content in a popout window with chat on the side or share it to a meeting and use it together in real-time.
To build these experiences, you can use message extensions, link unfurling, app content stages, and Live Share. Message extensions allow you to register your domain and turn links into adaptive cards. Link unfurling enables bots to unfurl the links and attach the adaptive cards to the messages. App content stages are special views that present the web app in a popout window or a meeting stage. Live Share is a service that allows you to easily enable multiplayer experiences in meetings, with features such as inking, cursors, video, and audio synchronization.
How to build these experiences
Bots are the foundation on which we will build these experiences. They enable everything else. You can use message extensions and link unfurling to attach rich interactive adaptive card previews when your URLs get shared in chats or channels. These previews can show relevant information and actions related to your app content, such as a product launch diagram, a survey, or a report. You can also customize the look and feel of the adaptive cards using templates and styles.
There are two app content stages, collab stage and meeting stage, to present your web app in Teams. The collab stage is a popout window that opens when you click on the app icon in the chat header or the preview card. It allows you to view and interact with the app content along with the chat on the side. The meeting stage is a full-screen view that opens when you share the app content to a meeting. It allows you to collaborate on the app content in real-time with other meeting participants. You can use Live Share to easily enable multiplayer experiences in meetings, with features such as inking, cursors, video, and audio synchronization.
Going from productivity to collaboration
Another important aspect of building collaborative apps in Teams is taking collaboration to the next level. You can enhance the collaboration experience by proactively notifying users and creating focused conversations, leveraging app skills and natural language processing, and using Teams SSO to authenticate users. You can also use some tools and resources for developers, such as Figma UI Kit, Teams Toolkit for Visual Studio Code, and Developer Portal. You should also be aware of some upcoming improvements, such as app rating and review, adaptive card styling, permissions and consent, and instant app tabs.
In conclusion, Teams can help you collaborate around apps in various scenarios and contexts, and you can build these experiences using the Teams platform.
Additional resources
You can find more information about how to build your own collaborative apps like link unfurling, collab stages or the Teams AI library here:
https://aka.ms/teams-link-unfurling
https://aka.ms/teams-collab-stage
https://aka.ms/teams-meeting-stage
https://aka.ms/teams-live-share
https://aka.ms/share-in-teams
https://aka.ms/teams-proactive-messages
https://aka.ms/teams-ai-library
https://aka.ms/teams-sso
https://aka.ms/teams-ui-kit
https://aka.ms/teams-developer-portal
Microsoft Tech Community – Latest Blogs –Read More
REST API output to handle in collection reference
Hi ,
I have code below which is returning from REST API call. Same i need to pass to collection reference to handle in mapping in ADF.
I am using the copy data activity source from Rest API call inputs and trying to mapping as below in copy data activity in ADF.(Above json code my RestAPI data set is generating.
Please provide how can i pass collection reference to handle to pass the values in sink.
Thanks
Hi ,I have code below which is returning from REST API call. Same i need to pass to collection reference to handle in mapping in ADF. I am using the copy data activity source from Rest API call inputs and trying to mapping as below in copy data activity in ADF.(Above json code my RestAPI data set is generating. Please provide how can i pass collection reference to handle to pass the values in sink. Thanks Read More
Teams Rooms Administration
Hi All,
We have some teams rooms accounts on our tenant with MTR equipment attached that we would like to have a small admin subset to administer solely.
Is anyone aware of how to go about setting this up so that when they login to the MTR Pro Management portal that only see the rooms they are assigned?
Hi All, We have some teams rooms accounts on our tenant with MTR equipment attached that we would like to have a small admin subset to administer solely. Is anyone aware of how to go about setting this up so that when they login to the MTR Pro Management portal that only see the rooms they are assigned? Read More
MVP’s Favorite Content: Copilot for Power BI, Azure, Windows
In this blog series dedicated to Microsoft’s technical articles, we’ll highlight our MVPs’ favorite article along with their personal insights.
Mohamed El-Qassas, M365, Business Applications MVP, Saudi Arabia
Overview of Copilot for Power BI (preview) – Power BI | Microsoft Learn
“I recommend this content as it offers an in-depth exploration of Power BI’s Copilot feature, including guidance on its configuration for use within your tenant settings.
Furthermore, it includes a practical example demonstrating the utilization of Copilot in Power BI.”
*Relevant Blog: How To Use Copilot In Power BI? | Microsoft Fabric (devoworx.net)
Yoichi Ishikawa, Data Platform MVP, Japan
Update your data model to work well with Copilot for Power BI – Power BI | Microsoft Learn
“When utilizing Copilot for Power BI, it’s recommended to refer to the guidelines for ensuring Copilot operates effectively. Even if you’re not immediately starting to use Copilot, adopting the right approach to your semantic models—such as setting column data types, measures, and types of relationships, among others—can significantly enhance the performance of generative AI. Moreover, it facilitates the sharing of understanding among stakeholders.”
(In Japanese: Copilot for Power BIの活用にあたり、Copilotが効果的に機能するためのガイドラインを参照することが推奨されます。即座にCopilotの使用を開始しない場合でも、セマンティックモデルに対する適切なアプローチ――列のデータ型、メジャー、リレーションシップの種類の設定等――は、生成AIのパフォーマンス向上はもちろん、関係者間の理解の共有にも大いに役立つでしょう。)
*Relevant Activity: Power BI Weekly News – connpass
George Chysovalantis Grammatikos, Microsoft Azure MVP, Greece
Azure on Microsoft Learn | Microsoft Learn
“I highly recommend Microsoft Azure training content because it covers a wide range of topics, from beginner to advanced. The content is up-to-date and provides tutorials, real-life scenarios and hands-on labs ensuring the learner gains practical experience apart from knowledge.”
*Relevant Activities/Resources:
– Optimizing Azure Infrastructure Management with Azure Bicep – Global Azure 2024
Maison da Silva, Windows and Devices MVP, Brazil
Stop-Computer (Microsoft.PowerShell.Management) – PowerShell | Microsoft Learn
“Shutting Down Local or Remote Computer via PowerShell or Windows Terminal.”
*Relevant Blog: Desligando o Computador local ou remoto via PowerShell ou Terminal do Windows – Maison da Silva
Microsoft Tech Community – Latest Blogs –Read More
Send an Email with Incident Details
I’m new to Logic Apps and trying to build one that sends an email with incident details (username, email, IP) upon incident creation. I’m stuck on extracting these specific fields from the incident log.
I’m new to Logic Apps and trying to build one that sends an email with incident details (username, email, IP) upon incident creation. I’m stuck on extracting these specific fields from the incident log. Read More
To Do – Due Date Selector Not Working
I noticed that when you go to select a due date in To Do from the calendar or even select an option like “tomorrow” instead of displaying the correct date it always shows the date of the day before.
For example, I select 5/20/24 and it displays 5/19/24.
I’ve attached a link to a video showing the issue.
https://drive.google.com/file/d/13_hj9hNBm_PG0p4UMvFoyvlE0AVInROs/view?usp=sharing
I noticed that when you go to select a due date in To Do from the calendar or even select an option like “tomorrow” instead of displaying the correct date it always shows the date of the day before. For example, I select 5/20/24 and it displays 5/19/24. I’ve attached a link to a video showing the issue.https://drive.google.com/file/d/13_hj9hNBm_PG0p4UMvFoyvlE0AVInROs/view?usp=sharing Read More
Service Broker Readers
Hi,
I am using service broker as a means to process events.
I have an activation procedure, that reads a single message, processes that message and then finishes.
My queue has 10 readers.
In the activation procedure I log the SPID and the login time of the session and my expectation would be, that each messages has a unique combination of SPID and login time of the session.
That is true most of the time.
Sometimes however, a sessions seems to be re-used, meaning that more that one message (as per now always three) are processed by the same session (combination of SPID and login time).
Does the service broker recycle readers and if yes, is that something that can be configured/controlled?
Thanks a lot in advance
Chris
Hi,I am using service broker as a means to process events.I have an activation procedure, that reads a single message, processes that message and then finishes.My queue has 10 readers.In the activation procedure I log the SPID and the login time of the session and my expectation would be, that each messages has a unique combination of SPID and login time of the session. That is true most of the time. Sometimes however, a sessions seems to be re-used, meaning that more that one message (as per now always three) are processed by the same session (combination of SPID and login time). Does the service broker recycle readers and if yes, is that something that can be configured/controlled? Thanks a lot in advanceChris Read More
Intune Suite features
Still no online status for devices? Or ability to send keystrokes for secure password input for UAC prompts when on a remote session? Or unattended access for windows devices that are not logged in? Seriously Microsoft…. lots of catching up to do…. please take a look at N-able’s remote support products to get a sense of how far behind this still is. I imagine there is a way to see the device runtime (at least for Windows) but not sure how to do this if there is so if anyone could let me know that would be much appreciated
Still no online status for devices? Or ability to send keystrokes for secure password input for UAC prompts when on a remote session? Or unattended access for windows devices that are not logged in? Seriously Microsoft…. lots of catching up to do…. please take a look at N-able’s remote support products to get a sense of how far behind this still is. I imagine there is a way to see the device runtime (at least for Windows) but not sure how to do this if there is so if anyone could let me know that would be much appreciated Read More
Bookings to Excel via Automate
I have a Booking Page that creates a new row in excel spreadsheet when a new booking is made. I have each column in my table mapped.
What I need is the date & time in separate columns in my table. In Power Automate, it only has options to map startdate and enddate each “jumbles” the appointment start date & time in one column.
Is there a function or expression to make this work?
I have a Booking Page that creates a new row in excel spreadsheet when a new booking is made. I have each column in my table mapped.What I need is the date & time in separate columns in my table. In Power Automate, it only has options to map startdate and enddate each “jumbles” the appointment start date & time in one column.Is there a function or expression to make this work? Read More
Copilot suggestions
Looking for best way to source a research asset library that contains files on different topics. Want to ask demographic or product specific questions and have copilot pull answers.
The asset library sits within our company intranet sharepoint site so it’s more than 2 levels deep.
Also, I have tried to apply a copilot to a client site containing a document library and copilot is not pulling any information on things I know it should be able to find.
Any suggestions would be appreciated.
Thanks!
Looking for best way to source a research asset library that contains files on different topics. Want to ask demographic or product specific questions and have copilot pull answers. The asset library sits within our company intranet sharepoint site so it’s more than 2 levels deep. Also, I have tried to apply a copilot to a client site containing a document library and copilot is not pulling any information on things I know it should be able to find. Any suggestions would be appreciated.Thanks! Read More
Targeted release
Hello
Is there a way to pull out a list of users that have targeted release?
I am looking for a powershell script.
Regards
JFM_12
HelloIs there a way to pull out a list of users that have targeted release?I am looking for a powershell script.RegardsJFM_12 Read More
Make report with all devices, username and last restart date
Hello,
How to make report from Intune/Ms graph for all devices with last resatart date (time)?
Thanks
Hello, How to make report from Intune/Ms graph for all devices with last resatart date (time)? Thanks Read More
Will VBA be obsolete from excel?
We have been receiving news about macros that it will be obsolete soon? Not sure if we should believe it or not? We have a product largely based on macros. if its true, we need to find option.
We have been receiving news about macros that it will be obsolete soon? Not sure if we should believe it or not? We have a product largely based on macros. if its true, we need to find option. Read More
Formula to get address of selected cell on a worksheet
Hi,
I’m looking for a formula to get the currently selected cell within a worksheet.
No VBA or macros since my client has that disabled.Not the last cell edited since an edit may not occur.It can’t be affected by changes of selection on other worksheets.
This is obviously information that Excel tracks and there is no good reason why it wouldn’t be readily available. (I know there is always the possibility of bad reasons they may keep this information inaccessible, but I’m hoping that isn’t the case.)
Any help would be appreciated.
Thank you,
John
Hi, I’m looking for a formula to get the currently selected cell within a worksheet.No VBA or macros since my client has that disabled.Not the last cell edited since an edit may not occur.It can’t be affected by changes of selection on other worksheets.This is obviously information that Excel tracks and there is no good reason why it wouldn’t be readily available. (I know there is always the possibility of bad reasons they may keep this information inaccessible, but I’m hoping that isn’t the case.)Any help would be appreciated. Thank you, John Read More
Lesson Learned #484: Database Performance Monitoring with PowerShell and ReadScale-Out Environments
Today, I handled a service request from our customer seeking additional information on performance monitoring for ReadScale-Out in Azure SQL Database. They needed details on how to find missing indexes, query statistics, and more.
I created this PowerShell script that works for both Elastic Database Pools and standalone databases, executing multiple queries for monitorin
The following PowerShell script connects to a read-only replica, retrieves performance data, and stores it for future analysis. This article will guide you through the steps required to implement this script in your own environment and could be possible to use for Single Azure SQL Database, Azure SQL Elastic Pool or Azure SQL Managed Instance.
Check if the statistics
If number of rows in the statistics is different of rows_sampled.
If we have more than 15 days that the statistics have been updated.
Check if the statistics associated to any index is:
If number of rows in the statistics is different of rows_sampled.
If we have more than 15 days that the statistics have been updated.
Check if MAXDOP is 0
Check if we have missing indexes
Obtain resource usage per database.
Total amount of space and rows per schema and table name
Query stats summary.
Basically we need to configure the parameters:
$server = “xxxxx.database.windows.net” // Azure SQL Server name
$user = “xxxxxx” // User Name
$passwordSecure = “xxxxxx” // Password
$Db = “xxxxxx” // Database Name, if you type the value ALL, all databases will be checked.
$Folder = $true // Folder where the log file will be generated with all the issues found.
DropExisting =value 1 or 0, if the previous files located on Destinatio folder will be deleted every time that you execute the process.
ElasticDBPoolName. PowerShell Script will check all the databases that are associated with this elastic database pool (only for Azure SQL Database).
Create a subfolder with date and time used in every execution
PerfChecker.Log = Contains all the issues found.
Every check done will save two files
Extension .Txt that contains the report of the operation done.
Extension .task that contains a possible mitigation about the issue found.
Script
#—————————————————————-
# Application: ReadScale-Out Performance Checker
# Propose: Inform about performance recomendations for ReadScale-Out
# Checks:
# 1) Check if the statistics
# If number of rows in the statistics is different of rows_sampled.
# If we have more than 15 days that the statistics have been updated.
# 2) Check if we have any auto-tuning recomendations
# 3) Check if the statistics associated to any index is:
# If number of rows in the statistics is different of rows_sampled.
# If we have more than 15 days that the statistics have been updated.
# 4) Check if MAXDOP is 0
# 5) Check if we have missing indexes (SQL Server Instance)
# 6) Obtain resource usage per database.
# 7) Total amount of space and rows per table.
# 8) Query Stats summary
# Outcomes:
# In the folder specified in $Folder variable we are going to have a file called PerfChecker.Log that contains all the operations done
# and issues found. Also, we are going to have a file per database and check done with the results gathered.
# Every time that the process is executed it is created a new subfolder.
#—————————————————————-
#—————————————————————-
#Parameters
#—————————————————————-
param($server = “myserver.database.windows.net”, #ServerName parameter to connect,for example, myserver.database.windows.net
$user = “username”, #UserName parameter to connect
$passwordSecure = “pwd!”, #Password Parameter to connect
$Db = “DBName”, #DBName Parameter to connect. Type ALL to check all the databases running in the server
$Folder = “c:SQLDAta”, #Folder Parameter to save the log and solution files, for example, c:PerfChecker
$DropExisting=1, #Drop (1) the previous file saved in the folder with extensions .csv, .txt, .task !=1 = leave the files
$ElasticDBPoolName = “dbPoolName”) #Name of the elastic DB Pool if you want to filter only by elastic DB Pool.
#——————————————————————————-
# Check the statistics status
# 1.- Review if number of rows is different of rows_sampled
# 2.- Review if we have more than 15 days that the statistics have been updated.
#——————————————————————————-
function CheckStatistics($connection,$FileName, $FileNameLogSolution , $iTimeOut)
{
try
{
$Item=0
logMsg( “—- Checking Statistics health (Started) (REF: https://docs.microsoft.com/en-us/sql/t-sql/statements/update-statistics-transact-sql?view=sql-server-ver15)—- ” ) (1) $true $FileName
$command = New-Object -TypeName System.Data.SqlClient.SqlCommand
$command.CommandTimeout = $iTimeOut
$command.Connection=$connection
$command.CommandText = “SELECT sp.stats_id, stat.name, o.name, filter_definition, last_updated, rows, rows_sampled, steps, unfiltered_rows, modification_counter, DATEDIFF(DAY, last_updated , getdate()) AS Diff, schema_name(o.schema_id) as SchemaName
FROM sys.stats AS stat
Inner join sys.objects o on stat.object_id=o.object_id
CROSS APPLY sys.dm_db_stats_properties(stat.object_id, stat.stats_id) AS sp
WHERE o.type = ‘U’ AND stat.auto_created =’1′ or stat.user_created=’1′ order by o.name, stat.name”
$Reader = $command.ExecuteReader();
while($Reader.Read())
{
if( $Reader.GetValue(5) -gt $Reader.GetValue(6)) #If number rows is different rows_sampled
{
$Item=$Item+1
logMsg(“Possible outdated (Rows_Sampled is less than rows of the table):”.PadRight(100,” “) + ” of ” + ($Reader.GetValue(11).ToString() +”.”+ ($Reader.GetValue(2).ToString() + ” ” + $Reader.GetValue(1).ToString())).PadRight(400,” “)) (2) $true $FileName
logSolution(“UPDATE STATISTICS [” + $Reader.GetValue(11).ToString() +”].[“+ $Reader.GetValue(2).ToString() + “]([” + $Reader.GetValue(1).ToString() + “]) WITH FULLSCAN”) $FileNameLogSolution
}
if( TestEmpty($Reader.GetValue(10))) {}
else
{
if($Reader.GetValue(10) -gt 15) #if we have more than 15 days since the lastest update.
{
$Item=$Item+1
logMsg(“Possible outdated (15 days since the latest update):”.PadRight(100,” “) + ” of ” + ($Reader.GetValue(11).ToString() +”.”+ ($Reader.GetValue(2).ToString() + ” ” + $Reader.GetValue(1).ToString())).PadRight(400,” “)) (2) $true $FileName
logSolution(“UPDATE STATISTICS [” + $Reader.GetValue(11).ToString() +”].[“+ $Reader.GetValue(2).ToString() + “]([” + $Reader.GetValue(1).ToString() + “]) WITH FULLSCAN”) $FileNameLogSolution
}
}
}
$Reader.Close();
logMsg( “—- Checking Statistics health (Finished) —- ” ) (1) $true -$FileName
return $Item
}
catch
{
logMsg(“Not able to run statistics health checker…” + $Error[0].Exception) (2) $true $FileName
return 0
}
}
#——————————————————————————-
# Check missing indexes.
#——————————————————————————-
function CheckMissingIndexes($connection ,$FileName, $FileNameLogSolution , $iTimeOut)
{
try
{
$Item=0
logMsg( “—- Checking Missing Indexes (Started) Ref: https://docs.microsoft.com/en-us/sql/relational-databases/system-dynamic-management-views/sys-dm-db-missing-index-groups-transact-sql?view=sql-server-ver15 —- ” ) (1) $true $FileName
$command = New-Object -TypeName System.Data.SqlClient.SqlCommand
$command.CommandTimeout = $iTimeOut
$command.Connection=$connection
$command.CommandText = “SELECT CONVERT (varchar, getdate(), 126) AS runtime,
CONVERT (decimal (28,1), migs.avg_total_user_cost * migs.avg_user_impact *
(migs.user_seeks + migs.user_scans)) AS improvement_measure,
REPLACE(REPLACE(‘CREATE INDEX missing_index_’ + CONVERT (varchar, mig.index_group_handle) + ‘_’ +
CONVERT (varchar, mid.index_handle) + ‘ ON ‘ + LTRIM(RTRIM(REPLACE(REPLACE(REPLACE(mid.statement,CHAR(10), ‘ ‘), CHAR(13), ‘ ‘),’ ‘,”))) +
‘(‘ + ISNULL (mid.equality_columns,”)
+ CASE WHEN mid.equality_columns IS NOT NULL
AND mid.inequality_columns IS NOT NULL
THEN ‘,’ ELSE ” END + ISNULL (mid.inequality_columns, ”)
+ ‘)’
+ ISNULL (‘ INCLUDE (‘ + mid.included_columns + ‘)’, ”), CHAR(10), ‘ ‘), CHAR(13), ‘ ‘) AS create_index_statement,
migs.avg_user_impact
FROM sys.dm_db_missing_index_groups AS mig
INNER JOIN sys.dm_db_missing_index_group_stats AS migs
ON migs.group_handle = mig.index_group_handle
INNER JOIN sys.dm_db_missing_index_details AS mid
ON mig.index_handle = mid.index_handle
ORDER BY migs.avg_total_user_cost * migs.avg_user_impact * (migs.user_seeks + migs.user_scans) DESC”
$Reader = $command.ExecuteReader()
$bFound=$false
$bCol=$false
$ColName=””
$Content = [System.Collections.ArrayList]@()
while($Reader.Read())
{
#Obtain the columns only
if($bCol -eq $false)
{
for ($iColumn=0; $iColumn -lt $Reader.FieldCount; $iColumn++)
{
$bCol=$true
$ColName=$ColName + $Reader.GetName($iColumn).ToString().Replace(“t”,” “).Replace(“n”,” “).Replace(“r”,” “).Replace(“rn”,””).Trim() + ” || “
}
}
#Obtain the values of every missing indexes
$bFound=$true
$TmpContent=””
for ($iColumn=0; $iColumn -lt $Reader.FieldCount; $iColumn++)
{
$TmpContent= $TmpContent + $Reader.GetValue($iColumn).ToString().Replace(“t”,” “).Replace(“n”,” “).Replace(“r”,” “).Replace(“rn”,””).Trim() + ” || “
}
$Content.Add($TmpContent) | Out-null
}
if($bFound)
{
logMsg( “—- Missing Indexes found —- ” ) (1) $true $FileName
logMsg( $ColName.Replace(“t”,””).Replace(“n”,””).Replace(“r”,””) ) (1) $true $FileName $false
for ($iColumn=0; $iColumn -lt $Content.Count; $iColumn++)
{
logMsg( $Content[$iColumn].Replace(“t”,””).Replace(“n”,””).Replace(“r”,””).Replace(“rn”,””).Trim() ) (1) $true $FileName $false
$Item=$Item+1
}
}
$Reader.Close();
logMsg( “—- Checking missing indexes (Finished) —- ” ) (1) $true $FileName
return $Item
}
catch
{
logMsg(“Not able to run missing indexes…” + $Error[0].Exception) (2) $true $FileName
return 0
}
}
#——————————————————————————-
# Check if the statistics associated to any index is:
# 1.- Review if number of rows is different of rows_sampled
# 2.- Review if we have more than 15 days that the statistics have been updated.
#——————————————————————————-
function CheckIndexesAndStatistics($connection, $FileName, $FileNameLogSolution , $iTimeOut )
{
try
{
$Item=0
logMsg( “—- Checking Indexes and Statistics health (Started) – Reference: https://docs.microsoft.com/en-us/sql/t-sql/statements/update-statistics-transact-sql?view=sql-server-ver15 -” ) (1) $true $FileName
$command = New-Object -TypeName System.Data.SqlClient.SqlCommand
$command.CommandTimeout = $iTimeOut
$command.Connection=$connection
$command.CommandText = “SELECT ind.index_id, ind.name, o.name, stat.filter_definition, sp.last_updated, sp.rows, sp.rows_sampled, sp.steps, sp.unfiltered_rows, sp.modification_counter, DATEDIFF(DAY, last_updated , getdate()) AS Diff, schema_name(o.schema_id) as SchemaName,*
from sys.indexes ind
Inner join sys.objects o on ind.object_id=o.object_id
inner join sys.stats stat on stat.object_id=o.object_id and stat.stats_id = ind.index_id
CROSS APPLY sys.dm_db_stats_properties(stat.object_id, stat.stats_id) AS sp
WHERE o.type = ‘U’ order by o.name, stat.name”
$Reader = $command.ExecuteReader();
while($Reader.Read())
{
if( $Reader.GetValue(5) -gt $Reader.GetValue(6)) #If number rows is different rows_sampled
{
$Item=$Item+1
logMsg(“Possible outdated – (Rows_Sampled is less than rows of the table):”.PadRight(100,” “) + ” of ” + ($Reader.GetValue(11).ToString() +”.”+ $Reader.GetValue(2).ToString() + ” ” + $Reader.GetValue(1).ToString()).PadRight(400,” “)) (2) $true $FileName
logSolution(“ALTER INDEX [” + $Reader.GetValue(1).ToString() + “] ON [” + $Reader.GetValue(11).ToString() +”].[“+ $Reader.GetValue(2).ToString() + “] REBUILD”) $FileNameLogSolution
}
if( TestEmpty($Reader.GetValue(10))) {}
else
{
if($Reader.GetValue(10) -gt 15)
{
$Item=$Item+1
logMsg(“Possible outdated – (15 days since the latest update):”.PadRight(100,” “) + ” of ” + ($Reader.GetValue(11).ToString() +”.”+ $Reader.GetValue(2).ToString() + ” ” + $Reader.GetValue(1).ToString()).PadRight(400,” “)) (2) $true $FileName
logSolution(“ALTER INDEX [” + $Reader.GetValue(1).ToString() + “] ON [” + $Reader.GetValue(11).ToString() +”].[“+ $Reader.GetValue(2).ToString() + “] REBUILD”) $FileNameLogSolution
}
}
}
$Reader.Close();
logMsg( “—- Checking Indexes and Statistics health (Finished) —- ” ) (1) $true $FileName
return $Item
}
catch
{
logMsg(“Not able to run Indexes and statistics health checker…” + $Error[0].Exception) (2) $true $FileName
return 0
}
}
#——————————————————————————-
# Check if MAXDOP is 0
#——————————————————————————-
function CheckScopeConfiguration($connection ,$FileName, $FileNameLogSolution , $iTimeOut)
{
try
{
$Item=0
logMsg( “—- Checking Scoped Configurations —- Ref: https://docs.microsoft.com/en-us/sql/relational-databases/system-catalog-views/sys-database-scoped-configurations-transact-sql?view=sql-server-ver15” ) (1) $true $FileName
$command = New-Object -TypeName System.Data.SqlClient.SqlCommand
$command.CommandTimeout = $iTimeOut
$command.Connection=$connection
$command.CommandText = “select * from sys.database_scoped_configurations”
$Reader = $command.ExecuteReader();
while($Reader.Read())
{
if( $Reader.GetValue(1) -eq “MAXDOP”)
{
if( $Reader.GetValue(2) -eq 0)
{
logMsg(“You have MAXDOP with value 0” ) (2) $true $FileName
$Item=$Item+1
}
}
}
$Reader.Close();
logMsg( “—- Checking Scoped Configurations (Finished) —- ” ) (1) $true $FileName
return $Item
}
catch
{
logMsg(“Not able to run Scoped Configurations…” + $Error[0].Exception) (2) $true $FileName
return 0
}
}
#—————————————————————-
#Function to connect to the database using a retry-logic
#—————————————————————-
Function GiveMeConnectionSource($DBs)
{
for ($i=1; $i -lt 10; $i++)
{
try
{
logMsg( “Connecting to the database…” + $DBs + “. Attempt #” + $i) (1)
$SQLConnection = New-Object System.Data.SqlClient.SqlConnection
$SQLConnection.ConnectionString = “Server=”+$server+”;Database=”+$Dbs+”;User ID=”+$user+”;Password=”+$password+”;Connection Timeout=60;Application Name=ReadScaleOut Collector;ApplicationIntent=ReadOnly”
$SQLConnection.Open()
logMsg(“Connected to the database..” + $DBs) (1)
return $SQLConnection
break;
}
catch
{
logMsg(“Not able to connect – ” + $DBs + ” – Retrying the connection…” + $Error[0].Exception) (2)
Start-Sleep -s 5
}
}
}
#————————————————————–
#Create a folder
#————————————————————–
Function CreateFolder
{
Param( [Parameter(Mandatory)]$Folder )
try
{
$FileExists = Test-Path $Folder
if($FileExists -eq $False)
{
$result = New-Item $Folder -type directory
if($result -eq $null)
{
logMsg(“Imposible to create the folder ” + $Folder) (2)
return $false
}
}
return $true
}
catch
{
return $false
}
}
#——————————-
#Create a folder
#——————————-
Function DeleteFile{
Param( [Parameter(Mandatory)]$FileName )
try
{
$FileExists = Test-Path $FileNAme
if($FileExists -eq $True)
{
Remove-Item -Path $FileName -Force
}
return $true
}
catch
{
return $false
}
}
#——————————–
#Log the operations
#——————————–
function logMsg
{
Param
(
[Parameter(Mandatory=$true, Position=0)]
[string] $msg,
[Parameter(Mandatory=$false, Position=1)]
[int] $Color,
[Parameter(Mandatory=$false, Position=2)]
[boolean] $Show=$true,
[Parameter(Mandatory=$false, Position=3)]
[string] $sFileName,
[Parameter(Mandatory=$false, Position=4)]
[boolean] $bShowDate=$true
)
try
{
if($bShowDate -eq $true)
{
$Fecha = Get-Date -format “yyyy-MM-dd HH:mm:ss”
$msg = $Fecha + ” ” + $msg
}
If( TestEmpty($SFileName) )
{
Write-Output $msg | Out-File -FilePath $LogFile -Append
}
else
{
Write-Output $msg | Out-File -FilePath $sFileName -Append
}
$Colores=”White”
$BackGround =
If($Color -eq 1 )
{
$Colores =”Cyan”
}
If($Color -eq 3 )
{
$Colores =”Yellow”
}
if($Color -eq 2 -And $Show -eq $true)
{
Write-Host -ForegroundColor White -BackgroundColor Red $msg
}
else
{
if($Show -eq $true)
{
Write-Host -ForegroundColor $Colores $msg
}
}
}
catch
{
Write-Host $msg
}
}
#——————————–
#Log the solution
#——————————–
function logSolution
{
Param
(
[Parameter(Mandatory=$true, Position=0)]
[string] $msg,
[Parameter(Mandatory=$false, Position=3)]
[string] $sFileName
)
try
{
Write-Output $msg | Out-File -FilePath $sFileName -Append
}
catch
{
Write-Host $msg
}
}
#——————————–
#The Folder Include “” or not???
#——————————–
function GiveMeFolderName([Parameter(Mandatory)]$FolderSalida)
{
try
{
$Pos = $FolderSalida.Substring($FolderSalida.Length-1,1)
If( $Pos -ne “” )
{return $FolderSalida + “”}
else
{return $FolderSalida}
}
catch
{
return $FolderSalida
}
}
#——————————–
#Validate Param
#——————————–
function TestEmpty($s)
{
if ([string]::IsNullOrWhitespace($s))
{
return $true;
}
else
{
return $false;
}
}
#——————————–
#Separator
#——————————–
function GiveMeSeparator
{
Param([Parameter(Mandatory=$true)]
[System.String]$Text,
[Parameter(Mandatory=$true)]
[System.String]$Separator)
try
{
[hashtable]$return=@{}
$Pos = $Text.IndexOf($Separator)
$return.Text= $Text.substring(0, $Pos)
$return.Remaining = $Text.substring( $Pos+1 )
return $Return
}
catch
{
$return.Text= $Text
$return.Remaining = “”
return $Return
}
}
Function Remove-InvalidFileNameChars {
param([Parameter(Mandatory=$true,
Position=0,
ValueFromPipeline=$true,
ValueFromPipelineByPropertyName=$true)]
[String]$Name
)
return [RegEx]::Replace($Name, “[{0}]” -f ([RegEx]::Escape([String][System.IO.Path]::GetInvalidFileNameChars())), ”)}
function Replace-SpacesInRange {
param (
[string]$InputString)
# Replace matched spaces with a single space
$OutputString = ($InputString -replace “s+”,” “)
return $OutputString
}
#——————————————————————————-
# Check the rows, space used, allocated and numbers of tables.
#——————————————————————————-
function CheckStatusPerTable($connection ,$FileName, $iTimeOut)
{
try
{
logMsg( “—- Checking Status per Table —- ” ) (1) $true $FileName
$Item=0
$command = New-Object -TypeName System.Data.SqlClient.SqlCommand
$command.CommandTimeout = $iTimeOut
$command.Connection=$connection
$command.CommandText = “SELECT s.Name + ‘.’ + t.name,
SUM(p.rows) AS RowCounts,
SUM(a.total_pages) * 8 AS TotalSpaceKB,
SUM(a.used_pages) * 8 AS UsedSpaceKB,
(SUM(a.total_pages) – SUM(a.used_pages)) * 8 AS UnusedSpaceKB
FROM
sys.tables t
INNER JOIN
sys.indexes i ON t.OBJECT_ID = i.object_id
INNER JOIN
sys.partitions p ON i.object_id = p.OBJECT_ID AND i.index_id = p.index_id
INNER JOIN
sys.allocation_units a ON p.partition_id = a.container_id
LEFT OUTER JOIN
sys.schemas s ON t.schema_id = s.schema_id
WHERE t.is_ms_shipped = 0
AND i.OBJECT_ID > 255
GROUP BY
s.Name + ‘.’ + t.name”
$Reader = $command.ExecuteReader();
$StringReport = “Table “
$StringReport = $StringReport + “Rows “
$StringReport = $StringReport + “Space “
$StringReport = $StringReport + “Used “
logMsg($StringReport) (1) $false $FileName -bShowDate $false
while($Reader.Read())
{
$Item=$Item+1
$lTotalRows = $Reader.GetValue(1)
$lTotalSpace = $Reader.GetValue(2)
$lTotalUsed = $Reader.GetValue(3)
$lTotalUnUsed = $Reader.GetValue(4)
$StringReport = $Reader.GetValue(0).ToString().PadRight(100).Substring(0,99) + ” “
$StringReport = $StringReport + $lTotalRows.ToString(‘N0’).PadLeft(20) + ” “
$StringReport = $StringReport + $lTotalSpace.ToString(‘N0’).PadLeft(20) + ” “
$StringReport = $StringReport + $lTotalUsed.ToString(‘N0’).PadLeft(20)
logMsg($StringReport) (1) $false $FileName -bShowDate $false
}
$Reader.Close();
return $Item
}
catch
{
$Reader.Close();
logMsg(“Not able to run Checking Status per Table…” + $Error[0].Exception) (2)
return 0
}
}
#—————————————————————————————-
# Check query stats and obtain execution count, cpu_time, logical_reads, etc.. per query
#—————————————————————————————
function CheckQueryStats($connection ,$FileName, $iTimeOut)
{
try
{
logMsg( “—- Checking Query Stats —- ” ) (1) $true $FileName
$Item=0
$command = New-Object -TypeName System.Data.SqlClient.SqlCommand
$command.CommandTimeout = $iTimeOut
$command.Connection=$connection
$command.CommandText = “SELECT
qs.execution_count,
qs.total_worker_time AS cpu_time,
qs.total_logical_reads AS avg_logical_reads,
qs.total_logical_writes AS avg_logical_writes,
qs.total_physical_reads AS avg_physical_reads,
qs.total_elapsed_time AS avg_elapsed_time,
st.text AS query_text
FROM sys.dm_exec_query_stats AS qs
CROSS APPLY sys.dm_exec_sql_text(qs.sql_handle) AS st”
$Reader = $command.ExecuteReader();
$StringReport = “Execution Count “
$StringReport = $StringReport + “CPU Time “
$StringReport = $StringReport + “Logical Reads “
$StringReport = $StringReport + “Logical Writes “
$StringReport = $StringReport + “Physical Writes “
$StringReport = $StringReport + “Elapsed Time “
$StringReport = $StringReport + “TSQL “
logMsg($StringReport) (1) $false $FileName -bShowDate $false
while($Reader.Read())
{
$Item=$Item+1
$TSQL = $Reader.GetValue(6).ToString().Trim()
$cleanedString = Replace-SpacesInRange -InputString $TSQL
$StringReport = $Reader.GetValue(0).ToString(‘N0’).ToString().PadLeft(20) + ” “
$StringReport = $StringReport + $Reader.GetValue(1).ToString(‘N0’).PadLeft(20) + ” “
$StringReport = $StringReport + $Reader.GetValue(2).ToString(‘N0’).PadLeft(20) + ” “
$StringReport = $StringReport + $Reader.GetValue(3).ToString(‘N0’).PadLeft(20) + ” “
$StringReport = $StringReport + $Reader.GetValue(4).ToString(‘N0’).PadLeft(20) + ” “
$StringReport = $StringReport + $Reader.GetValue(5).ToString(‘N0’).PadLeft(20) + ” “
$StringReport = $StringReport + $cleanedString
logMsg($StringReport) (1) $false $FileName -bShowDate $false
}
$Reader.Close();
return $Item
}
catch
{
$Reader.Close();
logMsg(“Not able to run Checking Query Stats..” + $Error[0].Exception) (2)
return 0
}
}
#——————————————————————————-
# Show the performance counters of the database
#——————————————————————————-
function CheckStatusPerResource($connection,$FileName, $iTimeOut)
{
try
{
logMsg( “—- Checking Status per Resources —- ” ) (1) $true $FileName
$Item=0
$command = New-Object -TypeName System.Data.SqlClient.SqlCommand
$command.CommandTimeout = $iTimeOut
$command.Connection=$connection
$command.CommandText = “select end_time, avg_cpu_percent, avg_data_io_percent, avg_log_write_percent, avg_memory_usage_percent, max_worker_percent from sys.dm_db_resource_stats order by end_time desc”
$Reader = $command.ExecuteReader();
$StringReport = “Time “
$StringReport = $StringReport + “Avg_Cpu “
$StringReport = $StringReport + “Avg_DataIO “
$StringReport = $StringReport + “Avg_Log “
$StringReport = $StringReport + “Avg_Memory “
$StringReport = $StringReport + “Max_Workers”
logMsg($StringReport) (1) $false $FileName -bShowDate $false
while($Reader.Read())
{
$Item=$Item+1
$lTotalCPU = $Reader.GetValue(1)
$lTotalDataIO = $Reader.GetValue(2)
$lTotalLog = $Reader.GetValue(3)
$lTotalMemory = $Reader.GetValue(4)
$lTotalWorkers = $Reader.GetValue(5)
$StringReport = $Reader.GetValue(0).ToString().PadLeft(20) + ” “
$StringReport = $StringReport + $lTotalCPU.ToString(‘N2’).PadLeft(10) + ” “
$StringReport = $StringReport + $lTotalDataIO.ToString(‘N2’).PadLeft(10)
$StringReport = $StringReport + $lTotalLog.ToString(‘N2’).PadLeft(10)
$StringReport = $StringReport + $lTotalMemory.ToString(‘N2’).PadLeft(10)
$StringReport = $StringReport + $lTotalWorkers.ToString(‘N2’).PadLeft(10)
logMsg($StringReport) (1) $false $FileName -bShowDate $false
}
$Reader.Close();
return $Item
}
catch
{
logMsg(“Not able to run Checking Status per Resources…” + $Error[0].Exception) (2) $true $FileName
return 0
}
}
function sGiveMeFileName{
Param([Parameter(Mandatory=$true)]
[System.String]$DBAccess,
[Parameter(Mandatory=$true)]
[System.String]$File)
try
{
return $FolderV + $DBAccess + $File
}
catch {
return “_UnKnow.csv”
}
}
try
{
Clear
#——————————–
#Check the parameters.
#——————————–
if (TestEmpty($server)) { $server = read-host -Prompt “Please enter a Server Name” }
if (TestEmpty($user)) { $user = read-host -Prompt “Please enter a User Name” }
if (TestEmpty($passwordSecure))
{
$passwordSecure = read-host -Prompt “Please enter a password” -assecurestring
$password = [Runtime.InteropServices.Marshal]::PtrToStringAuto([Runtime.InteropServices.Marshal]::SecureStringToBSTR($passwordSecure))
}
else
{$password = $passwordSecure}
if (TestEmpty($Db)) { $Db = read-host -Prompt “Please enter a Database Name, type ALL to check all databases” }
if (TestEmpty($Folder)) { $Folder = read-host -Prompt “Please enter a Destination Folder (Don’t include the last ) – Example c:PerfChecker” }
$DbsArray = [System.Collections.ArrayList]::new()
#——————————–
#Variables
#——————————–
$CheckStatistics=0
$CheckIndexesAndStatistics=0
$CheckMissingIndexes=0
$CheckScopeConfiguration=0
$CheckStatusPerResource=0
$CheckStatusPerTable=0
$CheckQueryStats=0
$TotalCheckStatistics=0
$TotalCheckIndexesAndStatistics=0
$TotalCheckMissingIndexes=0
$TotalCheckScopeConfiguration=0
$TotalCheckStatusPerResource=0
$TotalCheckStatusPerTable=0
$TotalCheckQueryStats=0
#——————————–
#Run the process
#——————————–
$timestamp = Get-Date -Format “yyyyMMddHHmmss”
$Folder = $Folder +”” + $timestamp + “”
logMsg(“Creating the folder ” + $Folder) (1)
$result = CreateFolder($Folder) #Creating the folder that we are going to have the results, log and zip.
If( $result -eq $false)
{
logMsg(“Was not possible to create the folder”) (2)
exit;
}
logMsg(“Created the folder ” + $Folder) (1)
$sFolderV = GiveMeFolderName($Folder) #Creating a correct folder adding at the end .
$LogFile = $sFolderV + “PerfChecker.Log” #Logging the operations.
logMsg(“Deleting Operation Log file”) (1)
$result = DeleteFile($LogFile) #Delete Log file
logMsg(“Deleted Operation Log file”) (1)
logMsg(“——————– Header Filter details ————–“) (1)
logMsg(” ServerName: ” + $server) (1)
logMsg(” DB Filter : ” + $DB) (1)
logMsg(” Folder : ” + $Folder) (1)
logMsg(” Delete Files: ” + $DropExisting) (1)
logMsg(” Elastic DB Pool Name: ” + $ElasticDBPoolName) (1)
logMsg(“——————– Footer Filter details ————–“) (1)
if( $DropExisting -eq 1)
{
foreach ($f in ((Get-ChildItem -Path $sFolderV)))
{
if($f.Extension -in (“.txt”) -or $f.Extension -in (“.task”) )
{
logMsg(“Deleting Operation file: ” + $f.FullName) (1)
$result = DeleteFile($f.FullName)
logMsg(“Deleted Operation file: ” + $f.FullName) (1)
}
}
}
if($Db -eq “ALL”)
{
$SQLConnectionSource = GiveMeConnectionSource “master” #Connecting to the database.
if($SQLConnectionSource -eq $null)
{
logMsg(“It is not possible to connect to the database”) (2)
exit;
}
$commandDB = New-Object -TypeName System.Data.SqlClient.SqlCommand
$commandDB.CommandTimeout = 6000
$commandDB.Connection=$SQLConnectionSource
if(TestEmpty($ElasticDBPoolName))
{
$commandDB.CommandText = “SELECT name from sys.databases where database_id >=5 order by name”
}
else
{
$commandDB.CommandText = “SELECT d.name as DatabaseName FROM sys.databases d inner join sys.database_service_objectives dso on d.database_id = dso.database_id WHERE dso.elastic_pool_name = ‘” + $ElasticDBPoolName + “‘ ORDER BY d.name”
}
$ReaderDB = $commandDB.ExecuteReader();
while($ReaderDB.Read())
{
[void]$DbsArray.Add($ReaderDB.GetValue(0).ToString())
logMsg(“Database Name selected:” + $ReaderDB.GetValue(0).ToString()) (1)
}
$ReaderDB.Close();
$SQLConnectionSource.Close()
}
else
{
$DbsArray.Add($DB)
}
for($iDBs=0;$iDBs -lt $DbsArray.Count; $iDBs=$iDBs+1)
{
logMsg(“Connecting to database..” + $DbsArray[$iDBs]) (1)
$SQLConnectionSource = GiveMeConnectionSource($DbsArray[$iDBs]) #Connecting to the database.
if($SQLConnectionSource -eq $null)
{
logMsg(“It is not possible to connect to the database ” + $DbsArray[$iDBs] ) (2)
exit;
}
logMsg(“Connected to database..” + $DbsArray[$iDBs]) (1)
$CheckStatistics=0
$CheckIndexesAndStatistics=0
$CheckMissingIndexes=0
$CheckScopeConfiguration=0
$CheckStatusPerResource=0
$CheckStatusPerTable=0
$CheckQueryStats=0
$FileName=Remove-InvalidFileNameChars($DbsArray[$iDBs])
$CheckStatistics = CheckStatistics $SQLConnectionSource ($sFolderV + $FileName + “_CheckStatistics.Txt”) ($sFolderV + $FileName + “_CheckStatistics.Task”) (3600)
$CheckIndexesAndStatistics = CheckIndexesAndStatistics $SQLConnectionSource ($sFolderV + $FileName + “_CheckIndexesStatistics.Txt”) ($sFolderV + $FileName + “_CheckIndexesStatistics.Task”) (3600)
$CheckMissingIndexes = CheckMissingIndexes $SQLConnectionSource ($sFolderV + $FileName + “_CheckMissingIndexes.Txt”) ($sFolderV + $FileName + “_CheckMissingIndexes.Task”) (3600)
$CheckScopeConfiguration = CheckScopeConfiguration $SQLConnectionSource ($sFolderV + $FileName + “_CheckScopeConfiguration.Txt”) ($sFolderV + $FileName + “_CheckScopeConfiguration.Task”) (3600)
$CheckStatusPerResource = CheckStatusPerResource $SQLConnectionSource ($sFolderV + $FileName + “_ResourceUsage.Txt”) (3600)
$CheckStatusPerTable = CheckStatusPerTable $SQLConnectionSource ($sFolderV + $FileName + “_TableSize.Txt”) (3600)
$CheckQueryStats=CheckQueryStats $SQLConnectionSource ($sFolderV + $FileName + “_QueryStats.Txt”) (3600)
$TotalCheckStatistics=$TotalCheckStatistics+$CheckStatistics
$TotalCheckIndexesAndStatistics=$TotalCheckIndexesAndStatistics+$CheckIndexesAndStatistics
$TotalCheckMissingIndexes=$TotalCheckMissingIndexes+$CheckMissingIndexes
$TotalCheckScopeConfiguration=$TotalCheckScopeConfiguration+$CheckScopeConfiguration
$TotalCheckStatusPerResource = $TotalCheckStatusPerResource + $CheckStatusPerResource
$TotalCheckStatusPerTable = $TotalCheckStatusPerTable + $CheckStatusPerTable
$TotalCheckQueryStats=$TotalCheckQueryStats+$CheckQueryStats
logMsg(“Closing the connection and summary for….. : ” + $DbsArray[$iDBs]) (3)
logMsg(“Number of Issues with statistics : ” + $CheckStatistics ) (1)
logMsg(“Number of Issues with statistics/indexes : ” + $CheckIndexesAndStatistics ) (1)
logMsg(“Number of Issues with Scoped Configuration : ” + $CheckScopeConfiguration ) (1)
logMsg(“Number of Issues with Missing Indexes : ” + $CheckMissingIndexes ) (1)
logMsg(“Number of Resource Usage : ” + $CheckStatusPerResource ) (1)
logMsg(“Number of Tables Usage : ” + $CheckStatusPerTable ) (1)
logMsg(“Number of Query stats : ” + $CheckQueryStats ) (1)
$SQLConnectionSource.Close()
}
Remove-Variable password
logMsg(“ReadScale-Out Performance Collector Script was executed correctly”) (3)
logMsg(“Total Number of Issues with statistics : ” + $TotalCheckStatistics ) (1)
logMsg(“Total Number of Issues with statistics/indexes : ” + $TotalCheckIndexesAndStatistics ) (1)
logMsg(“Total Number of Issues with Scoped Configuration : ” + $TotalCheckScopeConfiguration ) (1)
logMsg(“Total Number of Issues with Missing Indexes : ” + $TotalCheckMissingIndexes ) (1)
logMsg(“Total Number of Resource Usage : ” + $TotalCheckStatusPerResource ) (1)
logMsg(“Total Number of Tables Usage : ” + $TotalCheckStatusPerTable ) (1)
logMsg(“Total Number of Query Stats : ” + $TotalCheckQueryStats ) (1)
}
catch
{
logMsg(“ReadScale-Out Performance Collector Script was executed incorrectly ..: ” + $Error[0].Exception) (2)
}
finally
{
logMsg(“ReadScale-Out Performance Collector Script finished – Check the previous status line to know if it was success or not”) (2)
}
Disclaimer: This script is provided as an example and should be used at the user’s own risk. It must be thoroughly tested before being used in a production environment.
Microsoft Tech Community – Latest Blogs –Read More
Deleted SharePoint folders keep reappearing
We are having problems with a number of folders in a SharePoint site accessing via a link in OneDrive which seem to have multiple lives. When we delete them from one user’s laptop (OneDrive app) or Via the web ui, then other user’s laptops recreate the soon after. This happens whether or not there are files in the folders.
I have today had to re-delete 100s of folders. Each time I do, another user’s laptop puts them back again. Sometimes the recreated folders are tagged with the user’s laptop name as a tag eg: FolderName-ComputerName – I think this is when the original FolderName was already re-created by another user’s laptop.
No sync errors are reported on the user’s laptops.
There is no “odd” configuration. All the users are “Members” of the site.
We’ve had this happen for one or two folders in the past but in the last two days it’s gone crazy.
Any clues about what’s happening and how to stop it?
We are having problems with a number of folders in a SharePoint site accessing via a link in OneDrive which seem to have multiple lives. When we delete them from one user’s laptop (OneDrive app) or Via the web ui, then other user’s laptops recreate the soon after. This happens whether or not there are files in the folders.I have today had to re-delete 100s of folders. Each time I do, another user’s laptop puts them back again. Sometimes the recreated folders are tagged with the user’s laptop name as a tag eg: FolderName-ComputerName – I think this is when the original FolderName was already re-created by another user’s laptop.No sync errors are reported on the user’s laptops.There is no “odd” configuration. All the users are “Members” of the site.We’ve had this happen for one or two folders in the past but in the last two days it’s gone crazy.Any clues about what’s happening and how to stop it? Read More