Category: Microsoft
Category Archives: Microsoft
ASPX extension is getting mangled
Hello All,
We have a document library that contains some ASPX files.
In the past we have been able to upload these files normally and when opened they provide the expected functionality.
However, recently, our ASPX file names are being changed to include a query parameter (e.g. “story.aspx?d=wccb138f4041948d281908c2278103ca9”)
This is preventing the files from loading properly (since the extension is no longer “.aspx”).
I suspect that this is a problem with our new Proofpoint solution, but I want to see if anyone has experience filename mangling when uploading to SP O365.
Any insights are appreciated.
Thank you,
-tomas
Hello All,We have a document library that contains some ASPX files. In the past we have been able to upload these files normally and when opened they provide the expected functionality. However, recently, our ASPX file names are being changed to include a query parameter (e.g. “story.aspx?d=wccb138f4041948d281908c2278103ca9”) This is preventing the files from loading properly (since the extension is no longer “.aspx”). I suspect that this is a problem with our new Proofpoint solution, but I want to see if anyone has experience filename mangling when uploading to SP O365. Any insights are appreciated. Thank you,-tomas Read More
Cumulative Update #28 for SQL Server 2019 RTM
The 28th cumulative update release for SQL Server 2019 RTM is now available for download at the Microsoft Downloads site. Please note that registration is no longer required to download Cumulative updates.
To learn more about the release or servicing model, please visit:
CU28 KB Article: https://learn.microsoft.com/troubleshoot/sql/releases/sqlserver-2019/cumulativeupdate28
Starting with SQL Server 2017, we adopted a new modern servicing model. Please refer to our blog for more details on Modern Servicing Model for SQL Server
Microsoft® SQL Server® 2019 RTM Latest Cumulative Update: https://www.microsoft.com/download/details.aspx?id=100809
Update Center for Microsoft SQL Server: https://learn.microsoft.com/en-us/troubleshoot/sql/releases/download-and-install-latest-updates
Microsoft Tech Community – Latest Blogs –Read More
Warning: PIM disconnects users from Teams Mobile
I have been working with Microsoft Support on this issue for three months. Hopefully I can save others the trouble.
Sometime around April 2024, I and my colleagues started seeing regular alerts on our mobile devices saying “Open Teams to continue receiving notifications for <email address>”, or “<email address> needs to sign in to see notifications”. Just as promised, after this message appears, we do not get notified about messages and Teams calls do not ring on our mobile devices until we open Teams. We eventually determined that these alerts coincided with activating or deactivating PIM roles.
Apparently, a change was made to Privileged Identity Management in Microsoft Entra ID around that time whereby users’ tokens are invalidated when a role is activated or deactivated. Quoting the Microsoft Support rep:
“When a user’s role changes (either due to activation or expiration), Skype AAD will revoke existing tokens of that users. Skype AAD will also notify PNH about that token revocation. This is expected behavior and is working as designed. These changes were rolled out in Skype AAD in April/May 2024 which is since when you are facing the issue as well.”
(I’ve never heard of Skype AAD or PNH, but who am I to question the expert?)
Anyway, as far as I can tell, this change was not announced or documented anywhere, so hopefully this message will show up in the search results of my fellow admins who are dealing with this. It’s not a bug, it’s “expected behavior and working as designed”.
I have been working with Microsoft Support on this issue for three months. Hopefully I can save others the trouble. Sometime around April 2024, I and my colleagues started seeing regular alerts on our mobile devices saying “Open Teams to continue receiving notifications for <email address>”, or “<email address> needs to sign in to see notifications”. Just as promised, after this message appears, we do not get notified about messages and Teams calls do not ring on our mobile devices until we open Teams. We eventually determined that these alerts coincided with activating or deactivating PIM roles. Apparently, a change was made to Privileged Identity Management in Microsoft Entra ID around that time whereby users’ tokens are invalidated when a role is activated or deactivated. Quoting the Microsoft Support rep: “When a user’s role changes (either due to activation or expiration), Skype AAD will revoke existing tokens of that users. Skype AAD will also notify PNH about that token revocation. This is expected behavior and is working as designed. These changes were rolled out in Skype AAD in April/May 2024 which is since when you are facing the issue as well.” (I’ve never heard of Skype AAD or PNH, but who am I to question the expert?) Anyway, as far as I can tell, this change was not announced or documented anywhere, so hopefully this message will show up in the search results of my fellow admins who are dealing with this. It’s not a bug, it’s “expected behavior and working as designed”. Read More
Flow to send a daily email noting missing data in a SharePoint List
I need to create a flow that will send out a daily email showing all of the records in a SharePoint list that are missing the task approval dates (Approval Column). I have been playing around with building a flow but I am stumped when needing to enter the Query Filter info.
I need to create a flow that will send out a daily email showing all of the records in a SharePoint list that are missing the task approval dates (Approval Column). I have been playing around with building a flow but I am stumped when needing to enter the Query Filter info. Read More
Migrating new company microsoft license and management to my MSP company.
My MSP company is in the process of onboarding a new company that uses some microsoft services and licenses via another microsoft partner agency. How do i migrate and manage their licenses and services to my company? Your detailed response will be appreciated.
My MSP company is in the process of onboarding a new company that uses some microsoft services and licenses via another microsoft partner agency. How do i migrate and manage their licenses and services to my company? Your detailed response will be appreciated. Read More
Forward Request in API Management not working in Basic Tier but works in Consumption
Hoping this is the correct place to post this….
I am using the API Management police “Forward-Request” to extend the time out beyond 100 seconds. I have three environments – stage, train and live. In Stage and Train, the extended time out works by placing the policy appropriately in the “All Operations” section of the APIS. Both these environments are on the consumption tier.
Doing the same exact thing in our Live module, which is set to Basic Tier, the policy does not work. Has anyone else had this issue?
Hoping this is the correct place to post this…. I am using the API Management police “Forward-Request” to extend the time out beyond 100 seconds. I have three environments – stage, train and live. In Stage and Train, the extended time out works by placing the policy appropriately in the “All Operations” section of the APIS. Both these environments are on the consumption tier.Doing the same exact thing in our Live module, which is set to Basic Tier, the policy does not work. Has anyone else had this issue? Read More
Unable To Open Photo Forwarded To Email in Forms
Greetings!
I have created a MS Form in which the responses are forwarded to email using the following Discussion (https://techcommunity.microsoft.com/t5/microsoft-forms/microsoft-forms-help-with-forwarding/m-p/1825857). While the look is different that the current version of Power Automate, the steps walked me through how to complete this task. Purpose and flow are outlined below.
Function: Out Team Members submit Content and a Photo to add to social mediaFlowThe Form is completed by the Team MemberAn email is generated with the post informationThe email is forwarded onto the Admin who has access to our social media accountsAdmin can then copy and paste the post content, download the pic, and to add to social media
The Form Submission and email work as designed. However, the photo that is forwarded is not an image file type that by photo viewer. Attached are screenshots of the Form settings and the reply that the user receives in their Inbox.
Any assistance would be HUGELY helpful! Thanks in advance.
Greetings! I have created a MS Form in which the responses are forwarded to email using the following Discussion (https://techcommunity.microsoft.com/t5/microsoft-forms/microsoft-forms-help-with-forwarding/m-p/1825857). While the look is different that the current version of Power Automate, the steps walked me through how to complete this task. Purpose and flow are outlined below. Function: Out Team Members submit Content and a Photo to add to social mediaFlowThe Form is completed by the Team MemberAn email is generated with the post informationThe email is forwarded onto the Admin who has access to our social media accountsAdmin can then copy and paste the post content, download the pic, and to add to social media The Form Submission and email work as designed. However, the photo that is forwarded is not an image file type that by photo viewer. Attached are screenshots of the Form settings and the reply that the user receives in their Inbox. Any assistance would be HUGELY helpful! Thanks in advance. Read More
Enabling ISVs to accelerate AI app development with Azure AI and GitHub
More than 60,000 organizations use Microsoft Azure AI today to explore the power of custom AI applications. The market is quickly moving from experimentation to scale, and more developers around the world are becoming AI developers. To support this shift, Microsoft is partnering with GitHub to empower developers to build AI applications directly from GitHub.com, with seamless integrations with Codespaces and Microsoft Visual Studio Code. This collaboration brings Azure AI’s leading model selection to developers through GitHub Models, along with simple APIs to empower responsible, production-ready AI applications.
Today, Azure offers the largest and most complete model library in the market, including the latest models from OpenAI, Meta, Mistral, and Cohere, as well as updates to its own Phi-3 family of small language models. GitHub Models allows developers to explore and utilize the latest models along with AI innovations and next-generation frontier models. This offering gives developers the flexibility to choose the best combination of unique capabilities, performance metrics, and cost efficiencies.
While continuous model innovation brings more choice, it also brings complexity when selecting the right model for the right scenario. Developers have a range of options for cloud vs. edge, general-purpose vs. task-specific, and more. Organizations often need multiple models to enable better quality, lower cost of goods sold, and to address complex use cases for each industry. GitHub Models simplifies model experimentation and selection across the best of the Azure AI catalog, allowing developers to quickly compare models, parameters, and prompts.
Azure AI aims to help customers rapidly go from idea to code to cloud by making Azure AI an open, modular platform. With Azure AI on GitHub, developers can utilize Codespaces to set up a prototype or use the Prompty extension to generate code with GitHub Models directly in Microsoft Visual Studio Code. In the coming months, Azure AI will expand its integration further, bringing Azure AI’s language, vision, and multi-modal services to GitHub, along with additional Azure AI toolchain elements.
Developers building with AI want to be confident their AI applications are trustworthy, safe, and secure. GitHub Models provides a strong foundation with built-in safety and security controls from Azure AI. Azure AI works with model providers and partners such as HiddenLayer to reduce emerging threats, from cybersecurity vulnerabilities to malware and other signs of tampering. GitHub Models integrates Azure AI Content Safety for top foundation models, enabling built-in, real-time protection for risks such as the generation of harmful content, copyright materials, hallucination, and new AI-specific attacks.
Increased model selection gives developers the broadest range of options for their applications, but each model brings increased complexity. To counteract this, Azure AI provides a single API for model inference, allowing developers to compare performance across a diverse set of foundational models in a uniform and consistent way. The Azure AI Inference SDK provides client libraries in Python and JavaScript, with support for C# and .NET coming soon. This SDK simplifies common tasks related to authentication, security, and retries in the developer’s programming language of choice.
Beyond these new integrations, Microsoft is making it easier for organizations to access GitHub Enterprise through Azure, combining GitHub’s cloud-native platform with Azure’s robust enterprise-grade security and scalability. Organizations with an existing Azure subscription can purchase GitHub products via self-service, directly through Microsoft Sales, or via Microsoft Cloud Solution Providers. Companies can now spin up a GitHub instance directly from the Azure Portal and connect their Microsoft Entra ID with GitHub to facilitate user management and access control. New customers can explore these capabilities with a free 30-day trial of GitHub Enterprise.
To learn more, check out this new blog: Accelerating AI app development with Azure AI and GitHub | Microsoft Azure Blog
More than 60,000 organizations use Microsoft Azure AI today to explore the power of custom AI applications. The market is quickly moving from experimentation to scale, and more developers around the world are becoming AI developers. To support this shift, Microsoft is partnering with GitHub to empower developers to build AI applications directly from GitHub.com, with seamless integrations with Codespaces and Microsoft Visual Studio Code. This collaboration brings Azure AI’s leading model selection to developers through GitHub Models, along with simple APIs to empower responsible, production-ready AI applications.
Today, Azure offers the largest and most complete model library in the market, including the latest models from OpenAI, Meta, Mistral, and Cohere, as well as updates to its own Phi-3 family of small language models. GitHub Models allows developers to explore and utilize the latest models along with AI innovations and next-generation frontier models. This offering gives developers the flexibility to choose the best combination of unique capabilities, performance metrics, and cost efficiencies.
While continuous model innovation brings more choice, it also brings complexity when selecting the right model for the right scenario. Developers have a range of options for cloud vs. edge, general-purpose vs. task-specific, and more. Organizations often need multiple models to enable better quality, lower cost of goods sold, and to address complex use cases for each industry. GitHub Models simplifies model experimentation and selection across the best of the Azure AI catalog, allowing developers to quickly compare models, parameters, and prompts.
Azure AI aims to help customers rapidly go from idea to code to cloud by making Azure AI an open, modular platform. With Azure AI on GitHub, developers can utilize Codespaces to set up a prototype or use the Prompty extension to generate code with GitHub Models directly in Microsoft Visual Studio Code. In the coming months, Azure AI will expand its integration further, bringing Azure AI’s language, vision, and multi-modal services to GitHub, along with additional Azure AI toolchain elements.
Developers building with AI want to be confident their AI applications are trustworthy, safe, and secure. GitHub Models provides a strong foundation with built-in safety and security controls from Azure AI. Azure AI works with model providers and partners such as HiddenLayer to reduce emerging threats, from cybersecurity vulnerabilities to malware and other signs of tampering. GitHub Models integrates Azure AI Content Safety for top foundation models, enabling built-in, real-time protection for risks such as the generation of harmful content, copyright materials, hallucination, and new AI-specific attacks.
Increased model selection gives developers the broadest range of options for their applications, but each model brings increased complexity. To counteract this, Azure AI provides a single API for model inference, allowing developers to compare performance across a diverse set of foundational models in a uniform and consistent way. The Azure AI Inference SDK provides client libraries in Python and JavaScript, with support for C# and .NET coming soon. This SDK simplifies common tasks related to authentication, security, and retries in the developer’s programming language of choice.
Beyond these new integrations, Microsoft is making it easier for organizations to access GitHub Enterprise through Azure, combining GitHub’s cloud-native platform with Azure’s robust enterprise-grade security and scalability. Organizations with an existing Azure subscription can purchase GitHub products via self-service, directly through Microsoft Sales, or via Microsoft Cloud Solution Providers. Companies can now spin up a GitHub instance directly from the Azure Portal and connect their Microsoft Entra ID with GitHub to facilitate user management and access control. New customers can explore these capabilities with a free 30-day trial of GitHub Enterprise.
To learn more, check out this new blog: Accelerating AI app development with Azure AI and GitHub | Microsoft Azure Blog Read More
MGDC for SharePoint FAQ: How to gather insights from a large Files dataset?
In this post, we’ll cover recommendations on how to gather insights from the SharePoint Files dataset in the Microsoft Graph Data Connect (MGDC). If you’re not familiar with MGDC for SharePoint, start with https://aka.ms/SharePointData.
1. DO – Follow the process for other MGDC datasets
The SharePoint Files dataset in MGDC delivers the largest results, reaching hundreds of millions and even billions of rows for the largest tenants. That would be one row for each document (item in a document library) in SharePoint or OneDrive.
Even though it’s typically very large, this dataset follows the same MGDC process as other datasets. From a pipeline and data source standpoint, it’s just another dataset and it uses the same procedure. There is also support for sampling, filtering, deltas and history of the last 21 days. For details, see MGDC for SharePoint FAQ: Dataset types and features.
2. DO NOT – Load hundreds of millions of rows in Power BI
In general, when using MGDC, you can pull the resulting datasets from Power BI to create your own analytics. Power BI is an amazing tool that will make it easy to pull lots of data and do all sorts of aggregations and summarizations, showing them as reports, charts and dashboards. Power BI can easily pull that data, which is basically a set of folders with JSON files inside.
However, if your tenant ends up outputting tens of millions of rows or more, it is possible that Power BI won’t be able to handle that large dataset. From my experience, Power BI Desktop running on a fast PC with 32GB of RAM can typically handle a few million rows of data. If you have more than that, which is common for the Files dataset, you will need to do some preparation work outside Power BI.
We will discuss a few of these options below.
3. DO – Create a summarized table in ADLS Gen2
If you have too many rows to load into Power BI, you could run a Synapse pipeline to do some data aggregation before pulling the data into Power BI.
For instance, you could use a Synapse data flow to summarize the Files Dataset by “Extension”, calculating the number of sites, files, and bytes for each file extension and pulling that summary into Power BI. This will require some additional work in Synapse, and you will have less flexibility on how to pivot the data once you are in Power BI.
Here are the steps to follow, using a summary by file extension as an example:
Go to the Azure portal and create a new Synapse instance.
In Synapse, create a new data flow. Make sure to enable “data flow debug”.
Add a source to your data flow, pointing to the existing ADLS Gen2 storage, using JSON as the format
Use an aggregate transformation to summarize the data as needed
In the aggregate settings, for the group by column, choose extension
In the aggregate settings, for the aggregates, choose these 3 columns:
SiteCount – countAllDistinct(SiteId)
FileCount – count()
TotalSizeGB – round(sum(SizeInBytes)/1024/1024/1024,2)
Use the “data preview” tab to make sure things are working as expected (see picture below).
Add a sink to write the summarized data back to a new JSON file in ADLS Gen2.
Trigger the pipeline to execute the data flow and summarize the data.
Pull the summarized data into Power BI.
4, DO – Create a summarized table using a Synapse Notebook with PySpark
You can use Azure Synapse Analytics with a notebook to summarize a JSON-file-based dataset in ADLS Gen2.
Here’s a step-by-step guide to help you get started:
Go to the Azure portal and create a new Synapse workspace if you don’t already have one.
Make sure the storage account is configured to allow access by the Synapse workspace. In the Access Control section of the storage account configuration, you can add a role to the app that represents the synapse workspace.
In Synapse Studio, create a new notebook.
Add some code to the notebook. Use PySpark to read the JSON file from ADLS Gen2, perform the necessary summarization operations (for example, group by a field and calculate the sum of another field) and write the summarized data back to ADLS Gen2.
Here’s an example:
from pyspark.sql import SparkSession
from pyspark.sql.functions import count, countDistinct, sum
spark = SparkSession.builder.appName(“SummarizeJSON”).getOrCreate()
# Read the file as text and parse each line as JSON
input_json_path = “abfss://<container>@<account>.blob.core.windows.net/<filespath>/<filename.json>”
rdd = spark.sparkContext.textFile(input_json_path)
df = spark.read.json(rdd)
# Group by Extension and aggregate
summary_df = df.groupBy(“Extension”).agg(
count(“*”).alias(“FileCount”),
countDistinct(“SiteId”).alias(“SiteCount”),
sum(“SizeInBytes”).alias(“TotalBytes”)
)
# Write the output
output_json_path = “abfss://<container>@<account>.dfs.core.windows.net/<extensionspath>”
summary_df.write.mode(“overwrite”).json(output_json_path)
Execute the cell in your notebook to perform the summarization.
Use the new summarized data when loading into Power BI.
5. DO – Load the data into a SQL Server database
For large datasets, you might also want to move the entire dataset from the folder with JSON files into tables in a SQL Server. If your dataset is larger than 100GB, this could become expensive, and you would need to consider using indexes to help with your query. Columnstore indexes might be particularly useful for analytical queries that end up reading the entire table.
In Azure Synapse, you can use a “Copy Data” task where the source is your Azure Data Lake Storage Gen2 (ADLSGen2) and the destination (called the sink) is a table in SQL. You could also use the “Data Flow” task shown previously to transform the data and sink to SQL.
Moving to SQL will typically also require you to flatten the dataset, projecting nested objects. That means that objects inside objects must be represented as a flat list of properties. For instance, instead of having an “Author” object with two properties inside (“Email” and “Name”), you get two columns (“Author.Email” and “Author.Name”). In the Files Dataset, you will need to flatten the “Sensitivity Label Info”, the “Author” and the “Modified By” columns.
Note that you must first land the Microsoft Graph Data Connect dataset in an Azure Storage account before you can transform it and/or move it to SQL Server.
After the data is available in SQL Server, use the Power BI option to get data using a SQL query. Here is an example of a SQL query to summarize the data in the Files table by extension:
SELECT
Extension,
COUNT(*) AS FileCount,
COUNT(DISTINCT SiteId) AS SiteCount,
SUM(SizeInBytes) AS TotalBytes
FROM Files
GROUP BY Extension
If performance is more important than absolute accuracy, you might want to help SQL by using approximate distinct counts. This delivers faster results and guarantees up to a 2% error rate within a 97% probability. Here is an example:
SELECT
Extension,
COUNT(*) AS FileCount,
APPROX_COUNT_DISTINCT(SiteId) AS SiteCount,
SUM(SizeInBytes) AS TotalBytes
FROM Files
GROUP BY Extension
6. DO – Filter data
Instead of reducing the size of the data by summarizing, you can also filter the data in the Files dataset. That could be done by filtering the dataset for a specific site or possibly looking only at files with a specific author. You can use any of the methods described here (Synapse data flow, Synapse notebook or SQL Server query) to perform this filtering.
7. DO – Join the Files dataset with the Sites dataset
It might also be useful to join the Files dataset with the Sites dataset, so you can do specific aggregations or filtering. For instance, you could look into how the Files are distributed across the different types of SharePoint site using the Template or Template Id.
Here is an example using a SQL query:
SELECT
S.RootWebTemplate AS Template,
COUNT(*) AS FileCount,
COUNT(DISTINCT F.SiteId) AS SiteCount,
SUM(F.SizeInBytes) AS TotalBytes
FROM Files AS F
LEFT OUTER JOIN Sites AS S ON F.SiteId = S.Id
GROUP BY 1
Here is an example where we first calculate a summary of files per site, then do the join. This eliminates the need to use a COUNT DISTINCT:
SELECT
S.RootWebTemplate AS Template,
COUNT(*) AS TotalSites,
SUM(G.FilesPerSite) AS TotalFiles,
SUM(G.BytesPerSite) AS TotalBytes
FROM
(SELECT
F.SiteId,
COUNT(*) AS FilesPerSite,
SUM(F.SizeInBytes) AS BytesPerSite
FROM Files AS F
GROUP BY 1) AS G
LEFT OUTER JOIN Sites AS S ON G.SiteId = S.Id
GROUP BY 1
8, DO NOT – Join Files with Permissions on ItemId
You should be very careful when you attempt to join the Files dataset with the Permissions dataset. These are typically huge datasets and the way they should be joined is a bit complicated. You definitely do not want to join them by Item Id, since not every permission has an ItemId (it could be a permission on a Site, Library or Folder) and not every file has an associated permission (again, it could be a permission declared further up in the hierarchy).
If you must find the permissions for a given ItemId, the correct way to do it is by ScopeId. I suggest that you first filter the Files dataset for a specific ItemId and then join that with the Permissions dataset using the ScopeId. Note that a single item might have multiple permissions (with different roles, for instance) and these permissions might be granted for different item types.
Here is an example of a SQL query that shows the permissions for a given file, identified by a SiteId and an ItemId. It is very important to filter the output, otherwise the query might return billions of rows and take a very long time to process.
SELECT
F.SiteId,
F.ItemId,
F.ScopeId,
F.SiteUrl,
F.DirName,
F.FileName,
F.Extension,
P.ItemType,
P.ItemURL,
P.RoleDefinition,
P.LinkId,
P.LinkScope
FROM Files AS F
LEFT OUTER JOIN Permissions AS P
ON F.SiteId = P.SiteId AND F.ScopeId = P.ScopeId
WHERE F.SiteId = ‘ FCBDFC28-9335-4666-A852-6B1C1E7EC165’
AND F.ItemId = ‘ 647DCA3A-A3B8-4DBA-B1E8-6000389E696A’
To understand more about permission scopes, see MGDC for SharePoint FAQ: What is in the Permissions dataset?
9, Conclusion
I hope this clarifies what you can do with the Files dataset in MGDC for SharePoint. Let us know in the comments if you have other suggestions on how to get more out of the Files dataset.
For further details about the schema of all SharePoint datasets in MGDC, including SharePoint Sites and SharePoint File Actions, see https://aka.ms/SharePointDatasets.
Microsoft Tech Community – Latest Blogs –Read More
Msgbox if all text boxes are blank
Good day,
Is there a way I can generate a msgbox when a specific button is clicked if all text boxes in a userform have been left blank? I know I can create code for each box individually but I’m thinking there must be a way to combine everything together and generate the msgbox.
Thanks in advance for any assistance!!!
Good day, Is there a way I can generate a msgbox when a specific button is clicked if all text boxes in a userform have been left blank? I know I can create code for each box individually but I’m thinking there must be a way to combine everything together and generate the msgbox. Thanks in advance for any assistance!!! Read More
Changing table border color
Hello everyone,
Is there a way to change the color of the table borders? Maybe by adding CSS styles to the page? I’m not sure if it’s possible, and if so, I don’t know how to do it either.
Thank you in advance,
Hello everyone, Is there a way to change the color of the table borders? Maybe by adding CSS styles to the page? I’m not sure if it’s possible, and if so, I don’t know how to do it either. Thank you in advance, Read More
Color Code SharePoint List Column Cells based on Text in the Column
I have a Teams / SharePoint list for tracking projects. I have a column on Choices which include the text “Red”, “Amber” and “Green”. When the user selects one, I want to set the background color of the cell accordingly. I would like to use theses specific colors, if possible.
Red (#D20303)
Amber(#cbb100)
Green(#12ae00)
Amber and Green are working, Red is not. I tried many different codes and nothing works. Here is my code …
{
“elmType”: “div”,
“txtContent”: “@currentField”,
“style”: {
“color”: “#fff”,
“padding-left”: “14px”,
“background-color”: {
“operator”: “?”,
“operands”: [
{
“operator”: “==”,
“operands”: [
“@currentField”,
“Red”
]
},
“#D20303”,
{
“operator”: “?”,
“operands”: [
{
“operator”: “==”,
“operands”: [
“@currentField”,
“Green”
]
},
“#12ae00”,
{
“operator”: “?”,
“operands”: [
{
“operator”: “==”,
“operands”: [
“@currentField”,
“Amber”
]
},
“#cbb100”,
{
“operator”: “?”,
“operands”: [
{
“operator”: “==”,
“operands”: [
“@currentField”,
“Yellow”
]
},
“”,
“”
]
}
]
}
]
}
]
}
}
}
What am I missing?
Thanks,
John
I have a Teams / SharePoint list for tracking projects. I have a column on Choices which include the text “Red”, “Amber” and “Green”. When the user selects one, I want to set the background color of the cell accordingly. I would like to use theses specific colors, if possible. Red (#D20303)Amber(#cbb100)Green(#12ae00) Amber and Green are working, Red is not. I tried many different codes and nothing works. Here is my code … {“elmType”: “div”,”txtContent”: “@currentField”,”style”: {“color”: “#fff”,”padding-left”: “14px”,”background-color”: {“operator”: “?”,”operands”: [{“operator”: “==”,”operands”: [“@currentField”,”Red”]},”#D20303″,{“operator”: “?”,”operands”: [{“operator”: “==”,”operands”: [“@currentField”,”Green”]},”#12ae00″,{“operator”: “?”,”operands”: [{“operator”: “==”,”operands”: [“@currentField”,”Amber”]},”#cbb100″,{“operator”: “?”,”operands”: [{“operator”: “==”,”operands”: [“@currentField”,”Yellow”]},””,””]}]}]}]}}} What am I missing? Thanks,John Read More
Monitoring business continuity using Azure Advisor and Azure Monitor in MySQL – Flexible Server
In the era of digital transformation, businesses cannot afford downtime or data loss. High availability is crucial for maintaining business continuity and ensuring that applications are always accessible to users. Azure Database for MySQL – Flexible Server is designed with high availability (HA) as a fundamental feature, providing robust solutions to keep your databases operational even in the face of unexpected failures, but how do you ensure you’re getting the most out of it? This blog post delves into some of the effective monitoring and alerting mechanisms that are crucial for maintaining high availability.
High Availability in Azure Database for MySQL – Flexible Server
High Availability (HA) in Azure Database for MySQL – Flexible Server is designed to ensure uninterrupted database operation, even in the event of hardware or infrastructure failures in the availability zone. When high availability is configured, the flexible server automatically provisions and manages a standby replica in an alternate zone. The data and log files are hosted in zone-redundant storage (ZRS). The standby server reads and replay the log files continuously from the primary server’s storage account, which is protected by storage-level replication. If there’s a failover:
The standby replica is activated.
The binary log files of the primary server continue to apply to the standby server to bring it online to the last committed transaction on the primary.
With a Zone-redundant HA architecture, the guaranteed SLA is 99.99% of uptime and failover time is typically between 60-120 seconds.
Factors that impact High Availability
Lack of primary keys on tables – Azure Database for MySQL – Flexible Server allows configuring high availability with automatic failover. If there is a failure detected in primary, the standby replica in a HA configuration takes over the role of the primary to ensure business continuity. The failover time is typically between 60-120 seconds but in the absence of primary keys the failover time may get adversely affected. Azure Database for MySQL – Flexible Server leverages logical MySQL replication to replicate changes to the standby server. Absence of primary keys, increases the overhead on replication process, potentially causing replication lag or delays in applying changes on the standby server, thereby impacting the failover time or sometimes even resulting HA functionality to get impaired.
Storage IOPS – Storage IOPS have a significant impact on failover time in high availability mode in Azure Database for MySQL – Flexible Server. High IOPS contribute to faster data synchronization between the primary and standby servers, reduce transaction processing time, and ensure that the system can handle the increased I/O load during failover. If you have pre-provisioned IOPS and if it reaches its threshold value, it can adversely impact the replication lag and failover time. It is ideally recommended to switch to Auto scale of IOPS configuration for an optimized failover time in a HA configuration. You can learn more about Auto scale of IOPS here.
What is Azure Advisor?
Azure Advisor is a free, personalized cloud consultant service provided by Microsoft Azure. It analyses your Azure resources and usage telemetry to provide recommendations that help you improve the cost, security, performance, reliability, and operational efficiency of your cloud environment. By leveraging these insights, you can make data-driven decisions to enhance your Azure Database for MySQL – Flexible Server.
Leveraging Azure Advisor Alerts for High Availability
Azure Advisor monitors your MySQL – Flexible Server configurations and usage patterns. It identifies potential risks and provides alerts to help you address them before they impact your database availability.
You can access Advisors via the Azure portal by navigating to Settings-> Advisor recommendations.
Actionable Recommendations
Each alert from Azure Advisor comes with detailed recommendations on how to resolve the identified issues. These recommendations are customized to your specific environment, ensuring that they are relevant and actionable. For high availability, recommendations might include an Advisor highlighting:
The lack of primary key on the table.
High IO usage.
An example of an active advisor notification appearing in Azure portal is shown in the following screenshot:
Each advisor notification comes with a recommended action for the end user. Acting on these recommendations ensures a healthy HA configuration.
Automated Alerts and Notifications
It is always conceivable that advisor recommendations might go unnoticed, resulting in a lack of prompt action to the recommendations shared as part of the advisor notification. Therefore, Azure Advisor also integrates with Azure Monitor to provide automated alerts and notifications. You can set up alert rules to receive notifications via email, SMS, or through other channels whenever an issue is detected, and Azure advisor notification is triggered. This ensures you are promptly informed of any potential problems affecting high availability, allowing for quick action. To configure alerts on the new advisor recommendations on your Azure resource, follow the detailed steps in the article Create Azure Advisor alerts on new recommendations by using the Azure portal.
A sample alert generated appears in the following screenshot:
Note – You can set up alerts to be notified when you have a new Advisor recommendation on one of your resources. These alerts can notify you through email or text message. They can also be used to integrate with your existing systems through a webhook.
Conclusion
Monitoring the HA health in Azure Database for MySQL – Flexible Server is vital for ensuring continuous availability and enhancing disaster recovery preparedness. By using Azure Advisors and Azure Monitor, you can detect potential issues with your HA configuration beforehand, which ultimately contributes to a more reliable and resilient database environment.
If you have any questions or tips on using Azure Advisor with Azure Database for MySQL – Flexible Server, please share them in the Comments below. Happy monitoring!
Microsoft Tech Community – Latest Blogs –Read More
Removal of Default Tabs – “Staff Notebook” & “Reflect”
I have a few users who would like to toggle these tabs off or hide them. I have not been able to find a way to do this. If I right-click on the tabs, “App Permission” is the only option. Is there a way to perform this function via Powershell or O365 Admin Center?
See screenshot
I have a few users who would like to toggle these tabs off or hide them. I have not been able to find a way to do this. If I right-click on the tabs, “App Permission” is the only option. Is there a way to perform this function via Powershell or O365 Admin Center? See screenshot Read More
New Outlook app not sending email notifications on Windows 11
Hi everyone,
I’m having trouble with the New Outlook app on my Windows 11 machine. Despite trying to reinstall and reset the app, it’s still not sending me notifications for new emails.
The strange thing is that I can’t find the New Outlook app listed in the Windows 11 Settings > System > Notifications. I suspect this might be the root cause of the issue, as the app seems to be completely missing from the system’s notification settings.
Does anyone else have this problem? Any suggestions on how to fix this would be greatly appreciated.
Thanks!
Hi everyone,I’m having trouble with the New Outlook app on my Windows 11 machine. Despite trying to reinstall and reset the app, it’s still not sending me notifications for new emails.The strange thing is that I can’t find the New Outlook app listed in the Windows 11 Settings > System > Notifications. I suspect this might be the root cause of the issue, as the app seems to be completely missing from the system’s notification settings.Does anyone else have this problem? Any suggestions on how to fix this would be greatly appreciated.Thanks! Read More
Convert MPP into Excel file with transposition of every stage of sub-tasks in successive columns
Dear All,
I hope you are doing well and it’s my first time ever in Microsoft forum
As explained in the subject title, I want to know if it’s possible to convert MPP file into Excel file adding all it’s sub-tasks in differents columns. Meaning that :
– all same tasks level are in the same columns
– the missing sub-tasks level are represented by empty cells
For example :
MPP view :
Excel view would be then :
Or even better like that (file ready to share :-)) :
I would really appreciate your help
Many thanks in advance
Kind regards
Dear All, I hope you are doing well and it’s my first time ever in Microsoft forum As explained in the subject title, I want to know if it’s possible to convert MPP file into Excel file adding all it’s sub-tasks in differents columns. Meaning that :- all same tasks level are in the same columns- the missing sub-tasks level are represented by empty cells For example : MPP view : Excel view would be then : Or even better like that (file ready to share :-)) : I would really appreciate your help Many thanks in advance Kind regards Read More
Σχέδιο μαθήματος για το μάθημα της Γλώσσας ΣΤ δημοτικού σύμφωνα με το Νέο Αναλυτικό Πρόγραμμα
Γεια σου! Μπορείς να δημιουργήσεις ένα σχέδιο μαθήματος για το μάθημα της Γλώσσας ΣΤ δημοτικού σύμφωνα με το Νέο Αναλυτικό Πρόγραμμα;
Γεια σου! Μπορείς να δημιουργήσεις ένα σχέδιο μαθήματος για το μάθημα της Γλώσσας ΣΤ δημοτικού σύμφωνα με το Νέο Αναλυτικό Πρόγραμμα; Read More
Switch from GMAIL to O365 tracking training?
Good Afternoon,
So here in the near future we will be switching from GMAIL to O365 just like a any company you can send people any training video and they can say that they did it but in reality they did not.
Is there a way with like https://support.microsoft.com/en-us/training you could send people some training videos and then get a notification when they watch it or not?
I would like to get out as much info as possible before doing the switch over but I would also like to know if people are actually trying to learn or not.
Thanks.
Good Afternoon, So here in the near future we will be switching from GMAIL to O365 just like a any company you can send people any training video and they can say that they did it but in reality they did not. Is there a way with like https://support.microsoft.com/en-us/training you could send people some training videos and then get a notification when they watch it or not?I would like to get out as much info as possible before doing the switch over but I would also like to know if people are actually trying to learn or not. Thanks. Read More
Azure SQL free database: Azure Data Studio, PowerApps, GitHub copilot, Power BI, and Generative AI
This quick refresher covers the Azure SQL Database free offer and its integration with Azure Data Studio, Power Apps, Power BI, and Generative AI. The free offer provides a set amount of compute and storage for one serverless database per Azure subscription, perfect for both new and existing customers looking to develop for free and create proof-of-concept projects with Azure SQL Database. The offer renews monthly without expiration. Customers can upgrade to a continue usage mode to unlock additional capabilities and scale their database without losing data or settings.
Resources:
New Azure SQL Database free offer
View/share our latest episodes on Microsoft Learn and YouTube!
Microsoft Tech Community – Latest Blogs –Read More
JDBC Driver 12.8 for SQL Server Released
Version 12.8 of the Microsoft JDBC Driver for SQL Server has been released. Version 12.8.0 brings several added features, changes, and fixed issues over the previous production release.
Added
Java 22 support
Credential caching when using Managed Identity Credential or Default Azure Credential
Caching of the SQLServerBulkCopy object when using bulkcopy for batch insert
Connection level caching for destination column metadata in bulkcopy
SQL Server message handler and support for SQLException chaining
Full support for RFC4180 for CSV bulk insert operations
Direct construction of a microsoft.sql.DateTimeOffset instance from a java.time.OffsetDateTime value
Changed
Enum SQLServerSortOrder is now public
Removed synchronized from Socket overrides
Revised previous RMERR/RMFAIL changes by making the default RMFAIL
Updated dependencies
Enhanced support for TDSType.GUID
Fixed issues
19 bug fixes detailed in the release notes
Getting the latest release
The latest bits are available to download from Microsoft, from the GitHub repository, and via Maven Central.
Add the JDBC 12.8 RTW driver to your Maven project by adding the following code to your POM file to include it as a dependency in your project (choose .jre8 for Java 8/1.8 or .jre11 for Java 11 and up).
<dependency>
<groupId>com.microsoft.sqlserver</groupId>
<artifactId>mssql-jdbc</artifactId>
<version>12.8.0.jre11</version>
</dependency>
Help us improve the JDBC Driver by filing issues on GitHub or contributing to the project.
Microsoft Tech Community – Latest Blogs –Read More