Category: Microsoft
Category Archives: Microsoft
Ms Form – “Anyone can respond” Not working anymore
Hello,
I have a Microsoft form set to “Anyone can respond” that I shared with external members of my organization, and the link was perfectly functional until now for anyone having the link.
Since the end of the week 07/24/19, this form is no longer accessible without logging in (when testing with the link in private browsing, I am redirected to the login page) and it is also not accessible while loged to a Microsoft account external to my organization.
Has anyone else encountered this problem or is it a residual impact of the global bug related to the latest Microsoft update? Do you know how to report this bug if necessary? Thanks in advance.
Hello,I have a Microsoft form set to “Anyone can respond” that I shared with external members of my organization, and the link was perfectly functional until now for anyone having the link.Since the end of the week 07/24/19, this form is no longer accessible without logging in (when testing with the link in private browsing, I am redirected to the login page) and it is also not accessible while loged to a Microsoft account external to my organization. Has anyone else encountered this problem or is it a residual impact of the global bug related to the latest Microsoft update? Do you know how to report this bug if necessary? Thanks in advance. Read More
Windows 11 – missing “View online” on the right click view of folder or file
Just upgraded system to Window 11. OneDrive is running well. It’s strange that the right click of file or folder is missing “View online” function, but it has “Always keep on this device” function. How to get this solved and show “View online” on right click?
When using Window 10, “View online” function was there in right click.
Just upgraded system to Window 11. OneDrive is running well. It’s strange that the right click of file or folder is missing “View online” function, but it has “Always keep on this device” function. How to get this solved and show “View online” on right click? When using Window 10, “View online” function was there in right click. Read More
Checklist: How companies use Microsoft 365 securely
Microsoft 365 provides companies with numerous protective measures that ensure a high level of security for all applications and data. However, a prerequisite for the secure use of Microsoft 365 is that the security functions are actually licenced, activated, used and actively managed. In addition, all users and administrators must be informed about the risks and trained in secure use. Our checklist provides information on the security aspects you should consider when using Microsoft 365.
I wrote a Blog Post where you can read the most important things about this Topic.
https://www.msb365.blog/?p=5709
Microsoft 365 provides companies with numerous protective measures that ensure a high level of security for all applications and data. However, a prerequisite for the secure use of Microsoft 365 is that the security functions are actually licenced, activated, used and actively managed. In addition, all users and administrators must be informed about the risks and trained in secure use. Our checklist provides information on the security aspects you should consider when using Microsoft 365.
I wrote a Blog Post where you can read the most important things about this Topic.
https://www.msb365.blog/?p=5709
Need some KQL for DNS
I need few KQL query for below use case as table is _Im_Dns and ASimDnsActivityLogs.
Monitor DNS for Brand Abuse –
DNS Query Length with high standard deviation –
The following analytic identifies DNS queries with unusually large lengths by computing the standard deviation of query lengths and filtering those exceeding twice the standard deviation. It leverages DNS query data from the Network_Resolution data model, focusing on the length of the domain names being resolved. This activity is significant as unusually long DNS queries can indicate data exfiltration or command-and-control communication attempts. If confirmed malicious, this activity could allow attackers to stealthily transfer data or maintain persistent communication channels within the network.
Detect Long DNS TXT Record Response –
This search is used to detect attempts to use DNS tunneling, by calculating the length of responses to DNS TXT queries. Endpoints using DNS as a method of transmission for data exfiltration, Command And Control, or evasion of security controls can often be detected by noting unusually large volumes of DNS traffic. Deprecated because this detection should focus on DNS queries instead of DNS responses.
Large Volume of DNS ANY Queries –
The following analytic identifies a large volume of DNS ANY queries, which may indicate a DNS amplification attack. It leverages the Network_Resolution data model to count DNS queries of type “ANY” directed to specific destinations. This activity is significant because DNS amplification attacks can overwhelm network resources, leading to Denial of Service (DoS) conditions. If confirmed malicious, this activity could disrupt services, degrade network performance, and potentially be part of a larger Distributed Denial of Service (DDoS) attack, impacting the availability of critical infrastructure.
I need few KQL query for below use case as table is _Im_Dns and ASimDnsActivityLogs.Monitor DNS for Brand Abuse -This search looks for DNS requests for fauxdomains similar to the domains that youwant to have monitored for abuse.DNS Query Length with high standard deviation -The following analytic identifies DNS queries with unusually large lengths by computing the standard deviation of query lengths and filtering those exceeding twice the standard deviation. It leverages DNS query data from the Network_Resolution data model, focusing on the length of the domain names being resolved. This activity is significant as unusually long DNS queries can indicate data exfiltration or command-and-control communication attempts. If confirmed malicious, this activity could allow attackers to stealthily transfer data or maintain persistent communication channels within the network.Detect Long DNS TXT Record Response – This search is used to detect attempts to use DNS tunneling, by calculating the length of responses to DNS TXT queries. Endpoints using DNS as a method of transmission for data exfiltration, Command And Control, or evasion of security controls can often be detected by noting unusually large volumes of DNS traffic. Deprecated because this detection should focus on DNS queries instead of DNS responses.Large Volume of DNS ANY Queries -The following analytic identifies a large volume of DNS ANY queries, which may indicate a DNS amplification attack. It leverages the Network_Resolution data model to count DNS queries of type “ANY” directed to specific destinations. This activity is significant because DNS amplification attacks can overwhelm network resources, leading to Denial of Service (DoS) conditions. If confirmed malicious, this activity could disrupt services, degrade network performance, and potentially be part of a larger Distributed Denial of Service (DDoS) attack, impacting the availability of critical infrastructure. Read More
Cannot remove custom domain
Hi all.
Hoping for some advice from the community. I’ve been waiting for MS support for over a month, and it seems with each new technician we talk with, we just go through the same actions, which have been for the most part nothing different than what we had tried prior to raising the support ticket. Escalation options seem to have been exhausted. So, you could probably appreciate that I’m extremely disappointed with the support service so far.
The issue is that we cannot remove a domain from a tenant. The domain is shown as meeting the requirements for removal. We receive a generic error when we attempt to remove the domain from the Console or via Msol (Powershell). When we attempt to remove the domain from AzureAD (Powershell) using Remove-AzureADDomain command, we get an error that states “Details: PropertyName – supportedServices, PropertyErrorCode – DomainSupportedServiceUnsetNotAllowed”. The Domain supported services are “SupportedServices : {Email, OfficeCommunicationsOnline, MoeraDomain}”.
History: When the domain was initially added, it became stuck in a loop between “Setup Incomplete” and “Domain setup is complete” None of the KB or forum articles resolved this loop. In the end I added the supported services manually using the using the documented Set-AzureAdDomain command. The loop was resolved, and the domain showed as Verified. However users were unable to receive email and we were only able to add txt DNS records.
Our goal is to scrap the original tenant due to the uncertainty of domain tenant health and remove the domain from the tenant to be used in a new tenant, however we are unable to do so as per my earlier documented error. I’ve attempted to unset the variable trying a multitude of “bracketed null” variations to no avail.
Any insight into a resolution would be appreciated.
Hi all. Hoping for some advice from the community. I’ve been waiting for MS support for over a month, and it seems with each new technician we talk with, we just go through the same actions, which have been for the most part nothing different than what we had tried prior to raising the support ticket. Escalation options seem to have been exhausted. So, you could probably appreciate that I’m extremely disappointed with the support service so far.The issue is that we cannot remove a domain from a tenant. The domain is shown as meeting the requirements for removal. We receive a generic error when we attempt to remove the domain from the Console or via Msol (Powershell). When we attempt to remove the domain from AzureAD (Powershell) using Remove-AzureADDomain command, we get an error that states “Details: PropertyName – supportedServices, PropertyErrorCode – DomainSupportedServiceUnsetNotAllowed”. The Domain supported services are “SupportedServices : {Email, OfficeCommunicationsOnline, MoeraDomain}”. History: When the domain was initially added, it became stuck in a loop between “Setup Incomplete” and “Domain setup is complete” None of the KB or forum articles resolved this loop. In the end I added the supported services manually using the using the documented Set-AzureAdDomain command. The loop was resolved, and the domain showed as Verified. However users were unable to receive email and we were only able to add txt DNS records.Our goal is to scrap the original tenant due to the uncertainty of domain tenant health and remove the domain from the tenant to be used in a new tenant, however we are unable to do so as per my earlier documented error. I’ve attempted to unset the variable trying a multitude of “bracketed null” variations to no avail. Any insight into a resolution would be appreciated. Read More
How to take screenshot on Windows 11 with advanced editing feature?
I’m currently in search of a powerful screenshot tool for Windows 11 that offers advanced editing features. I’ve tried a few basic tools like the Snipping Tool and Snip & Sketch, but they don’t quite meet my needs. I require something with more robust editing capabilities such as annotations, blurring sensitive information, adding shapes, text, and other advanced editing options.
I’m open to free and paid options if it significantly improves my productivity. I would also appreciate it if the tool supports various file formats and provides easy sharing options. Anyone has experience with a tool that fits this description or can suggest one that might work for my needs, please share your insights.
Thanks
I’m currently in search of a powerful screenshot tool for Windows 11 that offers advanced editing features. I’ve tried a few basic tools like the Snipping Tool and Snip & Sketch, but they don’t quite meet my needs. I require something with more robust editing capabilities such as annotations, blurring sensitive information, adding shapes, text, and other advanced editing options. I’m open to free and paid options if it significantly improves my productivity. I would also appreciate it if the tool supports various file formats and provides easy sharing options. Anyone has experience with a tool that fits this description or can suggest one that might work for my needs, please share your insights. Thanks Read More
“Add required attendees” field not showing anymore
Hi,
When scheduling a meeting in Teams (free usage with a gmail account) the “Add required attendees” field is missing since last week. This issue has not occurend during the past year or so.
Already did a reinstall of the Teams app on the respective Windows PC and also tried it on an iPad but ran into the same problem. It also does not make a difference whether the classic or new version of Teams is used. Maybe it has to do with expiration of the free usage period, although I could not find anything about the existance of such period..
Your help would be much appreciated.
Hi, When scheduling a meeting in Teams (free usage with a gmail account) the “Add required attendees” field is missing since last week. This issue has not occurend during the past year or so.Already did a reinstall of the Teams app on the respective Windows PC and also tried it on an iPad but ran into the same problem. It also does not make a difference whether the classic or new version of Teams is used. Maybe it has to do with expiration of the free usage period, although I could not find anything about the existance of such period.. Your help would be much appreciated. Read More
How can I find Tasks where I missed to add a due date? (Windows App)
Hello,
in the view “Planned” I can see al my tasks by date. Good .
But what If I missed to add a due date so some tasks? In the “web version” All – To Do (office.com) I can see due date in a columns, so easy to find. In the App, this column is missing 🙁
Kind reards
Jens
Hello,in the view “Planned” I can see al my tasks by date. Good . But what If I missed to add a due date so some tasks? In the “web version” All – To Do (office.com) I can see due date in a columns, so easy to find. In the App, this column is missing :(Kind reardsJens Read More
How to use 9 new features in SharePoint [Co-Authoring Pages, Web Parts Enhancements]
In this video tutorial, explore the latest features in #SharePoint! Authors can now view real-time changes in Pages, sections, and web parts. Users can apply shapes over images and enhance text web parts with new styles and font size control. Additionally, add curated image backgrounds and customize page sections with banners. Excitingly, set link expirations and enable approvals workflows in any list in a few clicks.
#MicrosoftSharePoint #Microsoft365 #Productivity #NewFeatures #MVPbuzz
In this video tutorial, explore the latest features in #SharePoint! Authors can now view real-time changes in Pages, sections, and web parts. Users can apply shapes over images and enhance text web parts with new styles and font size control. Additionally, add curated image backgrounds and customize page sections with banners. Excitingly, set link expirations and enable approvals workflows in any list in a few clicks.
#MicrosoftSharePoint #Microsoft365 #Productivity #NewFeatures #MVPbuzz Read More
Build Powerful RAG Apps Without Code Using LangFlow and Azure OpenAI
Ok. So you want to supplement your LLM chat application with your own knowledge base but you could not be bothered about the cumbersome code development that might be involved. Or you are more visual-oriented and would like to make sense of your application workflow.
LangFlow is a drag-and-drop framework that helps you build fully customizable GenAI applications. You can assemble several components with a few clicks to create the exact Retrieval Augmented Generation application you envision, powered by your data source. This means you can access a more reliable, easy-to-build, GenAI model built to address your unique business needs.
In this tutorial, we will utilize some of the available building components in LangFlow to make an application that provides food recommendations based on the US dietary guidelines. This project builds upon our previous work where we integrated Azure Openai and Document Intelligence to scan food products and get more insights to guide our nutrition.
This project requires that you have:
Access to Azure Openai
Python 3.10
Creating a LangFlow account
To start LangFlow, ensure you are working in an environment with python 3.10. Then go to your terminal and run the following (I promise, you will not have to write any code beyond this):
python -m pip install langflow -U
python -m langflow run
If everything runs successfully, you should get something similar to the image below. Click on the endpoint. This should direct you to your LangFlow account. Proceed to create a new project.
Creating the workflow
Deploying an Azure Openai model
Now, open another page and log onto your Azure portal, and create an Openai resource (fill out this registration form if you don’t already have access):
Subscription: Select your active subscription
Resource group: Select an existing resource group or create a new one
Name: Name your resource.
Region: Randomly select any region from the available lists of regions.
Pricing: Select Standard S0
After deploying the resource, click on “Go to Azure Openai studio” on the top pane
Scroll down on the left pane and click on the “Deployments” page
Click on “create new deployment” next
Select GPT4o in the list of models
Assign a name to the deployment (note the deployment name)
Reduce the token rate limit to 7K and then “Create”
Once that has been successfully deployed, go back to the Azure portal. Click on the Openai resource you just deployed and copy the endpoint and keys.
Building your application pipeline
Go back to your LangFlow page. Note the panes on the right; this is where you will find your necessary building components. Click on the inputs dropdown menu and drag the chat input pane onto the canvas. Do the same with the outputs menu and select ‘chat output’. Go to the models pane, find and select ‘Azure Openai’.
Once that is done, connect the components as seen in the image above:
Paste the following to the template field in the ‘Prompt’ pane:You are an AI assistant that helps users resolve their question
Question:{question}
Connect the input node in ‘Chat Input’ to the ‘Prompt’ question node
Connect the prompt message in ‘Chat Input’ to the text node in ‘Azure OpenAI’.
Fill in the copied credentials.
Now let’s test it. Click on the play button in the top-right corner on the Chat Output pane. Click on ‘playground’ in the bottom right of the screen and ask any question. It should return a response successfully:
Creating a vector database and populating it with vector embeddings
Keeping to the objective of our application, we need a database to store, retrieve, and query our data on dietary guidelines. We need to be able to search this database and retrieve information closely related to our query. To do this we would be using Azure OpenAI embeddings to create the vectors that represent this relationship.
First sign up on Astra db. Once that is done, you should get an interface like the image below. If not, toggle the drop-down in the red box and select Astra db. Then select ‘create database’ on the right in the yellow box. Name the database and select Azure as the provider (please note, this costs a minimal fee to use), select us-east-2 as the region and proceed to create the database. Once created, note the database details on the right of the create database page.
Now go back to your LangFlow page. Save the current flow and then create a new project. On this page populate the screen with the following components:
‘File’
‘Recursive Character Text splitting’
‘Azure OpenAI Embeddings’
‘Astra DB’
You should have something similar to the image below:
Return to OpenAI studio and follow the steps in creating and deploying a vector embedding model (select text-embedding-002 as the model type) as stated earlier. Now go back to the LangFlow pane and do the following:
File component:
Attach the US dietary guideline PDF.
Connect the output to the input node in the recursive character text split pane
Recursive split:
Leave as default
Connect to Astra DB ‘Ingest Data’ node
Azure Openai Embeddings:
Fill the endpoint and API keys as you did with the GPT4o model
Fill in the name of the deployed model
Connect the embeddings model to Astra DB
Astra DB:
Return to the Astra DB page and copy the API and token from the database detail
Name the collection
Fill the parameters
Press the play button on the top right and then give it a few minutes to complete
The process above will create a collection of the vector embeddings in the database and might take a few minutes. Return to the Astra DB database page and you should see the collection has been created and proceed to inspect it.
Creating the final workflow
Next, open the previously saved flow and connect the Astra DB pane to the existing flow:
Fill the details of the Astra db as same as before
Create a new thread that connects chat input data to Astra DB
Drag ‘Parse Data’ onto the pane and connect Astra DB ‘search result’ node to ‘data’
Copy the template below and paste into the template field of the ‘prompt’ pane:
You are an AI assistant that helps that recommends healthy meals based on the {guidelines} provided
question: {question}
context: {guidelines}
Connect the text node in ‘Parse Data’ to the ‘guidelines’ node in templates. If all instructions are followed correctly, it should look something like this:
Now we have completed the pipeline and are ready to test.
Click on ‘playground’ and test out your app!
Bonus:
You can export this flow and integrate it into your app. LangFlow provides several API codes. Below, we generate a chat widget by simply copying the API code:
Click on API in the bottom left of the screen next to ‘Playground’ in LangFlow. Click on the Chat widget HTML and copy the code.
Create an HTML file in VS code. Paste the code. Right-click and ‘ open with live server’
This will direct you to a page with the chat widget.
References and further reading:
1. AstraDB On Azure
Microsoft Tech Community – Latest Blogs –Read More
Why? and How to Ground a Large Language Models using your Data? (RAG)
Introduction
Large language models (LLMs) can perform many different tasks with text, audio, images, and even videos, allowing them to be multimodal. With these many capabilities, there is a fear that LLMs might not always follow the goal that you created them for, which might be to do a specific task related to your business. For example, respond to customers’ inquiries or assist the company’s employees in finding answers to their questions. Modifying the system message (meta prompt) with prompt engineering to achieve these specific tasks will still leave you with nondeterministic responses generated randomly. To mitigate this, you can use retrieval-augmented generation (RAG), which we will explain in this blog.
Why Use RAG?
The top reasons why you’d consider using the RAG technique are:
Provide Grounding and Context for the Large Language Model:
Overcome the outdated training data limitation.
Lower starting cost compared to other solutions like fine-tuning.
Supercharge data retrieval with a powerful generative model.
What is RAG?
RAG is Intelligently retrieving a subset of data from data stores to provide specific, contextual knowledge to the large language model to support how it answers a user’s prompt (question or query).
Intelligent retrieval is crucial, as the model will only respond based on the retrieved data; if the data is bad, the model will give non-relevant responses.
You might often see RAG used with vector databases, but using this technique is not limited to that; you can also use RAG with your pre-existing SQL or No-SQL database. Just pass the query response to a large language model, and it will rephrase it into text-based responses that closely resemble human language and structure.
Why are Vector Databases commonly used with RAG?
A user question is in natural language that has some context and semantics for what they are talking about. Applying a full-text search to a user query strips it of everything and fully or partially matches the text in the question to text from your database. That does not allow for using synonyms in the search or semantically matching what the user means with database records; it has to be the exact word from the database to pull the right records. Here comes the vector database to provide that missing component.
Vector databases store vector embeddings that hold the semantic meaning of words. Vector embeddings consist of a vector (or array) of numbers that represent real-world words.
(Artwork by: sfoteini)
You can use any embedding model (like text-embedding-ada-002) to generate this array of numbers, but note that each embedding model generates vector embeddings of different lengths, so you need to make sure that all of your embeddings are of the same length and that you have configured that in your vector database too.
How to do RAG?
There are two ways to do RAG:
– With preexisting database SQL or No-SQL.
– With Vector database.
Preexisting database:
– Create a new Azure OpenAI Chat deployment. (see guide for creating a ChatGPT deployment here)
– Get the Model API Key and Endpoint to integrate it into your application.
– Feed the returned records from your database to the model.
– Return the model’s response to the user.
Vector database:
– Choose a vector database. (Currently, not all databases support storing vectors so you might need to migrate to a database that supports storing vectors and vector operations.)
– Create a new Azure OpenAI Embeddings deployment. (see guide for creating an embedding deployment here)
– Get the Model API Key and Endpoint to use it
– Choose the columns/keys that you want to convert to vector embeddings.
– Make API calls to generate your embeddings and store them in your database.
– Create a new Azure OpenAI Chat deployment. (see guide for creating a ChatGPT deployment here)
– Get the Model API Key and Endpoint to integrate it into your application.
– When a user asks a question convert it to vector embeddings and compare the similarity of the embeddings with what you have in the database (this now can compare relationships, patterns, and meaning of words)
– Feed the returned records from your database to the model.
– Return the model’s response to the user.
This GitHub sample displays the difference between full-text, vector-only, and rag search with Azure CosmosDB for MongoDB vCore and LangChain available here Azure-Samples/Cosmic-Food-RAG-app.
You can try it to test the three approaches and see what these techniques can add to your business.
Conclusion
RAG can streamline your recommendation system and take it to the next level with the power of human-like responses and semantic similarity, understanding what your user actually needs. You can start by adding a large language model layer between your database and the user to see the difference it makes before migrating your entire database to a vector database. It all depends on your use case and the nature of the data you have. Start by testing a small subset of your data first, then migrate if the results are looking good.
Further Reading
What is Azure OpenAI Service? – Azure AI services
Azure OpenAI Service embeddings – Azure OpenAI – embeddings and cosine similarity
Vector Search – Azure Cosmos DB for MongoDB vCore
Advanced Prompt engineering techniques
18 Lessons, Get Started Building with Generative AI
RAG and generative AI – Azure AI Search
RAG techniques: Cleaning user questions with an LLM
RAG techniques: Function calling for more structured retrieval
Found this useful? Share it with others and follow me to get updates on:
Twitter (twitter.com/john00isaac)
LinkedIn (linkedin.com/in/john0isaac)
Feel free to share your comments and/or inquiries in the comment section below..
See you in future blogs!
Microsoft Tech Community – Latest Blogs –Read More
Azure Monitor: How To Get Alerts for Disconnected Arc Agents
Ciao Readers,
Just a week no writing . The moment of another blog post arrived .
In this post, I am going to show you how to set up alerts for disconnected Arc agents using Azure Monitor. If you are not familiar with Azure Arc, it is a service that lets you manage and govern your hybrid cloud resources from a single pane of glass. More about it in the Azure Arc overview public documentation page.
One of the benefits of using Arc is that it allows you to collect data from your hybrid resources, so you monitor the health and performance of them. It is ‘a prerequisite’ for enabling Azure Monitor. With that in mind, why it is important to get the alert when a hybrid virtual machine gets disconnected, or the Arc agent status is reported as Offline? Ouch, you did not know they were offline !!!
There are several reasons that spread from management to compliance including monitoring why you need to be aware if your resources are communicating properly or not . Let me give you a few of them:
When a hybrid virtual machine is onboarded, every connection is authenticated using a Managed Identity created automatically during the onboarding process. This System Assigned Managed Identity is renewed automatically and can be set as expired if the system does not communicate for more than 60 days. Should this be the case, there is no way to reset the identity. You have to offboard and re-onboard the machine together with all the installed extensions and configurations
When the hybrid machine is disconnected, no monitoring data can be sent. This can lead up to something really bad like:
Customers go blind about infrastructure health
Machine will maintain the unsent monitoring data in the local cache on the C drive using up to 10 GB of disk space
Old, cached data will be deleted so monitoring data loss is expected
Machines with small disks can quickly and easily run out of disk space. Can you imagine that on a Domain Controller?
I just gave you two reasons and given them, I do not think you need any additional one, right? I think you have got the importance of being alerted when an Arc agent gets disconnected as soon as possible by now. Yes, the sooner, the better.
Therefore, you will agree with me that it is necessary to create an alarm. To achieve the goal of creating the alert, you can take advantage of the ability to Create alerts with Azure Resource Graph and Log Analytics.
Let us have a look at the query to be used. The query should give you back one line per monitored server (any alert should give you actionable information and the affected resource is the first in the list) where the last status is reported as Disconnected.
A good query should return records for hybrid machine not connected since a given amount of time. The value in this case is your choice, but I would recommend something not that wide (15 minutes could be a good compromise).
Once you have a good record set, you should configure the alert rule to use the Table rows as Measure and the Count as aggregation type. The Aggregation granularity, which is driving the data range the query will consider, could be set at 1 day
The alert rule logic will be then configured to measure the number of rows returned by the query. The alert will fire if records (even a single one) are returned.
Assuming that your preference will be to get an alert where resources have not been connecting for the last 15 minutes, you create an alert that uses a query similar to the following one:
arg(“”).resources
| where type == “microsoft.hybridcompute/machines”
| where tostring(properties.status) == “Disconnected”
| extend lastContactedDate = todatetime(properties.lastStatusChange)
| where lastContactedDate <= ago(15m)
| extend status = tostring(properties.status)
| project id, Computer=name, status, lastContactedDate
Running the suggested query, will return something similar to the following image, which will fire the alert in line with the Alert logic condition provided as sample:
I trust you all will be more than able to continue with alert creation; hence I am to stop here avoid consuming your eyes anymore .
Thanks for reading through !!!
Disclaimer
The sample scripts are not supported under any Microsoft standard support program or service. The sample scripts are provided AS IS without a warranty of any kind. Microsoft further disclaims all implied warranties including, without limitation, any implied warranties of merchantability or of fitness for a particular purpose. The entire risk arising out of the use or performance of the sample scripts and documentation remains with you. In no event shall Microsoft, its authors, or anyone else involved in the creation, production, or delivery of the scripts be liable for any damages whatsoever (including, without limitation, damages for loss of business profits, business interruption, loss of business information, or other pecuniary loss) arising out of the use of or inability to use the sample scripts or documentation, even if Microsoft has been advised of the possibility of such damages.
Microsoft Tech Community – Latest Blogs –Read More
How to fix Error 15243 in Q.B Desktop after update?
I’ve been experiencing Q.B Error 15243 during pay-roll updates. Despite following troubleshooting steps, the issue persists. Has anyone else faced this problem? Any insights or solutions would be greatly appreciated. Thanks!
I’ve been experiencing Q.B Error 15243 during pay-roll updates. Despite following troubleshooting steps, the issue persists. Has anyone else faced this problem? Any insights or solutions would be greatly appreciated. Thanks! Read More
How to fix Error 15222 in Q.B Desktop after update?
I’m facing Q.B Error 15222 while trying to download pay-roll updates. The error message states that the update couldn’t be completed due to a connectivity issue. I’ve tried rebooting my system and checking my internet connection, but nothing seems to work. Any suggestions?
I’m facing Q.B Error 15222 while trying to download pay-roll updates. The error message states that the update couldn’t be completed due to a connectivity issue. I’ve tried rebooting my system and checking my internet connection, but nothing seems to work. Any suggestions? Read More
How to fix Error 12007 in Q.B Desktop after update?
Every time I try to update my Q.B pay-roll, I encounter Q.B Error 12007. It seems to be related to a connection issue.
Every time I try to update my Q.B pay-roll, I encounter Q.B Error 12007. It seems to be related to a connection issue. Read More
The Subscribtion Dioes not Containst any Registered ASNs
https://learn.microsoft.com/en-us/azure/internet-peering/howto-subscription-association-portal
This is my Tracking ID please check this issue and response my email which was not address from last 48 hours untill now
https://learn.microsoft.com/en-us/azure/internet-peering/howto-subscription-association-portalI have follow this documentation and enable Microsoft.Peering. But When I try to add asn it says your subscription Does not contain any registered ASN. Please help me to fix this issue.https://i.is.cc/sEq7Idf.png TrackingID#2407180030009866This is my Tracking ID please check this issue and response my email which was not address from last 48 hours untill now Read More
Will Microsoft release Windows 10 for ARM RTM product for licensed user?
Will Microsoft release Windows 10 for ARM RTM product for licensed user in future?
Will Microsoft release Windows 10 for ARM RTM product for licensed user in future? Read More
Connect Azure SQL Server via User Assigned Managed Identity under Django
TOC
Why we use it
Architecture
How to use it
References
Why we use it
This tutorial will introduce how to integrate Microsoft Entra with Azure SQL Server to avoid using fixed usernames and passwords. By utilizing user-assigned managed identities as a programmatic bridge, it becomes easier for Azure-related PaaS services (such as Function App or App Services) to communicate with the database without storing connection information in plain text.
Architecture
I will introduce each service or component and their configurations in subsequent chapters according to the order of A-D:
A: The company’s account administrator needs to create or designate a user as the database administrator. This role can only be assigned to one person within the database and is responsible for basic configuration and the creation and maintenance of other database users. It is not intended for development or actual system operations.
B: The company’s security department needs to create one or more user-assigned managed identities. In the future, the Web App will issue access requests to the database under different user identities.
C: The company’s data department needs to create or maintain a database and designate Microsoft Entra as the only login method, eliminating other fixed username/password combinations.
D: The company’s development department needs to create a Web App (or other service) as the basic unit of the business system. Programmers within this unit will write business logic (e.g., accessing the database) and deploy it here.
How to use it
A: As this article does not dive into the detailed configuration of Microsoft Entra, it will only outline the process. The company’s account administrator needs to create or designate a user as the database administrator. In this example, we will call this user “cch,” and the account, “cch@thexxxxxxxxxxxx” will be used in subsequent steps.
B: Please create a user-assigned managed identity from Azure Portal. And copy the Client ID and Resource ID once you’ve created the identity for the further use.
C-1: Create a database/SQL server. During this process, you need to specify the user created in Step A as the database administrator. Please note that to select “Microsoft Entra-only authentication.” In this mode, the username/password will no longer be used. Then, click on “Next: Networking.”
Since this article does not cover the detailed network configuration of the database, temporarily allow public access during the tutorial. Use the default values for other settings, click on “Review + Create,” and then click “Create” to finish the setup.
During this process, you need to specify the user-assigned managed identity created in Step B as the entity that will actually operate the database.
And leave it default from the rest of the parts
C-2: After the database has created, you can log in using the identity “cch@thexxxxxxxxxxxx” you’ve get from Step A which is the database administrator. Open a PowerShell terminal and using the “cch” account, enter the following command to log in to SQL Server. You will need to change the <text> to follow your company’s naming conventions.
sqlcmd -S <YOUR_SERVER_NAME>.database.windows.net -d <YOUR_DB_NAME> -U <YOUR_FULL_USER_EMAIL> -G
You will be prompt for a 2 step verification.
Returning to the console, we will now create user accounts in SQL Server for the managed identities setup from Step B. First, we will introduce the method for the user-assigned managed identity. The purpose of the commands is to grant database-related operational permissions to the newly created user. This is just an example. In actual scenarios, you should follow your company’s security policies and make the necessary adjustments accordingly. Please enter the following command.
CREATE USER [<YOUR_IDENTITY_NAME>] FROM EXTERNAL PROVIDER;
USE [<YOUR_DB_NAME>];
EXEC sp_addrolemember ‘db_owner’, ‘<YOUR_IDENTITY_NAME>’;
For testing purposes, we will create a test table, and insert some data.
CREATE TABLE TestTable (
Column1 INT,
Column2 NVARCHAR(100)
);
INSERT INTO TestTable (Column1, Column2) VALUES (1, ‘First Record’);
INSERT INTO TestTable (Column1, Column2) VALUES (2, ‘Second Record’);
D-1: In this example, we can create a Web App with any SKU/region. For the development language (stack), we choose Python as a demonstration, though other languages also support the same functionality. Since this article does not cover the detailed network configuration or other specifics of the Web App, we will use the default values for other settings. Simply click on “Review + Create,” and then click on “Create” to complete the process.
D-2: After Web App has created, please open Azure Cloud Shell in the bash mode and enter a command. You will need to change the <text> to follow your company’s naming conventions.
az webapp identity assign –resource-group <YOUR_RG_NAME> –name <YOUR_APP_NAME> –identities <RESOURCE_ID_IN_STEP_B>
D-3: Programmer can now deploy the code to the Web App. In this tutorial, we use Quickstart: Deploy a Python (Django, Flask, or FastAPI) web app to Azure – Azure App Service | Microsoft Learn to complete the example. Other languages also have their respective SQL Server connectors and follow the same principles.
In requirements.txt, in addition to the existing ones, please add the following packages: mssql-django
In quickstartproject/settings.py, include the following example content, you will need to change the <text> to follow your company’s naming conventions
DATABASES = {
‘default’: {
‘ENGINE’: ‘mssql’,
‘NAME’: ‘<YOUR_DB_NAME>’,
‘HOST’: ‘<YOUR_SERVER_NAME>.database.windows.net’,
‘PORT’: ‘1433’,
“USER”: “<CLIENT_ID_IN_STEP_B>”,
‘OPTIONS’: {
‘driver’: ‘ODBC Driver 18 for SQL Server’,
‘extra_params’: ‘Authentication=ActiveDirectoryMsi’,
}
}
}
In hello_azure/views.py, include the following example content.
def index(request):
raw_text = “”
with connection.cursor() as cursor:
cursor.execute(“SELECT Column2 FROM TestTable”)
rows = cursor.fetchall()
for row in rows:
raw_text = row
return HttpResponse(raw_text, content_type=’text/plain’)
Please note that the code I provided in this tutorial is only suitable for the testing phase. Its purpose is to verify usability and it is not intended for production use. Ultimately, please make the corresponding modifications based on the business functionality and security guidelines of your own environment.
Once the deployment is complete, you can proceed with testing. We can observe that the Web App will call the authentication endpoint in the background to get an access token. It will then use this token to interact with the database and subsequently print out the queried data.
References:
Authenticate with Microsoft Entra ID in sqlcmd – SQL Server | Microsoft Learn
Microsoft Tech Community – Latest Blogs –Read More
SQL Query Incremental data load
I have source and staging tables & target table, I want to bring in incremental data into staging from source.
I have 2 date columnCreatedDate and UpdatedDate to work with to bring in the incremental data in stage
Table structure (ID_Pk, CreatedDate, UpdatedDate)
CreatedDate and UpdatedDate are date with time stamp and ID is PK
e.g. Created Date/updated date format ‘2024-05-09 16:13.03.5722250’
I have to write SQL to get only incremental data from source table, by using UpdateDate if not null, in case if updated date is null then use CreateDate to pull the incremental data.
I got the vmaxCreatedDate and vmaxUpdatedDate from targate table put it into varaibles and wrote below query, question is this this sql correct for incremental data load. I am inform to pick incremental data using UpdateDate in case it is NULL then use CreatedDate to bring in only incremental data
Select * from SourceTbl where updateDate > vmaxUpdatedDate and updateDate is not null
UNION
Select * from SourceTbl where createdDate > vmaxCreatedDate
I have source and staging tables & target table, I want to bring in incremental data into staging from source.I have 2 date columnCreatedDate and UpdatedDate to work with to bring in the incremental data in stageTable structure (ID_Pk, CreatedDate, UpdatedDate)CreatedDate and UpdatedDate are date with time stamp and ID is PKe.g. Created Date/updated date format ‘2024-05-09 16:13.03.5722250’I have to write SQL to get only incremental data from source table, by using UpdateDate if not null, in case if updated date is null then use CreateDate to pull the incremental data.I got the vmaxCreatedDate and vmaxUpdatedDate from targate table put it into varaibles and wrote below query, question is this this sql correct for incremental data load. I am inform to pick incremental data using UpdateDate in case it is NULL then use CreatedDate to bring in only incremental dataSelect * from SourceTbl where updateDate > vmaxUpdatedDate and updateDate is not null
UNION
Select * from SourceTbl where createdDate > vmaxCreatedDate Read More
SharePoint list view can’t grouping by and filtering by
Hi,
I’m troubleshooting an issue in SharePoint List, we using Approval Queues to collect all request for approval and create a view list to all user, at view list we grouping by some process and filtering sort A to Z. But, since last Friday we got issue the view list not showing the data.
Has anyone experienced the same issue like us?
Thank you.
Hi, I’m troubleshooting an issue in SharePoint List, we using Approval Queues to collect all request for approval and create a view list to all user, at view list we grouping by some process and filtering sort A to Z. But, since last Friday we got issue the view list not showing the data. Has anyone experienced the same issue like us? Thank you. Read More