Category: Microsoft
Category Archives: Microsoft
Unlock power of data in Azure- With SQL Server on Linux Azure VMs and Azure AI search
In a world awash with data, the challenge lies in our ability to comprehend and engage with it seamlessly. My colleague Muazma has shed light on this topic in her insightful blog: “Chat with your data in Azure SQL Database,”.
Inspired by her work, I pondered the possibility of applying similar principles to SQL Server on Linux VMs hosted on Azure. This blog is a result of that contemplation and here are the steps that we cover in this blog:
We’ll begin by setting up a Linux-based Virtual Machine on Azure and proceed to install SQL Server on it.
Next, we’ll implement TLS 1.2 encryption to secure connections to SQL Server, utilizing certbot for certificate creation with Let’s Encrypt serving as the certificate authority.
We’ll then import Kaggle’s dataset into SQL Server using the Import Flat File wizard.
Following that, we’ll create the Azure AI service and associated indexes, with the Azure SQL Server on Linux VM as the data source.
Lastly, we’ll utilize Azure OpenAI studio to interact with the data.
To follow my lead, all you need is an Azure Subscription and an account set up to gain access to Azure OpenAI Studio.
Step 1: Create the Azure SQL Server on Linux based VM:
Let’s start by setting up a Linux-based VM on Azure. For this demonstration, I’ll be configuring an Ubuntu 22.04 VM. Below is the script to first create a resource group, followed by the creation of a VM named SQLLinux22 running Ubuntu 22.04.
# let’s create the resource group using the command:
az group create –name myrgdemo –location centralindia
#lets create VM using Ubuntu image, I am using this image: 0001-com-ubuntu-minimal-jammy
az vm create –resource-group myrgdemo –name sqllinux22 –size “Standard_B4ms” –location “central india” –image “Canonical:0001-com-ubuntu-minimal-jammy:minimal-22_04-lts:22.04.202405131” –admin-username “amvin” –admin-password “MY$trongPass123*#” –authentication-type all –generate-ssh-keys
Once the VM is set up, proceed to install SQL Server by following the guidelines provided in the official Microsoft documentation.
Step 2: Enable TLS 1.2 Encryption on SQL Server on Linux, to secure SQL Server connections:
Following the installation of SQL Server, it’s time to move on to step 2: enabling TLS 1.2 encryption to secure connections to SQL Server. But first, you need to set up a DNS name for the VM you’ve created, as it’s necessary for generating the certificate. You can configure the DNS name via the Azure portal. Once it’s set up, it will appear as shown. Remember to also enable port 80 in the VM’s Network Settings, which is required by Certbot for certificate creation.
With that completed, it’s now time to install Certbot and generate the necessary certificate. Log into the VM using your preferred SSH client and execute the following commands. These will install Certbot and then create the certificate using the DNS name you have set up.
amvin@sqllinux22:~$ sudo snap install –classic certbot
2024-05-16T21:11:23Z INFO Waiting for automatic snapd restart…
certbot 2.10.0 from Certbot Project (certbot-eff✓) installed
## After the installation go ahead and create the certificate and private key file.
amvin@sqllinux22:~$ sudo certbot certonly –standalone –key-type rsa –preferred-challenges http -d sqllinux22.centralindia.cloudapp.azure.com
Saving debug log to /var/log/letsencrypt/letsencrypt.log
Enter email address (used for urgent renewal and security notices)
(Enter ‘c’ to cancel): xxxxxxxxxxx
– – – – – – – – – – – – – – – – – – – – – – – – – – – – – – – – – – – – – – – –
Please read the Terms of Service at
https://letsencrypt.org/documents/LE-SA-v1.4-April-3-2024.pdf. You must agree in
order to register with the ACME server. Do you agree?
– – – – – – – – – – – – – – – – – – – – – – – – – – – – – – – – – – – – – – – –
(Y)es/(N)o: Y
– – – – – – – – – – – – – – – – – – – – – – – – – – – – – – – – – – – – – – – –
Would you be willing, once your first certificate is successfully issued, to
share your email address with the Electronic Frontier Foundation, a founding
partner of the Let’s Encrypt project and the non-profit organization that
develops Certbot? We’d like to send you email about our work encrypting the web,
EFF news, campaigns, and ways to support digital freedom.
– – – – – – – – – – – – – – – – – – – – – – – – – – – – – – – – – – – – – – – –
(Y)es/(N)o: Y
Account registered.
Requesting a certificate for sqllinux22.centralindia.cloudapp.azure.com
Successfully received certificate.
Certificate is saved at: /etc/letsencrypt/live/sqllinux22.centralindia.cloudapp.azure.com/fullchain.pem
Key is saved at: /etc/letsencrypt/live/sqllinux22.centralindia.cloudapp.azure.com/privkey.pem
This certificate expires on 2024-08-14.
These files will be updated when the certificate renews.
Certbot has set up a scheduled task to automatically renew this certificate in the background.
– – – – – – – – – – – – – – – – – – – – – – – – – – – – – – – – – – – – – – – –
If you like Certbot, please consider supporting our work by:
* Donating to ISRG / Let’s Encrypt: https://letsencrypt.org/donate
* Donating to EFF: https://eff.org/donate-le
– – – – – – – – – – – – – – – – – – – – – – – – – – – – – – – – – – – – – – – –
## With this we have now created the required files as shown below:
root@sqllinux22:/etc/letsencrypt/live/sqllinux22.centralindia.cloudapp.azure.com# ll
total 28
drwxr-xr-x 2 root root 4096 May 16 21:13 ./
drwx—— 3 root root 4096 May 16 21:13 ../
-rw-r–r– 1 root root 692 May 16 21:13 README
lrwxrwxrwx 1 root root 66 May 16 21:13 cert.pem -> ../../archive/sqllinux22.centralindia.cloudapp.azure.com/cert1.pem
lrwxrwxrwx 1 root root 67 May 16 21:13 chain.pem -> ../../archive/sqllinux22.centralindia.cloudapp.azure.com/chain1.pem
lrwxrwxrwx 1 root root 71 May 16 21:13 fullchain.pem -> ../../archive/sqllinux22.centralindia.cloudapp.azure.com/fullchain1.pem
lrwxrwxrwx 1 root root 69 May 16 21:13 privkey.pem -> ../../archive/sqllinux22.centralindia.cloudapp.azure.com/privkey1.pem
Move the certificate and necessary files to the “/var/opt/mssql/secrets” directory for SQL Server’s use and to enable TLS 1.2 encryption as demonstrated below. After enabling TLS 1.2 encryption, please restart SQL Server.
# copy the cert and key to the secrets folder as shown below, we are converting the key # from .pem format to .key using the openssl option.
sudo cp /etc/letsencrypt/live/sqllinux22.centralindia.cloudapp.azure.com/fullchain.pem /var/opt/mssql/secrets/fullchain.pem
sudo openssl rsa -in /etc/letsencrypt/live/sqllinux22.centralindia.cloudapp.azure.com/privkey.pem -out /var/opt/mssql/secrets/privkey.key
# Enable TLS 1.2 as shown below using the mssql-conf for SQL Server
sudo /opt/mssql/bin/mssql-conf set network.tlscert /var/opt/mssql/secrets/fullchain.pem
sudo /opt/mssql/bin/mssql-conf set network.tlskey /var/opt/mssql/secrets/privkey.key
sudo /opt/mssql/bin/mssql-conf set network.tlsprotocols 1.2
sudo /opt/mssql/bin/mssql-conf set network.forceencryption 0
# Restart SQL Server and confirm that TLS 1.2 is enabled as seen in the errorlog:
amvin@sqllinux22:~$ sudo systemctl restart mssql-server
# Now, lets read the errorlog, to confirm the certificate is loaded
root@sqllinux22:~# cat /var/opt/mssql/log/errorlog | grep “Allowed TLS”
2024-05-16 21:23:16.78 Server Successfully initialized the TLS configuration. Allowed TLS protocol versions are [‘1.2’]. Allowed TLS ciphers are [‘ECDHE-ECDSA-AES128-GCM-SHA256:ECDHE-ECDSA-AES256-GCM-SHA384:ECDHE-RSA-AES128-GCM-SHA256:ECDHE-RSA-AES256-GCM-SHA384:ECDHE-ECDSA-AES128-SHA256:ECDHE-ECDSA-AES256-SHA384:ECDHE-ECDSA-AES256-SHA:ECDHE-ECDSA-AES128-SHA:AES256-GCM-SHA384:AES128-GCM-SHA256:AES256-SHA256:AES128-SHA256:AES256-SHA:AES128-SHA:!DHE-RSA-AES256-GCM-SHA384:!DHE-RSA-AES128-GCM-SHA256:!DHE-RSA-AES256-SHA:!DHE-RSA-AES128-SHA’].
Step 3: Load the dataset into SQL Server using the Import Flat file Wizard:
Now that we’ve reached the third step, it’s time to load the dataset into SQL Server on the Linux Azure VM. I connected to the SQL Server Azure VM using SQL Server Management Studio (SSMS) on my Windows machine and downloaded the dataset locally.
Next, I utilized the SSMS import flat file wizard to transfer the data from the file to SQL Server on the Azure VM. Once the dataset is loaded into the table, the data is displayed as follows.
Step 4: Create the Azure AI search service, Index and import data from SQL Server:
With data loaded, let’s log in to the Azure Portal, search for AI search -> click on create and create the search service as shown below:
once, the search service is created, we need to link it to the Azure SQL Server VM to import the data into the search. During the import process, it’s crucial to ensure that you map the correct data type. In this instance, I chose the string data type for all columns during the import. Additionally, take note of the essential facets of the columns that I have enabled as shown below.
After the index is created and the data is imported, it should appear as follows, and you can also execute a sample query to confirm that you can retrieve data as illustrated below:
Step 5: Use Azure OpenAI Studio to chat with your data:
That’s it, we are now in the final stage where using Azure OpenAI studio I can create a “Chat playground AI model”, configure the DataSource to the Azure AI search, index that I created above, and you are now ready to chat with your data as shown below. Try asking scenario and context-based questions like “What products to buy for a kid’s birthday?”, “Suggest items to buy for decor of a room”.
Hope you enjoyed reading this! Happy Friday!
Microsoft Tech Community – Latest Blogs –Read More
SharePoint Intranet Site doesn’t show images to some users
Hi there!
I’m having a problem with my SharePoint intranet site. Suddenly some users began to not be able to see certain areas of the site, especially images.
Check the permissions and we are only three users who have permissions for full control and the others are part of “Everyone except external users”
It is a problem that I have had for weeks and it is affecting us since we publish content frequently and it affects the visuals of the site.
Those of us who have full control have no problems viewing, but other users do.
Hope the community can help me!
Thanks.
Hi there! I’m having a problem with my SharePoint intranet site. Suddenly some users began to not be able to see certain areas of the site, especially images.Check the permissions and we are only three users who have permissions for full control and the others are part of “Everyone except external users”It is a problem that I have had for weeks and it is affecting us since we publish content frequently and it affects the visuals of the site.Those of us who have full control have no problems viewing, but other users do. Hope the community can help me! Thanks. Read More
Yaml to execute SQL scripts in a folder via Azure DevOps pipeline
Greetings!!!
We have a git repo directory ExternalSQLScripts with sub-directories for Tables, Views, Functions, StoredProcedures. Loop through each subdirectory and execute all the .sql files on the external SQL Server. We only have access to execute SQL Server database object scripts and on this SQL Server instance we cannot do a .dacpac deployment.
I have the below yaml code which is throwing errors.
Code:
variables:
sqlServerConnection: $(System.ConnectionStrings.DatabaseConnectionString)
sqlScriptPath: $(Build.SourcesDirectory)/SQLScript
steps:
– script: |
# Install SqlServer module
if (-!Test-Path (Get-Module -ListAvailable SqlServer)) {
Install-Module SqlServer -Scope CurrentUser -Force
}
Get-ChildItem -Path $sqlScriptPath -Filter “*.sql” -Recurse | ForEach-Object {
$scriptPath = $_.FullName
$scriptName = $_.BaseName
try {
Invoke-Sqlcmd -ServerInstance $sqlServerConnection -Database [System.DefaultWorkingDirectory] -InputFile $scriptPath
Write-Host “Successfully executed script: $scriptName”
} catch {
Write-Error “Error executing script: $scriptName – $($_.Exception.Message)”
}
}
– task: PublishBuildArtifacts@1
inputs:
pathToPublish: $(sqlScriptPath)
artifactName: sql-scripts
Thanks in advance…
Greetings!!! We have a git repo directory ExternalSQLScripts with sub-directories for Tables, Views, Functions, StoredProcedures. Loop through each subdirectory and execute all the .sql files on the external SQL Server. We only have access to execute SQL Server database object scripts and on this SQL Server instance we cannot do a .dacpac deployment. I have the below yaml code which is throwing errors.Code: variables:
sqlServerConnection: $(System.ConnectionStrings.DatabaseConnectionString)
sqlScriptPath: $(Build.SourcesDirectory)/SQLScript
steps:
– script: |
# Install SqlServer module
if (-!Test-Path (Get-Module -ListAvailable SqlServer)) {
Install-Module SqlServer -Scope CurrentUser -Force
}
Get-ChildItem -Path $sqlScriptPath -Filter “*.sql” -Recurse | ForEach-Object {
$scriptPath = $_.FullName
$scriptName = $_.BaseName
try {
Invoke-Sqlcmd -ServerInstance $sqlServerConnection -Database [System.DefaultWorkingDirectory] -InputFile $scriptPath
Write-Host “Successfully executed script: $scriptName”
} catch {
Write-Error “Error executing script: $scriptName – $($_.Exception.Message)”
}
}
– task: PublishBuildArtifacts@1
inputs:
pathToPublish: $(sqlScriptPath)
artifactName: sql-scripts
Thanks in advance… Read More
One Drive Newbie
I want to start using one drive for work. Where do I even start? Can I connect one drive to mirror sharepoint document library.
I want to start using one drive for work. Where do I even start? Can I connect one drive to mirror sharepoint document library. Read More
Problem with Autofill when Using VLOOKUP Function
Hello, I’m having a little problem when trying to drag and drop a formula to autofill with the VLOOKUP function. I have a long list and I’m using this function to display the highest sales in rank order. Not sure if this is the best way to do it, but it’s how I first learned how to do it. So my formula is this:
=VLOOKUP(LARGE($C$4:$C$10001,513),$C$4:$E$10001,3,FALSE).
What I would like for it to do is to have the 513 number to increment to 514 when I autofill to the next row. Then to 516, 517, etc. I have been manually changing this number the whole time, but it’s a little tedious.
I have a similar problem with using this formula as well: =LARGE($C$4:$C$1002,513). As stated above, I’d like the 513 rank number to adjust to 514 when I autofill the next line. Here is a screen shot that might help to see what I have. The highlighted yellow cell is my VLOOKUP function and the cell to the right is my LARGE function:
Any help with this would be greatly appreciated. Thank you!
Hello, I’m having a little problem when trying to drag and drop a formula to autofill with the VLOOKUP function. I have a long list and I’m using this function to display the highest sales in rank order. Not sure if this is the best way to do it, but it’s how I first learned how to do it. So my formula is this: =VLOOKUP(LARGE($C$4:$C$10001,513),$C$4:$E$10001,3,FALSE).What I would like for it to do is to have the 513 number to increment to 514 when I autofill to the next row. Then to 516, 517, etc. I have been manually changing this number the whole time, but it’s a little tedious. I have a similar problem with using this formula as well: =LARGE($C$4:$C$1002,513). As stated above, I’d like the 513 rank number to adjust to 514 when I autofill the next line. Here is a screen shot that might help to see what I have. The highlighted yellow cell is my VLOOKUP function and the cell to the right is my LARGE function: Any help with this would be greatly appreciated. Thank you! Read More
Problem with discord and Microsoft Edge.
So, i recently started using edge as a default browser. There were some minor issues but i fixed it, i like the design, everything works great and stuff.
But i just realised a problem with sharing screen on discord. Specifically, Edge as a window, discord (app) just restarts. While, if i share entire screen, it works fine. Tested on different browsers, games, app, updated discord, any other program works fine. Did anyone encounter same problem? I assume it has something to do with privacy or something, but it is annoying.
So, i recently started using edge as a default browser. There were some minor issues but i fixed it, i like the design, everything works great and stuff. But i just realised a problem with sharing screen on discord. Specifically, Edge as a window, discord (app) just restarts. While, if i share entire screen, it works fine. Tested on different browsers, games, app, updated discord, any other program works fine. Did anyone encounter same problem? I assume it has something to do with privacy or something, but it is annoying. Read More
Surface Hub 2 MTRoW – how do you now factory wipe the device?
As above how would you now factory reset, do you need a USB stick of some sort creating?
As above how would you now factory reset, do you need a USB stick of some sort creating? Read More
Help with a copilot task
I’m trying to prompt copilot to create a 3-5 page word document based on content in a OneDrive folder trained on a sample of up to 20 documents saved in another folder. How would this be organized and prompted? thanks
I’m trying to prompt copilot to create a 3-5 page word document based on content in a OneDrive folder trained on a sample of up to 20 documents saved in another folder. How would this be organized and prompted? thanks Read More
Complicated vlookup example
Hi there,
I have a dataset where there are three different columns with names. I would like VLOOKUP or INDEX, etc to use those names (in 3 different columns) and search for all 3 against one column in another sheet. Once the matches are found, I’d like to get the contents in the column 8 rows to the right for all 3 names in one cell (so merged). Example, if I have names Sam, Sophia and Liz and the columns 8 rows to the right had the following
Sam – Jungle Group
Sophia – Safari Group
Liz – Forest Group
I’d like the input to return to be Jungle Group, Safari Group, Forest Group
Let me know if this is possible.
Hi there, I have a dataset where there are three different columns with names. I would like VLOOKUP or INDEX, etc to use those names (in 3 different columns) and search for all 3 against one column in another sheet. Once the matches are found, I’d like to get the contents in the column 8 rows to the right for all 3 names in one cell (so merged). Example, if I have names Sam, Sophia and Liz and the columns 8 rows to the right had the following Sam – Jungle GroupSophia – Safari GroupLiz – Forest Group I’d like the input to return to be Jungle Group, Safari Group, Forest Group Let me know if this is possible. Read More
Shifted to Edge from chrome – 3 things I miss
Hi I shifted to edge, after using chrome for 10 years.
1. Edge should allow multiple user profiles on Mobile like chrome and multiple microsoft account login.
2. Please give user choice to which extensions to import while importing chrome data. ?
3. Please make new tab page clean, it took me 1 month to make it clean and now my edge is faster than chrome.
Hi I shifted to edge, after using chrome for 10 years. 1. Edge should allow multiple user profiles on Mobile like chrome and multiple microsoft account login. 2. Please give user choice to which extensions to import while importing chrome data. ?3. Please make new tab page clean, it took me 1 month to make it clean and now my edge is faster than chrome. Read More
Azure AI Services on AKS
Host your AI Language Containers and Web Apps on Azure Kubernetes Cluster: Flask Web App Sentiment Analysis
In this post, we’ll explore how to integrate Azure AI Containers into our applications running on Azure Kubernetes Service (AKS). Azure AI Containers enable you to harness the power of Azure’s AI services directly within your AKS environment, giving you complete control over where your data is processed. By streamlining the deployment process and ensuring consistency, Azure AI Containers simplify the integration of cutting-edge AI capabilities into your applications. Whether you’re developing tools for education, enhancing accessibility, or creating innovative user experiences, this guide will show you how to seamlessly incorporate Azure’s AI Containers into your web apps running on AKS.
Why Containers ?
Azure AI services provides several Docker containers that let you use the same APIs that are available in Azure, on-premises. Using these containers gives you the flexibility to bring Azure AI services closer to your data for compliance, security or other operational reasons. Container support is currently available for a subset of Azure AI services.
Azure AI Containers offer:
Immutable infrastructure: Consistent and reliable system parameters for DevOps teams, with flexibility to adapt and avoid configuration drift.Data control: Choose where data is processed, essential for data residency or security requirements.Model update control: Flexibility in versioning and updating deployed models.Portable architecture: Deploy on Azure, on-premises, or at the edge, with Kubernetes support.High throughput/low latency: Scale for demanding workloads by running Azure AI services close to data and logic.Scalability: Built on scalable cluster technology like Kubernetes for high availability and adaptable performance.
Source: https://learn.microsoft.com/en-us/azure/ai-services/cognitive-services-container-support
Workshop
Our Solution will utilize the Azure Language AI Service with the Text Analytics container for Sentiment Analysis. We will build a Python Flask Web App, containerize it with Docker and push it to Azure Container Registry. An AKS Cluster which we will create, will pull the Flask Image along with the Microsoft provided Sentiment Analysis Image directly from mcr.microsoft.com and we will make all required configurations on our AKS Cluster to have an Ingress Controller with SSL Certificate presenting a simple Web UI to write our Text, submit it for analysis and get the results. Our Web UI will look like this:
Azure Kubernetes Cluster, Azure Container Registry & Azure Text Analytics
These are our main resources and a Virtual Network of course for the AKS which is deployed automatically. Our Solution is hosted entirely on AKS with a Let’s Encrypt Certificate we will create separately offering secure HTTP with an Ingress Controller serving publicly our Flask UI which is calling via REST the Sentiment Analysis service, also hosted on AKS. The difference is that Flask is build with a custom Docker Image pulled from Azure Container Registry, while the Sentiment Analysis is a Microsoft ready Image which we pull directly.
In case your Azure Subscription does not have an AI Service you have to create a Language Service of Text Analytics using the Portal due to the requirement to accept the Responsible AI Terms. For more detail go to https://go.microsoft.com/fwlink/?linkid=2164190 .
My preference as a best practice, is to create an AKS Cluster with the default System Node Pool and add an additional User Node Pool to deploy my Apps, but it is really a matter of preference at the end of the day. So let’s start deploying! Start from your terminal by logging in with az login and set your Subscription with az account set –subscription ‘YourSubName”
## Change the values in < > with your values and remove < >!
## Create the AKS Cluster
az aks create
–resource-group <your-resource-group>
–name <your-cluster-name>
–node-count 1
–node-vm-size standard_a4_v2
–nodepool-name agentpool
–generate-ssh-keys
–nodepool-labels nodepooltype=system
–no-wait
–aks-custom-headers AKSSystemNodePool=true
–network-plugin azure
## Add a User Node Pool
az aks nodepool add
–resource-group <your-resource-group>
–cluster-name <your-cluster-name>
–name userpool
–node-count 1
–node-vm-size standard_d4s_v3
–no-wait
## Create Azure Container Registry
az acr create
–resource-group <your-resource-group>
–name <your-acr-name>
–sku Standard
–location northeurope
## Attach ACR to AKS
az aks update -n <your-cluster-name> -g <your-resource-group> –attach-acr <your-acr-name>
The Language Service is created from the Portal for the reasons we explained earlier. Search for Language and create a new Language service leaving the default selections ( No Custom QnA, no Custom Text Classification) on the F0 (Free) SKU. You may see a VNET menu appear in the Networking Tab, just ignore it, as long as you leave the default Public Access enabled it won’t create a Virtual Network. The presence of the Cloud Resource is for Billing and Metrics.
A Flask Web App has a directory structure where we store index.html in the Templates directory and our CSS and images in the Static directory. So in essence it looks like this:
-sentiment-aks
–flaskwebapp
app.py
requirements.txt
Dockerfile
—static
1.style.css
2.logo.png
—templates
1.index.html
The requirements.txt should have the needed packages :
## requirements.txt
Flask==3.0.0
requests==2.31.0## index.html
<!DOCTYPE html>
<html>
<head>
<title>Sentiment Analysis App</title>
<link rel=”stylesheet” type=”text/css” href=”{{ url_for(‘static’, filename=’style.css’) }}”>
</head>
<body>
<img src=”{{ url_for(‘static’, filename=’logo.png’) }}” class=”icon” alt=”App Icon”>
<h2>Sentiment Analysis</h2>
<form id=”textForm”>
<textarea name=”text” placeholder=”Enter text here…”></textarea>
<button type=”submit”>Analyze</button>
</form>
<div id=”result”></div>
<script>
document.getElementById(‘textForm’).onsubmit = async function(e) {
e.preventDefault();
let formData = new FormData(this);
let response = await fetch(‘/analyze’, {
method: ‘POST’,
body: formData
});
let resultData = await response.json();
let results = resultData.results;
if (results) {
let displayText = `Document: ${results.document}nSentiment: ${results.overall_sentiment}n`;
displayText += `Confidence – Positive: ${results.confidence_positive}, Neutral: ${results.confidence_neutral}, Negative: ${results.confidence_negative}`;
document.getElementById(‘result’).innerText = displayText;
} else {
document.getElementById(‘result’).innerText = ‘No results to display’;
}
};
</script>
</body>
</html>## style.css
body {
font-family: Arial, sans-serif;
background-color: #f0f8ff; /* Light blue background */
margin: 0;
padding: 0;
display: flex;
flex-direction: column;
align-items: center;
justify-content: center;
height: 100vh;
}
h2 {
color: #0277bd; /* Darker blue for headings */
}
.icon {
height: 100px; /* Adjust the size as needed */
margin-top: 20px; /* Add some space above the logo */
}
form {
background-color: white;
padding: 20px;
border-radius: 8px;
width: 300px;
box-shadow: 0 2px 4px rgba(0, 0, 0, 0.1);
}
textarea {
width: 100%;
box-sizing: border-box;
height: 100px;
margin-bottom: 10px;
border: 1px solid #0277bd;
border-radius: 4px;
padding: 10px;
}
button {
background-color: #029ae4; /* Blue button */
color: white;
border: none;
padding: 10px 15px;
border-radius: 4px;
cursor: pointer;
}
button:hover {
background-color: #0277bd;
}
#result {
margin-top: 20px;
}
And here is the most interesting file, our app.py. Notice the use of a REST API call directly to the Sentiment Analysis endpoint which we will declare in the YAML file for the Kubernetes deployment.
## app.py
from flask import Flask, render_template, request, jsonify
import requests
import os
app = Flask(__name__)
@app.route(‘/’, methods=[‘GET’])
def index():
return render_template(‘index.html’) # HTML file with input form
@app.route(‘/analyze’, methods=[‘POST’])
def analyze():
# Extract text from the form submission
text = request.form[‘text’]
if not text:
return jsonify({‘error’: ‘No text provided’}), 400
# Fetch API endpoint and key from environment variables
endpoint = os.environ.get(“CONTAINER_API_URL”)
# Ensure required configurations are available
if not endpoint:
return jsonify({‘error’: ‘API configuration not set’}), 500
# Construct the full URL for the sentiment analysis API
url = f”{endpoint}/text/analytics/v3.1/sentiment”
headers = {
‘Content-Type’: ‘application/json’
}
body = {
‘documents’: [{‘id’: ‘1’, ‘language’: ‘en’, ‘text’: text}]
}
# Make the HTTP POST request to the sentiment analysis API
response = requests.post(url, json=body, headers=headers)
if response.status_code != 200:
return jsonify({‘error’: ‘Failed to analyze sentiment’}), response.status_code
# Process the API response
data = response.json()
results = data[‘documents’][0]
detailed_results = {
‘document’: text,
‘overall_sentiment’: results[‘sentiment’],
‘confidence_positive’: results[‘confidenceScores’][‘positive’],
‘confidence_neutral’: results[‘confidenceScores’][‘neutral’],
‘confidence_negative’: results[‘confidenceScores’][‘negative’]
}
# Return the detailed results to the client
return jsonify({‘results’: detailed_results})
if __name__ == ‘__main__’:
app.run(host=’0.0.0.0′, port=5001, debug=False)
And finally we need a Dockerfile, pay attention to have it on the same level as your app.py file.
## Dockerfile
# Use an official Python runtime as a parent image
FROM python:3.10-slim
# Set the working directory in the container
WORKDIR /app
# Copy the current directory contents into the container at /app
COPY . /app
# Install any needed packages specified in requirements.txt
RUN pip install –no-cache-dir -r requirements.txt
# Make port 5001 available to the world outside this container
EXPOSE 5001
# Define environment variable
ENV CONTAINER_API_URL=”http://sentiment-service/”
# Run app.py when the container launches
CMD [“python”, “app.py”]
Our Web UI is ready to build ! We need Docker running on our development environment and we need to login to Azure Container Registry:
## Login to ACR
az acr login -n <your-acr-name>
## Build and Tag our image
docker build -t <acr-name>.azurecr.io/flaskweb:latest .
docker push <acr-name>.azurecr.io/flaskweb:latest
You can go to the Portal and from Azure Container Registry, Repositories you will find our new Image ready to be pulled!
Kubernetes Deployments
Let’s start deploying our AKS services ! As we already know we can pull the Sentiment Analysis Container from Microsoft directly and that’s what we are going to do with the following tasks. First, we need to login to our AKS Cluster so from Azure Portal head over to your AKS Cluster and click on the Connect link on the menu. Azure will provide the command to connect from our terminal:
Select Azure CLI and just copy-paste the commands to your Terminal.
Now we can run kubectl commands and manage our Cluster and AKS Services.
We need a YAML file for each service we are going to build, including the Certificate at the end. For now let’s create the Sentiment Analysis Service, as a Container, with the following file. Pay attention as you need to get the Language Service Key and Endpoint from the Text Analytics resource we created earlier, and in the nodeSelector block we must enter the name of the User Node Pool we created.
apiVersion: apps/v1
kind: Deployment
metadata:
name: sentiment-deployment
spec:
replicas: 1
selector:
matchLabels:
app: sentiment
template:
metadata:
labels:
app: sentiment
spec:
containers:
– name: sentiment
image: mcr.microsoft.com/azure-cognitive-services/textanalytics/sentiment:latest
ports:
– containerPort: 5000
resources:
limits:
memory: “8Gi”
cpu: “1”
requests:
memory: “8Gi”
cpu: “1”
env:
– name: Eula
value: “accept”
– name: Billing
value: “https://<your-Language-Service>.cognitiveservices.azure.com/”
– name: ApiKey
value: “xxxxxxxxxxxxxxxxxxxx”
nodeSelector:
agentpool: userpool
—
apiVersion: v1
kind: Service
metadata:
name: sentiment-service
spec:
selector:
app: sentiment
ports:
– protocol: TCP
port: 5000
targetPort: 5000
type: ClusterIP
Save the file and run from your Terminal:
kubectl apply -f sentiment-deployment.yaml
In a few seconds you can observe the service running from the AKS Services and Ingresses menu.
Let’s continue to bring our Flask Container now. In the same manner create a new YAML:
apiVersion: apps/v1
kind: Deployment
metadata:
name: flask-service
spec:
replicas: 1
selector:
matchLabels:
app: flask
template:
metadata:
labels:
app: flask
spec:
containers:
– name: flask
image: <your-ACR-name>.azurecr.io/flaskweb:latest
ports:
– containerPort: 5001
env:
– name: CONTAINER_API_URL
value: “http://sentiment-service:5000”
resources:
requests:
cpu: “500m”
memory: “256Mi”
limits:
cpu: “1”
memory: “512Mi”
nodeSelector:
agentpool: userpool
—
apiVersion: v1
kind: Service
metadata:
name: flask-lb
spec:
type: LoadBalancer
selector:
app: flask
ports:
– protocol: TCP
port: 80
targetPort: 5001
kubectl apply -f flask-service.yaml
Observe the Sentiment Analysis Environment Value. It is directly using the Service name of our Sentiment Analysis container as AKS has it’s own DNS resolver for easy communication between services. In fact if we hit the Service Public IP we will have HTTP access to the Web UI.
But let’s see how we can import our Certificate. We won’t describe how to get a Certificate. All we need is the PEM files, meaning the privatekey.pem and the cert.pem. IF we have a PFX we can export them with OpenSSL. Once we have these files in place we will create a secret in AKS that will hold our Certificate key and file. We just need to run this command from within the directory of our PEM files:
kubectl create secret tls flask-app-tls –key privkey.pem –cert cert.pem –namespace default
Once we create our Secret we will deploy a Kubernetes Ingress Controller which will manage HTTPS and will point to the Flask Service. Remember to add an A record to your DNS registrar with the DNS Hostname you are going to use and the Public IP, once you see the IP Address:
apiVersion: networking.k8s.io/v1
kind: Ingress
metadata:
name: flask-app-ingress
namespace: default
spec:
tls:
– hosts:
– your.domain.host
secretName: flask-app-tls
rules:
– host: your.domain.host
http:
paths:
– path: /
pathType: Prefix
backend:
service:
name: flask-lb
port:
number: 80
kubectl apply -f flask-app-ingress.yaml
From AKS – Services and Ingresses – Ingresses you will see the assigned Public IP. Add it to your DNS and once the Name Servers are updated you can hit your Hostname using HTTPS!
Final Thoughts
As we’ve explored, the combination of Azure AI Containers and AKS offers a powerful and flexible solution for deploying AI-driven applications in cloud-native environments. By leveraging these technologies, you gain granular control over your data and model deployments, while maintaining the scalability and portability essential for modern applications. Remember, this is just the starting point. As you delve deeper, consider the specific requirements of your project and explore the vast possibilities that Azure AI Containers unlock. Embrace the power of AI within your AKS deployments, and you’ll be well on your way to building innovative, intelligent solutions that redefine what’s possible in the cloud.
Architecture
Host your AI Language Containers and Web Apps on Azure Kubernetes Cluster: Flask Web App Sentiment Analysis In this post, we’ll explore how to integrate Azure AI Containers into our applications running on Azure Kubernetes Service (AKS). Azure AI Containers enable you to harness the power of Azure’s AI services directly within your AKS environment, giving you complete control over where your data is processed. By streamlining the deployment process and ensuring consistency, Azure AI Containers simplify the integration of cutting-edge AI capabilities into your applications. Whether you’re developing tools for education, enhancing accessibility, or creating innovative user experiences, this guide will show you how to seamlessly incorporate Azure’s AI Containers into your web apps running on AKS.Why Containers ?Azure AI services provides several Docker containers that let you use the same APIs that are available in Azure, on-premises. Using these containers gives you the flexibility to bring Azure AI services closer to your data for compliance, security or other operational reasons. Container support is currently available for a subset of Azure AI services.Azure AI Containers offer:Immutable infrastructure: Consistent and reliable system parameters for DevOps teams, with flexibility to adapt and avoid configuration drift.Data control: Choose where data is processed, essential for data residency or security requirements.Model update control: Flexibility in versioning and updating deployed models.Portable architecture: Deploy on Azure, on-premises, or at the edge, with Kubernetes support.High throughput/low latency: Scale for demanding workloads by running Azure AI services close to data and logic.Scalability: Built on scalable cluster technology like Kubernetes for high availability and adaptable performance.Source: https://learn.microsoft.com/en-us/azure/ai-services/cognitive-services-container-supportWorkshopOur Solution will utilize the Azure Language AI Service with the Text Analytics container for Sentiment Analysis. We will build a Python Flask Web App, containerize it with Docker and push it to Azure Container Registry. An AKS Cluster which we will create, will pull the Flask Image along with the Microsoft provided Sentiment Analysis Image directly from mcr.microsoft.com and we will make all required configurations on our AKS Cluster to have an Ingress Controller with SSL Certificate presenting a simple Web UI to write our Text, submit it for analysis and get the results. Our Web UI will look like this: Azure Kubernetes Cluster, Azure Container Registry & Azure Text AnalyticsThese are our main resources and a Virtual Network of course for the AKS which is deployed automatically. Our Solution is hosted entirely on AKS with a Let’s Encrypt Certificate we will create separately offering secure HTTP with an Ingress Controller serving publicly our Flask UI which is calling via REST the Sentiment Analysis service, also hosted on AKS. The difference is that Flask is build with a custom Docker Image pulled from Azure Container Registry, while the Sentiment Analysis is a Microsoft ready Image which we pull directly.In case your Azure Subscription does not have an AI Service you have to create a Language Service of Text Analytics using the Portal due to the requirement to accept the Responsible AI Terms. For more detail go to https://go.microsoft.com/fwlink/?linkid=2164190 .My preference as a best practice, is to create an AKS Cluster with the default System Node Pool and add an additional User Node Pool to deploy my Apps, but it is really a matter of preference at the end of the day. So let’s start deploying! Start from your terminal by logging in with az login and set your Subscription with az account set –subscription ‘YourSubName” ## Change the values in < > with your values and remove < >!
## Create the AKS Cluster
az aks create
–resource-group <your-resource-group>
–name <your-cluster-name>
–node-count 1
–node-vm-size standard_a4_v2
–nodepool-name agentpool
–generate-ssh-keys
–nodepool-labels nodepooltype=system
–no-wait
–aks-custom-headers AKSSystemNodePool=true
–network-plugin azure
## Add a User Node Pool
az aks nodepool add
–resource-group <your-resource-group>
–cluster-name <your-cluster-name>
–name userpool
–node-count 1
–node-vm-size standard_d4s_v3
–no-wait
## Create Azure Container Registry
az acr create
–resource-group <your-resource-group>
–name <your-acr-name>
–sku Standard
–location northeurope
## Attach ACR to AKS
az aks update -n <your-cluster-name> -g <your-resource-group> –attach-acr <your-acr-name> The Language Service is created from the Portal for the reasons we explained earlier. Search for Language and create a new Language service leaving the default selections ( No Custom QnA, no Custom Text Classification) on the F0 (Free) SKU. You may see a VNET menu appear in the Networking Tab, just ignore it, as long as you leave the default Public Access enabled it won’t create a Virtual Network. The presence of the Cloud Resource is for Billing and Metrics. A Flask Web App has a directory structure where we store index.html in the Templates directory and our CSS and images in the Static directory. So in essence it looks like this: -sentiment-aks
–flaskwebapp
app.py
requirements.txt
Dockerfile
—static
1.style.css
2.logo.png
—templates
1.index.html The requirements.txt should have the needed packages : ## requirements.txt
Flask==3.0.0
requests==2.31.0## index.html
<!DOCTYPE html>
<html>
<head>
<title>Sentiment Analysis App</title>
<link rel=”stylesheet” type=”text/css” href=”{{ url_for(‘static’, filename=’style.css’) }}”>
</head>
<body>
<img src=”{{ url_for(‘static’, filename=’logo.png’) }}” class=”icon” alt=”App Icon”>
<h2>Sentiment Analysis</h2>
<form id=”textForm”>
<textarea name=”text” placeholder=”Enter text here…”></textarea>
<button type=”submit”>Analyze</button>
</form>
<div id=”result”></div>
<script>
document.getElementById(‘textForm’).onsubmit = async function(e) {
e.preventDefault();
let formData = new FormData(this);
let response = await fetch(‘/analyze’, {
method: ‘POST’,
body: formData
});
let resultData = await response.json();
let results = resultData.results;
if (results) {
let displayText = `Document: ${results.document}nSentiment: ${results.overall_sentiment}n`;
displayText += `Confidence – Positive: ${results.confidence_positive}, Neutral: ${results.confidence_neutral}, Negative: ${results.confidence_negative}`;
document.getElementById(‘result’).innerText = displayText;
} else {
document.getElementById(‘result’).innerText = ‘No results to display’;
}
};
</script>
</body>
</html>## style.css
body {
font-family: Arial, sans-serif;
background-color: #f0f8ff; /* Light blue background */
margin: 0;
padding: 0;
display: flex;
flex-direction: column;
align-items: center;
justify-content: center;
height: 100vh;
}
h2 {
color: #0277bd; /* Darker blue for headings */
}
.icon {
height: 100px; /* Adjust the size as needed */
margin-top: 20px; /* Add some space above the logo */
}
form {
background-color: white;
padding: 20px;
border-radius: 8px;
width: 300px;
box-shadow: 0 2px 4px rgba(0, 0, 0, 0.1);
}
textarea {
width: 100%;
box-sizing: border-box;
height: 100px;
margin-bottom: 10px;
border: 1px solid #0277bd;
border-radius: 4px;
padding: 10px;
}
button {
background-color: #029ae4; /* Blue button */
color: white;
border: none;
padding: 10px 15px;
border-radius: 4px;
cursor: pointer;
}
button:hover {
background-color: #0277bd;
}
#result {
margin-top: 20px;
} And here is the most interesting file, our app.py. Notice the use of a REST API call directly to the Sentiment Analysis endpoint which we will declare in the YAML file for the Kubernetes deployment. ## app.py
from flask import Flask, render_template, request, jsonify
import requests
import os
app = Flask(__name__)
@app.route(‘/’, methods=[‘GET’])
def index():
return render_template(‘index.html’) # HTML file with input form
@app.route(‘/analyze’, methods=[‘POST’])
def analyze():
# Extract text from the form submission
text = request.form[‘text’]
if not text:
return jsonify({‘error’: ‘No text provided’}), 400
# Fetch API endpoint and key from environment variables
endpoint = os.environ.get(“CONTAINER_API_URL”)
# Ensure required configurations are available
if not endpoint:
return jsonify({‘error’: ‘API configuration not set’}), 500
# Construct the full URL for the sentiment analysis API
url = f”{endpoint}/text/analytics/v3.1/sentiment”
headers = {
‘Content-Type’: ‘application/json’
}
body = {
‘documents’: [{‘id’: ‘1’, ‘language’: ‘en’, ‘text’: text}]
}
# Make the HTTP POST request to the sentiment analysis API
response = requests.post(url, json=body, headers=headers)
if response.status_code != 200:
return jsonify({‘error’: ‘Failed to analyze sentiment’}), response.status_code
# Process the API response
data = response.json()
results = data[‘documents’][0]
detailed_results = {
‘document’: text,
‘overall_sentiment’: results[‘sentiment’],
‘confidence_positive’: results[‘confidenceScores’][‘positive’],
‘confidence_neutral’: results[‘confidenceScores’][‘neutral’],
‘confidence_negative’: results[‘confidenceScores’][‘negative’]
}
# Return the detailed results to the client
return jsonify({‘results’: detailed_results})
if __name__ == ‘__main__’:
app.run(host=’0.0.0.0′, port=5001, debug=False) And finally we need a Dockerfile, pay attention to have it on the same level as your app.py file. ## Dockerfile
# Use an official Python runtime as a parent image
FROM python:3.10-slim
# Set the working directory in the container
WORKDIR /app
# Copy the current directory contents into the container at /app
COPY . /app
# Install any needed packages specified in requirements.txt
RUN pip install –no-cache-dir -r requirements.txt
# Make port 5001 available to the world outside this container
EXPOSE 5001
# Define environment variable
ENV CONTAINER_API_URL=”http://sentiment-service/”
# Run app.py when the container launches
CMD [“python”, “app.py”] Our Web UI is ready to build ! We need Docker running on our development environment and we need to login to Azure Container Registry: ## Login to ACR
az acr login -n <your-acr-name>
## Build and Tag our image
docker build -t <acr-name>.azurecr.io/flaskweb:latest .
docker push <acr-name>.azurecr.io/flaskweb:latest You can go to the Portal and from Azure Container Registry, Repositories you will find our new Image ready to be pulled!Kubernetes DeploymentsLet’s start deploying our AKS services ! As we already know we can pull the Sentiment Analysis Container from Microsoft directly and that’s what we are going to do with the following tasks. First, we need to login to our AKS Cluster so from Azure Portal head over to your AKS Cluster and click on the Connect link on the menu. Azure will provide the command to connect from our terminal: Select Azure CLI and just copy-paste the commands to your Terminal.Now we can run kubectl commands and manage our Cluster and AKS Services.We need a YAML file for each service we are going to build, including the Certificate at the end. For now let’s create the Sentiment Analysis Service, as a Container, with the following file. Pay attention as you need to get the Language Service Key and Endpoint from the Text Analytics resource we created earlier, and in the nodeSelector block we must enter the name of the User Node Pool we created. apiVersion: apps/v1
kind: Deployment
metadata:
name: sentiment-deployment
spec:
replicas: 1
selector:
matchLabels:
app: sentiment
template:
metadata:
labels:
app: sentiment
spec:
containers:
– name: sentiment
image: mcr.microsoft.com/azure-cognitive-services/textanalytics/sentiment:latest
ports:
– containerPort: 5000
resources:
limits:
memory: “8Gi”
cpu: “1”
requests:
memory: “8Gi”
cpu: “1”
env:
– name: Eula
value: “accept”
– name: Billing
value: “https://<your-Language-Service>.cognitiveservices.azure.com/”
– name: ApiKey
value: “xxxxxxxxxxxxxxxxxxxx”
nodeSelector:
agentpool: userpool
—
apiVersion: v1
kind: Service
metadata:
name: sentiment-service
spec:
selector:
app: sentiment
ports:
– protocol: TCP
port: 5000
targetPort: 5000
type: ClusterIP Save the file and run from your Terminal:kubectl apply -f sentiment-deployment.yamlIn a few seconds you can observe the service running from the AKS Services and Ingresses menu.Let’s continue to bring our Flask Container now. In the same manner create a new YAML: apiVersion: apps/v1
kind: Deployment
metadata:
name: flask-service
spec:
replicas: 1
selector:
matchLabels:
app: flask
template:
metadata:
labels:
app: flask
spec:
containers:
– name: flask
image: <your-ACR-name>.azurecr.io/flaskweb:latest
ports:
– containerPort: 5001
env:
– name: CONTAINER_API_URL
value: “http://sentiment-service:5000”
resources:
requests:
cpu: “500m”
memory: “256Mi”
limits:
cpu: “1”
memory: “512Mi”
nodeSelector:
agentpool: userpool
—
apiVersion: v1
kind: Service
metadata:
name: flask-lb
spec:
type: LoadBalancer
selector:
app: flask
ports:
– protocol: TCP
port: 80
targetPort: 5001 kubectl apply -f flask-service.yamlObserve the Sentiment Analysis Environment Value. It is directly using the Service name of our Sentiment Analysis container as AKS has it’s own DNS resolver for easy communication between services. In fact if we hit the Service Public IP we will have HTTP access to the Web UI.But let’s see how we can import our Certificate. We won’t describe how to get a Certificate. All we need is the PEM files, meaning the privatekey.pem and the cert.pem. IF we have a PFX we can export them with OpenSSL. Once we have these files in place we will create a secret in AKS that will hold our Certificate key and file. We just need to run this command from within the directory of our PEM files:kubectl create secret tls flask-app-tls –key privkey.pem –cert cert.pem –namespace defaultOnce we create our Secret we will deploy a Kubernetes Ingress Controller which will manage HTTPS and will point to the Flask Service. Remember to add an A record to your DNS registrar with the DNS Hostname you are going to use and the Public IP, once you see the IP Address: apiVersion: networking.k8s.io/v1
kind: Ingress
metadata:
name: flask-app-ingress
namespace: default
spec:
tls:
– hosts:
– your.domain.host
secretName: flask-app-tls
rules:
– host: your.domain.host
http:
paths:
– path: /
pathType: Prefix
backend:
service:
name: flask-lb
port:
number: 80 kubectl apply -f flask-app-ingress.yamlFrom AKS – Services and Ingresses – Ingresses you will see the assigned Public IP. Add it to your DNS and once the Name Servers are updated you can hit your Hostname using HTTPS! Final ThoughtsAs we’ve explored, the combination of Azure AI Containers and AKS offers a powerful and flexible solution for deploying AI-driven applications in cloud-native environments. By leveraging these technologies, you gain granular control over your data and model deployments, while maintaining the scalability and portability essential for modern applications. Remember, this is just the starting point. As you delve deeper, consider the specific requirements of your project and explore the vast possibilities that Azure AI Containers unlock. Embrace the power of AI within your AKS deployments, and you’ll be well on your way to building innovative, intelligent solutions that redefine what’s possible in the cloud.Architecture Read More
My Teams invites go straight to the recipients spam folder
Hello all,
I have been using Teams at my previous job, and now that I tried to use it with my personal Microsoft 365 account, I seem to be running into an issue.
I use the browser version of Teams to schedule meetings, but they go straight to my recipients’ spam folder. Even if they are able to recover the invite when they accept, they get this message:
Your message wasn’t delivered to email address removed for privacy reasons because the address couldn’t be found or is unable to receive mail.
Apparently the invites are being sent from this outlook email (no idea where that came from) instead of the email I have in my Microsoft 365 and Teams account.
Please help!
Hello all,I have been using Teams at my previous job, and now that I tried to use it with my personal Microsoft 365 account, I seem to be running into an issue.I use the browser version of Teams to schedule meetings, but they go straight to my recipients’ spam folder. Even if they are able to recover the invite when they accept, they get this message:Your message wasn’t delivered to email address removed for privacy reasons because the address couldn’t be found or is unable to receive mail.Apparently the invites are being sent from this outlook email (no idea where that came from) instead of the email I have in my Microsoft 365 and Teams account.Please help! Read More
Windows Time
We have an issue with the Windows clock on the taskbar falling behind on Windows 11 workstations (4 so far) deployed out of box. The System Time seems to be find. Rebooting a machine brings the clock back to normal.
We have a Windows Active Directory in house; and began deploying Windows 11 Pro machines as a Desktop Refresh initiative. Each machine on the domain is synchronizing to Active Directory (except the 3 DCs; which are synchronizing to the ntp.org).
We have an issue with the Windows clock on the taskbar falling behind on Windows 11 workstations (4 so far) deployed out of box. The System Time seems to be find. Rebooting a machine brings the clock back to normal. We have a Windows Active Directory in house; and began deploying Windows 11 Pro machines as a Desktop Refresh initiative. Each machine on the domain is synchronizing to Active Directory (except the 3 DCs; which are synchronizing to the ntp.org). Read More
Can I change just the sheet, but keep the cells in multiple references
I have a workbook that tracks data in monthly tabs. I also have a sheet that pulls multiple datapoints from each worksheet for an easily printable report.
Since each of the monthly sheets is identical, is there a way that, as I go into a new month, I can change just which sheet my report pulls from, but keeps the cell references?
For example if I have data pulling from =May!A1 , =May!C3, =May!F45, is there an easy way to change all of the Mays to Junes?
Thanks in advance.
I have a workbook that tracks data in monthly tabs. I also have a sheet that pulls multiple datapoints from each worksheet for an easily printable report. Since each of the monthly sheets is identical, is there a way that, as I go into a new month, I can change just which sheet my report pulls from, but keeps the cell references? For example if I have data pulling from =May!A1 , =May!C3, =May!F45, is there an easy way to change all of the Mays to Junes? Thanks in advance. Read More
Text Element – Broken aria-describedby Error element
When creating a form with a text element, by default it includes a broken aria-described by element. Here is a sample test form for example with just a text input. That text input includes the following code:
<input aria-label=”Single line text” maxlength=”4000″ placeholder=”Enter your answer” aria-labelledby=”QuestionId_rf7ddf945f19f48f7a19e9be4ce25a328 QuestionInfo_rf7ddf945f19f48f7a19e9be4ce25a328″ aria-describedby=”rf7ddf945f19f48f7a19e9be4ce25a328_validationError” class=”-as-61″ spellcheck=”true” data-automation-id=”textInput”>
With the problematic aria-describedby=”rf7ddf945f19f48f7a19e9be4ce25a328_validationError” tag. In this case, I don’t think that tag is needed and could just be removed to resolve the issue.
When creating a form with a text element, by default it includes a broken aria-described by element. Here is a sample test form for example with just a text input. That text input includes the following code: <input aria-label=”Single line text” maxlength=”4000″ placeholder=”Enter your answer” aria-labelledby=”QuestionId_rf7ddf945f19f48f7a19e9be4ce25a328 QuestionInfo_rf7ddf945f19f48f7a19e9be4ce25a328″ aria-describedby=”rf7ddf945f19f48f7a19e9be4ce25a328_validationError” class=”-as-61″ spellcheck=”true” data-automation-id=”textInput”> With the problematic aria-describedby=”rf7ddf945f19f48f7a19e9be4ce25a328_validationError” tag. In this case, I don’t think that tag is needed and could just be removed to resolve the issue. Read More
腾龙集团腾龙账号注册开户微yx0503123
有些东西沉默就是答案,闪躲就是答案,没有主动也是答案。
52.喜欢塞着耳机然后一直走,在夜色里看灯火阑珊,看街边卖的烧烤小吃糖水,看来来往往手牵手的情侣,低下头不断的换着歌不觉得孤独,我可以一直就这样走下去多晚回家都没关系,洗个澡躺在床上又是一天,很孤单,却说一个人真好 。
53.世界上有不绝的风景,我有不老的心情。
54.其实,我们只是想找一个谈得来、合脾性,在一起舒坦、分开久了有点想念,安静久了想闹腾一下、吵架了又立马会后悔认输的人 。爱情如此,友情同理 。
55.在没去了解之前,每个人看起来都很好。
56.人间荒唐市侩,不如山中做怪。
57.命运要你成长的时候,总会安排一些让你不顺心的人或事刺激你。这是规律。
58. 都在用不同的方式长大,谁也没轻轻松松。
59.既然去不了诗和远方,那就到村口路上走走,看看落日与夕阳。人生处处皆是风景。
60.张小娴曾写道:我没有很刻意的去想念你,因为我知道,遇到了就应该感恩,路过了就需要释怀。我只是在很多很多的小瞬间,想起你。比如一部电影,一首歌,一句歌词,一条马路和无数个闭上眼睛的瞬间。
61.一个人身边的位置只有那么多,你能给的也只有那么多,在这个狭小的圈子里,有些人要进来,就有一些人不得不离开。
62.无效社交摧毁有趣灵魂。
63. 我不快乐大概是因为我得不到又忘不掉还不敢说。
64.所以我该用什么语气,什么样的言语,来表达我杂七杂八的心情呢。
65.看淡一点再努力一点,这世上,没有谁活得比谁容易,只是有人在呼天喊地,有人在静默坚守。
66.只要有人的地方就有恩怨,有恩怨就会有江湖,人就是江湖。
有些东西沉默就是答案,闪躲就是答案,没有主动也是答案。52.喜欢塞着耳机然后一直走,在夜色里看灯火阑珊,看街边卖的烧烤小吃糖水,看来来往往手牵手的情侣,低下头不断的换着歌不觉得孤独,我可以一直就这样走下去多晚回家都没关系,洗个澡躺在床上又是一天,很孤单,却说一个人真好 。53.世界上有不绝的风景,我有不老的心情。54.其实,我们只是想找一个谈得来、合脾性,在一起舒坦、分开久了有点想念,安静久了想闹腾一下、吵架了又立马会后悔认输的人 。爱情如此,友情同理 。55.在没去了解之前,每个人看起来都很好。56.人间荒唐市侩,不如山中做怪。57.命运要你成长的时候,总会安排一些让你不顺心的人或事刺激你。这是规律。58. 都在用不同的方式长大,谁也没轻轻松松。59.既然去不了诗和远方,那就到村口路上走走,看看落日与夕阳。人生处处皆是风景。60.张小娴曾写道:我没有很刻意的去想念你,因为我知道,遇到了就应该感恩,路过了就需要释怀。我只是在很多很多的小瞬间,想起你。比如一部电影,一首歌,一句歌词,一条马路和无数个闭上眼睛的瞬间。61.一个人身边的位置只有那么多,你能给的也只有那么多,在这个狭小的圈子里,有些人要进来,就有一些人不得不离开。62.无效社交摧毁有趣灵魂。63. 我不快乐大概是因为我得不到又忘不掉还不敢说。64.所以我该用什么语气,什么样的言语,来表达我杂七杂八的心情呢。65.看淡一点再努力一点,这世上,没有谁活得比谁容易,只是有人在呼天喊地,有人在静默坚守。66.只要有人的地方就有恩怨,有恩怨就会有江湖,人就是江湖。 Read More
Looking for MSOLAP OLEDB Driver lower than 16.0.134.22 (64bit)
Hello,
I currently have issues using the latest driver of MSOLAP OLEDB in Visual Studio SSIS (2022) for connecting and querying AAS instances. The error message is:
TITLE: Connection Manager
——————————
Test connection failed because of an error in initializing provider. Internal error: An unexpected error occurred (file ‘pfadalauthinfo.cpp’, line 1031, function ‘PFAdalAuthInfoConfigurationsWrapper::GetInstance’).
Does anyone knows a link to a version higher than 16.0.20.201 but lower 16.0.134.22 ?
Thank you!
Hello, I currently have issues using the latest driver of MSOLAP OLEDB in Visual Studio SSIS (2022) for connecting and querying AAS instances. The error message is:TITLE: Connection Manager——————————Test connection failed because of an error in initializing provider. Internal error: An unexpected error occurred (file ‘pfadalauthinfo.cpp’, line 1031, function ‘PFAdalAuthInfoConfigurationsWrapper::GetInstance’).Does anyone knows a link to a version higher than 16.0.20.201 but lower 16.0.134.22 ?Thank you! Read More
腾龙娱乐公司账号注册开户微yx0503123
有些东西沉默就是答案,闪躲就是答案,没有主动也是答案。
52.喜欢塞着耳机然后一直走,在夜色里看灯火阑珊,看街边卖的烧烤小吃糖水,看来来往往手牵手的情侣,低下头不断的换着歌不觉得孤独,我可以一直就这样走下去多晚回家都没关系,洗个澡躺在床上又是一天,很孤单,却说一个人真好 。
53.世界上有不绝的风景,我有不老的心情。
54.其实,我们只是想找一个谈得来、合脾性,在一起舒坦、分开久了有点想念,安静久了想闹腾一下、吵架了又立马会后悔认输的人 。爱情如此,友情同理 。
55.在没去了解之前,每个人看起来都很好。
56.人间荒唐市侩,不如山中做怪。
57.命运要你成长的时候,总会安排一些让你不顺心的人或事刺激你。这是规律。
58. 都在用不同的方式长大,谁也没轻轻松松。
59.既然去不了诗和远方,那就到村口路上走走,看看落日与夕阳。人生处处皆是风景。
60.张小娴曾写道:我没有很刻意的去想念你,因为我知道,遇到了就应该感恩,路过了就需要释怀。我只是在很多很多的小瞬间,想起你。比如一部电影,一首歌,一句歌词,一条马路和无数个闭上眼睛的瞬间。
61.一个人身边的位置只有那么多,你能给的也只有那么多,在这个狭小的圈子里,有些人要进来,就有一些人不得不离开。
62.无效社交摧毁有趣灵魂。
63. 我不快乐大概是因为我得不到又忘不掉还不敢说。
64.所以我该用什么语气,什么样的言语,来表达我杂七杂八的心情呢。
65.看淡一点再努力一点,这世上,没有谁活得比谁容易,只是有人在呼天喊地,有人在静默坚守。
66.只要有人的地方就有恩怨,有恩怨就会有江湖,人就是江湖。
有些东西沉默就是答案,闪躲就是答案,没有主动也是答案。52.喜欢塞着耳机然后一直走,在夜色里看灯火阑珊,看街边卖的烧烤小吃糖水,看来来往往手牵手的情侣,低下头不断的换着歌不觉得孤独,我可以一直就这样走下去多晚回家都没关系,洗个澡躺在床上又是一天,很孤单,却说一个人真好 。53.世界上有不绝的风景,我有不老的心情。54.其实,我们只是想找一个谈得来、合脾性,在一起舒坦、分开久了有点想念,安静久了想闹腾一下、吵架了又立马会后悔认输的人 。爱情如此,友情同理 。55.在没去了解之前,每个人看起来都很好。56.人间荒唐市侩,不如山中做怪。57.命运要你成长的时候,总会安排一些让你不顺心的人或事刺激你。这是规律。58. 都在用不同的方式长大,谁也没轻轻松松。59.既然去不了诗和远方,那就到村口路上走走,看看落日与夕阳。人生处处皆是风景。60.张小娴曾写道:我没有很刻意的去想念你,因为我知道,遇到了就应该感恩,路过了就需要释怀。我只是在很多很多的小瞬间,想起你。比如一部电影,一首歌,一句歌词,一条马路和无数个闭上眼睛的瞬间。61.一个人身边的位置只有那么多,你能给的也只有那么多,在这个狭小的圈子里,有些人要进来,就有一些人不得不离开。62.无效社交摧毁有趣灵魂。63. 我不快乐大概是因为我得不到又忘不掉还不敢说。64.所以我该用什么语气,什么样的言语,来表达我杂七杂八的心情呢。65.看淡一点再努力一点,这世上,没有谁活得比谁容易,只是有人在呼天喊地,有人在静默坚守。66.只要有人的地方就有恩怨,有恩怨就会有江湖,人就是江湖。 Read More
Outlook Web Search Progressively getting Worse – is the Archive Box the issue?
I’m using the new Outlook desktop app and outlook web app to manage my emails. I’ve noticed my search feature has been getting progressively worse the past few weeks. Before when I would search up an individuals name their email would typically pop up instantly as a search option. Same thing for general key words. I would search a keyword and get results almost instantly
Now it can take 1-2 minutes for searches to load. Often times when I type names of individuals I mail on a regular basis they don’t even pop up as a search option.
I started using the Archive box back in February and I practice inbox zero. So my inbox typically only has 10-20 emails in it at any given point with everything else in the Archive box.
Would the archive box impact Outlooks ability to search and return search results?
I’m using the new Outlook desktop app and outlook web app to manage my emails. I’ve noticed my search feature has been getting progressively worse the past few weeks. Before when I would search up an individuals name their email would typically pop up instantly as a search option. Same thing for general key words. I would search a keyword and get results almost instantlyNow it can take 1-2 minutes for searches to load. Often times when I type names of individuals I mail on a regular basis they don’t even pop up as a search option.I started using the Archive box back in February and I practice inbox zero. So my inbox typically only has 10-20 emails in it at any given point with everything else in the Archive box.Would the archive box impact Outlooks ability to search and return search results? Read More