Category: News
Yaml to execute SQL scripts in a folder via Azure DevOps pipeline
Greetings!!!
We have a git repo directory ExternalSQLScripts with sub-directories for Tables, Views, Functions, StoredProcedures. Loop through each subdirectory and execute all the .sql files on the external SQL Server. We only have access to execute SQL Server database object scripts and on this SQL Server instance we cannot do a .dacpac deployment.
I have the below yaml code which is throwing errors.
Code:
variables:
sqlServerConnection: $(System.ConnectionStrings.DatabaseConnectionString)
sqlScriptPath: $(Build.SourcesDirectory)/SQLScript
steps:
– script: |
# Install SqlServer module
if (-!Test-Path (Get-Module -ListAvailable SqlServer)) {
Install-Module SqlServer -Scope CurrentUser -Force
}
Get-ChildItem -Path $sqlScriptPath -Filter “*.sql” -Recurse | ForEach-Object {
$scriptPath = $_.FullName
$scriptName = $_.BaseName
try {
Invoke-Sqlcmd -ServerInstance $sqlServerConnection -Database [System.DefaultWorkingDirectory] -InputFile $scriptPath
Write-Host “Successfully executed script: $scriptName”
} catch {
Write-Error “Error executing script: $scriptName – $($_.Exception.Message)”
}
}
– task: PublishBuildArtifacts@1
inputs:
pathToPublish: $(sqlScriptPath)
artifactName: sql-scripts
Thanks in advance…
Greetings!!! We have a git repo directory ExternalSQLScripts with sub-directories for Tables, Views, Functions, StoredProcedures. Loop through each subdirectory and execute all the .sql files on the external SQL Server. We only have access to execute SQL Server database object scripts and on this SQL Server instance we cannot do a .dacpac deployment. I have the below yaml code which is throwing errors.Code: variables:
sqlServerConnection: $(System.ConnectionStrings.DatabaseConnectionString)
sqlScriptPath: $(Build.SourcesDirectory)/SQLScript
steps:
– script: |
# Install SqlServer module
if (-!Test-Path (Get-Module -ListAvailable SqlServer)) {
Install-Module SqlServer -Scope CurrentUser -Force
}
Get-ChildItem -Path $sqlScriptPath -Filter “*.sql” -Recurse | ForEach-Object {
$scriptPath = $_.FullName
$scriptName = $_.BaseName
try {
Invoke-Sqlcmd -ServerInstance $sqlServerConnection -Database [System.DefaultWorkingDirectory] -InputFile $scriptPath
Write-Host “Successfully executed script: $scriptName”
} catch {
Write-Error “Error executing script: $scriptName – $($_.Exception.Message)”
}
}
– task: PublishBuildArtifacts@1
inputs:
pathToPublish: $(sqlScriptPath)
artifactName: sql-scripts
Thanks in advance… Read More
One Drive Newbie
I want to start using one drive for work. Where do I even start? Can I connect one drive to mirror sharepoint document library.
I want to start using one drive for work. Where do I even start? Can I connect one drive to mirror sharepoint document library. Read More
Problem with Autofill when Using VLOOKUP Function
Hello, I’m having a little problem when trying to drag and drop a formula to autofill with the VLOOKUP function. I have a long list and I’m using this function to display the highest sales in rank order. Not sure if this is the best way to do it, but it’s how I first learned how to do it. So my formula is this:
=VLOOKUP(LARGE($C$4:$C$10001,513),$C$4:$E$10001,3,FALSE).
What I would like for it to do is to have the 513 number to increment to 514 when I autofill to the next row. Then to 516, 517, etc. I have been manually changing this number the whole time, but it’s a little tedious.
I have a similar problem with using this formula as well: =LARGE($C$4:$C$1002,513). As stated above, I’d like the 513 rank number to adjust to 514 when I autofill the next line. Here is a screen shot that might help to see what I have. The highlighted yellow cell is my VLOOKUP function and the cell to the right is my LARGE function:
Any help with this would be greatly appreciated. Thank you!
Hello, I’m having a little problem when trying to drag and drop a formula to autofill with the VLOOKUP function. I have a long list and I’m using this function to display the highest sales in rank order. Not sure if this is the best way to do it, but it’s how I first learned how to do it. So my formula is this: =VLOOKUP(LARGE($C$4:$C$10001,513),$C$4:$E$10001,3,FALSE).What I would like for it to do is to have the 513 number to increment to 514 when I autofill to the next row. Then to 516, 517, etc. I have been manually changing this number the whole time, but it’s a little tedious. I have a similar problem with using this formula as well: =LARGE($C$4:$C$1002,513). As stated above, I’d like the 513 rank number to adjust to 514 when I autofill the next line. Here is a screen shot that might help to see what I have. The highlighted yellow cell is my VLOOKUP function and the cell to the right is my LARGE function: Any help with this would be greatly appreciated. Thank you! Read More
Problem with discord and Microsoft Edge.
So, i recently started using edge as a default browser. There were some minor issues but i fixed it, i like the design, everything works great and stuff.
But i just realised a problem with sharing screen on discord. Specifically, Edge as a window, discord (app) just restarts. While, if i share entire screen, it works fine. Tested on different browsers, games, app, updated discord, any other program works fine. Did anyone encounter same problem? I assume it has something to do with privacy or something, but it is annoying.
So, i recently started using edge as a default browser. There were some minor issues but i fixed it, i like the design, everything works great and stuff. But i just realised a problem with sharing screen on discord. Specifically, Edge as a window, discord (app) just restarts. While, if i share entire screen, it works fine. Tested on different browsers, games, app, updated discord, any other program works fine. Did anyone encounter same problem? I assume it has something to do with privacy or something, but it is annoying. Read More
Surface Hub 2 MTRoW – how do you now factory wipe the device?
As above how would you now factory reset, do you need a USB stick of some sort creating?
As above how would you now factory reset, do you need a USB stick of some sort creating? Read More
Help with a copilot task
I’m trying to prompt copilot to create a 3-5 page word document based on content in a OneDrive folder trained on a sample of up to 20 documents saved in another folder. How would this be organized and prompted? thanks
I’m trying to prompt copilot to create a 3-5 page word document based on content in a OneDrive folder trained on a sample of up to 20 documents saved in another folder. How would this be organized and prompted? thanks Read More
Complicated vlookup example
Hi there,
I have a dataset where there are three different columns with names. I would like VLOOKUP or INDEX, etc to use those names (in 3 different columns) and search for all 3 against one column in another sheet. Once the matches are found, I’d like to get the contents in the column 8 rows to the right for all 3 names in one cell (so merged). Example, if I have names Sam, Sophia and Liz and the columns 8 rows to the right had the following
Sam – Jungle Group
Sophia – Safari Group
Liz – Forest Group
I’d like the input to return to be Jungle Group, Safari Group, Forest Group
Let me know if this is possible.
Hi there, I have a dataset where there are three different columns with names. I would like VLOOKUP or INDEX, etc to use those names (in 3 different columns) and search for all 3 against one column in another sheet. Once the matches are found, I’d like to get the contents in the column 8 rows to the right for all 3 names in one cell (so merged). Example, if I have names Sam, Sophia and Liz and the columns 8 rows to the right had the following Sam – Jungle GroupSophia – Safari GroupLiz – Forest Group I’d like the input to return to be Jungle Group, Safari Group, Forest Group Let me know if this is possible. Read More
Shifted to Edge from chrome – 3 things I miss
Hi I shifted to edge, after using chrome for 10 years.
1. Edge should allow multiple user profiles on Mobile like chrome and multiple microsoft account login.
2. Please give user choice to which extensions to import while importing chrome data. ?
3. Please make new tab page clean, it took me 1 month to make it clean and now my edge is faster than chrome.
Hi I shifted to edge, after using chrome for 10 years. 1. Edge should allow multiple user profiles on Mobile like chrome and multiple microsoft account login. 2. Please give user choice to which extensions to import while importing chrome data. ?3. Please make new tab page clean, it took me 1 month to make it clean and now my edge is faster than chrome. Read More
Azure AI Services on AKS
Host your AI Language Containers and Web Apps on Azure Kubernetes Cluster: Flask Web App Sentiment Analysis
In this post, we’ll explore how to integrate Azure AI Containers into our applications running on Azure Kubernetes Service (AKS). Azure AI Containers enable you to harness the power of Azure’s AI services directly within your AKS environment, giving you complete control over where your data is processed. By streamlining the deployment process and ensuring consistency, Azure AI Containers simplify the integration of cutting-edge AI capabilities into your applications. Whether you’re developing tools for education, enhancing accessibility, or creating innovative user experiences, this guide will show you how to seamlessly incorporate Azure’s AI Containers into your web apps running on AKS.
Why Containers ?
Azure AI services provides several Docker containers that let you use the same APIs that are available in Azure, on-premises. Using these containers gives you the flexibility to bring Azure AI services closer to your data for compliance, security or other operational reasons. Container support is currently available for a subset of Azure AI services.
Azure AI Containers offer:
Immutable infrastructure: Consistent and reliable system parameters for DevOps teams, with flexibility to adapt and avoid configuration drift.Data control: Choose where data is processed, essential for data residency or security requirements.Model update control: Flexibility in versioning and updating deployed models.Portable architecture: Deploy on Azure, on-premises, or at the edge, with Kubernetes support.High throughput/low latency: Scale for demanding workloads by running Azure AI services close to data and logic.Scalability: Built on scalable cluster technology like Kubernetes for high availability and adaptable performance.
Source: https://learn.microsoft.com/en-us/azure/ai-services/cognitive-services-container-support
Workshop
Our Solution will utilize the Azure Language AI Service with the Text Analytics container for Sentiment Analysis. We will build a Python Flask Web App, containerize it with Docker and push it to Azure Container Registry. An AKS Cluster which we will create, will pull the Flask Image along with the Microsoft provided Sentiment Analysis Image directly from mcr.microsoft.com and we will make all required configurations on our AKS Cluster to have an Ingress Controller with SSL Certificate presenting a simple Web UI to write our Text, submit it for analysis and get the results. Our Web UI will look like this:
Azure Kubernetes Cluster, Azure Container Registry & Azure Text Analytics
These are our main resources and a Virtual Network of course for the AKS which is deployed automatically. Our Solution is hosted entirely on AKS with a Let’s Encrypt Certificate we will create separately offering secure HTTP with an Ingress Controller serving publicly our Flask UI which is calling via REST the Sentiment Analysis service, also hosted on AKS. The difference is that Flask is build with a custom Docker Image pulled from Azure Container Registry, while the Sentiment Analysis is a Microsoft ready Image which we pull directly.
In case your Azure Subscription does not have an AI Service you have to create a Language Service of Text Analytics using the Portal due to the requirement to accept the Responsible AI Terms. For more detail go to https://go.microsoft.com/fwlink/?linkid=2164190 .
My preference as a best practice, is to create an AKS Cluster with the default System Node Pool and add an additional User Node Pool to deploy my Apps, but it is really a matter of preference at the end of the day. So let’s start deploying! Start from your terminal by logging in with az login and set your Subscription with az account set –subscription ‘YourSubName”
## Change the values in < > with your values and remove < >!
## Create the AKS Cluster
az aks create
–resource-group <your-resource-group>
–name <your-cluster-name>
–node-count 1
–node-vm-size standard_a4_v2
–nodepool-name agentpool
–generate-ssh-keys
–nodepool-labels nodepooltype=system
–no-wait
–aks-custom-headers AKSSystemNodePool=true
–network-plugin azure
## Add a User Node Pool
az aks nodepool add
–resource-group <your-resource-group>
–cluster-name <your-cluster-name>
–name userpool
–node-count 1
–node-vm-size standard_d4s_v3
–no-wait
## Create Azure Container Registry
az acr create
–resource-group <your-resource-group>
–name <your-acr-name>
–sku Standard
–location northeurope
## Attach ACR to AKS
az aks update -n <your-cluster-name> -g <your-resource-group> –attach-acr <your-acr-name>
The Language Service is created from the Portal for the reasons we explained earlier. Search for Language and create a new Language service leaving the default selections ( No Custom QnA, no Custom Text Classification) on the F0 (Free) SKU. You may see a VNET menu appear in the Networking Tab, just ignore it, as long as you leave the default Public Access enabled it won’t create a Virtual Network. The presence of the Cloud Resource is for Billing and Metrics.
A Flask Web App has a directory structure where we store index.html in the Templates directory and our CSS and images in the Static directory. So in essence it looks like this:
-sentiment-aks
–flaskwebapp
app.py
requirements.txt
Dockerfile
—static
1.style.css
2.logo.png
—templates
1.index.html
The requirements.txt should have the needed packages :
## requirements.txt
Flask==3.0.0
requests==2.31.0## index.html
<!DOCTYPE html>
<html>
<head>
<title>Sentiment Analysis App</title>
<link rel=”stylesheet” type=”text/css” href=”{{ url_for(‘static’, filename=’style.css’) }}”>
</head>
<body>
<img src=”{{ url_for(‘static’, filename=’logo.png’) }}” class=”icon” alt=”App Icon”>
<h2>Sentiment Analysis</h2>
<form id=”textForm”>
<textarea name=”text” placeholder=”Enter text here…”></textarea>
<button type=”submit”>Analyze</button>
</form>
<div id=”result”></div>
<script>
document.getElementById(‘textForm’).onsubmit = async function(e) {
e.preventDefault();
let formData = new FormData(this);
let response = await fetch(‘/analyze’, {
method: ‘POST’,
body: formData
});
let resultData = await response.json();
let results = resultData.results;
if (results) {
let displayText = `Document: ${results.document}nSentiment: ${results.overall_sentiment}n`;
displayText += `Confidence – Positive: ${results.confidence_positive}, Neutral: ${results.confidence_neutral}, Negative: ${results.confidence_negative}`;
document.getElementById(‘result’).innerText = displayText;
} else {
document.getElementById(‘result’).innerText = ‘No results to display’;
}
};
</script>
</body>
</html>## style.css
body {
font-family: Arial, sans-serif;
background-color: #f0f8ff; /* Light blue background */
margin: 0;
padding: 0;
display: flex;
flex-direction: column;
align-items: center;
justify-content: center;
height: 100vh;
}
h2 {
color: #0277bd; /* Darker blue for headings */
}
.icon {
height: 100px; /* Adjust the size as needed */
margin-top: 20px; /* Add some space above the logo */
}
form {
background-color: white;
padding: 20px;
border-radius: 8px;
width: 300px;
box-shadow: 0 2px 4px rgba(0, 0, 0, 0.1);
}
textarea {
width: 100%;
box-sizing: border-box;
height: 100px;
margin-bottom: 10px;
border: 1px solid #0277bd;
border-radius: 4px;
padding: 10px;
}
button {
background-color: #029ae4; /* Blue button */
color: white;
border: none;
padding: 10px 15px;
border-radius: 4px;
cursor: pointer;
}
button:hover {
background-color: #0277bd;
}
#result {
margin-top: 20px;
}
And here is the most interesting file, our app.py. Notice the use of a REST API call directly to the Sentiment Analysis endpoint which we will declare in the YAML file for the Kubernetes deployment.
## app.py
from flask import Flask, render_template, request, jsonify
import requests
import os
app = Flask(__name__)
@app.route(‘/’, methods=[‘GET’])
def index():
return render_template(‘index.html’) # HTML file with input form
@app.route(‘/analyze’, methods=[‘POST’])
def analyze():
# Extract text from the form submission
text = request.form[‘text’]
if not text:
return jsonify({‘error’: ‘No text provided’}), 400
# Fetch API endpoint and key from environment variables
endpoint = os.environ.get(“CONTAINER_API_URL”)
# Ensure required configurations are available
if not endpoint:
return jsonify({‘error’: ‘API configuration not set’}), 500
# Construct the full URL for the sentiment analysis API
url = f”{endpoint}/text/analytics/v3.1/sentiment”
headers = {
‘Content-Type’: ‘application/json’
}
body = {
‘documents’: [{‘id’: ‘1’, ‘language’: ‘en’, ‘text’: text}]
}
# Make the HTTP POST request to the sentiment analysis API
response = requests.post(url, json=body, headers=headers)
if response.status_code != 200:
return jsonify({‘error’: ‘Failed to analyze sentiment’}), response.status_code
# Process the API response
data = response.json()
results = data[‘documents’][0]
detailed_results = {
‘document’: text,
‘overall_sentiment’: results[‘sentiment’],
‘confidence_positive’: results[‘confidenceScores’][‘positive’],
‘confidence_neutral’: results[‘confidenceScores’][‘neutral’],
‘confidence_negative’: results[‘confidenceScores’][‘negative’]
}
# Return the detailed results to the client
return jsonify({‘results’: detailed_results})
if __name__ == ‘__main__’:
app.run(host=’0.0.0.0′, port=5001, debug=False)
And finally we need a Dockerfile, pay attention to have it on the same level as your app.py file.
## Dockerfile
# Use an official Python runtime as a parent image
FROM python:3.10-slim
# Set the working directory in the container
WORKDIR /app
# Copy the current directory contents into the container at /app
COPY . /app
# Install any needed packages specified in requirements.txt
RUN pip install –no-cache-dir -r requirements.txt
# Make port 5001 available to the world outside this container
EXPOSE 5001
# Define environment variable
ENV CONTAINER_API_URL=”http://sentiment-service/”
# Run app.py when the container launches
CMD [“python”, “app.py”]
Our Web UI is ready to build ! We need Docker running on our development environment and we need to login to Azure Container Registry:
## Login to ACR
az acr login -n <your-acr-name>
## Build and Tag our image
docker build -t <acr-name>.azurecr.io/flaskweb:latest .
docker push <acr-name>.azurecr.io/flaskweb:latest
You can go to the Portal and from Azure Container Registry, Repositories you will find our new Image ready to be pulled!
Kubernetes Deployments
Let’s start deploying our AKS services ! As we already know we can pull the Sentiment Analysis Container from Microsoft directly and that’s what we are going to do with the following tasks. First, we need to login to our AKS Cluster so from Azure Portal head over to your AKS Cluster and click on the Connect link on the menu. Azure will provide the command to connect from our terminal:
Select Azure CLI and just copy-paste the commands to your Terminal.
Now we can run kubectl commands and manage our Cluster and AKS Services.
We need a YAML file for each service we are going to build, including the Certificate at the end. For now let’s create the Sentiment Analysis Service, as a Container, with the following file. Pay attention as you need to get the Language Service Key and Endpoint from the Text Analytics resource we created earlier, and in the nodeSelector block we must enter the name of the User Node Pool we created.
apiVersion: apps/v1
kind: Deployment
metadata:
name: sentiment-deployment
spec:
replicas: 1
selector:
matchLabels:
app: sentiment
template:
metadata:
labels:
app: sentiment
spec:
containers:
– name: sentiment
image: mcr.microsoft.com/azure-cognitive-services/textanalytics/sentiment:latest
ports:
– containerPort: 5000
resources:
limits:
memory: “8Gi”
cpu: “1”
requests:
memory: “8Gi”
cpu: “1”
env:
– name: Eula
value: “accept”
– name: Billing
value: “https://<your-Language-Service>.cognitiveservices.azure.com/”
– name: ApiKey
value: “xxxxxxxxxxxxxxxxxxxx”
nodeSelector:
agentpool: userpool
—
apiVersion: v1
kind: Service
metadata:
name: sentiment-service
spec:
selector:
app: sentiment
ports:
– protocol: TCP
port: 5000
targetPort: 5000
type: ClusterIP
Save the file and run from your Terminal:
kubectl apply -f sentiment-deployment.yaml
In a few seconds you can observe the service running from the AKS Services and Ingresses menu.
Let’s continue to bring our Flask Container now. In the same manner create a new YAML:
apiVersion: apps/v1
kind: Deployment
metadata:
name: flask-service
spec:
replicas: 1
selector:
matchLabels:
app: flask
template:
metadata:
labels:
app: flask
spec:
containers:
– name: flask
image: <your-ACR-name>.azurecr.io/flaskweb:latest
ports:
– containerPort: 5001
env:
– name: CONTAINER_API_URL
value: “http://sentiment-service:5000”
resources:
requests:
cpu: “500m”
memory: “256Mi”
limits:
cpu: “1”
memory: “512Mi”
nodeSelector:
agentpool: userpool
—
apiVersion: v1
kind: Service
metadata:
name: flask-lb
spec:
type: LoadBalancer
selector:
app: flask
ports:
– protocol: TCP
port: 80
targetPort: 5001
kubectl apply -f flask-service.yaml
Observe the Sentiment Analysis Environment Value. It is directly using the Service name of our Sentiment Analysis container as AKS has it’s own DNS resolver for easy communication between services. In fact if we hit the Service Public IP we will have HTTP access to the Web UI.
But let’s see how we can import our Certificate. We won’t describe how to get a Certificate. All we need is the PEM files, meaning the privatekey.pem and the cert.pem. IF we have a PFX we can export them with OpenSSL. Once we have these files in place we will create a secret in AKS that will hold our Certificate key and file. We just need to run this command from within the directory of our PEM files:
kubectl create secret tls flask-app-tls –key privkey.pem –cert cert.pem –namespace default
Once we create our Secret we will deploy a Kubernetes Ingress Controller which will manage HTTPS and will point to the Flask Service. Remember to add an A record to your DNS registrar with the DNS Hostname you are going to use and the Public IP, once you see the IP Address:
apiVersion: networking.k8s.io/v1
kind: Ingress
metadata:
name: flask-app-ingress
namespace: default
spec:
tls:
– hosts:
– your.domain.host
secretName: flask-app-tls
rules:
– host: your.domain.host
http:
paths:
– path: /
pathType: Prefix
backend:
service:
name: flask-lb
port:
number: 80
kubectl apply -f flask-app-ingress.yaml
From AKS – Services and Ingresses – Ingresses you will see the assigned Public IP. Add it to your DNS and once the Name Servers are updated you can hit your Hostname using HTTPS!
Final Thoughts
As we’ve explored, the combination of Azure AI Containers and AKS offers a powerful and flexible solution for deploying AI-driven applications in cloud-native environments. By leveraging these technologies, you gain granular control over your data and model deployments, while maintaining the scalability and portability essential for modern applications. Remember, this is just the starting point. As you delve deeper, consider the specific requirements of your project and explore the vast possibilities that Azure AI Containers unlock. Embrace the power of AI within your AKS deployments, and you’ll be well on your way to building innovative, intelligent solutions that redefine what’s possible in the cloud.
Architecture
Host your AI Language Containers and Web Apps on Azure Kubernetes Cluster: Flask Web App Sentiment Analysis In this post, we’ll explore how to integrate Azure AI Containers into our applications running on Azure Kubernetes Service (AKS). Azure AI Containers enable you to harness the power of Azure’s AI services directly within your AKS environment, giving you complete control over where your data is processed. By streamlining the deployment process and ensuring consistency, Azure AI Containers simplify the integration of cutting-edge AI capabilities into your applications. Whether you’re developing tools for education, enhancing accessibility, or creating innovative user experiences, this guide will show you how to seamlessly incorporate Azure’s AI Containers into your web apps running on AKS.Why Containers ?Azure AI services provides several Docker containers that let you use the same APIs that are available in Azure, on-premises. Using these containers gives you the flexibility to bring Azure AI services closer to your data for compliance, security or other operational reasons. Container support is currently available for a subset of Azure AI services.Azure AI Containers offer:Immutable infrastructure: Consistent and reliable system parameters for DevOps teams, with flexibility to adapt and avoid configuration drift.Data control: Choose where data is processed, essential for data residency or security requirements.Model update control: Flexibility in versioning and updating deployed models.Portable architecture: Deploy on Azure, on-premises, or at the edge, with Kubernetes support.High throughput/low latency: Scale for demanding workloads by running Azure AI services close to data and logic.Scalability: Built on scalable cluster technology like Kubernetes for high availability and adaptable performance.Source: https://learn.microsoft.com/en-us/azure/ai-services/cognitive-services-container-supportWorkshopOur Solution will utilize the Azure Language AI Service with the Text Analytics container for Sentiment Analysis. We will build a Python Flask Web App, containerize it with Docker and push it to Azure Container Registry. An AKS Cluster which we will create, will pull the Flask Image along with the Microsoft provided Sentiment Analysis Image directly from mcr.microsoft.com and we will make all required configurations on our AKS Cluster to have an Ingress Controller with SSL Certificate presenting a simple Web UI to write our Text, submit it for analysis and get the results. Our Web UI will look like this: Azure Kubernetes Cluster, Azure Container Registry & Azure Text AnalyticsThese are our main resources and a Virtual Network of course for the AKS which is deployed automatically. Our Solution is hosted entirely on AKS with a Let’s Encrypt Certificate we will create separately offering secure HTTP with an Ingress Controller serving publicly our Flask UI which is calling via REST the Sentiment Analysis service, also hosted on AKS. The difference is that Flask is build with a custom Docker Image pulled from Azure Container Registry, while the Sentiment Analysis is a Microsoft ready Image which we pull directly.In case your Azure Subscription does not have an AI Service you have to create a Language Service of Text Analytics using the Portal due to the requirement to accept the Responsible AI Terms. For more detail go to https://go.microsoft.com/fwlink/?linkid=2164190 .My preference as a best practice, is to create an AKS Cluster with the default System Node Pool and add an additional User Node Pool to deploy my Apps, but it is really a matter of preference at the end of the day. So let’s start deploying! Start from your terminal by logging in with az login and set your Subscription with az account set –subscription ‘YourSubName” ## Change the values in < > with your values and remove < >!
## Create the AKS Cluster
az aks create
–resource-group <your-resource-group>
–name <your-cluster-name>
–node-count 1
–node-vm-size standard_a4_v2
–nodepool-name agentpool
–generate-ssh-keys
–nodepool-labels nodepooltype=system
–no-wait
–aks-custom-headers AKSSystemNodePool=true
–network-plugin azure
## Add a User Node Pool
az aks nodepool add
–resource-group <your-resource-group>
–cluster-name <your-cluster-name>
–name userpool
–node-count 1
–node-vm-size standard_d4s_v3
–no-wait
## Create Azure Container Registry
az acr create
–resource-group <your-resource-group>
–name <your-acr-name>
–sku Standard
–location northeurope
## Attach ACR to AKS
az aks update -n <your-cluster-name> -g <your-resource-group> –attach-acr <your-acr-name> The Language Service is created from the Portal for the reasons we explained earlier. Search for Language and create a new Language service leaving the default selections ( No Custom QnA, no Custom Text Classification) on the F0 (Free) SKU. You may see a VNET menu appear in the Networking Tab, just ignore it, as long as you leave the default Public Access enabled it won’t create a Virtual Network. The presence of the Cloud Resource is for Billing and Metrics. A Flask Web App has a directory structure where we store index.html in the Templates directory and our CSS and images in the Static directory. So in essence it looks like this: -sentiment-aks
–flaskwebapp
app.py
requirements.txt
Dockerfile
—static
1.style.css
2.logo.png
—templates
1.index.html The requirements.txt should have the needed packages : ## requirements.txt
Flask==3.0.0
requests==2.31.0## index.html
<!DOCTYPE html>
<html>
<head>
<title>Sentiment Analysis App</title>
<link rel=”stylesheet” type=”text/css” href=”{{ url_for(‘static’, filename=’style.css’) }}”>
</head>
<body>
<img src=”{{ url_for(‘static’, filename=’logo.png’) }}” class=”icon” alt=”App Icon”>
<h2>Sentiment Analysis</h2>
<form id=”textForm”>
<textarea name=”text” placeholder=”Enter text here…”></textarea>
<button type=”submit”>Analyze</button>
</form>
<div id=”result”></div>
<script>
document.getElementById(‘textForm’).onsubmit = async function(e) {
e.preventDefault();
let formData = new FormData(this);
let response = await fetch(‘/analyze’, {
method: ‘POST’,
body: formData
});
let resultData = await response.json();
let results = resultData.results;
if (results) {
let displayText = `Document: ${results.document}nSentiment: ${results.overall_sentiment}n`;
displayText += `Confidence – Positive: ${results.confidence_positive}, Neutral: ${results.confidence_neutral}, Negative: ${results.confidence_negative}`;
document.getElementById(‘result’).innerText = displayText;
} else {
document.getElementById(‘result’).innerText = ‘No results to display’;
}
};
</script>
</body>
</html>## style.css
body {
font-family: Arial, sans-serif;
background-color: #f0f8ff; /* Light blue background */
margin: 0;
padding: 0;
display: flex;
flex-direction: column;
align-items: center;
justify-content: center;
height: 100vh;
}
h2 {
color: #0277bd; /* Darker blue for headings */
}
.icon {
height: 100px; /* Adjust the size as needed */
margin-top: 20px; /* Add some space above the logo */
}
form {
background-color: white;
padding: 20px;
border-radius: 8px;
width: 300px;
box-shadow: 0 2px 4px rgba(0, 0, 0, 0.1);
}
textarea {
width: 100%;
box-sizing: border-box;
height: 100px;
margin-bottom: 10px;
border: 1px solid #0277bd;
border-radius: 4px;
padding: 10px;
}
button {
background-color: #029ae4; /* Blue button */
color: white;
border: none;
padding: 10px 15px;
border-radius: 4px;
cursor: pointer;
}
button:hover {
background-color: #0277bd;
}
#result {
margin-top: 20px;
} And here is the most interesting file, our app.py. Notice the use of a REST API call directly to the Sentiment Analysis endpoint which we will declare in the YAML file for the Kubernetes deployment. ## app.py
from flask import Flask, render_template, request, jsonify
import requests
import os
app = Flask(__name__)
@app.route(‘/’, methods=[‘GET’])
def index():
return render_template(‘index.html’) # HTML file with input form
@app.route(‘/analyze’, methods=[‘POST’])
def analyze():
# Extract text from the form submission
text = request.form[‘text’]
if not text:
return jsonify({‘error’: ‘No text provided’}), 400
# Fetch API endpoint and key from environment variables
endpoint = os.environ.get(“CONTAINER_API_URL”)
# Ensure required configurations are available
if not endpoint:
return jsonify({‘error’: ‘API configuration not set’}), 500
# Construct the full URL for the sentiment analysis API
url = f”{endpoint}/text/analytics/v3.1/sentiment”
headers = {
‘Content-Type’: ‘application/json’
}
body = {
‘documents’: [{‘id’: ‘1’, ‘language’: ‘en’, ‘text’: text}]
}
# Make the HTTP POST request to the sentiment analysis API
response = requests.post(url, json=body, headers=headers)
if response.status_code != 200:
return jsonify({‘error’: ‘Failed to analyze sentiment’}), response.status_code
# Process the API response
data = response.json()
results = data[‘documents’][0]
detailed_results = {
‘document’: text,
‘overall_sentiment’: results[‘sentiment’],
‘confidence_positive’: results[‘confidenceScores’][‘positive’],
‘confidence_neutral’: results[‘confidenceScores’][‘neutral’],
‘confidence_negative’: results[‘confidenceScores’][‘negative’]
}
# Return the detailed results to the client
return jsonify({‘results’: detailed_results})
if __name__ == ‘__main__’:
app.run(host=’0.0.0.0′, port=5001, debug=False) And finally we need a Dockerfile, pay attention to have it on the same level as your app.py file. ## Dockerfile
# Use an official Python runtime as a parent image
FROM python:3.10-slim
# Set the working directory in the container
WORKDIR /app
# Copy the current directory contents into the container at /app
COPY . /app
# Install any needed packages specified in requirements.txt
RUN pip install –no-cache-dir -r requirements.txt
# Make port 5001 available to the world outside this container
EXPOSE 5001
# Define environment variable
ENV CONTAINER_API_URL=”http://sentiment-service/”
# Run app.py when the container launches
CMD [“python”, “app.py”] Our Web UI is ready to build ! We need Docker running on our development environment and we need to login to Azure Container Registry: ## Login to ACR
az acr login -n <your-acr-name>
## Build and Tag our image
docker build -t <acr-name>.azurecr.io/flaskweb:latest .
docker push <acr-name>.azurecr.io/flaskweb:latest You can go to the Portal and from Azure Container Registry, Repositories you will find our new Image ready to be pulled!Kubernetes DeploymentsLet’s start deploying our AKS services ! As we already know we can pull the Sentiment Analysis Container from Microsoft directly and that’s what we are going to do with the following tasks. First, we need to login to our AKS Cluster so from Azure Portal head over to your AKS Cluster and click on the Connect link on the menu. Azure will provide the command to connect from our terminal: Select Azure CLI and just copy-paste the commands to your Terminal.Now we can run kubectl commands and manage our Cluster and AKS Services.We need a YAML file for each service we are going to build, including the Certificate at the end. For now let’s create the Sentiment Analysis Service, as a Container, with the following file. Pay attention as you need to get the Language Service Key and Endpoint from the Text Analytics resource we created earlier, and in the nodeSelector block we must enter the name of the User Node Pool we created. apiVersion: apps/v1
kind: Deployment
metadata:
name: sentiment-deployment
spec:
replicas: 1
selector:
matchLabels:
app: sentiment
template:
metadata:
labels:
app: sentiment
spec:
containers:
– name: sentiment
image: mcr.microsoft.com/azure-cognitive-services/textanalytics/sentiment:latest
ports:
– containerPort: 5000
resources:
limits:
memory: “8Gi”
cpu: “1”
requests:
memory: “8Gi”
cpu: “1”
env:
– name: Eula
value: “accept”
– name: Billing
value: “https://<your-Language-Service>.cognitiveservices.azure.com/”
– name: ApiKey
value: “xxxxxxxxxxxxxxxxxxxx”
nodeSelector:
agentpool: userpool
—
apiVersion: v1
kind: Service
metadata:
name: sentiment-service
spec:
selector:
app: sentiment
ports:
– protocol: TCP
port: 5000
targetPort: 5000
type: ClusterIP Save the file and run from your Terminal:kubectl apply -f sentiment-deployment.yamlIn a few seconds you can observe the service running from the AKS Services and Ingresses menu.Let’s continue to bring our Flask Container now. In the same manner create a new YAML: apiVersion: apps/v1
kind: Deployment
metadata:
name: flask-service
spec:
replicas: 1
selector:
matchLabels:
app: flask
template:
metadata:
labels:
app: flask
spec:
containers:
– name: flask
image: <your-ACR-name>.azurecr.io/flaskweb:latest
ports:
– containerPort: 5001
env:
– name: CONTAINER_API_URL
value: “http://sentiment-service:5000”
resources:
requests:
cpu: “500m”
memory: “256Mi”
limits:
cpu: “1”
memory: “512Mi”
nodeSelector:
agentpool: userpool
—
apiVersion: v1
kind: Service
metadata:
name: flask-lb
spec:
type: LoadBalancer
selector:
app: flask
ports:
– protocol: TCP
port: 80
targetPort: 5001 kubectl apply -f flask-service.yamlObserve the Sentiment Analysis Environment Value. It is directly using the Service name of our Sentiment Analysis container as AKS has it’s own DNS resolver for easy communication between services. In fact if we hit the Service Public IP we will have HTTP access to the Web UI.But let’s see how we can import our Certificate. We won’t describe how to get a Certificate. All we need is the PEM files, meaning the privatekey.pem and the cert.pem. IF we have a PFX we can export them with OpenSSL. Once we have these files in place we will create a secret in AKS that will hold our Certificate key and file. We just need to run this command from within the directory of our PEM files:kubectl create secret tls flask-app-tls –key privkey.pem –cert cert.pem –namespace defaultOnce we create our Secret we will deploy a Kubernetes Ingress Controller which will manage HTTPS and will point to the Flask Service. Remember to add an A record to your DNS registrar with the DNS Hostname you are going to use and the Public IP, once you see the IP Address: apiVersion: networking.k8s.io/v1
kind: Ingress
metadata:
name: flask-app-ingress
namespace: default
spec:
tls:
– hosts:
– your.domain.host
secretName: flask-app-tls
rules:
– host: your.domain.host
http:
paths:
– path: /
pathType: Prefix
backend:
service:
name: flask-lb
port:
number: 80 kubectl apply -f flask-app-ingress.yamlFrom AKS – Services and Ingresses – Ingresses you will see the assigned Public IP. Add it to your DNS and once the Name Servers are updated you can hit your Hostname using HTTPS! Final ThoughtsAs we’ve explored, the combination of Azure AI Containers and AKS offers a powerful and flexible solution for deploying AI-driven applications in cloud-native environments. By leveraging these technologies, you gain granular control over your data and model deployments, while maintaining the scalability and portability essential for modern applications. Remember, this is just the starting point. As you delve deeper, consider the specific requirements of your project and explore the vast possibilities that Azure AI Containers unlock. Embrace the power of AI within your AKS deployments, and you’ll be well on your way to building innovative, intelligent solutions that redefine what’s possible in the cloud.Architecture Read More
My Teams invites go straight to the recipients spam folder
Hello all,
I have been using Teams at my previous job, and now that I tried to use it with my personal Microsoft 365 account, I seem to be running into an issue.
I use the browser version of Teams to schedule meetings, but they go straight to my recipients’ spam folder. Even if they are able to recover the invite when they accept, they get this message:
Your message wasn’t delivered to email address removed for privacy reasons because the address couldn’t be found or is unable to receive mail.
Apparently the invites are being sent from this outlook email (no idea where that came from) instead of the email I have in my Microsoft 365 and Teams account.
Please help!
Hello all,I have been using Teams at my previous job, and now that I tried to use it with my personal Microsoft 365 account, I seem to be running into an issue.I use the browser version of Teams to schedule meetings, but they go straight to my recipients’ spam folder. Even if they are able to recover the invite when they accept, they get this message:Your message wasn’t delivered to email address removed for privacy reasons because the address couldn’t be found or is unable to receive mail.Apparently the invites are being sent from this outlook email (no idea where that came from) instead of the email I have in my Microsoft 365 and Teams account.Please help! Read More
Windows Time
We have an issue with the Windows clock on the taskbar falling behind on Windows 11 workstations (4 so far) deployed out of box. The System Time seems to be find. Rebooting a machine brings the clock back to normal.
We have a Windows Active Directory in house; and began deploying Windows 11 Pro machines as a Desktop Refresh initiative. Each machine on the domain is synchronizing to Active Directory (except the 3 DCs; which are synchronizing to the ntp.org).
We have an issue with the Windows clock on the taskbar falling behind on Windows 11 workstations (4 so far) deployed out of box. The System Time seems to be find. Rebooting a machine brings the clock back to normal. We have a Windows Active Directory in house; and began deploying Windows 11 Pro machines as a Desktop Refresh initiative. Each machine on the domain is synchronizing to Active Directory (except the 3 DCs; which are synchronizing to the ntp.org). Read More
Can I change just the sheet, but keep the cells in multiple references
I have a workbook that tracks data in monthly tabs. I also have a sheet that pulls multiple datapoints from each worksheet for an easily printable report.
Since each of the monthly sheets is identical, is there a way that, as I go into a new month, I can change just which sheet my report pulls from, but keeps the cell references?
For example if I have data pulling from =May!A1 , =May!C3, =May!F45, is there an easy way to change all of the Mays to Junes?
Thanks in advance.
I have a workbook that tracks data in monthly tabs. I also have a sheet that pulls multiple datapoints from each worksheet for an easily printable report. Since each of the monthly sheets is identical, is there a way that, as I go into a new month, I can change just which sheet my report pulls from, but keeps the cell references? For example if I have data pulling from =May!A1 , =May!C3, =May!F45, is there an easy way to change all of the Mays to Junes? Thanks in advance. Read More
equivalent of BytesAvailable in serialport?
Hi, I am trying to migrate from `serial` to serialport interface, and having a hard time accessing BytesAvailable in `serialport`. While using serial, I could access the input buffer and get all bytes using `serial.BytesAvailable` which by default is 512. So with `fread` I get a matrix of 512 *1. However, when using `serialport` interface, NumBytesAvailable does not give me the same info.
Could anyone help me please? Thanks!Hi, I am trying to migrate from `serial` to serialport interface, and having a hard time accessing BytesAvailable in `serialport`. While using serial, I could access the input buffer and get all bytes using `serial.BytesAvailable` which by default is 512. So with `fread` I get a matrix of 512 *1. However, when using `serialport` interface, NumBytesAvailable does not give me the same info.
Could anyone help me please? Thanks! Hi, I am trying to migrate from `serial` to serialport interface, and having a hard time accessing BytesAvailable in `serialport`. While using serial, I could access the input buffer and get all bytes using `serial.BytesAvailable` which by default is 512. So with `fread` I get a matrix of 512 *1. However, when using `serialport` interface, NumBytesAvailable does not give me the same info.
Could anyone help me please? Thanks! serial, serialport, bytesavailable MATLAB Answers — New Questions
Plotting a graph of the distance between the centroid of an irregular shaped particle and its contour (radius) versus angle (equal intervals of theta from zero to 360 degrees)
I am trying to plot a graph between the radius of an irregular shaped particle (from the centroid to a point on the contour) at equal intervals of theta around the particle (from zero to 360 degrees).
My code (I actually got most of it from here) is not giving me what I want. I would like the graph to look like that of a typical signal but with angles (0 to 360 degrees) on the x-axis and radius on the y-axis.
I would really appreciate if someone could help me get this graph so that I can proceed with further operations using the fast fourier transform algorithm
Thanks!
%Image Threshholding
clc; clearvars; clear all;
img = imread(‘1_50.jpg’); %Read image
BW = im2bw(img,0.45); %binarize image with a threshold value of 0.45
img2= bwareaopen (BW, 1000); %Remove small objects
img2= imfill(BW, ‘holes’); %fill holes
centriod_value= regionprops(img2, ‘Centroid’); %Finds centroid of image
centroid = centriod_value.Centroid; %Returns centroid in [row, column]
%Display the binary image with the centroid locations superimposed
subplot(1,3,1)
imshow(img2);
hold on
plot(centroid(:,1),centroid(:,2),’b.’)
hold on
%tracing the boundary of image
p_boundary= bwboundaries(img2); % in row, column order, not x,y
number_of_boundaries = size(p_boundary,1); %one boundary
for k = 1 : length(p_boundary);
thisBoundary = p_boundary{k};
y = thisBoundary(:,2);
x = thisBoundary(:,1);
subplot(1,3,2)
plot(x, y, ‘g’, ‘LineWidth’, 2);
hold on
plot(centroid(:,1),centroid(:,2),’b.’)
hold on
end
hold on
% Calculate the angles in degrees
deltaY = thisBoundary(:,1) – centroid(1);
deltaX = thisBoundary(:,2) – centroid(2);
angles = atand(deltaY ./ deltaX);
% Calculate the distances.
distances = sqrt((thisBoundary(:,1) – centroid(1)).^2 + (thisBoundary(:,2) – centroid(2)).^2);
% Plot distance vs. angle.
angle=0:360/128:360;
for i=1:length(p_boundary);
angles(i)+360/128;
subplot(1,3,3)
plot(angles,distances)
endI am trying to plot a graph between the radius of an irregular shaped particle (from the centroid to a point on the contour) at equal intervals of theta around the particle (from zero to 360 degrees).
My code (I actually got most of it from here) is not giving me what I want. I would like the graph to look like that of a typical signal but with angles (0 to 360 degrees) on the x-axis and radius on the y-axis.
I would really appreciate if someone could help me get this graph so that I can proceed with further operations using the fast fourier transform algorithm
Thanks!
%Image Threshholding
clc; clearvars; clear all;
img = imread(‘1_50.jpg’); %Read image
BW = im2bw(img,0.45); %binarize image with a threshold value of 0.45
img2= bwareaopen (BW, 1000); %Remove small objects
img2= imfill(BW, ‘holes’); %fill holes
centriod_value= regionprops(img2, ‘Centroid’); %Finds centroid of image
centroid = centriod_value.Centroid; %Returns centroid in [row, column]
%Display the binary image with the centroid locations superimposed
subplot(1,3,1)
imshow(img2);
hold on
plot(centroid(:,1),centroid(:,2),’b.’)
hold on
%tracing the boundary of image
p_boundary= bwboundaries(img2); % in row, column order, not x,y
number_of_boundaries = size(p_boundary,1); %one boundary
for k = 1 : length(p_boundary);
thisBoundary = p_boundary{k};
y = thisBoundary(:,2);
x = thisBoundary(:,1);
subplot(1,3,2)
plot(x, y, ‘g’, ‘LineWidth’, 2);
hold on
plot(centroid(:,1),centroid(:,2),’b.’)
hold on
end
hold on
% Calculate the angles in degrees
deltaY = thisBoundary(:,1) – centroid(1);
deltaX = thisBoundary(:,2) – centroid(2);
angles = atand(deltaY ./ deltaX);
% Calculate the distances.
distances = sqrt((thisBoundary(:,1) – centroid(1)).^2 + (thisBoundary(:,2) – centroid(2)).^2);
% Plot distance vs. angle.
angle=0:360/128:360;
for i=1:length(p_boundary);
angles(i)+360/128;
subplot(1,3,3)
plot(angles,distances)
end I am trying to plot a graph between the radius of an irregular shaped particle (from the centroid to a point on the contour) at equal intervals of theta around the particle (from zero to 360 degrees).
My code (I actually got most of it from here) is not giving me what I want. I would like the graph to look like that of a typical signal but with angles (0 to 360 degrees) on the x-axis and radius on the y-axis.
I would really appreciate if someone could help me get this graph so that I can proceed with further operations using the fast fourier transform algorithm
Thanks!
%Image Threshholding
clc; clearvars; clear all;
img = imread(‘1_50.jpg’); %Read image
BW = im2bw(img,0.45); %binarize image with a threshold value of 0.45
img2= bwareaopen (BW, 1000); %Remove small objects
img2= imfill(BW, ‘holes’); %fill holes
centriod_value= regionprops(img2, ‘Centroid’); %Finds centroid of image
centroid = centriod_value.Centroid; %Returns centroid in [row, column]
%Display the binary image with the centroid locations superimposed
subplot(1,3,1)
imshow(img2);
hold on
plot(centroid(:,1),centroid(:,2),’b.’)
hold on
%tracing the boundary of image
p_boundary= bwboundaries(img2); % in row, column order, not x,y
number_of_boundaries = size(p_boundary,1); %one boundary
for k = 1 : length(p_boundary);
thisBoundary = p_boundary{k};
y = thisBoundary(:,2);
x = thisBoundary(:,1);
subplot(1,3,2)
plot(x, y, ‘g’, ‘LineWidth’, 2);
hold on
plot(centroid(:,1),centroid(:,2),’b.’)
hold on
end
hold on
% Calculate the angles in degrees
deltaY = thisBoundary(:,1) – centroid(1);
deltaX = thisBoundary(:,2) – centroid(2);
angles = atand(deltaY ./ deltaX);
% Calculate the distances.
distances = sqrt((thisBoundary(:,1) – centroid(1)).^2 + (thisBoundary(:,2) – centroid(2)).^2);
% Plot distance vs. angle.
angle=0:360/128:360;
for i=1:length(p_boundary);
angles(i)+360/128;
subplot(1,3,3)
plot(angles,distances)
end contour MATLAB Answers — New Questions
kfoldLoss() values have inconsistent precision between different iterations of a loop
I am training an RBF SVM with leave-one-out cross-validation using 94 observations and I am surpised to find that the precision of the result of kfoldLoss() isn’t consistent when comparing models that have the same loss (or accuracy). For example, an accuracy of 76/94 does not always produce exactly the same value, with a variation of around 1e-15. The error is completely negligible except for comparing values or searching for the maximum, etc. The only thing that should be different is which 76 of the 94 folds are correct, but this should have no effect on the value or precision of the result.
I’m using a parfor loop to test many combinations of features (e.g. 260K combinations) and measuring the accuracy using accuracy = 1 – kfoldLoss(Mdl). I then use max() to find the position of the result with the highest accuracy; however, sometimes this does not work because there can be tiny variations in the precision. How is this even possible?
With 94 observations, there are only 94 possible accuracy levels. In my latest test, the peak accuracy is 76 out of 94, which is 0.808510638297872…etc.
Eight of the models tested have this 76 / 94 accuracy but it isn’t stored with the same precision in the same double-precision vector. Precision errors are inevitable, but I would have expected MATLAB to always return the same result for 76 / 94.
I’m using a parfor loop. Could this have something to do with it? Is it possible for one thread to somehow produce a different precision from others? It’s an Intel i7-7700 running MATLAB 2024a on Windows 10 .
% Abbreviated code. "combinations" is a cell array with each cell
% containing a vector of the features to select from the training data
accuracy = [];
parfor i = 1:length(combinations)
td_sel = training_data(:, cell2mat(combinations(i)));
Mdl = fitcsvm(td_sel, response_name, ‘KernelFunction’, ‘RBF’, ‘KFold’, 94, ‘CacheSize’, ‘maximal’)
accuracy(i) = 1 – kfoldLoss(Mdl);
end
[max_val, max_pos] = max(accuracies)
max_val =
0.808510638297872
max_pos =
52793
% Find all values that are very close to this value. But I don’t understand
% how the precision (in the storage of the value) can be different
a = find(abs(accuracies – max_val) < 1e-10)
accuracies(a)
a =
6829
6891
6989
13699
21936
22778
45270
52793
ans =
0.808510638297872
0.808510638297872
0.808510638297872
0.808510638297872
0.808510638297872
0.808510638297872
0.808510638297872
0.808510638297872
accuracies(a) – max_val
ans =
1.0e-15 *
-0.111022302462516
-0.111022302462516
-0.111022302462516
-0.111022302462516
-0.111022302462516
-0.111022302462516
-0.111022302462516
0
accuracies(a) – 76/94
ans =
1.0e-15 *
-0.111022302462516
-0.111022302462516
-0.111022302462516
-0.111022302462516
-0.111022302462516
-0.111022302462516
-0.111022302462516
0
Thanks.I am training an RBF SVM with leave-one-out cross-validation using 94 observations and I am surpised to find that the precision of the result of kfoldLoss() isn’t consistent when comparing models that have the same loss (or accuracy). For example, an accuracy of 76/94 does not always produce exactly the same value, with a variation of around 1e-15. The error is completely negligible except for comparing values or searching for the maximum, etc. The only thing that should be different is which 76 of the 94 folds are correct, but this should have no effect on the value or precision of the result.
I’m using a parfor loop to test many combinations of features (e.g. 260K combinations) and measuring the accuracy using accuracy = 1 – kfoldLoss(Mdl). I then use max() to find the position of the result with the highest accuracy; however, sometimes this does not work because there can be tiny variations in the precision. How is this even possible?
With 94 observations, there are only 94 possible accuracy levels. In my latest test, the peak accuracy is 76 out of 94, which is 0.808510638297872…etc.
Eight of the models tested have this 76 / 94 accuracy but it isn’t stored with the same precision in the same double-precision vector. Precision errors are inevitable, but I would have expected MATLAB to always return the same result for 76 / 94.
I’m using a parfor loop. Could this have something to do with it? Is it possible for one thread to somehow produce a different precision from others? It’s an Intel i7-7700 running MATLAB 2024a on Windows 10 .
% Abbreviated code. "combinations" is a cell array with each cell
% containing a vector of the features to select from the training data
accuracy = [];
parfor i = 1:length(combinations)
td_sel = training_data(:, cell2mat(combinations(i)));
Mdl = fitcsvm(td_sel, response_name, ‘KernelFunction’, ‘RBF’, ‘KFold’, 94, ‘CacheSize’, ‘maximal’)
accuracy(i) = 1 – kfoldLoss(Mdl);
end
[max_val, max_pos] = max(accuracies)
max_val =
0.808510638297872
max_pos =
52793
% Find all values that are very close to this value. But I don’t understand
% how the precision (in the storage of the value) can be different
a = find(abs(accuracies – max_val) < 1e-10)
accuracies(a)
a =
6829
6891
6989
13699
21936
22778
45270
52793
ans =
0.808510638297872
0.808510638297872
0.808510638297872
0.808510638297872
0.808510638297872
0.808510638297872
0.808510638297872
0.808510638297872
accuracies(a) – max_val
ans =
1.0e-15 *
-0.111022302462516
-0.111022302462516
-0.111022302462516
-0.111022302462516
-0.111022302462516
-0.111022302462516
-0.111022302462516
0
accuracies(a) – 76/94
ans =
1.0e-15 *
-0.111022302462516
-0.111022302462516
-0.111022302462516
-0.111022302462516
-0.111022302462516
-0.111022302462516
-0.111022302462516
0
Thanks. I am training an RBF SVM with leave-one-out cross-validation using 94 observations and I am surpised to find that the precision of the result of kfoldLoss() isn’t consistent when comparing models that have the same loss (or accuracy). For example, an accuracy of 76/94 does not always produce exactly the same value, with a variation of around 1e-15. The error is completely negligible except for comparing values or searching for the maximum, etc. The only thing that should be different is which 76 of the 94 folds are correct, but this should have no effect on the value or precision of the result.
I’m using a parfor loop to test many combinations of features (e.g. 260K combinations) and measuring the accuracy using accuracy = 1 – kfoldLoss(Mdl). I then use max() to find the position of the result with the highest accuracy; however, sometimes this does not work because there can be tiny variations in the precision. How is this even possible?
With 94 observations, there are only 94 possible accuracy levels. In my latest test, the peak accuracy is 76 out of 94, which is 0.808510638297872…etc.
Eight of the models tested have this 76 / 94 accuracy but it isn’t stored with the same precision in the same double-precision vector. Precision errors are inevitable, but I would have expected MATLAB to always return the same result for 76 / 94.
I’m using a parfor loop. Could this have something to do with it? Is it possible for one thread to somehow produce a different precision from others? It’s an Intel i7-7700 running MATLAB 2024a on Windows 10 .
% Abbreviated code. "combinations" is a cell array with each cell
% containing a vector of the features to select from the training data
accuracy = [];
parfor i = 1:length(combinations)
td_sel = training_data(:, cell2mat(combinations(i)));
Mdl = fitcsvm(td_sel, response_name, ‘KernelFunction’, ‘RBF’, ‘KFold’, 94, ‘CacheSize’, ‘maximal’)
accuracy(i) = 1 – kfoldLoss(Mdl);
end
[max_val, max_pos] = max(accuracies)
max_val =
0.808510638297872
max_pos =
52793
% Find all values that are very close to this value. But I don’t understand
% how the precision (in the storage of the value) can be different
a = find(abs(accuracies – max_val) < 1e-10)
accuracies(a)
a =
6829
6891
6989
13699
21936
22778
45270
52793
ans =
0.808510638297872
0.808510638297872
0.808510638297872
0.808510638297872
0.808510638297872
0.808510638297872
0.808510638297872
0.808510638297872
accuracies(a) – max_val
ans =
1.0e-15 *
-0.111022302462516
-0.111022302462516
-0.111022302462516
-0.111022302462516
-0.111022302462516
-0.111022302462516
-0.111022302462516
0
accuracies(a) – 76/94
ans =
1.0e-15 *
-0.111022302462516
-0.111022302462516
-0.111022302462516
-0.111022302462516
-0.111022302462516
-0.111022302462516
-0.111022302462516
0
Thanks. kfoldloss, precision, parfor MATLAB Answers — New Questions
Text Element – Broken aria-describedby Error element
When creating a form with a text element, by default it includes a broken aria-described by element. Here is a sample test form for example with just a text input. That text input includes the following code:
<input aria-label=”Single line text” maxlength=”4000″ placeholder=”Enter your answer” aria-labelledby=”QuestionId_rf7ddf945f19f48f7a19e9be4ce25a328 QuestionInfo_rf7ddf945f19f48f7a19e9be4ce25a328″ aria-describedby=”rf7ddf945f19f48f7a19e9be4ce25a328_validationError” class=”-as-61″ spellcheck=”true” data-automation-id=”textInput”>
With the problematic aria-describedby=”rf7ddf945f19f48f7a19e9be4ce25a328_validationError” tag. In this case, I don’t think that tag is needed and could just be removed to resolve the issue.
When creating a form with a text element, by default it includes a broken aria-described by element. Here is a sample test form for example with just a text input. That text input includes the following code: <input aria-label=”Single line text” maxlength=”4000″ placeholder=”Enter your answer” aria-labelledby=”QuestionId_rf7ddf945f19f48f7a19e9be4ce25a328 QuestionInfo_rf7ddf945f19f48f7a19e9be4ce25a328″ aria-describedby=”rf7ddf945f19f48f7a19e9be4ce25a328_validationError” class=”-as-61″ spellcheck=”true” data-automation-id=”textInput”> With the problematic aria-describedby=”rf7ddf945f19f48f7a19e9be4ce25a328_validationError” tag. In this case, I don’t think that tag is needed and could just be removed to resolve the issue. Read More
腾龙集团腾龙账号注册开户微yx0503123
有些东西沉默就是答案,闪躲就是答案,没有主动也是答案。
52.喜欢塞着耳机然后一直走,在夜色里看灯火阑珊,看街边卖的烧烤小吃糖水,看来来往往手牵手的情侣,低下头不断的换着歌不觉得孤独,我可以一直就这样走下去多晚回家都没关系,洗个澡躺在床上又是一天,很孤单,却说一个人真好 。
53.世界上有不绝的风景,我有不老的心情。
54.其实,我们只是想找一个谈得来、合脾性,在一起舒坦、分开久了有点想念,安静久了想闹腾一下、吵架了又立马会后悔认输的人 。爱情如此,友情同理 。
55.在没去了解之前,每个人看起来都很好。
56.人间荒唐市侩,不如山中做怪。
57.命运要你成长的时候,总会安排一些让你不顺心的人或事刺激你。这是规律。
58. 都在用不同的方式长大,谁也没轻轻松松。
59.既然去不了诗和远方,那就到村口路上走走,看看落日与夕阳。人生处处皆是风景。
60.张小娴曾写道:我没有很刻意的去想念你,因为我知道,遇到了就应该感恩,路过了就需要释怀。我只是在很多很多的小瞬间,想起你。比如一部电影,一首歌,一句歌词,一条马路和无数个闭上眼睛的瞬间。
61.一个人身边的位置只有那么多,你能给的也只有那么多,在这个狭小的圈子里,有些人要进来,就有一些人不得不离开。
62.无效社交摧毁有趣灵魂。
63. 我不快乐大概是因为我得不到又忘不掉还不敢说。
64.所以我该用什么语气,什么样的言语,来表达我杂七杂八的心情呢。
65.看淡一点再努力一点,这世上,没有谁活得比谁容易,只是有人在呼天喊地,有人在静默坚守。
66.只要有人的地方就有恩怨,有恩怨就会有江湖,人就是江湖。
有些东西沉默就是答案,闪躲就是答案,没有主动也是答案。52.喜欢塞着耳机然后一直走,在夜色里看灯火阑珊,看街边卖的烧烤小吃糖水,看来来往往手牵手的情侣,低下头不断的换着歌不觉得孤独,我可以一直就这样走下去多晚回家都没关系,洗个澡躺在床上又是一天,很孤单,却说一个人真好 。53.世界上有不绝的风景,我有不老的心情。54.其实,我们只是想找一个谈得来、合脾性,在一起舒坦、分开久了有点想念,安静久了想闹腾一下、吵架了又立马会后悔认输的人 。爱情如此,友情同理 。55.在没去了解之前,每个人看起来都很好。56.人间荒唐市侩,不如山中做怪。57.命运要你成长的时候,总会安排一些让你不顺心的人或事刺激你。这是规律。58. 都在用不同的方式长大,谁也没轻轻松松。59.既然去不了诗和远方,那就到村口路上走走,看看落日与夕阳。人生处处皆是风景。60.张小娴曾写道:我没有很刻意的去想念你,因为我知道,遇到了就应该感恩,路过了就需要释怀。我只是在很多很多的小瞬间,想起你。比如一部电影,一首歌,一句歌词,一条马路和无数个闭上眼睛的瞬间。61.一个人身边的位置只有那么多,你能给的也只有那么多,在这个狭小的圈子里,有些人要进来,就有一些人不得不离开。62.无效社交摧毁有趣灵魂。63. 我不快乐大概是因为我得不到又忘不掉还不敢说。64.所以我该用什么语气,什么样的言语,来表达我杂七杂八的心情呢。65.看淡一点再努力一点,这世上,没有谁活得比谁容易,只是有人在呼天喊地,有人在静默坚守。66.只要有人的地方就有恩怨,有恩怨就会有江湖,人就是江湖。 Read More
Looking for MSOLAP OLEDB Driver lower than 16.0.134.22 (64bit)
Hello,
I currently have issues using the latest driver of MSOLAP OLEDB in Visual Studio SSIS (2022) for connecting and querying AAS instances. The error message is:
TITLE: Connection Manager
——————————
Test connection failed because of an error in initializing provider. Internal error: An unexpected error occurred (file ‘pfadalauthinfo.cpp’, line 1031, function ‘PFAdalAuthInfoConfigurationsWrapper::GetInstance’).
Does anyone knows a link to a version higher than 16.0.20.201 but lower 16.0.134.22 ?
Thank you!
Hello, I currently have issues using the latest driver of MSOLAP OLEDB in Visual Studio SSIS (2022) for connecting and querying AAS instances. The error message is:TITLE: Connection Manager——————————Test connection failed because of an error in initializing provider. Internal error: An unexpected error occurred (file ‘pfadalauthinfo.cpp’, line 1031, function ‘PFAdalAuthInfoConfigurationsWrapper::GetInstance’).Does anyone knows a link to a version higher than 16.0.20.201 but lower 16.0.134.22 ?Thank you! Read More
腾龙娱乐公司账号注册开户微yx0503123
有些东西沉默就是答案,闪躲就是答案,没有主动也是答案。
52.喜欢塞着耳机然后一直走,在夜色里看灯火阑珊,看街边卖的烧烤小吃糖水,看来来往往手牵手的情侣,低下头不断的换着歌不觉得孤独,我可以一直就这样走下去多晚回家都没关系,洗个澡躺在床上又是一天,很孤单,却说一个人真好 。
53.世界上有不绝的风景,我有不老的心情。
54.其实,我们只是想找一个谈得来、合脾性,在一起舒坦、分开久了有点想念,安静久了想闹腾一下、吵架了又立马会后悔认输的人 。爱情如此,友情同理 。
55.在没去了解之前,每个人看起来都很好。
56.人间荒唐市侩,不如山中做怪。
57.命运要你成长的时候,总会安排一些让你不顺心的人或事刺激你。这是规律。
58. 都在用不同的方式长大,谁也没轻轻松松。
59.既然去不了诗和远方,那就到村口路上走走,看看落日与夕阳。人生处处皆是风景。
60.张小娴曾写道:我没有很刻意的去想念你,因为我知道,遇到了就应该感恩,路过了就需要释怀。我只是在很多很多的小瞬间,想起你。比如一部电影,一首歌,一句歌词,一条马路和无数个闭上眼睛的瞬间。
61.一个人身边的位置只有那么多,你能给的也只有那么多,在这个狭小的圈子里,有些人要进来,就有一些人不得不离开。
62.无效社交摧毁有趣灵魂。
63. 我不快乐大概是因为我得不到又忘不掉还不敢说。
64.所以我该用什么语气,什么样的言语,来表达我杂七杂八的心情呢。
65.看淡一点再努力一点,这世上,没有谁活得比谁容易,只是有人在呼天喊地,有人在静默坚守。
66.只要有人的地方就有恩怨,有恩怨就会有江湖,人就是江湖。
有些东西沉默就是答案,闪躲就是答案,没有主动也是答案。52.喜欢塞着耳机然后一直走,在夜色里看灯火阑珊,看街边卖的烧烤小吃糖水,看来来往往手牵手的情侣,低下头不断的换着歌不觉得孤独,我可以一直就这样走下去多晚回家都没关系,洗个澡躺在床上又是一天,很孤单,却说一个人真好 。53.世界上有不绝的风景,我有不老的心情。54.其实,我们只是想找一个谈得来、合脾性,在一起舒坦、分开久了有点想念,安静久了想闹腾一下、吵架了又立马会后悔认输的人 。爱情如此,友情同理 。55.在没去了解之前,每个人看起来都很好。56.人间荒唐市侩,不如山中做怪。57.命运要你成长的时候,总会安排一些让你不顺心的人或事刺激你。这是规律。58. 都在用不同的方式长大,谁也没轻轻松松。59.既然去不了诗和远方,那就到村口路上走走,看看落日与夕阳。人生处处皆是风景。60.张小娴曾写道:我没有很刻意的去想念你,因为我知道,遇到了就应该感恩,路过了就需要释怀。我只是在很多很多的小瞬间,想起你。比如一部电影,一首歌,一句歌词,一条马路和无数个闭上眼睛的瞬间。61.一个人身边的位置只有那么多,你能给的也只有那么多,在这个狭小的圈子里,有些人要进来,就有一些人不得不离开。62.无效社交摧毁有趣灵魂。63. 我不快乐大概是因为我得不到又忘不掉还不敢说。64.所以我该用什么语气,什么样的言语,来表达我杂七杂八的心情呢。65.看淡一点再努力一点,这世上,没有谁活得比谁容易,只是有人在呼天喊地,有人在静默坚守。66.只要有人的地方就有恩怨,有恩怨就会有江湖,人就是江湖。 Read More