Category: News
Replacing onmicrosoft
Hello – I recently purchased and installed Office 365 E3 (work or school) licenses for my consulting practice and a friend, client, and fellow small business owner. We would both prefer to use our legacy email address removed for privacy reasons address as our primary username, Outlook email & Calendars instead of the name@domain.onmicrosoft.com configuration assigned while creating the E3 accounts.
I am a domain admin for both accounts and understand it may require editing DNS records.
I would like to configure my own account and test the functionality on both Windows PC AND Mac devices before making similar changes for my client. Can someone please point me to the appropriate resource for getting this done quickly and effectively with minimal disruption.
Thank you.
Hello – I recently purchased and installed Office 365 E3 (work or school) licenses for my consulting practice and a friend, client, and fellow small business owner. We would both prefer to use our legacy email address removed for privacy reasons address as our primary username, Outlook email & Calendars instead of the name@domain.onmicrosoft.com configuration assigned while creating the E3 accounts. I am a domain admin for both accounts and understand it may require editing DNS records. I would like to configure my own account and test the functionality on both Windows PC AND Mac devices before making similar changes for my client. Can someone please point me to the appropriate resource for getting this done quickly and effectively with minimal disruption. Thank you. Read More
Add a border to a page in Word for iPad
Exciting Update, Microsoft 365 Insiders!
Word for iPad now offers a Page Border feature that brings a new level of polish to your documents. With the latest update (Version 2.76 Build 23073006), you can:
Enhance Document Appearance: Add professional-looking borders to reports, essays, and presentations that improve readability and help engage your audience.
Customize with Ease: Employ user-friendly tools to fine-tune the border’s thickness, color, and style to make your document stand out.
Responsive Design: Add borders that adapt to various document sizes and orientations, maintaining a consistent and visually appealing look across all your projects.
Learn how it works in our latest blog: Add a border to a page in Word for iPad
Have a great weekend!
Perry Sjogren
Microsoft 365 Insider Social Media Manager
Become a Microsoft 365 Insider and gain exclusive access to new features and help shape the future of Microsoft 365. Join Now: Windows | Mac | iOS | Android
Exciting Update, Microsoft 365 Insiders!
Word for iPad now offers a Page Border feature that brings a new level of polish to your documents. With the latest update (Version 2.76 Build 23073006), you can:
Enhance Document Appearance: Add professional-looking borders to reports, essays, and presentations that improve readability and help engage your audience.
Customize with Ease: Employ user-friendly tools to fine-tune the border’s thickness, color, and style to make your document stand out.
Responsive Design: Add borders that adapt to various document sizes and orientations, maintaining a consistent and visually appealing look across all your projects.
Learn how it works in our latest blog: Add a border to a page in Word for iPad
Have a great weekend!
Perry Sjogren
Microsoft 365 Insider Social Media Manager
Become a Microsoft 365 Insider and gain exclusive access to new features and help shape the future of Microsoft 365. Join Now: Windows | Mac | iOS | Android
Read More
Dev Channel update to 126.0.2578.1 is live.
Welcome back! We released 126.0.2578.1 to the Dev channel! This includes numerous fixes. For more details on the changes, check out the highlights below.
Added Features:
Implemented a close button to halt media playback.
Improved Reliability:
Fixed an issue that was causing the browser to crash upon playing the next media file.
Fixed an issue that caused the browser to crash when selecting ‘Permissions and Privacy’ from any application in the sidebar.
Resolved a problem where browser would crash upon launch.
Resolved a problem where clicking the ‘x’ button on the privacy page caused browser to crash.
Changed Behavior:
Resolved an issue when triggering the ‘No thanks’ button in the profile pane resulted in the loss of keyboard focus.
Fixed an issue by making the circle selection option more prominent under the ‘Send diagnostic data to Microsoft’ section during feedback submission.
Fixed an issue where the notification pop-up window was not easily locatable.
Resolved an issue where the suggestions for Work and History were still appearing even after using the backspace key.
Fixed an issue where the Scroll View did not refresh when expanding a group within vertical tabs.
Fixed an issue that caused an error to appear in the translation pop-up menu after choosing ‘Always translate xx’ and changing to the desired language.
Fixed an issue where the background color of the text input box in the Favorites hub was not uniform.
iOS:
Resolved a problem where switching tabs would inadvertently change the browser to dark mode, which would persist until the browser was restarted.
Fixed an issue that caused the privacy icon in the search box to be misaligned on iOS devices.
See an issue that you think might be a bug? Remember to send that directly through the in-app feedback by heading to the … menu > Help and feedback > Send feedback and include diagnostics so the team can investigate.
Thanks again for sending us feedback and helping us improve our Insider builds.
~Gouri
Welcome back! We released 126.0.2578.1 to the Dev channel! This includes numerous fixes. For more details on the changes, check out the highlights below.
Microsoft and LinkedIn release the 2024 Work Trend Index on the state of AI at work – The Official Microsoft Blog
Added Features:
Implemented a close button to halt media playback.
Improved Reliability:
Fixed an issue that was causing the browser to crash upon playing the next media file.
Fixed an issue that caused the browser to crash when selecting ‘Permissions and Privacy’ from any application in the sidebar.
Resolved a problem where browser would crash upon launch.
Resolved a problem where clicking the ‘x’ button on the privacy page caused browser to crash.
Changed Behavior:
Resolved an issue when triggering the ‘No thanks’ button in the profile pane resulted in the loss of keyboard focus.
Fixed an issue by making the circle selection option more prominent under the ‘Send diagnostic data to Microsoft’ section during feedback submission.
Fixed an issue where the notification pop-up window was not easily locatable.
Resolved an issue where the suggestions for Work and History were still appearing even after using the backspace key.
Fixed an issue where the Scroll View did not refresh when expanding a group within vertical tabs.
Fixed an issue that caused an error to appear in the translation pop-up menu after choosing ‘Always translate xx’ and changing to the desired language.
Fixed an issue where the background color of the text input box in the Favorites hub was not uniform.
iOS:
Resolved a problem where switching tabs would inadvertently change the browser to dark mode, which would persist until the browser was restarted.
Fixed an issue that caused the privacy icon in the search box to be misaligned on iOS devices.
See an issue that you think might be a bug? Remember to send that directly through the in-app feedback by heading to the … menu > Help and feedback > Send feedback and include diagnostics so the team can investigate.
Thanks again for sending us feedback and helping us improve our Insider builds.
~Gouri Read More
Filter Issue
Dear Experts,
Greetings!
I have an issue where I need to prepare a set from input sheet to Output sheet,
In the “Input” sheet we have New Antenna Id, which can be 1,3,4,7 and each of them has a corresponding RSRP for each set( as shown as an example in Yellow, orange etc)
From this , need to create the Output, so that each Antenna Id 1/3/4/7 has their RSRP values in 4 different columns as shown below:-
Both Legacy(Excel formulae) and PQ solutions are welcome.
Thanks & Regards
Anupam
Dear Experts, Greetings!I have an issue where I need to prepare a set from input sheet to Output sheet,In the “Input” sheet we have New Antenna Id, which can be 1,3,4,7 and each of them has a corresponding RSRP for each set( as shown as an example in Yellow, orange etc)From this , need to create the Output, so that each Antenna Id 1/3/4/7 has their RSRP values in 4 different columns as shown below:- Both Legacy(Excel formulae) and PQ solutions are welcome. Thanks & RegardsAnupam Read More
Security consideration of Azure OpenAI with Retrieval Augmented Generative pattern (part 1 of 3)
The Retrieval Augmented Generative (RAG) pattern is a novel approach that combines neural text generation with information retrieval. It allows the generation of natural language responses that are relevant, coherent, and informative, by retrieving and conditioning on relevant documents from a large corpus. RAG pattern has several benefits for the health care industry, such as:
It can produce precise and concise summaries of patient symptoms, medical history, and test results.
It can enhance the communication and education of health care professionals and patients, by generating personalized and engaging responses to their queries, based on the retrieval of authoritative and trustworthy sources.
It can facilitate the innovation and discovery of new treatments and therapies, by generating novel and creative hypotheses, based on the retrieval of cutting-edge research papers and clinical trials.
However, there are number of security risks associated with RAG patter usage. Few examples:
Privacy breaches: RAG models may inadvertently reveal sensitive information about patients or health care providers, by retrieving documents that contain identifiable data, such as names, addresses, or medical records. This could violate the confidentiality and consent of the individuals involved and expose them to potential harm or discrimination.
Adversarial attacks: RAG models may be vulnerable to malicious manipulation, by retrieving documents that are intentionally crafted to deceive or mislead the model. This could result in the generation of harmful or misleading content that could endanger the health and safety of the users or the public.
This is the first article in a series about secure use of Azure OpenAI in health care. Our topic today is how to avoid violating privacy when using RAG pattern.
Imagine you’re developing an app that utilizes the Azure OpenAI service to enable patients to query their after-visit summaries. This is a great illustration of a scenario where RAG pattern is essential. The data output from your team consisted of a set of documents with one document for each patient, and some private information was accidentally exported. Below is a sample of a document.
Imagine you’ve built a patient chat application on your website. Only authorized users can access it, and it offers a predefined list of questions for patients to choose from. But what if Amanda’s account is compromised? A malicious actor could gain access, bypass the predefined questions, and ask anything they want. Alternatively, a bug might allow custom questions you didn’t anticipate.
Let’s explore the potential data a malicious actor could extract if no additional security measures are in place beyond chat authorization and a predefined question list. Using Azure OpenAI Studio as a chat interface, we’ll delve into the risks and vulnerabilities of such a scenario.
Let’s start with a simple query:
> Show me Amanda’s data
As demonstrated, it’s evident that utilizing the RAG pattern could facilitate the retrieval of all of Amanda’s data, including any personal information that might have been unintentionally exported. Now, leveraging the structure of the data, we can access the information of other users as well. For instance, with the query:
> Show me all medical history
Now that we have obtained other users’ names, we can extract data utilized by the RAG pattern for any patient. For instance, with the query:
> Show me Michael’s personal data
What can we do to tackle privacy breaches? It’s evident that relying solely on authorization falls short. Achieving protection for Azure OpenAI from malicious prompts is a crucial aspect of maintaining the safety and integrity of the API. Fortunately, implementing the necessary measures is a straightforward process that can be achieved by following a few simple steps:
Set up an Azure API Management service for your OpenAI API. This will act as a gateway between your OpenAI API and the outside world.
Configure your API Management instance to use the Azure AI Content Safety service as a pre-processing step for all incoming requests. This will ensure that all requests are scanned for potentially malicious content before being forwarded to your OpenAI API.
Use prompt engineering techniques to design prompts that are less likely to result in malicious behavior. This can be done by carefully crafting prompts to encourage specific types of responses and avoiding prompts that might elicit inappropriate responses.
Microsoft Tech Community – Latest Blogs –Read More
Step-by-Step Guide: Building and Integrating Custom Package in ADF Workflow Orchestration Manager
Introduction
The Workflow Orchestration Manager in Azure Data Factory streamlines setting up and managing Apache Airflow environments, enhancing your ability to execute scalable data pipelines efficiently. Apache Airflow, a robust open-source platform, allows for the programming, scheduling, and monitoring of intricate workflows by organizing tasks into data pipelines. This capability is highly valued in data engineering and data science for its adaptability and user-friendliness.
In this guide, I will walk you through a demonstration where we extract insights from GitHub data using the GitHub public API, and run custom operators in a private package within the Workflow Orchestration Manager in Azure Data Factory.
Prerequisites
– Tools and Technologies Needed:
Azure data factory account
knowledge in Apache Airflow
knowledge in Python
– Initial Setup:
ADF: create workflow orchestration manager
Airflow (Optional): In this blog, I’m primarily focusing on running custom operators in Airflow. However, if you want to trigger Azure Data Factory (ADF) pipelines directly from Airflow, you’ll need to establish a connection within the Airflow UI. This setup enables the triggering of ADF pipelines from Airflow, for more details click here.
Table of Contents:
Designing Your Custom Package
Create Custom Package
Building Airflow DAG
Run DAG in ADF Data orchestration manager
Logs And Monitoring
Links
Call-To-Action
Step 1: Designing Your Custom Package
In this tutorial, I am utilizing the GitHub API and have written two Python operators: `GitHubAPIReaderOperator` and `CountLanguagesOperator`. These operators are designed to fetch data from GitHub repositories and count the programming languages used, respectively.
from airflow.models import BaseOperator
from airflow.utils.decorators import apply_defaults
import requests
import logging
import re
class GitHubAPIReaderOperator(BaseOperator):
@apply_defaults
def __init__(self, api_url, max_pages=20, token=None, *args, **kwargs):
super(GitHubAPIReaderOperator, self).__init__(*args, **kwargs)
self.api_url = api_url
self.max_pages = max_pages
self.token = token
def execute(self, context):
headers = {“Accept”: “application/vnd.github.v3+json”}
if self.token:
headers[“Authorization”] = f”Bearer {self.token}”
session = requests.Session()
session.headers.update(headers)
next_url = self.api_url
all_data = []
page_count = 0
while next_url and page_count < self.max_pages:
response = session.get(next_url)
response.raise_for_status()
data = response.json()
all_data.extend(data)
next_url = self.get_next_link(response.headers.get(‘Link’))
page_count += 1
return all_data
def get_next_link(self, link_header):
if link_header:
links = link_header.split(‘,’)
next_link = [link for link in links if ‘rel=”next”‘ in link]
if next_link:
match = re.search(r'<(.*)>’, next_link[0])
if match:
return match.group(1)
return None
class CountLanguagesOperator(BaseOperator):
@apply_defaults
def __init__(self, api_url, token=None, *args, **kwargs):
super(CountLanguagesOperator, self).__init__(*args, **kwargs)
self.api_url = api_url
self.token = token
def execute(self, context):
repos = context[‘task_instance’].xcom_pull(task_ids=’fetch_github_data’)
headers = {“Accept”: “application/vnd.github.v3+json”}
if self.token:
headers[“Authorization”] = f”Bearer {self.token}”
session = requests.Session()
session.headers.update(headers)
language_counts = {}
for repo in repos:
languages_url = repo.get(‘languages_url’)
if not languages_url:
continue # Skip repos without a languages URL
try:
response = session.get(languages_url)
response.raise_for_status()
languages_data = response.json()
for language in languages_data.keys():
if language in language_counts:
language_counts[language] += 1
else:
language_counts[language] = 1
except requests.exceptions.HTTPError as error:
if error.response.status_code == 403:
logging.warning(f”Skipping repository due to HTTP 403 Forbidden: {languages_url}”)
continue
else:
raise
# Output the results
for lang, count in language_counts.items():
logging.info(f”{lang} repositories count: {count}”)
return language_counts
Please check API‘s documentation and limitations.
Step 2: Create the Custom Package
Follow steps below to create wheel package in Python
you have to have folder hierarchy:
in the setup file add the package folder name like so:
from setuptools import setup, find_packages
setup(
name=”custom_operators”,
version=”0.1.0″,
package_dir={“”: “src”},
packages=find_packages(where=”src”),
install_requires=[
# List your dependencies here, e.g., ‘numpy’, ‘pandas’
],
classifiers=[
“Programming Language :: Python :: 3”,
“License :: OSI Approved :: MIT License”,
“Operating System :: OS Independent”,
],
python_requires=’>=3.6′,
)
in CMD, run this command to create the wheel package:
pip install setuptools wheelpython setup.py sdist bdist_wheel
This command will create a source distribution and a wheel for your package. The wheel file (.whl) will be stored in a newly created dist/ directory under custom_operators folder.
Step 3: Building Airflow DAG
Now that we have build our custom operators and created the wheel package, now we need to create a dag that will trigger these operators.
for that i created 2 tasks, fetch_github_data and count_languages.
each will call the operators above
from airflow import DAG
from datetime import datetime, timedelta
from custom_operators.github_operators import GitHubAPIReaderOperator,CountLanguagesOperator
default_args = {
‘owner’: ‘airflow’,
‘start_date’: datetime(2023, 1, 1),
‘depends_on_past’: False,
’email_on_failure’: False,
’email_on_retry’: False,
‘retries’: 1,
‘retry_delay’: timedelta(minutes=5),
}
dag = DAG(
‘github_language_analysis’,
default_args=default_args,
description=’Analyze GitHub repos for language usage’,
schedule_interval=timedelta(days=1),
)
fetch_github_data = GitHubAPIReaderOperator(
task_id=’fetch_github_data’,
api_url=’https://api.github.com/repositories’,
max_pages=10,
token=’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’, # Replace with your actual token
dag=dag
)
count_languages = CountLanguagesOperator(
task_id=’count_languages’,
api_url=’https://api.github.com’,
token=’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’, # Replace with your actual token
dag=dag
)
fetch_github_data >> count_languages
Step 4: Run DAG in ADF Data orchestration manager
Now, we built our DAG and our custom package.
in order to run it in ADF.
1. Create managed Airflow instance in ADF following MS docs.
2. in ADLS workspace, create the folder hierarchy as the following:
In the requirements file, include the path to the custom package stored in your ADLS storage account as follows:
/opt/airflow/dags/custom_operators-0.1.0-py3-none-any.whl
3. In the ADF workspace, click on “Import files.” Navigate to your ADLS storage account, locate the “Airflow” folder, and check the “Import requirements” checkbox.
it will take a few minutes till ADF orchestration manager will update the code and the custom package.
Step 5: Logs and Monitoring
After importing the files, click on the “Monitor” button in the Data Orchestration Manager to view task execution and export Airflow logs. This will open the Airflow UI.
DAG:
Logs in count_languages task :
P.S: For more dynamic work, you can save the languages count as a JSON file and store it in your storage account.
Links:
Install a Private package – Azure Data Factory | Microsoft Learn
How does Workflow Orchestration Manager work? – Azure Data Factory | Microsoft Learn
airflow.operators.python — Airflow Documentation (apache.org)
airflow.providers.microsoft.azure — apache-airflow-providers-microsoft-azure Documentation
Call to Action:
– Make sure to establish all connections before starting to work on managed airflow.
– check MS documentation on Workflow Orchestration Manager.
– Please help us improve by sharing your valuable Workflow Orchestration Manager Preview feedback by emailing us at ManagedAirflow@microsoft.com
– Follow me on LinkedIn: Sally Dabbah | LinkedIn
Microsoft Tech Community – Latest Blogs –Read More
Unlock power of data in Azure- With SQL Server on Linux Azure VMs and Azure AI search
In a world awash with data, the challenge lies in our ability to comprehend and engage with it seamlessly. My colleague Muazma has shed light on this topic in her insightful blog: “Chat with your data in Azure SQL Database,”.
Inspired by her work, I pondered the possibility of applying similar principles to SQL Server on Linux VMs hosted on Azure. This blog is a result of that contemplation and here are the steps that we cover in this blog:
We’ll begin by setting up a Linux-based Virtual Machine on Azure and proceed to install SQL Server on it.
Next, we’ll implement TLS 1.2 encryption to secure connections to SQL Server, utilizing certbot for certificate creation with Let’s Encrypt serving as the certificate authority.
We’ll then import Kaggle’s dataset into SQL Server using the Import Flat File wizard.
Following that, we’ll create the Azure AI service and associated indexes, with the Azure SQL Server on Linux VM as the data source.
Lastly, we’ll utilize Azure OpenAI studio to interact with the data.
To follow my lead, all you need is an Azure Subscription and an account set up to gain access to Azure OpenAI Studio.
Step 1: Create the Azure SQL Server on Linux based VM:
Let’s start by setting up a Linux-based VM on Azure. For this demonstration, I’ll be configuring an Ubuntu 22.04 VM. Below is the script to first create a resource group, followed by the creation of a VM named SQLLinux22 running Ubuntu 22.04.
# let’s create the resource group using the command:
az group create –name myrgdemo –location centralindia
#lets create VM using Ubuntu image, I am using this image: 0001-com-ubuntu-minimal-jammy
az vm create –resource-group myrgdemo –name sqllinux22 –size “Standard_B4ms” –location “central india” –image “Canonical:0001-com-ubuntu-minimal-jammy:minimal-22_04-lts:22.04.202405131” –admin-username “amvin” –admin-password “MY$trongPass123*#” –authentication-type all –generate-ssh-keys
Once the VM is set up, proceed to install SQL Server by following the guidelines provided in the official Microsoft documentation.
Step 2: Enable TLS 1.2 Encryption on SQL Server on Linux, to secure SQL Server connections:
Following the installation of SQL Server, it’s time to move on to step 2: enabling TLS 1.2 encryption to secure connections to SQL Server. But first, you need to set up a DNS name for the VM you’ve created, as it’s necessary for generating the certificate. You can configure the DNS name via the Azure portal. Once it’s set up, it will appear as shown. Remember to also enable port 80 in the VM’s Network Settings, which is required by Certbot for certificate creation.
With that completed, it’s now time to install Certbot and generate the necessary certificate. Log into the VM using your preferred SSH client and execute the following commands. These will install Certbot and then create the certificate using the DNS name you have set up.
amvin@sqllinux22:~$ sudo snap install –classic certbot
2024-05-16T21:11:23Z INFO Waiting for automatic snapd restart…
certbot 2.10.0 from Certbot Project (certbot-eff✓) installed
## After the installation go ahead and create the certificate and private key file.
amvin@sqllinux22:~$ sudo certbot certonly –standalone –key-type rsa –preferred-challenges http -d sqllinux22.centralindia.cloudapp.azure.com
Saving debug log to /var/log/letsencrypt/letsencrypt.log
Enter email address (used for urgent renewal and security notices)
(Enter ‘c’ to cancel): xxxxxxxxxxx
– – – – – – – – – – – – – – – – – – – – – – – – – – – – – – – – – – – – – – – –
Please read the Terms of Service at
https://letsencrypt.org/documents/LE-SA-v1.4-April-3-2024.pdf. You must agree in
order to register with the ACME server. Do you agree?
– – – – – – – – – – – – – – – – – – – – – – – – – – – – – – – – – – – – – – – –
(Y)es/(N)o: Y
– – – – – – – – – – – – – – – – – – – – – – – – – – – – – – – – – – – – – – – –
Would you be willing, once your first certificate is successfully issued, to
share your email address with the Electronic Frontier Foundation, a founding
partner of the Let’s Encrypt project and the non-profit organization that
develops Certbot? We’d like to send you email about our work encrypting the web,
EFF news, campaigns, and ways to support digital freedom.
– – – – – – – – – – – – – – – – – – – – – – – – – – – – – – – – – – – – – – – –
(Y)es/(N)o: Y
Account registered.
Requesting a certificate for sqllinux22.centralindia.cloudapp.azure.com
Successfully received certificate.
Certificate is saved at: /etc/letsencrypt/live/sqllinux22.centralindia.cloudapp.azure.com/fullchain.pem
Key is saved at: /etc/letsencrypt/live/sqllinux22.centralindia.cloudapp.azure.com/privkey.pem
This certificate expires on 2024-08-14.
These files will be updated when the certificate renews.
Certbot has set up a scheduled task to automatically renew this certificate in the background.
– – – – – – – – – – – – – – – – – – – – – – – – – – – – – – – – – – – – – – – –
If you like Certbot, please consider supporting our work by:
* Donating to ISRG / Let’s Encrypt: https://letsencrypt.org/donate
* Donating to EFF: https://eff.org/donate-le
– – – – – – – – – – – – – – – – – – – – – – – – – – – – – – – – – – – – – – – –
## With this we have now created the required files as shown below:
root@sqllinux22:/etc/letsencrypt/live/sqllinux22.centralindia.cloudapp.azure.com# ll
total 28
drwxr-xr-x 2 root root 4096 May 16 21:13 ./
drwx—— 3 root root 4096 May 16 21:13 ../
-rw-r–r– 1 root root 692 May 16 21:13 README
lrwxrwxrwx 1 root root 66 May 16 21:13 cert.pem -> ../../archive/sqllinux22.centralindia.cloudapp.azure.com/cert1.pem
lrwxrwxrwx 1 root root 67 May 16 21:13 chain.pem -> ../../archive/sqllinux22.centralindia.cloudapp.azure.com/chain1.pem
lrwxrwxrwx 1 root root 71 May 16 21:13 fullchain.pem -> ../../archive/sqllinux22.centralindia.cloudapp.azure.com/fullchain1.pem
lrwxrwxrwx 1 root root 69 May 16 21:13 privkey.pem -> ../../archive/sqllinux22.centralindia.cloudapp.azure.com/privkey1.pem
Move the certificate and necessary files to the “/var/opt/mssql/secrets” directory for SQL Server’s use and to enable TLS 1.2 encryption as demonstrated below. After enabling TLS 1.2 encryption, please restart SQL Server.
# copy the cert and key to the secrets folder as shown below, we are converting the key # from .pem format to .key using the openssl option.
sudo cp /etc/letsencrypt/live/sqllinux22.centralindia.cloudapp.azure.com/fullchain.pem /var/opt/mssql/secrets/fullchain.pem
sudo openssl rsa -in /etc/letsencrypt/live/sqllinux22.centralindia.cloudapp.azure.com/privkey.pem -out /var/opt/mssql/secrets/privkey.key
# Enable TLS 1.2 as shown below using the mssql-conf for SQL Server
sudo /opt/mssql/bin/mssql-conf set network.tlscert /var/opt/mssql/secrets/fullchain.pem
sudo /opt/mssql/bin/mssql-conf set network.tlskey /var/opt/mssql/secrets/privkey.key
sudo /opt/mssql/bin/mssql-conf set network.tlsprotocols 1.2
sudo /opt/mssql/bin/mssql-conf set network.forceencryption 0
# Restart SQL Server and confirm that TLS 1.2 is enabled as seen in the errorlog:
amvin@sqllinux22:~$ sudo systemctl restart mssql-server
# Now, lets read the errorlog, to confirm the certificate is loaded
root@sqllinux22:~# cat /var/opt/mssql/log/errorlog | grep “Allowed TLS”
2024-05-16 21:23:16.78 Server Successfully initialized the TLS configuration. Allowed TLS protocol versions are [‘1.2’]. Allowed TLS ciphers are [‘ECDHE-ECDSA-AES128-GCM-SHA256:ECDHE-ECDSA-AES256-GCM-SHA384:ECDHE-RSA-AES128-GCM-SHA256:ECDHE-RSA-AES256-GCM-SHA384:ECDHE-ECDSA-AES128-SHA256:ECDHE-ECDSA-AES256-SHA384:ECDHE-ECDSA-AES256-SHA:ECDHE-ECDSA-AES128-SHA:AES256-GCM-SHA384:AES128-GCM-SHA256:AES256-SHA256:AES128-SHA256:AES256-SHA:AES128-SHA:!DHE-RSA-AES256-GCM-SHA384:!DHE-RSA-AES128-GCM-SHA256:!DHE-RSA-AES256-SHA:!DHE-RSA-AES128-SHA’].
Step 3: Load the dataset into SQL Server using the Import Flat file Wizard:
Now that we’ve reached the third step, it’s time to load the dataset into SQL Server on the Linux Azure VM. I connected to the SQL Server Azure VM using SQL Server Management Studio (SSMS) on my Windows machine and downloaded the dataset locally.
Next, I utilized the SSMS import flat file wizard to transfer the data from the file to SQL Server on the Azure VM. Once the dataset is loaded into the table, the data is displayed as follows.
Step 4: Create the Azure AI search service, Index and import data from SQL Server:
With data loaded, let’s log in to the Azure Portal, search for AI search -> click on create and create the search service as shown below:
once, the search service is created, we need to link it to the Azure SQL Server VM to import the data into the search. During the import process, it’s crucial to ensure that you map the correct data type. In this instance, I chose the string data type for all columns during the import. Additionally, take note of the essential facets of the columns that I have enabled as shown below.
After the index is created and the data is imported, it should appear as follows, and you can also execute a sample query to confirm that you can retrieve data as illustrated below:
Step 5: Use Azure OpenAI Studio to chat with your data:
That’s it, we are now in the final stage where using Azure OpenAI studio I can create a “Chat playground AI model”, configure the DataSource to the Azure AI search, index that I created above, and you are now ready to chat with your data as shown below. Try asking scenario and context-based questions like “What products to buy for a kid’s birthday?”, “Suggest items to buy for decor of a room”.
Hope you enjoyed reading this! Happy Friday!
Microsoft Tech Community – Latest Blogs –Read More
Is it possible to quantize a projected NN?
I have trained a LSTM Model to filter some signals (approximately following this example: https://it.mathworks.com/help/signal/ug/denoise-eeg-signals-using-differentiable-signal-processing-layers.html). Then, since the aim is to implement the NN in an hardware, I reduced the number of learnables using the Projection method of compression. This gives me the possibility to reduce the learnables from 4.3k to 1.5k. Then, I improved the accuracy of the model fine-tuning the projected LSTM. Finally, I would like to quantize the model on 8 bits. When I use the quantize function in MATLAB I have the following error: "The class deep.internal.quantization.config.ConfigType has no Constant property or Static method named ‘InputProjector’."
Is it possible to quantize a projected NN?
Fairly new in this area, be kind :)I have trained a LSTM Model to filter some signals (approximately following this example: https://it.mathworks.com/help/signal/ug/denoise-eeg-signals-using-differentiable-signal-processing-layers.html). Then, since the aim is to implement the NN in an hardware, I reduced the number of learnables using the Projection method of compression. This gives me the possibility to reduce the learnables from 4.3k to 1.5k. Then, I improved the accuracy of the model fine-tuning the projected LSTM. Finally, I would like to quantize the model on 8 bits. When I use the quantize function in MATLAB I have the following error: "The class deep.internal.quantization.config.ConfigType has no Constant property or Static method named ‘InputProjector’."
Is it possible to quantize a projected NN?
Fairly new in this area, be kind 🙂 I have trained a LSTM Model to filter some signals (approximately following this example: https://it.mathworks.com/help/signal/ug/denoise-eeg-signals-using-differentiable-signal-processing-layers.html). Then, since the aim is to implement the NN in an hardware, I reduced the number of learnables using the Projection method of compression. This gives me the possibility to reduce the learnables from 4.3k to 1.5k. Then, I improved the accuracy of the model fine-tuning the projected LSTM. Finally, I would like to quantize the model on 8 bits. When I use the quantize function in MATLAB I have the following error: "The class deep.internal.quantization.config.ConfigType has no Constant property or Static method named ‘InputProjector’."
Is it possible to quantize a projected NN?
Fairly new in this area, be kind 🙂 lstm, quantizing networks, projection, neural networks, quantization, compression techniques MATLAB Answers — New Questions
Which characters are replaced by matlab.lang.makeValidName?
Function matlab.lang.makeValidName does 2 things:
first, it checks the input string for special characters which are not allowed to be contained in identifiers and replaces those (e.g. with "")
second, it checks the length of the input string with possibly replaced characters, and if is too long, it truncates the string.
However, I would like the function to check only for special characters, not for length.
One option would be to use "strrep", but then I would like to generate the list of characters which are replaced by matlab.lang.makeValidName to be consistent with makeValidName.Function matlab.lang.makeValidName does 2 things:
first, it checks the input string for special characters which are not allowed to be contained in identifiers and replaces those (e.g. with "")
second, it checks the length of the input string with possibly replaced characters, and if is too long, it truncates the string.
However, I would like the function to check only for special characters, not for length.
One option would be to use "strrep", but then I would like to generate the list of characters which are replaced by matlab.lang.makeValidName to be consistent with makeValidName. Function matlab.lang.makeValidName does 2 things:
first, it checks the input string for special characters which are not allowed to be contained in identifiers and replaces those (e.g. with "")
second, it checks the length of the input string with possibly replaced characters, and if is too long, it truncates the string.
However, I would like the function to check only for special characters, not for length.
One option would be to use "strrep", but then I would like to generate the list of characters which are replaced by matlab.lang.makeValidName to be consistent with makeValidName. matlab.lang.makevalidname MATLAB Answers — New Questions
Simulation slowed down by universal bridge diode rectifier
I’m doing a simulation in Matlab Simulink using Simscape Electrical Specialized power systems. The aim of my simulation is to model a synchronous machine with brushless excitation. The brushless excitation requires a rotating rectifier bridge, modelized in my simulation using the block "universal bridge". The addition of this block considerably slows down my simulation (i need 3 of them in my system), to the point that 5s of simulation can take hours.
I also created my block of the diode bridge rectifier using diodes from specialized power systems but i have the same problem.
The solver I’m using is ode23tb with a max step size 1e-3, relative tolerance 1e-3 ( in order to have correct waveforms I should use at least 1e-4, 1e-4 but it would make the simulation slower). I already tried accelerator and rapid accelerator without any improvements.
Does anyone have any syuggestion on how to resolve this simulation speed problem?I’m doing a simulation in Matlab Simulink using Simscape Electrical Specialized power systems. The aim of my simulation is to model a synchronous machine with brushless excitation. The brushless excitation requires a rotating rectifier bridge, modelized in my simulation using the block "universal bridge". The addition of this block considerably slows down my simulation (i need 3 of them in my system), to the point that 5s of simulation can take hours.
I also created my block of the diode bridge rectifier using diodes from specialized power systems but i have the same problem.
The solver I’m using is ode23tb with a max step size 1e-3, relative tolerance 1e-3 ( in order to have correct waveforms I should use at least 1e-4, 1e-4 but it would make the simulation slower). I already tried accelerator and rapid accelerator without any improvements.
Does anyone have any syuggestion on how to resolve this simulation speed problem? I’m doing a simulation in Matlab Simulink using Simscape Electrical Specialized power systems. The aim of my simulation is to model a synchronous machine with brushless excitation. The brushless excitation requires a rotating rectifier bridge, modelized in my simulation using the block "universal bridge". The addition of this block considerably slows down my simulation (i need 3 of them in my system), to the point that 5s of simulation can take hours.
I also created my block of the diode bridge rectifier using diodes from specialized power systems but i have the same problem.
The solver I’m using is ode23tb with a max step size 1e-3, relative tolerance 1e-3 ( in order to have correct waveforms I should use at least 1e-4, 1e-4 but it would make the simulation slower). I already tried accelerator and rapid accelerator without any improvements.
Does anyone have any syuggestion on how to resolve this simulation speed problem? rectifiers, simulation speed, slow simulation, rotating rectifier, electrical machine MATLAB Answers — New Questions
SharePoint Intranet Site doesn’t show images to some users
Hi there!
I’m having a problem with my SharePoint intranet site. Suddenly some users began to not be able to see certain areas of the site, especially images.
Check the permissions and we are only three users who have permissions for full control and the others are part of “Everyone except external users”
It is a problem that I have had for weeks and it is affecting us since we publish content frequently and it affects the visuals of the site.
Those of us who have full control have no problems viewing, but other users do.
Hope the community can help me!
Thanks.
Hi there! I’m having a problem with my SharePoint intranet site. Suddenly some users began to not be able to see certain areas of the site, especially images.Check the permissions and we are only three users who have permissions for full control and the others are part of “Everyone except external users”It is a problem that I have had for weeks and it is affecting us since we publish content frequently and it affects the visuals of the site.Those of us who have full control have no problems viewing, but other users do. Hope the community can help me! Thanks. Read More
Yaml to execute SQL scripts in a folder via Azure DevOps pipeline
Greetings!!!
We have a git repo directory ExternalSQLScripts with sub-directories for Tables, Views, Functions, StoredProcedures. Loop through each subdirectory and execute all the .sql files on the external SQL Server. We only have access to execute SQL Server database object scripts and on this SQL Server instance we cannot do a .dacpac deployment.
I have the below yaml code which is throwing errors.
Code:
variables:
sqlServerConnection: $(System.ConnectionStrings.DatabaseConnectionString)
sqlScriptPath: $(Build.SourcesDirectory)/SQLScript
steps:
– script: |
# Install SqlServer module
if (-!Test-Path (Get-Module -ListAvailable SqlServer)) {
Install-Module SqlServer -Scope CurrentUser -Force
}
Get-ChildItem -Path $sqlScriptPath -Filter “*.sql” -Recurse | ForEach-Object {
$scriptPath = $_.FullName
$scriptName = $_.BaseName
try {
Invoke-Sqlcmd -ServerInstance $sqlServerConnection -Database [System.DefaultWorkingDirectory] -InputFile $scriptPath
Write-Host “Successfully executed script: $scriptName”
} catch {
Write-Error “Error executing script: $scriptName – $($_.Exception.Message)”
}
}
– task: PublishBuildArtifacts@1
inputs:
pathToPublish: $(sqlScriptPath)
artifactName: sql-scripts
Thanks in advance…
Greetings!!! We have a git repo directory ExternalSQLScripts with sub-directories for Tables, Views, Functions, StoredProcedures. Loop through each subdirectory and execute all the .sql files on the external SQL Server. We only have access to execute SQL Server database object scripts and on this SQL Server instance we cannot do a .dacpac deployment. I have the below yaml code which is throwing errors.Code: variables:
sqlServerConnection: $(System.ConnectionStrings.DatabaseConnectionString)
sqlScriptPath: $(Build.SourcesDirectory)/SQLScript
steps:
– script: |
# Install SqlServer module
if (-!Test-Path (Get-Module -ListAvailable SqlServer)) {
Install-Module SqlServer -Scope CurrentUser -Force
}
Get-ChildItem -Path $sqlScriptPath -Filter “*.sql” -Recurse | ForEach-Object {
$scriptPath = $_.FullName
$scriptName = $_.BaseName
try {
Invoke-Sqlcmd -ServerInstance $sqlServerConnection -Database [System.DefaultWorkingDirectory] -InputFile $scriptPath
Write-Host “Successfully executed script: $scriptName”
} catch {
Write-Error “Error executing script: $scriptName – $($_.Exception.Message)”
}
}
– task: PublishBuildArtifacts@1
inputs:
pathToPublish: $(sqlScriptPath)
artifactName: sql-scripts
Thanks in advance… Read More
One Drive Newbie
I want to start using one drive for work. Where do I even start? Can I connect one drive to mirror sharepoint document library.
I want to start using one drive for work. Where do I even start? Can I connect one drive to mirror sharepoint document library. Read More
Problem with Autofill when Using VLOOKUP Function
Hello, I’m having a little problem when trying to drag and drop a formula to autofill with the VLOOKUP function. I have a long list and I’m using this function to display the highest sales in rank order. Not sure if this is the best way to do it, but it’s how I first learned how to do it. So my formula is this:
=VLOOKUP(LARGE($C$4:$C$10001,513),$C$4:$E$10001,3,FALSE).
What I would like for it to do is to have the 513 number to increment to 514 when I autofill to the next row. Then to 516, 517, etc. I have been manually changing this number the whole time, but it’s a little tedious.
I have a similar problem with using this formula as well: =LARGE($C$4:$C$1002,513). As stated above, I’d like the 513 rank number to adjust to 514 when I autofill the next line. Here is a screen shot that might help to see what I have. The highlighted yellow cell is my VLOOKUP function and the cell to the right is my LARGE function:
Any help with this would be greatly appreciated. Thank you!
Hello, I’m having a little problem when trying to drag and drop a formula to autofill with the VLOOKUP function. I have a long list and I’m using this function to display the highest sales in rank order. Not sure if this is the best way to do it, but it’s how I first learned how to do it. So my formula is this: =VLOOKUP(LARGE($C$4:$C$10001,513),$C$4:$E$10001,3,FALSE).What I would like for it to do is to have the 513 number to increment to 514 when I autofill to the next row. Then to 516, 517, etc. I have been manually changing this number the whole time, but it’s a little tedious. I have a similar problem with using this formula as well: =LARGE($C$4:$C$1002,513). As stated above, I’d like the 513 rank number to adjust to 514 when I autofill the next line. Here is a screen shot that might help to see what I have. The highlighted yellow cell is my VLOOKUP function and the cell to the right is my LARGE function: Any help with this would be greatly appreciated. Thank you! Read More
Problem with discord and Microsoft Edge.
So, i recently started using edge as a default browser. There were some minor issues but i fixed it, i like the design, everything works great and stuff.
But i just realised a problem with sharing screen on discord. Specifically, Edge as a window, discord (app) just restarts. While, if i share entire screen, it works fine. Tested on different browsers, games, app, updated discord, any other program works fine. Did anyone encounter same problem? I assume it has something to do with privacy or something, but it is annoying.
So, i recently started using edge as a default browser. There were some minor issues but i fixed it, i like the design, everything works great and stuff. But i just realised a problem with sharing screen on discord. Specifically, Edge as a window, discord (app) just restarts. While, if i share entire screen, it works fine. Tested on different browsers, games, app, updated discord, any other program works fine. Did anyone encounter same problem? I assume it has something to do with privacy or something, but it is annoying. Read More
Surface Hub 2 MTRoW – how do you now factory wipe the device?
As above how would you now factory reset, do you need a USB stick of some sort creating?
As above how would you now factory reset, do you need a USB stick of some sort creating? Read More
Help with a copilot task
I’m trying to prompt copilot to create a 3-5 page word document based on content in a OneDrive folder trained on a sample of up to 20 documents saved in another folder. How would this be organized and prompted? thanks
I’m trying to prompt copilot to create a 3-5 page word document based on content in a OneDrive folder trained on a sample of up to 20 documents saved in another folder. How would this be organized and prompted? thanks Read More
Complicated vlookup example
Hi there,
I have a dataset where there are three different columns with names. I would like VLOOKUP or INDEX, etc to use those names (in 3 different columns) and search for all 3 against one column in another sheet. Once the matches are found, I’d like to get the contents in the column 8 rows to the right for all 3 names in one cell (so merged). Example, if I have names Sam, Sophia and Liz and the columns 8 rows to the right had the following
Sam – Jungle Group
Sophia – Safari Group
Liz – Forest Group
I’d like the input to return to be Jungle Group, Safari Group, Forest Group
Let me know if this is possible.
Hi there, I have a dataset where there are three different columns with names. I would like VLOOKUP or INDEX, etc to use those names (in 3 different columns) and search for all 3 against one column in another sheet. Once the matches are found, I’d like to get the contents in the column 8 rows to the right for all 3 names in one cell (so merged). Example, if I have names Sam, Sophia and Liz and the columns 8 rows to the right had the following Sam – Jungle GroupSophia – Safari GroupLiz – Forest Group I’d like the input to return to be Jungle Group, Safari Group, Forest Group Let me know if this is possible. Read More
Shifted to Edge from chrome – 3 things I miss
Hi I shifted to edge, after using chrome for 10 years.
1. Edge should allow multiple user profiles on Mobile like chrome and multiple microsoft account login.
2. Please give user choice to which extensions to import while importing chrome data. ?
3. Please make new tab page clean, it took me 1 month to make it clean and now my edge is faster than chrome.
Hi I shifted to edge, after using chrome for 10 years. 1. Edge should allow multiple user profiles on Mobile like chrome and multiple microsoft account login. 2. Please give user choice to which extensions to import while importing chrome data. ?3. Please make new tab page clean, it took me 1 month to make it clean and now my edge is faster than chrome. Read More
Azure AI Services on AKS
Host your AI Language Containers and Web Apps on Azure Kubernetes Cluster: Flask Web App Sentiment Analysis
In this post, we’ll explore how to integrate Azure AI Containers into our applications running on Azure Kubernetes Service (AKS). Azure AI Containers enable you to harness the power of Azure’s AI services directly within your AKS environment, giving you complete control over where your data is processed. By streamlining the deployment process and ensuring consistency, Azure AI Containers simplify the integration of cutting-edge AI capabilities into your applications. Whether you’re developing tools for education, enhancing accessibility, or creating innovative user experiences, this guide will show you how to seamlessly incorporate Azure’s AI Containers into your web apps running on AKS.
Why Containers ?
Azure AI services provides several Docker containers that let you use the same APIs that are available in Azure, on-premises. Using these containers gives you the flexibility to bring Azure AI services closer to your data for compliance, security or other operational reasons. Container support is currently available for a subset of Azure AI services.
Azure AI Containers offer:
Immutable infrastructure: Consistent and reliable system parameters for DevOps teams, with flexibility to adapt and avoid configuration drift.Data control: Choose where data is processed, essential for data residency or security requirements.Model update control: Flexibility in versioning and updating deployed models.Portable architecture: Deploy on Azure, on-premises, or at the edge, with Kubernetes support.High throughput/low latency: Scale for demanding workloads by running Azure AI services close to data and logic.Scalability: Built on scalable cluster technology like Kubernetes for high availability and adaptable performance.
Source: https://learn.microsoft.com/en-us/azure/ai-services/cognitive-services-container-support
Workshop
Our Solution will utilize the Azure Language AI Service with the Text Analytics container for Sentiment Analysis. We will build a Python Flask Web App, containerize it with Docker and push it to Azure Container Registry. An AKS Cluster which we will create, will pull the Flask Image along with the Microsoft provided Sentiment Analysis Image directly from mcr.microsoft.com and we will make all required configurations on our AKS Cluster to have an Ingress Controller with SSL Certificate presenting a simple Web UI to write our Text, submit it for analysis and get the results. Our Web UI will look like this:
Azure Kubernetes Cluster, Azure Container Registry & Azure Text Analytics
These are our main resources and a Virtual Network of course for the AKS which is deployed automatically. Our Solution is hosted entirely on AKS with a Let’s Encrypt Certificate we will create separately offering secure HTTP with an Ingress Controller serving publicly our Flask UI which is calling via REST the Sentiment Analysis service, also hosted on AKS. The difference is that Flask is build with a custom Docker Image pulled from Azure Container Registry, while the Sentiment Analysis is a Microsoft ready Image which we pull directly.
In case your Azure Subscription does not have an AI Service you have to create a Language Service of Text Analytics using the Portal due to the requirement to accept the Responsible AI Terms. For more detail go to https://go.microsoft.com/fwlink/?linkid=2164190 .
My preference as a best practice, is to create an AKS Cluster with the default System Node Pool and add an additional User Node Pool to deploy my Apps, but it is really a matter of preference at the end of the day. So let’s start deploying! Start from your terminal by logging in with az login and set your Subscription with az account set –subscription ‘YourSubName”
## Change the values in < > with your values and remove < >!
## Create the AKS Cluster
az aks create
–resource-group <your-resource-group>
–name <your-cluster-name>
–node-count 1
–node-vm-size standard_a4_v2
–nodepool-name agentpool
–generate-ssh-keys
–nodepool-labels nodepooltype=system
–no-wait
–aks-custom-headers AKSSystemNodePool=true
–network-plugin azure
## Add a User Node Pool
az aks nodepool add
–resource-group <your-resource-group>
–cluster-name <your-cluster-name>
–name userpool
–node-count 1
–node-vm-size standard_d4s_v3
–no-wait
## Create Azure Container Registry
az acr create
–resource-group <your-resource-group>
–name <your-acr-name>
–sku Standard
–location northeurope
## Attach ACR to AKS
az aks update -n <your-cluster-name> -g <your-resource-group> –attach-acr <your-acr-name>
The Language Service is created from the Portal for the reasons we explained earlier. Search for Language and create a new Language service leaving the default selections ( No Custom QnA, no Custom Text Classification) on the F0 (Free) SKU. You may see a VNET menu appear in the Networking Tab, just ignore it, as long as you leave the default Public Access enabled it won’t create a Virtual Network. The presence of the Cloud Resource is for Billing and Metrics.
A Flask Web App has a directory structure where we store index.html in the Templates directory and our CSS and images in the Static directory. So in essence it looks like this:
-sentiment-aks
–flaskwebapp
app.py
requirements.txt
Dockerfile
—static
1.style.css
2.logo.png
—templates
1.index.html
The requirements.txt should have the needed packages :
## requirements.txt
Flask==3.0.0
requests==2.31.0## index.html
<!DOCTYPE html>
<html>
<head>
<title>Sentiment Analysis App</title>
<link rel=”stylesheet” type=”text/css” href=”{{ url_for(‘static’, filename=’style.css’) }}”>
</head>
<body>
<img src=”{{ url_for(‘static’, filename=’logo.png’) }}” class=”icon” alt=”App Icon”>
<h2>Sentiment Analysis</h2>
<form id=”textForm”>
<textarea name=”text” placeholder=”Enter text here…”></textarea>
<button type=”submit”>Analyze</button>
</form>
<div id=”result”></div>
<script>
document.getElementById(‘textForm’).onsubmit = async function(e) {
e.preventDefault();
let formData = new FormData(this);
let response = await fetch(‘/analyze’, {
method: ‘POST’,
body: formData
});
let resultData = await response.json();
let results = resultData.results;
if (results) {
let displayText = `Document: ${results.document}nSentiment: ${results.overall_sentiment}n`;
displayText += `Confidence – Positive: ${results.confidence_positive}, Neutral: ${results.confidence_neutral}, Negative: ${results.confidence_negative}`;
document.getElementById(‘result’).innerText = displayText;
} else {
document.getElementById(‘result’).innerText = ‘No results to display’;
}
};
</script>
</body>
</html>## style.css
body {
font-family: Arial, sans-serif;
background-color: #f0f8ff; /* Light blue background */
margin: 0;
padding: 0;
display: flex;
flex-direction: column;
align-items: center;
justify-content: center;
height: 100vh;
}
h2 {
color: #0277bd; /* Darker blue for headings */
}
.icon {
height: 100px; /* Adjust the size as needed */
margin-top: 20px; /* Add some space above the logo */
}
form {
background-color: white;
padding: 20px;
border-radius: 8px;
width: 300px;
box-shadow: 0 2px 4px rgba(0, 0, 0, 0.1);
}
textarea {
width: 100%;
box-sizing: border-box;
height: 100px;
margin-bottom: 10px;
border: 1px solid #0277bd;
border-radius: 4px;
padding: 10px;
}
button {
background-color: #029ae4; /* Blue button */
color: white;
border: none;
padding: 10px 15px;
border-radius: 4px;
cursor: pointer;
}
button:hover {
background-color: #0277bd;
}
#result {
margin-top: 20px;
}
And here is the most interesting file, our app.py. Notice the use of a REST API call directly to the Sentiment Analysis endpoint which we will declare in the YAML file for the Kubernetes deployment.
## app.py
from flask import Flask, render_template, request, jsonify
import requests
import os
app = Flask(__name__)
@app.route(‘/’, methods=[‘GET’])
def index():
return render_template(‘index.html’) # HTML file with input form
@app.route(‘/analyze’, methods=[‘POST’])
def analyze():
# Extract text from the form submission
text = request.form[‘text’]
if not text:
return jsonify({‘error’: ‘No text provided’}), 400
# Fetch API endpoint and key from environment variables
endpoint = os.environ.get(“CONTAINER_API_URL”)
# Ensure required configurations are available
if not endpoint:
return jsonify({‘error’: ‘API configuration not set’}), 500
# Construct the full URL for the sentiment analysis API
url = f”{endpoint}/text/analytics/v3.1/sentiment”
headers = {
‘Content-Type’: ‘application/json’
}
body = {
‘documents’: [{‘id’: ‘1’, ‘language’: ‘en’, ‘text’: text}]
}
# Make the HTTP POST request to the sentiment analysis API
response = requests.post(url, json=body, headers=headers)
if response.status_code != 200:
return jsonify({‘error’: ‘Failed to analyze sentiment’}), response.status_code
# Process the API response
data = response.json()
results = data[‘documents’][0]
detailed_results = {
‘document’: text,
‘overall_sentiment’: results[‘sentiment’],
‘confidence_positive’: results[‘confidenceScores’][‘positive’],
‘confidence_neutral’: results[‘confidenceScores’][‘neutral’],
‘confidence_negative’: results[‘confidenceScores’][‘negative’]
}
# Return the detailed results to the client
return jsonify({‘results’: detailed_results})
if __name__ == ‘__main__’:
app.run(host=’0.0.0.0′, port=5001, debug=False)
And finally we need a Dockerfile, pay attention to have it on the same level as your app.py file.
## Dockerfile
# Use an official Python runtime as a parent image
FROM python:3.10-slim
# Set the working directory in the container
WORKDIR /app
# Copy the current directory contents into the container at /app
COPY . /app
# Install any needed packages specified in requirements.txt
RUN pip install –no-cache-dir -r requirements.txt
# Make port 5001 available to the world outside this container
EXPOSE 5001
# Define environment variable
ENV CONTAINER_API_URL=”http://sentiment-service/”
# Run app.py when the container launches
CMD [“python”, “app.py”]
Our Web UI is ready to build ! We need Docker running on our development environment and we need to login to Azure Container Registry:
## Login to ACR
az acr login -n <your-acr-name>
## Build and Tag our image
docker build -t <acr-name>.azurecr.io/flaskweb:latest .
docker push <acr-name>.azurecr.io/flaskweb:latest
You can go to the Portal and from Azure Container Registry, Repositories you will find our new Image ready to be pulled!
Kubernetes Deployments
Let’s start deploying our AKS services ! As we already know we can pull the Sentiment Analysis Container from Microsoft directly and that’s what we are going to do with the following tasks. First, we need to login to our AKS Cluster so from Azure Portal head over to your AKS Cluster and click on the Connect link on the menu. Azure will provide the command to connect from our terminal:
Select Azure CLI and just copy-paste the commands to your Terminal.
Now we can run kubectl commands and manage our Cluster and AKS Services.
We need a YAML file for each service we are going to build, including the Certificate at the end. For now let’s create the Sentiment Analysis Service, as a Container, with the following file. Pay attention as you need to get the Language Service Key and Endpoint from the Text Analytics resource we created earlier, and in the nodeSelector block we must enter the name of the User Node Pool we created.
apiVersion: apps/v1
kind: Deployment
metadata:
name: sentiment-deployment
spec:
replicas: 1
selector:
matchLabels:
app: sentiment
template:
metadata:
labels:
app: sentiment
spec:
containers:
– name: sentiment
image: mcr.microsoft.com/azure-cognitive-services/textanalytics/sentiment:latest
ports:
– containerPort: 5000
resources:
limits:
memory: “8Gi”
cpu: “1”
requests:
memory: “8Gi”
cpu: “1”
env:
– name: Eula
value: “accept”
– name: Billing
value: “https://<your-Language-Service>.cognitiveservices.azure.com/”
– name: ApiKey
value: “xxxxxxxxxxxxxxxxxxxx”
nodeSelector:
agentpool: userpool
—
apiVersion: v1
kind: Service
metadata:
name: sentiment-service
spec:
selector:
app: sentiment
ports:
– protocol: TCP
port: 5000
targetPort: 5000
type: ClusterIP
Save the file and run from your Terminal:
kubectl apply -f sentiment-deployment.yaml
In a few seconds you can observe the service running from the AKS Services and Ingresses menu.
Let’s continue to bring our Flask Container now. In the same manner create a new YAML:
apiVersion: apps/v1
kind: Deployment
metadata:
name: flask-service
spec:
replicas: 1
selector:
matchLabels:
app: flask
template:
metadata:
labels:
app: flask
spec:
containers:
– name: flask
image: <your-ACR-name>.azurecr.io/flaskweb:latest
ports:
– containerPort: 5001
env:
– name: CONTAINER_API_URL
value: “http://sentiment-service:5000”
resources:
requests:
cpu: “500m”
memory: “256Mi”
limits:
cpu: “1”
memory: “512Mi”
nodeSelector:
agentpool: userpool
—
apiVersion: v1
kind: Service
metadata:
name: flask-lb
spec:
type: LoadBalancer
selector:
app: flask
ports:
– protocol: TCP
port: 80
targetPort: 5001
kubectl apply -f flask-service.yaml
Observe the Sentiment Analysis Environment Value. It is directly using the Service name of our Sentiment Analysis container as AKS has it’s own DNS resolver for easy communication between services. In fact if we hit the Service Public IP we will have HTTP access to the Web UI.
But let’s see how we can import our Certificate. We won’t describe how to get a Certificate. All we need is the PEM files, meaning the privatekey.pem and the cert.pem. IF we have a PFX we can export them with OpenSSL. Once we have these files in place we will create a secret in AKS that will hold our Certificate key and file. We just need to run this command from within the directory of our PEM files:
kubectl create secret tls flask-app-tls –key privkey.pem –cert cert.pem –namespace default
Once we create our Secret we will deploy a Kubernetes Ingress Controller which will manage HTTPS and will point to the Flask Service. Remember to add an A record to your DNS registrar with the DNS Hostname you are going to use and the Public IP, once you see the IP Address:
apiVersion: networking.k8s.io/v1
kind: Ingress
metadata:
name: flask-app-ingress
namespace: default
spec:
tls:
– hosts:
– your.domain.host
secretName: flask-app-tls
rules:
– host: your.domain.host
http:
paths:
– path: /
pathType: Prefix
backend:
service:
name: flask-lb
port:
number: 80
kubectl apply -f flask-app-ingress.yaml
From AKS – Services and Ingresses – Ingresses you will see the assigned Public IP. Add it to your DNS and once the Name Servers are updated you can hit your Hostname using HTTPS!
Final Thoughts
As we’ve explored, the combination of Azure AI Containers and AKS offers a powerful and flexible solution for deploying AI-driven applications in cloud-native environments. By leveraging these technologies, you gain granular control over your data and model deployments, while maintaining the scalability and portability essential for modern applications. Remember, this is just the starting point. As you delve deeper, consider the specific requirements of your project and explore the vast possibilities that Azure AI Containers unlock. Embrace the power of AI within your AKS deployments, and you’ll be well on your way to building innovative, intelligent solutions that redefine what’s possible in the cloud.
Architecture
Host your AI Language Containers and Web Apps on Azure Kubernetes Cluster: Flask Web App Sentiment Analysis In this post, we’ll explore how to integrate Azure AI Containers into our applications running on Azure Kubernetes Service (AKS). Azure AI Containers enable you to harness the power of Azure’s AI services directly within your AKS environment, giving you complete control over where your data is processed. By streamlining the deployment process and ensuring consistency, Azure AI Containers simplify the integration of cutting-edge AI capabilities into your applications. Whether you’re developing tools for education, enhancing accessibility, or creating innovative user experiences, this guide will show you how to seamlessly incorporate Azure’s AI Containers into your web apps running on AKS.Why Containers ?Azure AI services provides several Docker containers that let you use the same APIs that are available in Azure, on-premises. Using these containers gives you the flexibility to bring Azure AI services closer to your data for compliance, security or other operational reasons. Container support is currently available for a subset of Azure AI services.Azure AI Containers offer:Immutable infrastructure: Consistent and reliable system parameters for DevOps teams, with flexibility to adapt and avoid configuration drift.Data control: Choose where data is processed, essential for data residency or security requirements.Model update control: Flexibility in versioning and updating deployed models.Portable architecture: Deploy on Azure, on-premises, or at the edge, with Kubernetes support.High throughput/low latency: Scale for demanding workloads by running Azure AI services close to data and logic.Scalability: Built on scalable cluster technology like Kubernetes for high availability and adaptable performance.Source: https://learn.microsoft.com/en-us/azure/ai-services/cognitive-services-container-supportWorkshopOur Solution will utilize the Azure Language AI Service with the Text Analytics container for Sentiment Analysis. We will build a Python Flask Web App, containerize it with Docker and push it to Azure Container Registry. An AKS Cluster which we will create, will pull the Flask Image along with the Microsoft provided Sentiment Analysis Image directly from mcr.microsoft.com and we will make all required configurations on our AKS Cluster to have an Ingress Controller with SSL Certificate presenting a simple Web UI to write our Text, submit it for analysis and get the results. Our Web UI will look like this: Azure Kubernetes Cluster, Azure Container Registry & Azure Text AnalyticsThese are our main resources and a Virtual Network of course for the AKS which is deployed automatically. Our Solution is hosted entirely on AKS with a Let’s Encrypt Certificate we will create separately offering secure HTTP with an Ingress Controller serving publicly our Flask UI which is calling via REST the Sentiment Analysis service, also hosted on AKS. The difference is that Flask is build with a custom Docker Image pulled from Azure Container Registry, while the Sentiment Analysis is a Microsoft ready Image which we pull directly.In case your Azure Subscription does not have an AI Service you have to create a Language Service of Text Analytics using the Portal due to the requirement to accept the Responsible AI Terms. For more detail go to https://go.microsoft.com/fwlink/?linkid=2164190 .My preference as a best practice, is to create an AKS Cluster with the default System Node Pool and add an additional User Node Pool to deploy my Apps, but it is really a matter of preference at the end of the day. So let’s start deploying! Start from your terminal by logging in with az login and set your Subscription with az account set –subscription ‘YourSubName” ## Change the values in < > with your values and remove < >!
## Create the AKS Cluster
az aks create
–resource-group <your-resource-group>
–name <your-cluster-name>
–node-count 1
–node-vm-size standard_a4_v2
–nodepool-name agentpool
–generate-ssh-keys
–nodepool-labels nodepooltype=system
–no-wait
–aks-custom-headers AKSSystemNodePool=true
–network-plugin azure
## Add a User Node Pool
az aks nodepool add
–resource-group <your-resource-group>
–cluster-name <your-cluster-name>
–name userpool
–node-count 1
–node-vm-size standard_d4s_v3
–no-wait
## Create Azure Container Registry
az acr create
–resource-group <your-resource-group>
–name <your-acr-name>
–sku Standard
–location northeurope
## Attach ACR to AKS
az aks update -n <your-cluster-name> -g <your-resource-group> –attach-acr <your-acr-name> The Language Service is created from the Portal for the reasons we explained earlier. Search for Language and create a new Language service leaving the default selections ( No Custom QnA, no Custom Text Classification) on the F0 (Free) SKU. You may see a VNET menu appear in the Networking Tab, just ignore it, as long as you leave the default Public Access enabled it won’t create a Virtual Network. The presence of the Cloud Resource is for Billing and Metrics. A Flask Web App has a directory structure where we store index.html in the Templates directory and our CSS and images in the Static directory. So in essence it looks like this: -sentiment-aks
–flaskwebapp
app.py
requirements.txt
Dockerfile
—static
1.style.css
2.logo.png
—templates
1.index.html The requirements.txt should have the needed packages : ## requirements.txt
Flask==3.0.0
requests==2.31.0## index.html
<!DOCTYPE html>
<html>
<head>
<title>Sentiment Analysis App</title>
<link rel=”stylesheet” type=”text/css” href=”{{ url_for(‘static’, filename=’style.css’) }}”>
</head>
<body>
<img src=”{{ url_for(‘static’, filename=’logo.png’) }}” class=”icon” alt=”App Icon”>
<h2>Sentiment Analysis</h2>
<form id=”textForm”>
<textarea name=”text” placeholder=”Enter text here…”></textarea>
<button type=”submit”>Analyze</button>
</form>
<div id=”result”></div>
<script>
document.getElementById(‘textForm’).onsubmit = async function(e) {
e.preventDefault();
let formData = new FormData(this);
let response = await fetch(‘/analyze’, {
method: ‘POST’,
body: formData
});
let resultData = await response.json();
let results = resultData.results;
if (results) {
let displayText = `Document: ${results.document}nSentiment: ${results.overall_sentiment}n`;
displayText += `Confidence – Positive: ${results.confidence_positive}, Neutral: ${results.confidence_neutral}, Negative: ${results.confidence_negative}`;
document.getElementById(‘result’).innerText = displayText;
} else {
document.getElementById(‘result’).innerText = ‘No results to display’;
}
};
</script>
</body>
</html>## style.css
body {
font-family: Arial, sans-serif;
background-color: #f0f8ff; /* Light blue background */
margin: 0;
padding: 0;
display: flex;
flex-direction: column;
align-items: center;
justify-content: center;
height: 100vh;
}
h2 {
color: #0277bd; /* Darker blue for headings */
}
.icon {
height: 100px; /* Adjust the size as needed */
margin-top: 20px; /* Add some space above the logo */
}
form {
background-color: white;
padding: 20px;
border-radius: 8px;
width: 300px;
box-shadow: 0 2px 4px rgba(0, 0, 0, 0.1);
}
textarea {
width: 100%;
box-sizing: border-box;
height: 100px;
margin-bottom: 10px;
border: 1px solid #0277bd;
border-radius: 4px;
padding: 10px;
}
button {
background-color: #029ae4; /* Blue button */
color: white;
border: none;
padding: 10px 15px;
border-radius: 4px;
cursor: pointer;
}
button:hover {
background-color: #0277bd;
}
#result {
margin-top: 20px;
} And here is the most interesting file, our app.py. Notice the use of a REST API call directly to the Sentiment Analysis endpoint which we will declare in the YAML file for the Kubernetes deployment. ## app.py
from flask import Flask, render_template, request, jsonify
import requests
import os
app = Flask(__name__)
@app.route(‘/’, methods=[‘GET’])
def index():
return render_template(‘index.html’) # HTML file with input form
@app.route(‘/analyze’, methods=[‘POST’])
def analyze():
# Extract text from the form submission
text = request.form[‘text’]
if not text:
return jsonify({‘error’: ‘No text provided’}), 400
# Fetch API endpoint and key from environment variables
endpoint = os.environ.get(“CONTAINER_API_URL”)
# Ensure required configurations are available
if not endpoint:
return jsonify({‘error’: ‘API configuration not set’}), 500
# Construct the full URL for the sentiment analysis API
url = f”{endpoint}/text/analytics/v3.1/sentiment”
headers = {
‘Content-Type’: ‘application/json’
}
body = {
‘documents’: [{‘id’: ‘1’, ‘language’: ‘en’, ‘text’: text}]
}
# Make the HTTP POST request to the sentiment analysis API
response = requests.post(url, json=body, headers=headers)
if response.status_code != 200:
return jsonify({‘error’: ‘Failed to analyze sentiment’}), response.status_code
# Process the API response
data = response.json()
results = data[‘documents’][0]
detailed_results = {
‘document’: text,
‘overall_sentiment’: results[‘sentiment’],
‘confidence_positive’: results[‘confidenceScores’][‘positive’],
‘confidence_neutral’: results[‘confidenceScores’][‘neutral’],
‘confidence_negative’: results[‘confidenceScores’][‘negative’]
}
# Return the detailed results to the client
return jsonify({‘results’: detailed_results})
if __name__ == ‘__main__’:
app.run(host=’0.0.0.0′, port=5001, debug=False) And finally we need a Dockerfile, pay attention to have it on the same level as your app.py file. ## Dockerfile
# Use an official Python runtime as a parent image
FROM python:3.10-slim
# Set the working directory in the container
WORKDIR /app
# Copy the current directory contents into the container at /app
COPY . /app
# Install any needed packages specified in requirements.txt
RUN pip install –no-cache-dir -r requirements.txt
# Make port 5001 available to the world outside this container
EXPOSE 5001
# Define environment variable
ENV CONTAINER_API_URL=”http://sentiment-service/”
# Run app.py when the container launches
CMD [“python”, “app.py”] Our Web UI is ready to build ! We need Docker running on our development environment and we need to login to Azure Container Registry: ## Login to ACR
az acr login -n <your-acr-name>
## Build and Tag our image
docker build -t <acr-name>.azurecr.io/flaskweb:latest .
docker push <acr-name>.azurecr.io/flaskweb:latest You can go to the Portal and from Azure Container Registry, Repositories you will find our new Image ready to be pulled!Kubernetes DeploymentsLet’s start deploying our AKS services ! As we already know we can pull the Sentiment Analysis Container from Microsoft directly and that’s what we are going to do with the following tasks. First, we need to login to our AKS Cluster so from Azure Portal head over to your AKS Cluster and click on the Connect link on the menu. Azure will provide the command to connect from our terminal: Select Azure CLI and just copy-paste the commands to your Terminal.Now we can run kubectl commands and manage our Cluster and AKS Services.We need a YAML file for each service we are going to build, including the Certificate at the end. For now let’s create the Sentiment Analysis Service, as a Container, with the following file. Pay attention as you need to get the Language Service Key and Endpoint from the Text Analytics resource we created earlier, and in the nodeSelector block we must enter the name of the User Node Pool we created. apiVersion: apps/v1
kind: Deployment
metadata:
name: sentiment-deployment
spec:
replicas: 1
selector:
matchLabels:
app: sentiment
template:
metadata:
labels:
app: sentiment
spec:
containers:
– name: sentiment
image: mcr.microsoft.com/azure-cognitive-services/textanalytics/sentiment:latest
ports:
– containerPort: 5000
resources:
limits:
memory: “8Gi”
cpu: “1”
requests:
memory: “8Gi”
cpu: “1”
env:
– name: Eula
value: “accept”
– name: Billing
value: “https://<your-Language-Service>.cognitiveservices.azure.com/”
– name: ApiKey
value: “xxxxxxxxxxxxxxxxxxxx”
nodeSelector:
agentpool: userpool
—
apiVersion: v1
kind: Service
metadata:
name: sentiment-service
spec:
selector:
app: sentiment
ports:
– protocol: TCP
port: 5000
targetPort: 5000
type: ClusterIP Save the file and run from your Terminal:kubectl apply -f sentiment-deployment.yamlIn a few seconds you can observe the service running from the AKS Services and Ingresses menu.Let’s continue to bring our Flask Container now. In the same manner create a new YAML: apiVersion: apps/v1
kind: Deployment
metadata:
name: flask-service
spec:
replicas: 1
selector:
matchLabels:
app: flask
template:
metadata:
labels:
app: flask
spec:
containers:
– name: flask
image: <your-ACR-name>.azurecr.io/flaskweb:latest
ports:
– containerPort: 5001
env:
– name: CONTAINER_API_URL
value: “http://sentiment-service:5000”
resources:
requests:
cpu: “500m”
memory: “256Mi”
limits:
cpu: “1”
memory: “512Mi”
nodeSelector:
agentpool: userpool
—
apiVersion: v1
kind: Service
metadata:
name: flask-lb
spec:
type: LoadBalancer
selector:
app: flask
ports:
– protocol: TCP
port: 80
targetPort: 5001 kubectl apply -f flask-service.yamlObserve the Sentiment Analysis Environment Value. It is directly using the Service name of our Sentiment Analysis container as AKS has it’s own DNS resolver for easy communication between services. In fact if we hit the Service Public IP we will have HTTP access to the Web UI.But let’s see how we can import our Certificate. We won’t describe how to get a Certificate. All we need is the PEM files, meaning the privatekey.pem and the cert.pem. IF we have a PFX we can export them with OpenSSL. Once we have these files in place we will create a secret in AKS that will hold our Certificate key and file. We just need to run this command from within the directory of our PEM files:kubectl create secret tls flask-app-tls –key privkey.pem –cert cert.pem –namespace defaultOnce we create our Secret we will deploy a Kubernetes Ingress Controller which will manage HTTPS and will point to the Flask Service. Remember to add an A record to your DNS registrar with the DNS Hostname you are going to use and the Public IP, once you see the IP Address: apiVersion: networking.k8s.io/v1
kind: Ingress
metadata:
name: flask-app-ingress
namespace: default
spec:
tls:
– hosts:
– your.domain.host
secretName: flask-app-tls
rules:
– host: your.domain.host
http:
paths:
– path: /
pathType: Prefix
backend:
service:
name: flask-lb
port:
number: 80 kubectl apply -f flask-app-ingress.yamlFrom AKS – Services and Ingresses – Ingresses you will see the assigned Public IP. Add it to your DNS and once the Name Servers are updated you can hit your Hostname using HTTPS! Final ThoughtsAs we’ve explored, the combination of Azure AI Containers and AKS offers a powerful and flexible solution for deploying AI-driven applications in cloud-native environments. By leveraging these technologies, you gain granular control over your data and model deployments, while maintaining the scalability and portability essential for modern applications. Remember, this is just the starting point. As you delve deeper, consider the specific requirements of your project and explore the vast possibilities that Azure AI Containers unlock. Embrace the power of AI within your AKS deployments, and you’ll be well on your way to building innovative, intelligent solutions that redefine what’s possible in the cloud.Architecture Read More