Category: News
Excel 365 insert comments show up reverse order (Oldest at top, Newest at bottom)
Hoping anyone can help.
Sometimes our team collaboration causes a single cell to have ~100-200 replies.
I can’t understand why Excel made is so that old comments show up first and new ones, way way way at the bottom.
Does anyone know if there is any way to reverse this?
Thank you!
Hoping anyone can help.Sometimes our team collaboration causes a single cell to have ~100-200 replies.I can’t understand why Excel made is so that old comments show up first and new ones, way way way at the bottom.Does anyone know if there is any way to reverse this?Thank you! Read More
Select applications where COpilot M 365 is enabled
HI
Is it possible for a 365 admin to select-unselect M 365 apps where the Copilot button is displayed ? For instance having Copiloit activated only in Teams & Outlook.
IMHO answer is NO, but I would like to get a confirmation from experts.
Thanks in advance.
HIIs it possible for a 365 admin to select-unselect M 365 apps where the Copilot button is displayed ? For instance having Copiloit activated only in Teams & Outlook.IMHO answer is NO, but I would like to get a confirmation from experts.Thanks in advance. Read More
Consolidating Multiple Related Company Email Databases
If I have multiple subsidiaries with respective employee email [address] accounts at each subsidiary, am I able, at the parent company level, to consolidate and have access to a consolidated database of email addresses for all subsidiaries?
If I have multiple subsidiaries with respective employee email [address] accounts at each subsidiary, am I able, at the parent company level, to consolidate and have access to a consolidated database of email addresses for all subsidiaries? Read More
Unable to initialize ExchangeService instance
I am trying to write a program that checks for changes in a mailbox’s calendar every time it runs and updates a database. My problem begins at the very top of the assignment when I initialize the ExchangeService instance and any function I try to run on it returns
The request failed. The remote server returned an error: (401) .
I initialized it like this:
private static void initExchangeService()
{
m_ExchangeService = new ExchangeService();
m_ExchangeService.UseDefaultCredentials = true;
m_ExchangeService.KeepAlive = true;
string domain = ConfigurationManager.AppSettings[“Domain”];
m_ExchangeService.Credentials = new System.Net.NetworkCredential(CREATOR_MAIL, CREATOR_PASSWORD, domain);
m_ExchangeService.ImpersonatedUserId = new ImpersonatedUserId(ConnectingIdType.SmtpAddress, CREATOR_MAIL);
if (string.IsNullOrEmpty(EXCHANGE_URL))
{
m_ExchangeService.AutodiscoverUrl(CREATOR_MAIL);
}
else
{
m_ExchangeService.Url = new Uri(EXCHANGE_URL);
}
writeToLog($”Initialized exchange service for {CREATOR_MAIL}”, MESSAGE);
}
The mailbox I am trying to access has 2-step authentication disabled and I worked with the system admin to ensure on our end that I have the permissions I need to run this but any address I try (even non real ones) returns the same error.
I’d love some help figuring out what the problem is.
I am trying to write a program that checks for changes in a mailbox’s calendar every time it runs and updates a database. My problem begins at the very top of the assignment when I initialize the ExchangeService instance and any function I try to run on it returns The request failed. The remote server returned an error: (401) . I initialized it like this: private static void initExchangeService()
{
m_ExchangeService = new ExchangeService();
m_ExchangeService.UseDefaultCredentials = true;
m_ExchangeService.KeepAlive = true;
string domain = ConfigurationManager.AppSettings[“Domain”];
m_ExchangeService.Credentials = new System.Net.NetworkCredential(CREATOR_MAIL, CREATOR_PASSWORD, domain);
m_ExchangeService.ImpersonatedUserId = new ImpersonatedUserId(ConnectingIdType.SmtpAddress, CREATOR_MAIL);
if (string.IsNullOrEmpty(EXCHANGE_URL))
{
m_ExchangeService.AutodiscoverUrl(CREATOR_MAIL);
}
else
{
m_ExchangeService.Url = new Uri(EXCHANGE_URL);
}
writeToLog($”Initialized exchange service for {CREATOR_MAIL}”, MESSAGE);
} The mailbox I am trying to access has 2-step authentication disabled and I worked with the system admin to ensure on our end that I have the permissions I need to run this but any address I try (even non real ones) returns the same error. I’d love some help figuring out what the problem is. Read More
Refresh power query without opening file
Good afternoon,
I would like to update my power query without having to be in the file to trigger the refresh. I have excel workbooks that live inside share point. I would like the power query to refresh when a new row of data is added to the source data. The file where the power query will live will not be opened everyday. I would like this process to occur in the background without having to open the file. Please let me know if you have a solution.
Good afternoon,I would like to update my power query without having to be in the file to trigger the refresh. I have excel workbooks that live inside share point. I would like the power query to refresh when a new row of data is added to the source data. The file where the power query will live will not be opened everyday. I would like this process to occur in the background without having to open the file. Please let me know if you have a solution. Read More
After change of list title Power Automate does not recognize it through environment variable
Dear ladies and gentlemen,
I have created a list in SharePoint Online only with Latin characters, because I would like to have a nice URL like this: “Abkuerzungen”.
Then I have changed this list title to “Abkürzungen”.
When I use the SharePoint Object “When element has been created or changed” in German it is “Wenn ein Element erstellt oder geändert wird” and select directly this list with the changed name everything works fine.
But when I try to use the changed name of this list through an environment variable I receive the error 404 “List not found”.
I have reproduced this behavior with more lists.
I would say it is a bug.
Thank you
Ladislav Stupak
Dear ladies and gentlemen, I have created a list in SharePoint Online only with Latin characters, because I would like to have a nice URL like this: “Abkuerzungen”. Then I have changed this list title to “Abkürzungen”. When I use the SharePoint Object “When element has been created or changed” in German it is “Wenn ein Element erstellt oder geändert wird” and select directly this list with the changed name everything works fine.But when I try to use the changed name of this list through an environment variable I receive the error 404 “List not found”. I have reproduced this behavior with more lists. I would say it is a bug. Thank youLadislav Stupak Read More
Announcing UNISTR and || operator in Azure SQL Database – preview
We are excited to announce that the UNISTR intrinsic function and ANSI SQL concatenation operator (||) are now available in public preview in Azure SQL Database. The UNISTR function allows you to escape Unicode characters, making it easier to work with international text. The ANSI SQL concatenation operator (||) provides a simple and intuitive way to combine characters or binary strings. These new features will enhance your ability to manipulate and work with text data.
What is UNISTR function?
The UNISTR function takes a text literal or an expression of characters and Unicode values, that resolves to character data and returns it as a UTF-8 or UTF-16 encoded string. This function allows you to use Unicode codepoint escape sequences with other characters in the string. The escape sequence for a Unicode character can be specified in the form of xxxx or +xxxxxx, where xxxx is a valid UTF-16 codepoint value, and xxxxxx is a valid Unicode codepoint value. This is especially useful for inserting data into NCHAR columns.
The syntax of the UNISTR function is as follows:
UNISTR ( ‘character_expression’ [ , ‘unicode_escape_character’ ] )
The data type of character_expression could be char, nchar, varchar, or nvarchar. For char and varchar data types, the collation should be a valid UTF-8 collation only.
A single character representing a user-defined Unicode escape sequence. If not supplied, the default value is .
Examples
Example #1:
For example, the following query returns the Unicode character for the specified value:
— All the queries below will produce the same output.
SELECT UNISTR(N’Hello D83DDE00′);
SELECT UNISTR(N’048ello +01F603′);
SELECT UNISTR(N’04806506C06C06F +01F603′);
Results:
——————————-
Hello
Example #2:
In this example, the UNISTR function is used with a user-defined escape character ($) and a VARCHAR data type with UTF-8 collation.
SELECT UNISTR (‘I $2665 Azure SQL.’ COLLATE Latin1_General_100_CI_AS_KS_SC_UTF8, ‘$’);
Results:
——————————-
I ♥ Azure SQL.
The legacy collations with code page can be identified using the query below:
SELECT DISTINCT p.language, p.codepage
FROM sys.fn_helpcollations() AS c CROSS APPLY (VALUES(LEFT(c.name, CHARINDEX(‘_’, c.name)-1), COLLATIONPROPERTY(c.name, ‘codepage’))) AS p(language, codepage)
WHERE p.codepage NOT IN (0 /* Unicode Only collation */, 65001 /* UTF-8 code page */);
The legacy collations with code page can be identified using the query below:
SELECT DISTINCT p.language, p.codepage
FROM sys.fn_helpcollations() AS c CROSS APPLY (VALUES(LEFT(c.name, CHARINDEX(‘_’, c.name)-1), COLLATIONPROPERTY(c.name, ‘codepage’))) AS p(language, codepage)
WHERE p.codepage NOT IN (0 /* Unicode Only collation */, 65001 /* UTF-8 code page */);
What is ANSI SQL concatenation operator (||)?
The ANSI SQL concatenation operator (||) concatenates two or more characters or binary strings, columns, or a combination of strings and column names into one expression. The || operator does not honor the SET CONCAT_NULL_YIELDS_NULL option and always behaves as if the ANSI SQL behavior is enabled. This operator will work with character strings or binary data of any supported SQL Server collation. The || operator supports compound assignment ||= similar to +=. If the operands are of incompatible collation, then an error will be thrown. The collation behavior is identical to the CONCAT function of character string data.
The syntax of the string concatenation operator is as follows:
expression || expression
The expression is a character or binary expression. Both expressions must be of the same data type, or one expression must be able to be implicitly converted to the data type of the other expression. If one operand is of binary type, then an unsupported operand type error will be thrown.
Examples
Example #1:
For example, the following query concatenates two strings and returns the result:
SELECT ‘Hello ‘ || ‘ World!’;
Results:
——————————-
Hello World!
Example #2:
In this example, multiple character strings are concatenated. If at least one input is a character string, non-character strings will be implicitly converted to character strings.
SELECT ‘Josè’ || ‘ Doe’ AS full_name,
‘Order-‘ || CAST(1001 AS VARCHAR) || ‘~TS~’ || current_timestamp || ‘~’ || NEWID() AS order_details,
‘Item-‘ || NULL AS item_desc;
Results:
——————————-
full_name order_details item_desc
Josè Doe Order-1001~TS~Jun 1 2024 6:25AM~442A4706-0002-48EC-84FC-8AF27XXXX NULL
Example #3:
In the example below, concatenating two or more binary strings and also compounding with T-SQL assignment operator.
DECLARE @v1 VARBINARY(10) = 0x1a;
SET @v1 ||= 0x2b;
SELECT @v1 as V1, 0x|| 0x4E as B1, CAST(NEWID() AS VARBINARY) || 0xa5 as B2
Results:
——————————-
V1 B1 B2
0x1A2B 0x4E 0xAE8C602E951AC245ADE767A23C834704A5
Example #4:
As shown in the example below, using the || operator with only non-character types or combining binary data with other types is not supported.
SELECT 1|| 2;
SELECT 1|| ‘a’ || 0x4e;
Above queries will fail with error messages as below –
The data types int and int are incompatible in the concat operator.
The data types varchar and varbinary are incompatible in the concat operator.
Conclusion
In this blog post, we have introduced the UNISTR function and ANSI SQL concatenation operator (||) in Azure SQL Database. The UNISTR function allows you to escape Unicode characters, making it easier to work with international text. ANSI SQL concatenation operator (||) provides a simple and intuitive way to combine characters or binary data. These new features will enhance your ability to manipulate and work with text data efficiently.
We hope you will explore these enhancements, apply them in your projects, and share your feedback with us to help us continue improving. Thank you!
Microsoft Tech Community – Latest Blogs –Read More
Using patternCustom to plot antenna Radiation Pattern in one figure
I’m using MATALAB R2020A with the Antenna toolbox to generate 2-D Plots of the measured Radiation Pattern (power in dBi) of an antenna, saved in an excel file. I wish to plot the Z-X Plane, and Z-Y Plane slice of the RP, but can’t seem to understand how to combine the "right side" and "left side" pattern to create one plot. As I understand it has to do with using the polarpattern class, but I cant seem to use it correctly. It would also be great to know how to fix the legend, and add a plot title.
clearvars; %clear all worskspace variables
close all; %Close all figures
%% ——Get data from excel sheet———
% %Get Anttena Powerb(dBi) data 13x17array
% filename=’RawAntennaData.xlsx’;
% H = table2array(readtable(filename,’Sheet’,’Horizontal’,’Range’,’B2:R14′));%
% %Get Theta (Elevation) and Phi (Azimuth) data, respectively a 13×1 and 1×17 array
% El= table2array(readtable(filename,’Sheet’,’Sheet1′,’Range’,’A2:A14′));%Elevation, "Vertical" Angle from 0 to pi
% Az = table2array(readtable(filename,’Sheet’,’Sheet1′,’Range’,’B1:R1′));%Azismuth, "Horizontal" angle -pi to pi
%% — Get data for Helix type antenna Radiation pattern—-
h = helix;
[H,Az,El] = pattern(h,2e9);
theta = El;
phi = Az’;
MagE = H’;
%% 3-D Plot of Antenna Radiation pattern
figure;
patternCustom(MAgE, theta, phi);% 3D plot in Polar cordinates
%% —– 2-D Plot of Z-X Plane (Theta) Cut with phi/Azimuth=90, in Polar coordinates ——-
figure; % Plot Z-X RP data points (for Y=0, phi=0°) at +X axis (right side)
A= patternCustom(MagE, theta, phi,’CoordinateSystem’,’polar’,’Slice’,’phi’,’SliceValue’,0);
%p = polarpattern(‘gco’);
%P = polarpattern(p,’TitleTop’,’Polar Pattern of Monopole’);
hold on
%figure; % Plot Z-X RP data points (for Y=0, phi=180°) at -X axis (left side)
B=patternCustom(MagE, theta, phi,’CoordinateSystem’,’polar’,’Slice’,’phi’,’SliceValue’,180);
hold off
legend(‘A’, ‘B’);I’m using MATALAB R2020A with the Antenna toolbox to generate 2-D Plots of the measured Radiation Pattern (power in dBi) of an antenna, saved in an excel file. I wish to plot the Z-X Plane, and Z-Y Plane slice of the RP, but can’t seem to understand how to combine the "right side" and "left side" pattern to create one plot. As I understand it has to do with using the polarpattern class, but I cant seem to use it correctly. It would also be great to know how to fix the legend, and add a plot title.
clearvars; %clear all worskspace variables
close all; %Close all figures
%% ——Get data from excel sheet———
% %Get Anttena Powerb(dBi) data 13x17array
% filename=’RawAntennaData.xlsx’;
% H = table2array(readtable(filename,’Sheet’,’Horizontal’,’Range’,’B2:R14′));%
% %Get Theta (Elevation) and Phi (Azimuth) data, respectively a 13×1 and 1×17 array
% El= table2array(readtable(filename,’Sheet’,’Sheet1′,’Range’,’A2:A14′));%Elevation, "Vertical" Angle from 0 to pi
% Az = table2array(readtable(filename,’Sheet’,’Sheet1′,’Range’,’B1:R1′));%Azismuth, "Horizontal" angle -pi to pi
%% — Get data for Helix type antenna Radiation pattern—-
h = helix;
[H,Az,El] = pattern(h,2e9);
theta = El;
phi = Az’;
MagE = H’;
%% 3-D Plot of Antenna Radiation pattern
figure;
patternCustom(MAgE, theta, phi);% 3D plot in Polar cordinates
%% —– 2-D Plot of Z-X Plane (Theta) Cut with phi/Azimuth=90, in Polar coordinates ——-
figure; % Plot Z-X RP data points (for Y=0, phi=0°) at +X axis (right side)
A= patternCustom(MagE, theta, phi,’CoordinateSystem’,’polar’,’Slice’,’phi’,’SliceValue’,0);
%p = polarpattern(‘gco’);
%P = polarpattern(p,’TitleTop’,’Polar Pattern of Monopole’);
hold on
%figure; % Plot Z-X RP data points (for Y=0, phi=180°) at -X axis (left side)
B=patternCustom(MagE, theta, phi,’CoordinateSystem’,’polar’,’Slice’,’phi’,’SliceValue’,180);
hold off
legend(‘A’, ‘B’); I’m using MATALAB R2020A with the Antenna toolbox to generate 2-D Plots of the measured Radiation Pattern (power in dBi) of an antenna, saved in an excel file. I wish to plot the Z-X Plane, and Z-Y Plane slice of the RP, but can’t seem to understand how to combine the "right side" and "left side" pattern to create one plot. As I understand it has to do with using the polarpattern class, but I cant seem to use it correctly. It would also be great to know how to fix the legend, and add a plot title.
clearvars; %clear all worskspace variables
close all; %Close all figures
%% ——Get data from excel sheet———
% %Get Anttena Powerb(dBi) data 13x17array
% filename=’RawAntennaData.xlsx’;
% H = table2array(readtable(filename,’Sheet’,’Horizontal’,’Range’,’B2:R14′));%
% %Get Theta (Elevation) and Phi (Azimuth) data, respectively a 13×1 and 1×17 array
% El= table2array(readtable(filename,’Sheet’,’Sheet1′,’Range’,’A2:A14′));%Elevation, "Vertical" Angle from 0 to pi
% Az = table2array(readtable(filename,’Sheet’,’Sheet1′,’Range’,’B1:R1′));%Azismuth, "Horizontal" angle -pi to pi
%% — Get data for Helix type antenna Radiation pattern—-
h = helix;
[H,Az,El] = pattern(h,2e9);
theta = El;
phi = Az’;
MagE = H’;
%% 3-D Plot of Antenna Radiation pattern
figure;
patternCustom(MAgE, theta, phi);% 3D plot in Polar cordinates
%% —– 2-D Plot of Z-X Plane (Theta) Cut with phi/Azimuth=90, in Polar coordinates ——-
figure; % Plot Z-X RP data points (for Y=0, phi=0°) at +X axis (right side)
A= patternCustom(MagE, theta, phi,’CoordinateSystem’,’polar’,’Slice’,’phi’,’SliceValue’,0);
%p = polarpattern(‘gco’);
%P = polarpattern(p,’TitleTop’,’Polar Pattern of Monopole’);
hold on
%figure; % Plot Z-X RP data points (for Y=0, phi=180°) at -X axis (left side)
B=patternCustom(MagE, theta, phi,’CoordinateSystem’,’polar’,’Slice’,’phi’,’SliceValue’,180);
hold off
legend(‘A’, ‘B’); patterncustom, antenna toolbox, polarpattern class, 2d polar plots MATLAB Answers — New Questions
Adams, Simulink, and Reinforcement Learning
Build a model using Adams and combine it with MATLAB’s Simulink for reinforcement learning. However, at the beginning of each training round, the Adams model always continues the previous state. How to make the model start from the initial state in each round?Build a model using Adams and combine it with MATLAB’s Simulink for reinforcement learning. However, at the beginning of each training round, the Adams model always continues the previous state. How to make the model start from the initial state in each round? Build a model using Adams and combine it with MATLAB’s Simulink for reinforcement learning. However, at the beginning of each training round, the Adams model always continues the previous state. How to make the model start from the initial state in each round? adams, simulink, reinforcement learning MATLAB Answers — New Questions
Creating a grid on an image at an angle with a specific spacing
Hey All
The summary is it all. Basically I have an image and a code that draws two "axes" lines a red and blue line. I basically want to form a grid using lines parellel to those in order to find their intersects and plot other circles where they intersect.
I have provided an image for clarification!Hey All
The summary is it all. Basically I have an image and a code that draws two "axes" lines a red and blue line. I basically want to form a grid using lines parellel to those in order to find their intersects and plot other circles where they intersect.
I have provided an image for clarification! Hey All
The summary is it all. Basically I have an image and a code that draws two "axes" lines a red and blue line. I basically want to form a grid using lines parellel to those in order to find their intersects and plot other circles where they intersect.
I have provided an image for clarification! image analysis, image processing, image analyst MATLAB Answers — New Questions
unable to reactivate windows due to windows insider programme
i am unable to reactivate my windows after i had to replace my motherboard with a like for like motherboard but because i was part of the windows insider programme its not allowing me to transfer my activated windows which is bound to my email address i spoke to microsoft and the said it could be fixed from their end by tec support but they have directed me here and told me to somehow remove the windows insider programme from my windows account so they can transfer my activated windows i dont understand how im ment to do that as i dont have access to all windows features and cannot transfer windows because it was a version of windows insider and my current windows which is the same install does not have access to any of the insider features im very frustrated because i was originally told that this could be fixed at microsofts end and that they should be able to transfer my windows activation to my pc but they have now directed me here because they said i needed to remove insider first.
i am unable to reactivate my windows after i had to replace my motherboard with a like for like motherboard but because i was part of the windows insider programme its not allowing me to transfer my activated windows which is bound to my email address i spoke to microsoft and the said it could be fixed from their end by tec support but they have directed me here and told me to somehow remove the windows insider programme from my windows account so they can transfer my activated windows i dont understand how im ment to do that as i dont have access to all windows features and cannot transfer windows because it was a version of windows insider and my current windows which is the same install does not have access to any of the insider features im very frustrated because i was originally told that this could be fixed at microsofts end and that they should be able to transfer my windows activation to my pc but they have now directed me here because they said i needed to remove insider first. Read More
Booking – No availability on this date. Choose another.
Bom dia,
Utilizo Office 365 (online) e tenho página de reservas no Booking, tenho disponibilidade de reservas para terças e quintas entre as 14h00 e as 16h00, com período mínimo de reserva de 24 horas e máximo de 365 dias. Porém, ao tentarem marcar consultas, descobrem que não há horário disponível. Um ponto a ressaltar é que funcionou por aproximadamente 1 mês, após esse período começamos a ter esse problema. Você poderia nos ajudar?
Conforme imagem anexa do integrante da equipe, o horário de trabalho está de acordo com o horário disponível na agenda:
Bom dia,
Utilizo Office 365 (online) e tenho página de reservas no Booking, tenho disponibilidade de reservas para terças e quintas entre as 14h00 e as 16h00, com período mínimo de reserva de 24 horas e máximo de 365 dias. Porém, ao tentarem marcar consultas, descobrem que não há horário disponível. Um ponto a ressaltar é que funcionou por aproximadamente 1 mês, após esse período começamos a ter esse problema. Você poderia nos ajudar?Conforme imagem anexa do integrante da equipe, o horário de trabalho está de acordo com o horário disponível na agenda: Read More
Vulnerabilities List from Defender API not filtering by My Organization
Hi all,
when I consult the api for the list of vulnerabilities detected by Defender using this URL: GET https://api.securitycenter.microsoft.com/api/Vulnerabilities, I get a list of >250K vulnerabilities, but when I consult in the Defender Portal (Vulnerability Management –> Weakness) the list is 11K.
It seems that the API is not filtering vulnerabilites on “my organization”, but the complete list of vulnerabilities known by Defender.
Is there any filter that I can apply to the API URL to bring only vulnerabilities on my organization?
Thanks and best regards,
Alberto Medina
Hi all,when I consult the api for the list of vulnerabilities detected by Defender using this URL: GET https://api.securitycenter.microsoft.com/api/Vulnerabilities, I get a list of >250K vulnerabilities, but when I consult in the Defender Portal (Vulnerability Management –> Weakness) the list is 11K. It seems that the API is not filtering vulnerabilites on “my organization”, but the complete list of vulnerabilities known by Defender. Is there any filter that I can apply to the API URL to bring only vulnerabilities on my organization? Thanks and best regards,Alberto Medina Read More
I cannot delete an old org in teams
Hello,
Unfortunately, I cannot delete an old org in my personal teams, For which I used to work years ago. The switch to make it invisible is grayed out. When I go to my settings and “My Groups”, there is no sign of that org.
What did I miss? There must be a simple explanation 🙂
Hello, Unfortunately, I cannot delete an old org in my personal teams, For which I used to work years ago. The switch to make it invisible is grayed out. When I go to my settings and “My Groups”, there is no sign of that org. What did I miss? There must be a simple explanation 🙂 Read More
Additional Standard Database Template (MS Access)
I was inspired by the Northwind Database and I decided to work on an alternative package to complement the MS Access standard templates. I developed what I am proposing, an additional accounting package template using MS Access which goes all the way to Balance sheet, Profit and Loss, Debtors and Creditors statement, Aging analysis, Project codes, Work in progress, Inventory valuation, Invoicing, Credit notes, General ledger transaction history and analysis. It also comes with user logon, assigning of access rights, locking transacting periods, inventory movement reports, sales analysis by product, by customer, by Sales rep and so much more. I believe it can be a great addition to the pre-loaded MS Access Databases and increase the uptake of Office MIS products
I was inspired by the Northwind Database and I decided to work on an alternative package to complement the MS Access standard templates. I developed what I am proposing, an additional accounting package template using MS Access which goes all the way to Balance sheet, Profit and Loss, Debtors and Creditors statement, Aging analysis, Project codes, Work in progress, Inventory valuation, Invoicing, Credit notes, General ledger transaction history and analysis. It also comes with user logon, assigning of access rights, locking transacting periods, inventory movement reports, sales analysis by product, by customer, by Sales rep and so much more. I believe it can be a great addition to the pre-loaded MS Access Databases and increase the uptake of Office MIS products Read More
CEF Collector ingesting logs to ‘Syslog’ table instead of ‘CommonSecurityLog’
I am forwarding Palo Alto and Fortinet Firewall logs to the CEF Collector but in Sentinel it is showing logs in ‘Syslog’ table instead of ‘CommonSecurityLog’. What could be the issue? Everything is in place including DCR as well.
I am forwarding Palo Alto and Fortinet Firewall logs to the CEF Collector but in Sentinel it is showing logs in ‘Syslog’ table instead of ‘CommonSecurityLog’. What could be the issue? Everything is in place including DCR as well. Read More
Exception for Forwarding External Email to Group with External Senders Blocked
Is it possible to create an exception to allow a single external email address to a group which blocks external senders?
Is it possible to create an exception to allow a single external email address to a group which blocks external senders? Read More
Missing entries in custom log table
We are writing to a custom log table in a few Log Analytics workspaces – these workspaces are targetted by a few different instances of our application (beta/staging/prods, etc).
Interestingly some 3 of these workspaces are missing certain logs while the other 5 or so do have it. There are no exceptions thrown in our asp.net core code where we do a SendMessage to OMS either.
Any ideas if something like this is possible and how to troubleshoot/fix?
Thanks
We are writing to a custom log table in a few Log Analytics workspaces – these workspaces are targetted by a few different instances of our application (beta/staging/prods, etc).Interestingly some 3 of these workspaces are missing certain logs while the other 5 or so do have it. There are no exceptions thrown in our asp.net core code where we do a SendMessage to OMS either. Any ideas if something like this is possible and how to troubleshoot/fix?Thanks Read More
Create error message when Currency Field exceeds maximum
I have a SharePoint form that I want to set the maximum allowed value and have an error message appear before the form is saved telling the requester that the field exceeds the maximum allowed value. I have tried a variety of things but am not having any luck. The form can’t be saved but it doesn’t inform a user as to why.
I have a SharePoint form that I want to set the maximum allowed value and have an error message appear before the form is saved telling the requester that the field exceeds the maximum allowed value. I have tried a variety of things but am not having any luck. The form can’t be saved but it doesn’t inform a user as to why. Read More
AI+API better together: Benefits & Best Practices using APIs for AI workloads
This blog post will give you an overview of benefits and best practices you will get harnessing APIs and an API Manager solution when integrating AI into your application landscape.
Adding Artificial intelligence (AI) to existing applications is becoming an important part in application development. The correct integration of AI is vital to meet business goals, functional and non-functional requirements and build applications that are efficient to maintain and enhance. APIs (Application Programming Interfaces) play a key part in this integration and an API Manager is fundamental to keep control of the usage, performance, and versioning of APIs – especially in enterprise landscapes.
Quick Refresher: What is an API & API Manager?
An API is a connector between software components, promoting the separation of components by adding an abstraction layer, so someone can interact with a system and use its capability without understanding the internal complexity. Every AI service we leverage is accessed via an API.
An API Manager is a service that manages the API’s lifecycle, acts as single point of entry for all API traffic, and is a place to observe APIs. For AI workloads it is an API gateway that sits between your intelligent app and the AI endpoint. Adding an API Gateway in front of your AI endpoints is a best practice to add functionality without increasing the complexity of your application code. You also create a continuous development environment to increase the agility and speed of bringing new capabilities into production while maintaining older versions.
This blog post will show the benefits and best practices of AI + APIs in 5 key areas:
Performance & Reliability
Security
Caching
Sharing & Monetization
Continuous Development
The best practices in bold are universal and apply to any technology. The detailed explanation focuses on the features of Azure API Management (APIM) and the Azure services surrounding it.
1. Performance & Reliability
If you aim to add AI capability to an existing application, it feels the easiest to just connect an AI Endpoint to an existing app. In fact, a lot of tutorials use this scenario.
While this is a faster setup at the beginning, it leads to challenges and code complexity eventually once application requirements increase or multiple applications use the same AI service. With more calls targeting an AI Endpoint, performance, reliability, and latency will become requirements. Azure AI services have limits and quotas, but exceeding those limits will lead to error responses or unresponsive applications. To ensure a good user experience in production workloads, an API manager between the intelligent app and the AI Endpoint is a best practice.
Azure APIM, acting as an AI Gateway, provides load balancing and monitoring of AI Endpoints to guarantee a consistent and reliable performance of your deployed AI models and your intelligent apps. For the best result, multiple instances of an AI model should be deployed in parallel for requests to be distributed evenly (see Figure 2). The number of instances depends on your business requirements, use cases and forecasted peak traffic scenarios. You can route the traffic randomly or via round robin to load balance it evenly. For a more targeted routing you can distribute traffic.
Distributing requests across multiple AI instances is more than just load balancing. Using built in-policies or writing custom policies in Azure APIM, enables you to route traffic to selected Azure AI Endpoints or forward traffic to a more regional endpoint closer to the user’s location. For more complex workloads, the use of backend pools can add value (see Figure 3). A backend pool defines a group of resources which can be targeted depending on their availability, time to respond or workload. APIM can distribute incoming requests across them based on patterns like the circuit breaker pattern, preventing applications from repeatedly trying to execute an operation that’s likely to fail. Both ways of distribution are a good practice to ensure optimal performance and reliability in case of planned outages (upgrades, maintenance) or unplanned outages (power outages, natural disasters), high traffic scenarios or data residency requirements.
Another method to keep performance high and requests under control is by adding a rate limiting pattern to throttle traffic to AI models. Limiting access by time, IP address, registered API consumer or API key allows you to protect the backend against volume bursts as well as potential denial of service attacks. Applying an AI token-based limit as a policy is a good practice to define throttling tokens per minute and restrict noisy neighbours (see Figure 4).
But rate limiting and load balancing are not enough to ensure high performance. A consistent monitoring of workloads is a fundamental part of operational excellence. This includes health checks of endpoints, connection attempts, request times or failure counts. Azure API Management can help to keep all information in one place by storing analytics and insights of requests in a Log Analytics workspace (see Figure 4). This allows you to gain insights into the usage and performance of the APIs, API operations and how they perform over time, or in different geographic regions. Adding Azure Monitor to the Log Analytics workspace allows you to visualize, query, and archive data coming from APIM, as well as trigger corresponding actions. These actions can be anomaly alerts sent to API operators via push notification, email, SMS, or voice messages on any critical event.
2. Security
Protecting an application is a key requirement for all businesses to prevent data loss, denial of service attacks or unauthorized data access. Security is a multi-layer approach including infrastructure, application, and data. APIs act as one security layer to provide input security, key management, access management, as well as output validation in a central place.
While there is not one right way of adding security, adding input validation at a central point is beneficial for easy maintenance and fast adjustment when it comes to new vulnerabilities. For external facing applications this should include an Application Firewall in front of APIM. Input validation in Azure includes that APIM scans all incoming requests based on rules and regular expressions to protect the backend against malicious activities and vulnerabilities such as SQL injection or cross site scripting. That allows only valid requests to be processed by the AI Endpoints. Validation is not limited to input but can also be used for output security, preventing data to be exposed to external or unauthorized resources or users (see Figure 5).
Access management is another pillar of security. To authenticate access to Azure AI endpoints you can provide the API keys to the developers and give them direct access to the AI Endpoints. This, however, leaves you out of control who is accessing your AI models. A better option is to store API keys in a central place like Azure APIM and create an input policy. Access is then restricted to authorized users and applications.
Microsoft Entra ID (formerly Azure Active Directory) is one cloud based identity and access management solution that can authenticate users and applications by SSO (Single-Sign-on) credentials, password, or an Azure managed identity. For more fine-grained access and as part of a defence-in-depth strategy a backend authorization with OAUTH 2.0 and JWT (JSON web token) validation is a good practice to add.
Within Azure API Management you can also fine tune access rights per user or user groups by adding RBAC (Role based access control). It is good practice to use the built-in roles as a starting point to keep the number of roles as low as possible. If the default roles do not match your company’s needs, custom roles can be created and assigned. Adding users to groups and maintaining the access rights at the group level is another good practice as it minimizes maintenance efforts and increases structure.
3. Caching
Do you have an FAQ page that covers the most common questions? If you do, you likely created it to lower costs for the company and save time for the user. A response cache works the same way and can store previously requested information in memory for a predefined time and scope. Information that does not change frequently or contain sensitive information can be stored and reused. When using a cache, every request from the front-end is analysed semantically to check if an answer is available in the cache. If the semantic search is successful, the response from the cache will be used otherwise the request is forwarded to the AI model and the response is sent to the requesting application and stored in the cache if requirements for caching are met.
There are different caching options (see Figure 7): (1.) Inside the Azure APIM cache for simple use case scenarios or (2.) in an external cache like Redis Cache for more control over the cache configurations.
To get insights into frequently asked questions and cache usage, analytics data can be collected with Application Insights and visualized in real time using Grafana Dashboards. This allows you to identify trends in your intelligent apps and share insights for application improvement with decision makers and model fine tuning with engineering teams.
4. Sharing, Chargeback & Monetization
Divide and conquer is a common IT paradigm which can help you with your AI use cases. Sharing content and learnings across divisions rather than working in isolation and repeating similar work increases the speed of innovation and decreases the costs for development of new IP (intellectual property). While this is not possible in every company, most organizations would welcome a more collaborative approach especially when developing and testing new AI use cases. Developing tailored AI components in a central team and reuse them throughout the company will add speed and agility. But how do you track the usage across all divisions and share costs?
Once you have overcome the difficult cultural aspect of sharing information across divisions, charging back costs will be mainly an engineering problem. With APIM, you can bill and chargeback per API usage. Depending on how you want to chargeback or monetize your AI capability, you have different billing methods to choose from; Subscription and Metered. With Subscription billing, the user pays a fixed fee upfront and uses the service according to the terms and condition, like a video streaming service. This billing model gives you, as the API owner, a predictable income and capacity planning. Conversely, with Metered billing, the user pays according to the frequency of their activity, similar to an energy bill. This option gives the user more freedom to only pay what they use but it is more suited for organisations with highly scalable infrastructure set ups, as Metered billing can make scaling out AI instances more complex.
Monitoring the analytics of each call can help with scaling and optimization. Without accessing the content itself monitoring gives you a powerful tool to track real time analytics. Through outbound policies the analytics data can be streamed with Event Hub to PowerBI to create real time dashboards or to Application Insights to view the token usage for each client (see Figure 8). This information can help to automate internal chargeback or generate revenue by monetizing IP. An optional integration of 3rd party payment providers facilitate the payments. This solves the cost question. But once you share your IP widely, how can you ensure high performance for all users?
Limiting the requests per user, token or time (as explained in the Performance section) controls how many requests a user can send based on policies. This gives each project the right performance for the APIs they use. Sending the requests to a specific AI instance based on the project’s development stage, helps you to balance performance and costs. For example, Dev/Test workloads can be directed to the less expensive Pay as you Go instances when latency is not critical, while production workloads can be directed to AI Endpoints that use provisioned throughput units (PTUs) (see Figure 9). This allocated infrastructure is ideal for production applications that need consistent response times and throughput. By using the capacity planner to plan the size for your PTU, you will have a reserved AI instance that suits your workloads. Future increases in traffic can be routed to either another PTU instance or a Pay as you go instance in the same region or another one.
5. Continuous Development
Keeping up with the quick evolving AI Models is challenging as new models come to the market within months. Companies need to choose with every new model available, if they want to stay on a former version or use the newest for their use case. To keep the development lifecycle most efficient, it is a good practice to have separate teams focusing on parts of the application; Divide & conquer. This can mean a parallel development of the consuming application and the corresponding AI capability within a project team or a central AI team sharing their AI capability with the wider company. For either model, using APIs to link the parts is paramount. But the more APIs created, the more complex the API landscape becomes.
A single API Manager is a best practice to manage and monitor all created APIs and provide one source of information for sharing APIs with your developers, allow them to test the API operations and request access. The information shared should include an overview of the available API versions, revisions, and their status, so developers can track changes and switch to a newer API version when needed or convenient for their development. A roadmap is a nice-to-have feature if your development team is comfortable sharing their plans.
While such an overview of APIs can be created anywhere and is still often seen in Wikis, it is best to keep the documentation directly linked to your APIs, so it stays up to date. Azure APIM automatically creates a so called Developer Portal, a customizable webpage containing all the details about the APIs in one place reflecting changes made in APIM immediately, as the two services are linked (see Figure 10). This additional free of charge portal provides significant benefits to API developer and API consumer. The API consumer can view the APIs, the documentation, and conduct tests of all API operations visible to them. The API developer can share additional business information, set up a fine granular access management for the portal and track API usage to get an overview which API versions are actively used and when it is safe to retire older versions or provide long-term support.
Application development is usually brown field, with existing applications or APIs deployed in different environments or on multiple clouds. APIM supports the Import of existing OpenAPI specification and other APIs to facilitate adding all APIs into one API Management. The APIM instances can then be deployed on Azure or other cloud environments as managed service. This allows you and your team to decide when to move workloads if wanted or needed.
Summary
AI-led applications usher in a new era of working, and we’re still in its early stages. This blog post gave insights why AI and APIs are a powerful combination, and how an API Manager can enhance your application to make it more agile, efficient, and reliable. The best practices I covered on performance, security, caching, sharing, and continuous development are based on Microsoft’s recommendations and customer projects I’ve worked on across various industries in the UK and Europe. I hope this guide will help you to design, develop, and maintain your next AI-led application.
Microsoft Tech Community – Latest Blogs –Read More