Category: Microsoft
Category Archives: Microsoft
盛世公司在线客服咨询lx6789122
盛世集团公司开户注册lx6789122,通常需要遵循以下一般步骤:准备相关资料:可能包括个人或企业身份证明、联系方式、地址证明等。联系盛世集团:通过其官方网站、客服渠道等,了解具体开户注册流程和所需资料要求。填写表格:按照要求如填写相关信息。提交资料:将准备好的资料提交给盛世集团,可能通过线上提交或线下提交的方式。等待审核:公司同意提交的资料进行审核。完成审核:审核通过后,即可成功开户注册。需要注意的是,具体步骤和要求可能因盛世集团的规定而有所不同。建议您与该公司进行详细沟通,以确保顺利完成开户注册。
盛世集团公司开户注册lx6789122,通常需要遵循以下一般步骤:准备相关资料:可能包括个人或企业身份证明、联系方式、地址证明等。联系盛世集团:通过其官方网站、客服渠道等,了解具体开户注册流程和所需资料要求。填写表格:按照要求如填写相关信息。提交资料:将准备好的资料提交给盛世集团,可能通过线上提交或线下提交的方式。等待审核:公司同意提交的资料进行审核。完成审核:审核通过后,即可成功开户注册。需要注意的是,具体步骤和要求可能因盛世集团的规定而有所不同。建议您与该公司进行详细沟通,以确保顺利完成开户注册。 Read More
VBA code copies an Excel template into a specific workbook which is in the cloud – hangs randomly
My client has a small application which effectively, using Access VBA, copies an Excel template to a systematically named excel file, populates it, and then saves the destination excel workbook file. It hangs “randomly” , claiming it cannot find the renamed destination file, at copying time – one filesys copyfile statement – before it has been populated. The Access is running on 1 PC in a lan but the excel files are stored in the cloud – OneDrive. I think that the short process is too fast – sometimes – for the cloud to find / create the destination excel file. I also discovered that a newly arrived Excel advice banner was a problem. I “upgraded” the original Excel template as advised and this issue disappeared. But the “can’t find” issue is not new. Any similar experiences?
My client has a small application which effectively, using Access VBA, copies an Excel template to a systematically named excel file, populates it, and then saves the destination excel workbook file. It hangs “randomly” , claiming it cannot find the renamed destination file, at copying time – one filesys copyfile statement – before it has been populated. The Access is running on 1 PC in a lan but the excel files are stored in the cloud – OneDrive. I think that the short process is too fast – sometimes – for the cloud to find / create the destination excel file. I also discovered that a newly arrived Excel advice banner was a problem. I “upgraded” the original Excel template as advised and this issue disappeared. But the “can’t find” issue is not new. Any similar experiences? Read More
working view v print view
Hi all, just trying to figure out how to work on my sheets (invoices) in the exact view that they will print. For example – I have a cell thats wrapped & when I enter in a description and its too long for the cell it fills to second line. thats all good and I can adjust row height to fit if necessary….. THE PROBLEM i’m having is that when I go to print preview the line that previously filled to the next line actually fits on the one line and prints on the 1 line too. I have been forever checking print previews to see if text fits or if adjustments need to be made prior to emailing out. SO how can i just use the active worksheet in the actual size it will print to avoid the time wastage. hopefully this makes sense. C
Hi all, just trying to figure out how to work on my sheets (invoices) in the exact view that they will print. For example – I have a cell thats wrapped & when I enter in a description and its too long for the cell it fills to second line. thats all good and I can adjust row height to fit if necessary….. THE PROBLEM i’m having is that when I go to print preview the line that previously filled to the next line actually fits on the one line and prints on the 1 line too. I have been forever checking print previews to see if text fits or if adjustments need to be made prior to emailing out. SO how can i just use the active worksheet in the actual size it will print to avoid the time wastage. hopefully this makes sense. C Read More
This PC can’t run Windows
I am building a new computer from scratch and I’m getting an error when trying to install Windows 11.
“This PC can’t run Windows.”
Motherboard: Asus Prime B450M-A II
CPU: AMD Ryzen 3 4100
GPU: GEFORCE GTX 1650
Hard drive: FIKWOT FS810 250GB SSD
RAM: 16GB
Secure boot: Windows UEFI mode
AMD fTPM configuration: Enable Firmware TPM
I’ve tried updating the BIOS, but I’m still not sure why I can’t install Windows 11 with these parts?
I am building a new computer from scratch and I’m getting an error when trying to install Windows 11. “This PC can’t run Windows.” Motherboard: Asus Prime B450M-A IICPU: AMD Ryzen 3 4100GPU: GEFORCE GTX 1650Hard drive: FIKWOT FS810 250GB SSDRAM: 16GBSecure boot: Windows UEFI modeAMD fTPM configuration: Enable Firmware TPM I’ve tried updating the BIOS, but I’m still not sure why I can’t install Windows 11 with these parts? Read More
Look for a safe & best YouTube to MP4 converters wthout ads for Windows 11
I’m currently looking for recommendations on the best YouTube to MP4 converters tool for my Windows 11 (Dell Laptop). I need a tool that is both reliable and efficient, preferably with options to adjust the quality and format of the video. My main concern is finding software that provides high-quality output and isn’t too complicated. I’ve tried a few online services, but they usually come with intrusive ads or limit the length of the download unless you pay for a subscription. Can anyone recommend a good YouTube to MP4 converter that combines functionality and ease of use? I’d be grateful.
I’m currently looking for recommendations on the best YouTube to MP4 converters tool for my Windows 11 (Dell Laptop). I need a tool that is both reliable and efficient, preferably with options to adjust the quality and format of the video. My main concern is finding software that provides high-quality output and isn’t too complicated. I’ve tried a few online services, but they usually come with intrusive ads or limit the length of the download unless you pay for a subscription. Can anyone recommend a good YouTube to MP4 converter that combines functionality and ease of use? I’d be grateful. Read More
Column Format – Time!
Good Afternoon
I’m trying to format a DateTime field with a bit of verve. I got JSON from GitHub but I need to modify as I want to include the time. – The only example that includes what I’m looking for is this:
https://github.com/pnp/List-Formatting/tree/master/column-samples/generic-world-time
But I can’t get it quite to work and I don’t understand why.
This is what I’m aiming for but the time represented is 21 June – 2:30 pm! I’m 14 hours out (Regional Settings are Australia, UTC + 10)
As the original code comes from something that does a time conversion, I thought to take out the bit that adds the conversion factor. I have tried removing “+Number([$UTC]” (and why isn’t it screaming at me anyway for including a reference to a non-existing field?), adding 0, multiplying by one …
“elmType”: “div”,
“children”: [
{
“elmType”: “span”,
“txtContent”: “=padStart(toString(floor(((Number(@currentField)+Number([$UTC])*3600000)%86400000)/3600000)),2,’0′)”
},
{
“elmType”: “span”,
“txtContent”: “:”
},
{
“elmType”: “span”,
“txtContent”: “=padStart(toString(floor(((Number(@currentField)+Number([$UTC])*3600000)%86400000)%3600000/60000)),2,’0′)”
}
]
They all result in something like this:
It’s all a foreign language to me but I’ve worked out:
padStart pads the string at the beginning with 0 to a length of 2 – won’t actually need that for the hour, happy with 1:30 pmNumber I’m guessing turns the date time string into a number which we can then manipulatefloor does some rounding, not sure why that’s neededtoString turns it back into a string for display, I imagine
But I don’t understand why it would need the UTC calculation. And I think it’s the UTC calculation that messes with my times.
If anybody understands this or can point me to some documentation that will help me understand, I would appreciate it.
Thanks
Christine
Good Afternoon I’m trying to format a DateTime field with a bit of verve. I got JSON from GitHub but I need to modify as I want to include the time. – The only example that includes what I’m looking for is this:https://github.com/pnp/List-Formatting/tree/master/column-samples/generic-world-timeBut I can’t get it quite to work and I don’t understand why. This is what I’m aiming for but the time represented is 21 June – 2:30 pm! I’m 14 hours out (Regional Settings are Australia, UTC + 10)As the original code comes from something that does a time conversion, I thought to take out the bit that adds the conversion factor. I have tried removing “+Number([$UTC]” (and why isn’t it screaming at me anyway for including a reference to a non-existing field?), adding 0, multiplying by one … “elmType”: “div”,
“children”: [
{
“elmType”: “span”,
“txtContent”: “=padStart(toString(floor(((Number(@currentField)+Number([$UTC])*3600000)%86400000)/3600000)),2,’0′)”
},
{
“elmType”: “span”,
“txtContent”: “:”
},
{
“elmType”: “span”,
“txtContent”: “=padStart(toString(floor(((Number(@currentField)+Number([$UTC])*3600000)%86400000)%3600000/60000)),2,’0′)”
}
] They all result in something like this:It’s all a foreign language to me but I’ve worked out:padStart pads the string at the beginning with 0 to a length of 2 – won’t actually need that for the hour, happy with 1:30 pmNumber I’m guessing turns the date time string into a number which we can then manipulatefloor does some rounding, not sure why that’s neededtoString turns it back into a string for display, I imagineBut I don’t understand why it would need the UTC calculation. And I think it’s the UTC calculation that messes with my times. If anybody understands this or can point me to some documentation that will help me understand, I would appreciate it.ThanksChristine Read More
Intune
Dear All,
All my endpoints have joined AzureAD. And i have pushed configuration profile and removed all the exisiting administrator permission.
Now the issue is, one of the project user laptops are required limited permission to their devices to change there network settings, like changing their IP’s and setting based on their project.
Now i need to fix this issue through Inune. Kindly help me is there any way to resolve this issue??
All the endpoints are only joined AzureAD. And we do not have any exisiting company domain.
Dear All, All my endpoints have joined AzureAD. And i have pushed configuration profile and removed all the exisiting administrator permission.Now the issue is, one of the project user laptops are required limited permission to their devices to change there network settings, like changing their IP’s and setting based on their project.Now i need to fix this issue through Inune. Kindly help me is there any way to resolve this issue??All the endpoints are only joined AzureAD. And we do not have any exisiting company domain. Read More
AFP program session
I serve on the board of the Association of Fundraising Professionals Collier-Lee Chapter in Florida. We’re planning a session on AI and its use and place in philanthropy. We’d like to invite an organization that helps nonprofits use AI to the panel discussion.
The session will be from 8:30-9:30 a.m. July 23rd.
I serve on the board of the Association of Fundraising Professionals Collier-Lee Chapter in Florida. We’re planning a session on AI and its use and place in philanthropy. We’d like to invite an organization that helps nonprofits use AI to the panel discussion. The session will be from 8:30-9:30 a.m. July 23rd. Read More
Microsoft Travel/ Real Estate Community
Is there a travel or real estate group/community on Microsoft?
Is there a travel or real estate group/community on Microsoft? Read More
Windows 2022 Server may Updates have failed
Hello,
I have a Windows update issue. The may updates have failed. I ran the windows update troubleshooter and it said it made some changes but the updates still fail. I have several screenshots I will add at the bottom. I have tried the updates several times, but nothing worked.
After the reboots to remove the software that could not be updated The Windows Update screen had a error code to research:
0x8007054f
There is plenty of room on the C: drive
I don’t know if the info on the second screen points to the issue or not, It was with the errors Event log area.
Any assistance and suggestions would be greatly appreciated.
Hello, I have a Windows update issue. The may updates have failed. I ran the windows update troubleshooter and it said it made some changes but the updates still fail. I have several screenshots I will add at the bottom. I have tried the updates several times, but nothing worked. After the reboots to remove the software that could not be updated The Windows Update screen had a error code to research: 0x8007054f There is plenty of room on the C: driveI don’t know if the info on the second screen points to the issue or not, It was with the errors Event log area. Any assistance and suggestions would be greatly appreciated. Read More
My post was marked as spam incorrectly by someone…what do I do?
First of all, I am a loyal Windows user, and this is my first time posting here… I tried to create a post to seek help on how to download my own YouTube videos to my Windows 10 computer.
I am a self-media person, and I often need to make some short videos to earn a living. My post received many enthusiastic responses and help from netizens, which touched me a lot. But not long after, my post was marked as pam by others and could not be viewed. Please help in removing from spam list as we need the support on this. Thanks!
How does this community work? How do I prevent this from happening? Please help in removing from spam list as we need the support on this. Thanks!
First of all, I am a loyal Windows user, and this is my first time posting here… I tried to create a post to seek help on how to download my own YouTube videos to my Windows 10 computer. I am a self-media person, and I often need to make some short videos to earn a living. My post received many enthusiastic responses and help from netizens, which touched me a lot. But not long after, my post was marked as pam by others and could not be viewed. Please help in removing from spam list as we need the support on this. Thanks! https://techcommunity.microsoft.com/t5/windows-10/how-to-download-youtube-videos-in-laptop-windows-10/m-p/4152498#M11810 How does this community work? How do I prevent this from happening? Please help in removing from spam list as we need the support on this. Thanks! Read More
NEW Digital Event: Intro to Copilot Partner-led Immersion Experience | June 19
Join us on June 19th to learn how to use the recently launched Partner-Led Copilot Immersion Experience, providing you the ability to demo and help customers with hands-on experiences.
In this session, we will walk you through each of the assets to drive adoption of Copilot for Microsoft 365 by role/persona and provide guidance about the best way to utilize and demo with your customers – showcasing how Copilot can help businesses solve common problems and achieve more.
Please register for your preferred time zone:
Introduction to Copilot Partner Led Immersion Experience | Digital Event Americas/EMEA
June 19, 2024 8:00 AM (America/Los Angeles) | Register here
Introduction to Copilot Partner-Led Immersion Experience | Digital Event APAC
June 19, 2024 5:00 PM (America/Los Angeles) | Register here
REGISTER TODAY!
Join us on June 19th to learn how to use the recently launched Partner-Led Copilot Immersion Experience, providing you the ability to demo and help customers with hands-on experiences.
In this session, we will walk you through each of the assets to drive adoption of Copilot for Microsoft 365 by role/persona and provide guidance about the best way to utilize and demo with your customers – showcasing how Copilot can help businesses solve common problems and achieve more.
Please register for your preferred time zone:
Introduction to Copilot Partner Led Immersion Experience | Digital Event Americas/EMEA
June 19, 2024 8:00 AM (America/Los Angeles) | Register here
Introduction to Copilot Partner-Led Immersion Experience | Digital Event APAC
June 19, 2024 5:00 PM (America/Los Angeles) | Register here Read More
Database mail and connection pool depletion
Database Mail (DBMail) in SQL Server is a feature that allows you to send emails directly from SQL Server, but it has limitations when sending a large volume of emails. The issues you’re experiencing with DBMail not being able to send massive emails could be due to several factors.
DBMail relies on the .NET Framework’s SmtpClient class, and any issues with this underlying component could affect DBMail’s functionality.
The SmtpClient class implementation pools SMTP connections so that it can avoid the overhead of re-establishing a connection for every message to the same server. An application may re-use the same SmtpClient object to send many different emails to the same SMTP server and to many different SMTP servers. As a result, there is no way to determine when an application is finished using the SmtpClient object and it should be cleaned up.
It is important also to notice that t’s important to note that SmtpClient is deprecated in .NET Core and .NET 5.0 and later versions. While it is still available for compatibility reasons, it is recommended to use alternative libraries for new development
As a side effect on the SQL Server database mail, we may face the below error
The mail could not be sent to the recipients because of the mail server failure. (Sending Mail using Account 7 (2024-01-08T16:07:10). Exception Message: 1) Exception Information =================== Exception Type: Microsoft.SqlServer.Management.SqlIMail.MailFramework.Exceptions.BaseMailFrameworkException Message: Cannot send mails to mail server. (The operation has timed out.) Data: System.Collections.ListDictionaryInternal TargetSite: Void Send(Microsoft.SqlServer.Management.SqlIMail.MailFramework.Framework.IMessage) HelpLink: NULL Source: DatabaseMailProtocols HResult: -2146232832 StackTrace Information =================== at Microsoft.SqlServer.Management.SqlIMail.MailFramework.Smtp.SmtpMailSender.Send(IMessage msg) at Microsoft.SqlServer.Management.SqlIMail.Server.Controller.ProfileMailSender.SendMailToAccount(Account a, IMessageSender ms, OutMailItem si) 2) Exception Information =================== Exception Type: System.Net.Mail.SmtpException StatusCode: GeneralFailure Message: The operation has timed out. Data: System.Collections.ListDictionaryInternal TargetSite: Void Send(System.Net.Mail.MailMessage) HelpLink: NULL Source: System HResult: -2146233088 StackTrace Information =================== at System.Net.Mail.SmtpClient.Send(MailMessage message) at Microsoft.SqlServer.Management.SqlIMail.MailFramework.Smtp.SmtpMailSender.Send(IMessage msg). )
As a workaround we can take the below actions
Try to follow all limitations on the mail server side for your mail account and do not exceed them. This is for avoiding any possible exceptions in the SmtpClient <-> Mail Server layer.
For example, if the exchange server is configured a maximum number of concurrent connections, make sure that your script or application does not send number of emails that exceeds that limitation.
If you find that sending mail with DatabaseMail starts to fail, please restart Databasemail.exe. DatabaseMail will resend all failed mails upon restart.
Finally, please note that SMTPClient doesn’t support many modern protocols. and as of now is on compatibility mode-only but doesn’t scale to modern requirements of the protocols
More information on the below link
SmtpClient Class (System.Net.Mail) | Microsoft Learn
Microsoft Tech Community – Latest Blogs –Read More
Deployment rules for Notebooks: Enhancing Efficiency with Microsoft Fabric.
Introduction.
Fabric Notebooks, an integral component of the Microsoft Fabric ecosystem, offer a powerful platform for interactive data exploration, analysis, and collaboration. Designed to enhance productivity and streamline workflows, Fabric Notebooks provide users with a versatile environment to write, execute, and share code and visualizations alongside workspaces.
Fabric Notebooks empower users to engage in interactive data analysis using programming languages such as Python and R. By seamlessly integrating code execution with explanatory text and visualizations, Fabric Notebooks streamline the workflow for data exploration and interpretation. Moreover, notebooks support collaborative work environments, enabling multiple users to collaborate on the same notebook simultaneously.
Utilizing source control through Fabric’s integration with Git facilitates transparency and documentation within teams. This integration enhances the ability to replicate analyses and share findings with stakeholders effectively.
In this article, we describe the relationship between Notebooks and Lakehouses in Fabric, the advantages of implementing source control for notebooks, and delve into the utilization of deployment rules for Microsoft Fabric Notebooks. These rules serve to expedite deployment processes and aid in the compartmentalization of knowledge pertaining to the components comprising an analytical solution.
Notebooks and Lakehouses.
The Microsoft Fabric Notebook is a primary code item for developing Apache Spark jobs and machine learning experiments. It’s a web-based interactive surface used by data scientists and data engineers to write code benefiting from rich visualizations and Markdown text. Data engineers write code for data ingestion, data preparation, and data transformation. Data scientists also use notebooks to build machine learning solutions, including creating experiments and models, model tracking, and deployment.
You can either create a new notebook or import an existing notebook.
Like other standard Fabric item creation processes, you can easily create a new notebook from the Fabric Data Engineering homepage, the workspace New option, or the Create Hub.
You can create new notebooks or import one or more existing notebooks from your local computer to a Fabric workspace from the Data Engineering or the Data Science homepage. Fabric notebooks recognize the standard Jupyter Notebook .ipynb files, and source files like .py, .scala, and .sql.
To learn more about notebooks creation see How to use notebooks – Microsoft Fabric | Microsoft Learn
Next figure shows a Notebook1 in a workspace named “Learn”.
The items contained in this workspace can be added to source control if you have integrated it with an ADO Repo, using this feature of Fabric explained at Overview of Fabric Git integration – Microsoft Fabric | Microsoft Learn.
With Git integration, you can back up and version your notebook, revert to previous stages as needed, collaborate or work alone using Git branches, and manage your notebook content lifecycle entirely within Fabric.
Fabric notebooks now support close interactions with Lakehouses.
Microsoft Fabric Lakehouse is a data architecture platform for storing, managing, and analyzing structured and unstructured data in a single location.
You can easily add a new or existing lakehouse to a notebook from the Lakehouse explorer:
We can create the Notebook from the Lakehouse as well, that way that is the default Lakehouse of that notebook.
Notebooks can previously exist with different codes to analyze data, and we can select which notebook the Lakehouse is going to be analyzed with, either with one or with several notebooks.
Here is an example of a notebook and its associated Lakehouse.
So, a Lakehouse can be analyzed with one or more notebooks and vice versa, a notebook is used to analyze one or more lakehouses, but the notebook can have a pinned Lakehouse which is the default Lakehouse where notebook codes are applied to store, transform and visualize data.
You can navigate to different lakehouses in the Lakehouse explorer and set one lakehouse as the default by pinning it. Your default is then mounted to the runtime working directory, and you can read or write to the default lakehouse using a local path.
The default lakehouse in a notebook is typically managed in the configuration settings of the notebook’s code. You can set or overwrite the default lakehouse for the current session programmatically using a configuration block in your notebook. Here’s an example of how you might configure it:
%%configure
{
“defaultLakehouse”: {
“name”: “your-lakehouse-name”, # The name of your lakehouse
# “id”: “<(optional) lakehouse-id>”, # The ID of your lakehouse (optional)
# “workspaceId”: “<(optional) workspace-id-that-contains-the-lakehouse>” # The workspace ID if it’s from another workspace (optional)
}
}
This code snippet should be placed at the beginning of your notebook to set the default lakehouse for the session. If you’re using a relative path to access data from the lakehouse, the default lakehouse will serve as the root folder at runtime.
Change the default lakehouse of a notebook using Fabric User Interface.
Using just the UI, in the Lakehouse list, the pin icon next to the name of a Lakehouse indicates that it’s the default Lakehouse in your current notebook.
After several lakehouses have been added to the notebook, you can pin the lakehouse that you want to manage by default. Click on the pin and the previously added lakehouses appear:
To switch to a different default lakehouse, move the pin icon.
Notebooks inside Deployment Pipelines.
You can define a deployment pipeline in the “Learn” workspace, considering this workspace as the Development stage.
To learn more about Fabric’s deployment pipelines you can read Microsoft Fabric: Integration with ADO Repos and Deployment Pipelines – A Power BI Case Study.
Creating a deployment pipeline looks like this:
When you select “Create”, you can assign the desired workspace to the Development Stage.
After being assigned, you can see three stages, and begin to deploy the items to the next stages. Deploying creates a new workspace if it doesn’t exist. See the next three images.
How can you use “deployment rules” for notebooks in Test and in Production stages?
In relation to notebooks, Fabric lets users define deployment rules associated with specific notebooks. These rules allow users to customize the default lakehouse where the notebook is utilized.
Here are the steps to follow.
Select the icon at the upper right corner of the workspace, as seen in the next figure.
2. You see the notebooks created in your workspace after deploying content from Development workspace:
3. Select the notebook you want to deploy to Production but changing the default lakehouse it will work with:
4. Add a rule to change the original default lakehouse this notebook has in Test workspace, to another lakehouse that already exists in Production. When you choose to adopt other lakehouses in the target environment, Lakehouse ID is a must have. You can find the ID of a lakehouse from the lakehouse URL link.
5. Press Deploy. This action allows the production stage to manage the codes of the notebook to which you applied a deployment rule, referred to a different Lakehouse, which will serve as the basis for all the definitive analyses that will be viewed by all members of the organization, stakeholders and authorized users.
You can also deploy content backwards, from a later stage in the deployment pipeline, to an earlier one, so you can use deployment rules for a notebook in a Production workspace to change its default Lakehouse if you need that the notebook be deployed from Production to Test.
Summary:
Microsoft Fabric notebooks are highly significant for data scientists.
They provide a comprehensive environment for completing the entire data science process, from data exploration and preparation to experimentation, modeling, and serving predictive insights.
With tools like Lakehouse, data scientists can easily attach to a notebook to browse and interact with data, streamlining the process of reading data into data frames for analysis .
With Git integration, you can back up and version your notebook, revert to previous stages as needed, collaborate or work alone using Git branches, and manage your notebook content lifecycle entirely within Fabric.
Deployment rules applied to notebooks in Microsoft Fabric are used to manage the application lifecycle, particularly when deploying content between different stages such as development, test, and production. This feature streamlines the development process and ensures quality and consistency during deployment.
You can find more information in the following resources:
How To Create NOTEBOOK in Microsoft Fabric | Apache Spark | PySpark – YouTube
How to use notebooks – Microsoft Fabric | Microsoft Learn
Develop, execute, and debug notebook in VS Code – Microsoft Fabric | Microsoft Learn
Explore the lakehouse data with a notebook – Microsoft Fabric | Microsoft Learn
Notebook source control and deployment – Microsoft Fabric | Microsoft Learn
Solved: How to set default Lakehouse in the notebook progr… – Microsoft Fabric Community
Microsoft Tech Community – Latest Blogs –Read More
AI-as-a-Service: Architecting GenAI Application Governance with Azure API Management and Fabric
The past year has seen explosive growth for Azure OpenAI and large language models in general. With models reliant on a token-based approach for processing requests, ensuring prompt engineering is being done correctly, tracking what models and api’s are being used, load balancing across multiple instances, and creating chargeback models has become increasingly important. The use of Azure API Management (APIM) is key to solving these challenges. There have been several announcements specific to the integration of Azure Open AI and APIM during Microsoft Build 2024 to make them easier to use together.
As the importance of evaluating analytics and performing data science against Azure Open AI based workloads grows, storing usage information is critical. That’s where adding Microsoft Fabric and the Lakehouse to the architecture comes in. Capturing the usage data in an open format for long term storage while enabling fast querying rounds out the overall solution.
We must also consider that not all use cases will require the use of a Large Language Model (LLM). The recent rise of Small Language Models (SLM), such as Phi-3, for use cases that do not require LLMs, means there will very likely be multiple types of Generative AI (GenAI) models in use for a typical enterprise and they will all be exposed through a centrally secured and governed set of APIs enabling every GenAI use case for rapid onboarding and adoption. Having an AI Center of Enablement framework providing “AI-as-a-Service” will be incredibly important for organizations to safely enable different GenAI models quickly and their numerous versions all within the allocated budget or by using a chargeback model that can span across the enterprise regardless of the number of teams consuming the AI services and the number of subscriptions or environments they end up requiring.
This model will also allow organizations to have complete consumption visibility if they purchase Provisioned Throughput Units (PTU) for their GenAI workloads in production (at scale with predictable latency and without having to worry about noisy neighbors) when each of the individual AI use cases/business units are not able to purchase it entirely on their own. This true economy of scale can be achieved with this same architecture where PTU is purchased for a particular Azure OpenAI model deployment and is shared among all Business-critical production use cases.
The overall architecture for this “AI-as-a-Service” solution is as follows:
Flow:
A client makes a request to an AI model through Azure API Management using a subscription key that is unique to them. This allows multiple clients to share the same AI model instance and yet we can uniquely identify each one of them. Clients could be different Business Units or Internal/External consumers or product lines.
Azure API Management forwards the request to the AI model and receives the output of the model.
Azure API Management logs the subscription details and request/response data to Event Hubs using a log-to-eventhub policy.
Using the Realtime Intelligence experience in Microsoft Fabric, an Eventstream processor reads the data from Event Hubs.
The output of the stream is written to a managed Delta table in a Lakehouse.
After creating a view of the Delta table in the Sql Analytics endpoint for the Lakehouse, it can now be queried by Power BI. We can also use a Notebook to perform any data science requirements against the prompt data
Build out
Create an Event Hub logger in API Management.
In the API that exposes AI backend, add policy that sends the data to the event hub. This example shows Azure OpenAI as the backend.
<policies>
<inbound>
<base />
<authentication-managed-identity resource=”https://cognitiveservices.azure.com” output-token-variable-name=”msi-access-token” ignore-error=”false” />
<set-header name=”Authorization” exists-action=”override”>
<value>@(“Bearer ” + (string)context.Variables[“msi-access-token”])</value>
</set-header>
<set-variable name=”requestBody” value=”@(context.Request.Body.As<string>(preserveContent: true))” />
</inbound>
<backend>
<base />
</backend>
<outbound>
<base />
<choose>
<when condition=”@(context.Response.StatusCode == 200)”>
<log-to-eventhub logger-id=”ai-usage”>@{
var responseBody = context.Response.Body?.As<string>(true);
var requestBody = (string)context.Variables[“requestBody”];
return new JObject(
new JProperty(“EventTime”, DateTime.UtcNow),
new JProperty(“AppSubscriptionKey”, context.Request.Headers.GetValueOrDefault(“api-key”,string.Empty)),
new JProperty(“Request”, requestBody),
new JProperty(“Response”,responseBody )
).ToString();
}</log-to-eventhub>
</when>
</choose>
</outbound>
<on-error>
<base />
</on-error>
</policies>
Build an Eventstream in Fabric that lands the data into the Delta table.
The data comes across a bit too raw to use for analytics, but with the SQL Analytics endpoint, we can create views overtop of the table.CREATE OR ALTER VIEW [dbo].[AIUsageView] AS
SELECT CAST(EventTime AS DateTime2) AS [EventTime],
[AppSubscriptionKey],
JSON_VALUE([Response], ‘$.object’) AS [Operation],
JSON_VALUE([Response], ‘$.model’) AS [Model],
[Request],
[Response],
CAST(JSON_VALUE([Response], ‘$.usage.completion_tokens’) AS INT) AS [CompletionTokens],
CAST(JSON_VALUE([Response], ‘$.usage.prompt_tokens’) AS INT) AS [PromptTokens],
CAST(JSON_VALUE([Response], ‘$.usage.total_tokens’) AS INT) AS [TotalTokens]
FROM
[YOUR_LAKEHOUSE_NAME].[dbo].[AIData]
We can now create a report using a DirectLake query from Power BI
We can also load the data into a Spark dataframe to perform data science analysis on the prompts and responses.
You can find more detailed instructions on building this on our GitHub sample.
A Landing Zone Accelerator is also available that shows how to build the underlying foundation infrastructure in an enterprise way.
Alternative Designs
Azure Cosmos DB for NoSQL to persist Chat History – If your application is already storing Chat history (prompts & completions) in Azure Cosmos DB for NoSQL, you don’t need to log the requests and responses to Event Hub from APIM policy again. In that case, you can simply log the key metrics to Event Hub (e.g. Client Identifier, Deployment Type, Tokens consumed etc.) and source the prompts and completions from Cosmos DB for advanced analytics. The new preview feature of Mirroring a Cosmos DB can simplify this process.
Here is a code sample to parse the response body and log the token consumption through APIM policies.
<log-to-eventhub logger-id=”ai-usage”>@{
return new JObject(
new JProperty(“TotalTokens”, context.Response.Body.As<JObject>(preserveContent: true).SelectToken(“usage.total_tokens”).ToString())
).ToString();
}</log-to-eventhub>
Once the raw token counts and API consumer (e.g. different Business Units using the AI-as-a-Service model) info is logged into Event Hub and it makes its way into Fabric Lakehouse, aggregate measures can be created directly on top of the Semantic model (default or custom) and displayed in a Power BI dashboard of your choice. An example of such aggregate measure is as follows:
TokensByBU = CALCULATE(SUMX(
aoaichargeback,
VALUE(MAX(aoaichargeback[TotalTokens]))
),
ALLEXCEPT(aoaichargeback, aoaichargeback[BusinessUnitName]))
Here aoaichargeback is the name of the Lakehouse table where all events emitted from APIM are stored. TokensByBU measure calculates the sum of the maximum TotalTokens value for each BusinessUnitName in the aoaichargeback table.
Since both the chat history data and the key usage/performance metrics is in Lakehouse, they can be combined & used for any advanced analytical purposes. Similar approaches (earlier in the Article) of utilizing the Fabric Lakehouse SQL Analytics endpoint can be used for analyzing and governing the persisted data.
2. Azure OpenAI Emit Token Metric Policy – With the recent announcement of GenAI Gateway capabilities in Azure API Management – a set of features designed specifically for GenAI use cases, we can now get key Azure OpenAI consumption metrics straight out of our App Insight namespace when this feature is enabled and implemented. A new policy <azure-openai-emit-token-metric> can now be used for sending the Azure OpenAI token count metrics to Application Insights along with User ID, Client IP, and API ID as dimensions.
Microsoft Tech Community – Latest Blogs –Read More
Merge cells containing the same information in multiple tables with macro
Hi everyone, I’m new to this and I’m trying to figure out how to do the following.
I have an excel file, the first sheet is the main table and from this table different subtables are formed and each table is on a different sheet.
For a better visualization of the data, the user needs to see these excel tables with the records grouped together.
All tables have the same columns only their data changes.
As you can see in the first image the table has each record separated in each row but the user wants to view it like the second image in which the table is merged in the records that contain the same information.
I already tried to do this in power automate with an office script connector but I couldn’t get it to work and the macro I have is the following, but it only does this merge in the range of cells that I specify and only in a single table.
Sub MergeSameCell()
Set myRange=(“DataDesk”)
MergeSame:
For Each Cell In myRange
If cell.Value = cell.Offset(1, 0).Value And Not IsEmpty(cell) Then
Range(cell, cell.Offset(1, 0)).Merge
cell.VerticalAlignment = xlCenter
GoTo MergeSame
End If
Next cell
End Sub
I would greatly appreciate your help.
Hi everyone, I’m new to this and I’m trying to figure out how to do the following.I have an excel file, the first sheet is the main table and from this table different subtables are formed and each table is on a different sheet.For a better visualization of the data, the user needs to see these excel tables with the records grouped together.All tables have the same columns only their data changes.As you can see in the first image the table has each record separated in each row but the user wants to view it like the second image in which the table is merged in the records that contain the same information. I already tried to do this in power automate with an office script connector but I couldn’t get it to work and the macro I have is the following, but it only does this merge in the range of cells that I specify and only in a single table.Sub MergeSameCell()
Set myRange=(“DataDesk”)
MergeSame:
For Each Cell In myRange
If cell.Value = cell.Offset(1, 0).Value And Not IsEmpty(cell) Then
Range(cell, cell.Offset(1, 0)).Merge
cell.VerticalAlignment = xlCenter
GoTo MergeSame
End If
Next cell
End SubI would greatly appreciate your help. Read More
Adding DevOps Wiki content to M365 Copilot
We have a lot of knowledge in Devops Wiki and I want that to be exposed to M365 Copilot. So, when I ask question from copilot in Teams/M365, it uses the knowledge from wikis.
I have already created a connection to Devops Wiki using Search & Intelligence >> Data Sources >> Connections. Is that enough to add the Devops Wiki to Copilot knowledge?
We have a lot of knowledge in Devops Wiki and I want that to be exposed to M365 Copilot. So, when I ask question from copilot in Teams/M365, it uses the knowledge from wikis. I have already created a connection to Devops Wiki using Search & Intelligence >> Data Sources >> Connections. Is that enough to add the Devops Wiki to Copilot knowledge? Read More
Azure Licence – Bringing it Back to In-House
Hi
Our company is using Azure P1 Licence and underneath there are Power Automate , Sharepoint, Exchange , Business Premium licence for users.
I am trying to find a way to bring it all back so Microsoft charges it directly instead of having it invoiced from our managed service provider.
I looked at O365 Admin Center and I can add a payment method.
If I were to add a company credit card in , Do i then linked all the user licences ( Automate,Sharepoint, Business Premium) in the credit card?
We also have Azure P1 Licence and I can’t see it under the licence tab in o365 admin center so I don’t know whether that can be linked to the company credit card?
I did a search online and all seems to be pointing at licences under O365 admin center but I don’t see Azure P1 licence?
HiOur company is using Azure P1 Licence and underneath there are Power Automate , Sharepoint, Exchange , Business Premium licence for users. I am trying to find a way to bring it all back so Microsoft charges it directly instead of having it invoiced from our managed service provider. I looked at O365 Admin Center and I can add a payment method.If I were to add a company credit card in , Do i then linked all the user licences ( Automate,Sharepoint, Business Premium) in the credit card?We also have Azure P1 Licence and I can’t see it under the licence tab in o365 admin center so I don’t know whether that can be linked to the company credit card? I did a search online and all seems to be pointing at licences under O365 admin center but I don’t see Azure P1 licence? Read More
Unable to access partner account due to lost 2FA phone
I lost my phone used for 2FA for my Microsoft Partner account. I’ve tried repeatedly to recover my account, providing multiple supporting documentation, but have not been able to do so. I’ve not yet been able to find a phone number to talk with a real person, and all the ‘Contact Us’ links I can find require you log onto your account in order to access the content, but I cannot log on due to 2FA method lost. Is this what’s considered Catch 22? Is anybody here able to provide a method of contacting a Microsoft Partner representative without 2FA?
I lost my phone used for 2FA for my Microsoft Partner account. I’ve tried repeatedly to recover my account, providing multiple supporting documentation, but have not been able to do so. I’ve not yet been able to find a phone number to talk with a real person, and all the ‘Contact Us’ links I can find require you log onto your account in order to access the content, but I cannot log on due to 2FA method lost. Is this what’s considered Catch 22? Is anybody here able to provide a method of contacting a Microsoft Partner representative without 2FA? Read More