Month: June 2024
Variables cannot be recognized in app designer
I wrote a program in a script and it works, but it doesn’t work in app designer. The unrecognizable code is shown below (a part of the calling function). Thank you very much for your time.I wrote a program in a script and it works, but it doesn’t work in app designer. The unrecognizable code is shown below (a part of the calling function). Thank you very much for your time. I wrote a program in a script and it works, but it doesn’t work in app designer. The unrecognizable code is shown below (a part of the calling function). Thank you very much for your time. appdesigner MATLAB Answers — New Questions
How to change the rigid transform parameters between two frames during simulation in MultiBody Simscape?
Hi, is there any way how to change the rigid transform parameters between two frames during simulation in MultiBody Simscape? Ideally, I would need a rigid transform block containing a signal input to change the translational offset. I tried to set the offset as a parameter but I couldn’t figure out how to update it during simulation, let’s say using an S-function. Or is there any other trick I could use? Thank you!Hi, is there any way how to change the rigid transform parameters between two frames during simulation in MultiBody Simscape? Ideally, I would need a rigid transform block containing a signal input to change the translational offset. I tried to set the offset as a parameter but I couldn’t figure out how to update it during simulation, let’s say using an S-function. Or is there any other trick I could use? Thank you! Hi, is there any way how to change the rigid transform parameters between two frames during simulation in MultiBody Simscape? Ideally, I would need a rigid transform block containing a signal input to change the translational offset. I tried to set the offset as a parameter but I couldn’t figure out how to update it during simulation, let’s say using an S-function. Or is there any other trick I could use? Thank you! rigid transform, multibody simscape, update transform parameters MATLAB Answers — New Questions
AFP program session
I serve on the board of the Association of Fundraising Professionals Collier-Lee Chapter in Florida. We’re planning a session on AI and its use and place in philanthropy. We’d like to invite an organization that helps nonprofits use AI to the panel discussion.
The session will be from 8:30-9:30 a.m. July 23rd.
I serve on the board of the Association of Fundraising Professionals Collier-Lee Chapter in Florida. We’re planning a session on AI and its use and place in philanthropy. We’d like to invite an organization that helps nonprofits use AI to the panel discussion. The session will be from 8:30-9:30 a.m. July 23rd. Read More
How can I draw NSIDC’s Polar Stereographic Projection use M_Map
When I use the m_map toolkit to draw polar stereographic projections, how do I set up the projection information to get NSIDC’s Polar Stereographic Projection(EPSG:3411)。
This is how I currently have it set up, but it’s not clear if there is a bias:
m_proj(‘azimuthal equal-area’,’latitude’,90,’longitude’,-45,’radius’,45,’rectbox’,’on’);
Can m_map be used in conjunction with PROJ4?
For example using PROJ4:+proj=stere +lat_0=90 +lat_ts=70 +lon_0=-45 +k=1 +x_0=0 +y_0=0 +a=6378273 +b=6356889.449 +units=m +no_defsWhen I use the m_map toolkit to draw polar stereographic projections, how do I set up the projection information to get NSIDC’s Polar Stereographic Projection(EPSG:3411)。
This is how I currently have it set up, but it’s not clear if there is a bias:
m_proj(‘azimuthal equal-area’,’latitude’,90,’longitude’,-45,’radius’,45,’rectbox’,’on’);
Can m_map be used in conjunction with PROJ4?
For example using PROJ4:+proj=stere +lat_0=90 +lat_ts=70 +lon_0=-45 +k=1 +x_0=0 +y_0=0 +a=6378273 +b=6356889.449 +units=m +no_defs When I use the m_map toolkit to draw polar stereographic projections, how do I set up the projection information to get NSIDC’s Polar Stereographic Projection(EPSG:3411)。
This is how I currently have it set up, but it’s not clear if there is a bias:
m_proj(‘azimuthal equal-area’,’latitude’,90,’longitude’,-45,’radius’,45,’rectbox’,’on’);
Can m_map be used in conjunction with PROJ4?
For example using PROJ4:+proj=stere +lat_0=90 +lat_ts=70 +lon_0=-45 +k=1 +x_0=0 +y_0=0 +a=6378273 +b=6356889.449 +units=m +no_defs m_map, nsidc’s polar stereographic projection, proj4 MATLAB Answers — New Questions
HOW TO Get Roadrunner license?
I follow the instruction of https://ww2.mathworks.cn/matlabcentral/answers/533728-roadrunner to find the license of Roadrunner, but there is only MATLAB in my license list.
Then I tried to get a trial in https://ww2.mathworks.cn/products/roadrunner.html , but after 48 hours, nobody contacted me through my email.
I want to know how to get the license of Roadrunner and its installation, please.I follow the instruction of https://ww2.mathworks.cn/matlabcentral/answers/533728-roadrunner to find the license of Roadrunner, but there is only MATLAB in my license list.
Then I tried to get a trial in https://ww2.mathworks.cn/products/roadrunner.html , but after 48 hours, nobody contacted me through my email.
I want to know how to get the license of Roadrunner and its installation, please. I follow the instruction of https://ww2.mathworks.cn/matlabcentral/answers/533728-roadrunner to find the license of Roadrunner, but there is only MATLAB in my license list.
Then I tried to get a trial in https://ww2.mathworks.cn/products/roadrunner.html , but after 48 hours, nobody contacted me through my email.
I want to know how to get the license of Roadrunner and its installation, please. roadrunner, license MATLAB Answers — New Questions
Microsoft Travel/ Real Estate Community
Is there a travel or real estate group/community on Microsoft?
Is there a travel or real estate group/community on Microsoft? Read More
Windows 2022 Server may Updates have failed
Hello,
I have a Windows update issue. The may updates have failed. I ran the windows update troubleshooter and it said it made some changes but the updates still fail. I have several screenshots I will add at the bottom. I have tried the updates several times, but nothing worked.
After the reboots to remove the software that could not be updated The Windows Update screen had a error code to research:
0x8007054f
There is plenty of room on the C: drive
I don’t know if the info on the second screen points to the issue or not, It was with the errors Event log area.
Any assistance and suggestions would be greatly appreciated.
Hello, I have a Windows update issue. The may updates have failed. I ran the windows update troubleshooter and it said it made some changes but the updates still fail. I have several screenshots I will add at the bottom. I have tried the updates several times, but nothing worked. After the reboots to remove the software that could not be updated The Windows Update screen had a error code to research: 0x8007054f There is plenty of room on the C: driveI don’t know if the info on the second screen points to the issue or not, It was with the errors Event log area. Any assistance and suggestions would be greatly appreciated. Read More
My post was marked as spam incorrectly by someone…what do I do?
First of all, I am a loyal Windows user, and this is my first time posting here… I tried to create a post to seek help on how to download my own YouTube videos to my Windows 10 computer.
I am a self-media person, and I often need to make some short videos to earn a living. My post received many enthusiastic responses and help from netizens, which touched me a lot. But not long after, my post was marked as pam by others and could not be viewed. Please help in removing from spam list as we need the support on this. Thanks!
How does this community work? How do I prevent this from happening? Please help in removing from spam list as we need the support on this. Thanks!
First of all, I am a loyal Windows user, and this is my first time posting here… I tried to create a post to seek help on how to download my own YouTube videos to my Windows 10 computer. I am a self-media person, and I often need to make some short videos to earn a living. My post received many enthusiastic responses and help from netizens, which touched me a lot. But not long after, my post was marked as pam by others and could not be viewed. Please help in removing from spam list as we need the support on this. Thanks! https://techcommunity.microsoft.com/t5/windows-10/how-to-download-youtube-videos-in-laptop-windows-10/m-p/4152498#M11810 How does this community work? How do I prevent this from happening? Please help in removing from spam list as we need the support on this. Thanks! Read More
NEW Digital Event: Intro to Copilot Partner-led Immersion Experience | June 19
Join us on June 19th to learn how to use the recently launched Partner-Led Copilot Immersion Experience, providing you the ability to demo and help customers with hands-on experiences.
In this session, we will walk you through each of the assets to drive adoption of Copilot for Microsoft 365 by role/persona and provide guidance about the best way to utilize and demo with your customers – showcasing how Copilot can help businesses solve common problems and achieve more.
Please register for your preferred time zone:
Introduction to Copilot Partner Led Immersion Experience | Digital Event Americas/EMEA
June 19, 2024 8:00 AM (America/Los Angeles) | Register here
Introduction to Copilot Partner-Led Immersion Experience | Digital Event APAC
June 19, 2024 5:00 PM (America/Los Angeles) | Register here
REGISTER TODAY!
Join us on June 19th to learn how to use the recently launched Partner-Led Copilot Immersion Experience, providing you the ability to demo and help customers with hands-on experiences.
In this session, we will walk you through each of the assets to drive adoption of Copilot for Microsoft 365 by role/persona and provide guidance about the best way to utilize and demo with your customers – showcasing how Copilot can help businesses solve common problems and achieve more.
Please register for your preferred time zone:
Introduction to Copilot Partner Led Immersion Experience | Digital Event Americas/EMEA
June 19, 2024 8:00 AM (America/Los Angeles) | Register here
Introduction to Copilot Partner-Led Immersion Experience | Digital Event APAC
June 19, 2024 5:00 PM (America/Los Angeles) | Register here Read More
Database mail and connection pool depletion
Database Mail (DBMail) in SQL Server is a feature that allows you to send emails directly from SQL Server, but it has limitations when sending a large volume of emails. The issues you’re experiencing with DBMail not being able to send massive emails could be due to several factors.
DBMail relies on the .NET Framework’s SmtpClient class, and any issues with this underlying component could affect DBMail’s functionality.
The SmtpClient class implementation pools SMTP connections so that it can avoid the overhead of re-establishing a connection for every message to the same server. An application may re-use the same SmtpClient object to send many different emails to the same SMTP server and to many different SMTP servers. As a result, there is no way to determine when an application is finished using the SmtpClient object and it should be cleaned up.
It is important also to notice that t’s important to note that SmtpClient is deprecated in .NET Core and .NET 5.0 and later versions. While it is still available for compatibility reasons, it is recommended to use alternative libraries for new development
As a side effect on the SQL Server database mail, we may face the below error
The mail could not be sent to the recipients because of the mail server failure. (Sending Mail using Account 7 (2024-01-08T16:07:10). Exception Message: 1) Exception Information =================== Exception Type: Microsoft.SqlServer.Management.SqlIMail.MailFramework.Exceptions.BaseMailFrameworkException Message: Cannot send mails to mail server. (The operation has timed out.) Data: System.Collections.ListDictionaryInternal TargetSite: Void Send(Microsoft.SqlServer.Management.SqlIMail.MailFramework.Framework.IMessage) HelpLink: NULL Source: DatabaseMailProtocols HResult: -2146232832 StackTrace Information =================== at Microsoft.SqlServer.Management.SqlIMail.MailFramework.Smtp.SmtpMailSender.Send(IMessage msg) at Microsoft.SqlServer.Management.SqlIMail.Server.Controller.ProfileMailSender.SendMailToAccount(Account a, IMessageSender ms, OutMailItem si) 2) Exception Information =================== Exception Type: System.Net.Mail.SmtpException StatusCode: GeneralFailure Message: The operation has timed out. Data: System.Collections.ListDictionaryInternal TargetSite: Void Send(System.Net.Mail.MailMessage) HelpLink: NULL Source: System HResult: -2146233088 StackTrace Information =================== at System.Net.Mail.SmtpClient.Send(MailMessage message) at Microsoft.SqlServer.Management.SqlIMail.MailFramework.Smtp.SmtpMailSender.Send(IMessage msg). )
As a workaround we can take the below actions
Try to follow all limitations on the mail server side for your mail account and do not exceed them. This is for avoiding any possible exceptions in the SmtpClient <-> Mail Server layer.
For example, if the exchange server is configured a maximum number of concurrent connections, make sure that your script or application does not send number of emails that exceeds that limitation.
If you find that sending mail with DatabaseMail starts to fail, please restart Databasemail.exe. DatabaseMail will resend all failed mails upon restart.
Finally, please note that SMTPClient doesn’t support many modern protocols. and as of now is on compatibility mode-only but doesn’t scale to modern requirements of the protocols
More information on the below link
SmtpClient Class (System.Net.Mail) | Microsoft Learn
Microsoft Tech Community – Latest Blogs –Read More
What is the multiplier in the “tank” section of the Reverberator system object
Hi there, I’m looking into the Reverberator system object to see how we can use it to produce a controlled IR with a known RT per frequency.
In the Algorithm section of the Mathworks help on the object, there is a "tank" sub-section. We would like to know why the output signal is multiplied by 0.6? There’s no reasoning behind this in the documentation, perhaps it is obvious, but I don’t want to make assumptions.
If someone could explain this multiplication factor that would be most helpful.
Many thanks.Hi there, I’m looking into the Reverberator system object to see how we can use it to produce a controlled IR with a known RT per frequency.
In the Algorithm section of the Mathworks help on the object, there is a "tank" sub-section. We would like to know why the output signal is multiplied by 0.6? There’s no reasoning behind this in the documentation, perhaps it is obvious, but I don’t want to make assumptions.
If someone could explain this multiplication factor that would be most helpful.
Many thanks. Hi there, I’m looking into the Reverberator system object to see how we can use it to produce a controlled IR with a known RT per frequency.
In the Algorithm section of the Mathworks help on the object, there is a "tank" sub-section. We would like to know why the output signal is multiplied by 0.6? There’s no reasoning behind this in the documentation, perhaps it is obvious, but I don’t want to make assumptions.
If someone could explain this multiplication factor that would be most helpful.
Many thanks. reverberator, signal processing, audio toolbox MATLAB Answers — New Questions
Deployment rules for Notebooks: Enhancing Efficiency with Microsoft Fabric.
Introduction.
Fabric Notebooks, an integral component of the Microsoft Fabric ecosystem, offer a powerful platform for interactive data exploration, analysis, and collaboration. Designed to enhance productivity and streamline workflows, Fabric Notebooks provide users with a versatile environment to write, execute, and share code and visualizations alongside workspaces.
Fabric Notebooks empower users to engage in interactive data analysis using programming languages such as Python and R. By seamlessly integrating code execution with explanatory text and visualizations, Fabric Notebooks streamline the workflow for data exploration and interpretation. Moreover, notebooks support collaborative work environments, enabling multiple users to collaborate on the same notebook simultaneously.
Utilizing source control through Fabric’s integration with Git facilitates transparency and documentation within teams. This integration enhances the ability to replicate analyses and share findings with stakeholders effectively.
In this article, we describe the relationship between Notebooks and Lakehouses in Fabric, the advantages of implementing source control for notebooks, and delve into the utilization of deployment rules for Microsoft Fabric Notebooks. These rules serve to expedite deployment processes and aid in the compartmentalization of knowledge pertaining to the components comprising an analytical solution.
Notebooks and Lakehouses.
The Microsoft Fabric Notebook is a primary code item for developing Apache Spark jobs and machine learning experiments. It’s a web-based interactive surface used by data scientists and data engineers to write code benefiting from rich visualizations and Markdown text. Data engineers write code for data ingestion, data preparation, and data transformation. Data scientists also use notebooks to build machine learning solutions, including creating experiments and models, model tracking, and deployment.
You can either create a new notebook or import an existing notebook.
Like other standard Fabric item creation processes, you can easily create a new notebook from the Fabric Data Engineering homepage, the workspace New option, or the Create Hub.
You can create new notebooks or import one or more existing notebooks from your local computer to a Fabric workspace from the Data Engineering or the Data Science homepage. Fabric notebooks recognize the standard Jupyter Notebook .ipynb files, and source files like .py, .scala, and .sql.
To learn more about notebooks creation see How to use notebooks – Microsoft Fabric | Microsoft Learn
Next figure shows a Notebook1 in a workspace named “Learn”.
The items contained in this workspace can be added to source control if you have integrated it with an ADO Repo, using this feature of Fabric explained at Overview of Fabric Git integration – Microsoft Fabric | Microsoft Learn.
With Git integration, you can back up and version your notebook, revert to previous stages as needed, collaborate or work alone using Git branches, and manage your notebook content lifecycle entirely within Fabric.
Fabric notebooks now support close interactions with Lakehouses.
Microsoft Fabric Lakehouse is a data architecture platform for storing, managing, and analyzing structured and unstructured data in a single location.
You can easily add a new or existing lakehouse to a notebook from the Lakehouse explorer:
We can create the Notebook from the Lakehouse as well, that way that is the default Lakehouse of that notebook.
Notebooks can previously exist with different codes to analyze data, and we can select which notebook the Lakehouse is going to be analyzed with, either with one or with several notebooks.
Here is an example of a notebook and its associated Lakehouse.
So, a Lakehouse can be analyzed with one or more notebooks and vice versa, a notebook is used to analyze one or more lakehouses, but the notebook can have a pinned Lakehouse which is the default Lakehouse where notebook codes are applied to store, transform and visualize data.
You can navigate to different lakehouses in the Lakehouse explorer and set one lakehouse as the default by pinning it. Your default is then mounted to the runtime working directory, and you can read or write to the default lakehouse using a local path.
The default lakehouse in a notebook is typically managed in the configuration settings of the notebook’s code. You can set or overwrite the default lakehouse for the current session programmatically using a configuration block in your notebook. Here’s an example of how you might configure it:
%%configure
{
“defaultLakehouse”: {
“name”: “your-lakehouse-name”, # The name of your lakehouse
# “id”: “<(optional) lakehouse-id>”, # The ID of your lakehouse (optional)
# “workspaceId”: “<(optional) workspace-id-that-contains-the-lakehouse>” # The workspace ID if it’s from another workspace (optional)
}
}
This code snippet should be placed at the beginning of your notebook to set the default lakehouse for the session. If you’re using a relative path to access data from the lakehouse, the default lakehouse will serve as the root folder at runtime.
Change the default lakehouse of a notebook using Fabric User Interface.
Using just the UI, in the Lakehouse list, the pin icon next to the name of a Lakehouse indicates that it’s the default Lakehouse in your current notebook.
After several lakehouses have been added to the notebook, you can pin the lakehouse that you want to manage by default. Click on the pin and the previously added lakehouses appear:
To switch to a different default lakehouse, move the pin icon.
Notebooks inside Deployment Pipelines.
You can define a deployment pipeline in the “Learn” workspace, considering this workspace as the Development stage.
To learn more about Fabric’s deployment pipelines you can read Microsoft Fabric: Integration with ADO Repos and Deployment Pipelines – A Power BI Case Study.
Creating a deployment pipeline looks like this:
When you select “Create”, you can assign the desired workspace to the Development Stage.
After being assigned, you can see three stages, and begin to deploy the items to the next stages. Deploying creates a new workspace if it doesn’t exist. See the next three images.
How can you use “deployment rules” for notebooks in Test and in Production stages?
In relation to notebooks, Fabric lets users define deployment rules associated with specific notebooks. These rules allow users to customize the default lakehouse where the notebook is utilized.
Here are the steps to follow.
Select the icon at the upper right corner of the workspace, as seen in the next figure.
2. You see the notebooks created in your workspace after deploying content from Development workspace:
3. Select the notebook you want to deploy to Production but changing the default lakehouse it will work with:
4. Add a rule to change the original default lakehouse this notebook has in Test workspace, to another lakehouse that already exists in Production. When you choose to adopt other lakehouses in the target environment, Lakehouse ID is a must have. You can find the ID of a lakehouse from the lakehouse URL link.
5. Press Deploy. This action allows the production stage to manage the codes of the notebook to which you applied a deployment rule, referred to a different Lakehouse, which will serve as the basis for all the definitive analyses that will be viewed by all members of the organization, stakeholders and authorized users.
You can also deploy content backwards, from a later stage in the deployment pipeline, to an earlier one, so you can use deployment rules for a notebook in a Production workspace to change its default Lakehouse if you need that the notebook be deployed from Production to Test.
Summary:
Microsoft Fabric notebooks are highly significant for data scientists.
They provide a comprehensive environment for completing the entire data science process, from data exploration and preparation to experimentation, modeling, and serving predictive insights.
With tools like Lakehouse, data scientists can easily attach to a notebook to browse and interact with data, streamlining the process of reading data into data frames for analysis .
With Git integration, you can back up and version your notebook, revert to previous stages as needed, collaborate or work alone using Git branches, and manage your notebook content lifecycle entirely within Fabric.
Deployment rules applied to notebooks in Microsoft Fabric are used to manage the application lifecycle, particularly when deploying content between different stages such as development, test, and production. This feature streamlines the development process and ensures quality and consistency during deployment.
You can find more information in the following resources:
How To Create NOTEBOOK in Microsoft Fabric | Apache Spark | PySpark – YouTube
How to use notebooks – Microsoft Fabric | Microsoft Learn
Develop, execute, and debug notebook in VS Code – Microsoft Fabric | Microsoft Learn
Explore the lakehouse data with a notebook – Microsoft Fabric | Microsoft Learn
Notebook source control and deployment – Microsoft Fabric | Microsoft Learn
Solved: How to set default Lakehouse in the notebook progr… – Microsoft Fabric Community
Microsoft Tech Community – Latest Blogs –Read More
AI-as-a-Service: Architecting GenAI Application Governance with Azure API Management and Fabric
The past year has seen explosive growth for Azure OpenAI and large language models in general. With models reliant on a token-based approach for processing requests, ensuring prompt engineering is being done correctly, tracking what models and api’s are being used, load balancing across multiple instances, and creating chargeback models has become increasingly important. The use of Azure API Management (APIM) is key to solving these challenges. There have been several announcements specific to the integration of Azure Open AI and APIM during Microsoft Build 2024 to make them easier to use together.
As the importance of evaluating analytics and performing data science against Azure Open AI based workloads grows, storing usage information is critical. That’s where adding Microsoft Fabric and the Lakehouse to the architecture comes in. Capturing the usage data in an open format for long term storage while enabling fast querying rounds out the overall solution.
We must also consider that not all use cases will require the use of a Large Language Model (LLM). The recent rise of Small Language Models (SLM), such as Phi-3, for use cases that do not require LLMs, means there will very likely be multiple types of Generative AI (GenAI) models in use for a typical enterprise and they will all be exposed through a centrally secured and governed set of APIs enabling every GenAI use case for rapid onboarding and adoption. Having an AI Center of Enablement framework providing “AI-as-a-Service” will be incredibly important for organizations to safely enable different GenAI models quickly and their numerous versions all within the allocated budget or by using a chargeback model that can span across the enterprise regardless of the number of teams consuming the AI services and the number of subscriptions or environments they end up requiring.
This model will also allow organizations to have complete consumption visibility if they purchase Provisioned Throughput Units (PTU) for their GenAI workloads in production (at scale with predictable latency and without having to worry about noisy neighbors) when each of the individual AI use cases/business units are not able to purchase it entirely on their own. This true economy of scale can be achieved with this same architecture where PTU is purchased for a particular Azure OpenAI model deployment and is shared among all Business-critical production use cases.
The overall architecture for this “AI-as-a-Service” solution is as follows:
Flow:
A client makes a request to an AI model through Azure API Management using a subscription key that is unique to them. This allows multiple clients to share the same AI model instance and yet we can uniquely identify each one of them. Clients could be different Business Units or Internal/External consumers or product lines.
Azure API Management forwards the request to the AI model and receives the output of the model.
Azure API Management logs the subscription details and request/response data to Event Hubs using a log-to-eventhub policy.
Using the Realtime Intelligence experience in Microsoft Fabric, an Eventstream processor reads the data from Event Hubs.
The output of the stream is written to a managed Delta table in a Lakehouse.
After creating a view of the Delta table in the Sql Analytics endpoint for the Lakehouse, it can now be queried by Power BI. We can also use a Notebook to perform any data science requirements against the prompt data
Build out
Create an Event Hub logger in API Management.
In the API that exposes AI backend, add policy that sends the data to the event hub. This example shows Azure OpenAI as the backend.
<policies>
<inbound>
<base />
<authentication-managed-identity resource=”https://cognitiveservices.azure.com” output-token-variable-name=”msi-access-token” ignore-error=”false” />
<set-header name=”Authorization” exists-action=”override”>
<value>@(“Bearer ” + (string)context.Variables[“msi-access-token”])</value>
</set-header>
<set-variable name=”requestBody” value=”@(context.Request.Body.As<string>(preserveContent: true))” />
</inbound>
<backend>
<base />
</backend>
<outbound>
<base />
<choose>
<when condition=”@(context.Response.StatusCode == 200)”>
<log-to-eventhub logger-id=”ai-usage”>@{
var responseBody = context.Response.Body?.As<string>(true);
var requestBody = (string)context.Variables[“requestBody”];
return new JObject(
new JProperty(“EventTime”, DateTime.UtcNow),
new JProperty(“AppSubscriptionKey”, context.Request.Headers.GetValueOrDefault(“api-key”,string.Empty)),
new JProperty(“Request”, requestBody),
new JProperty(“Response”,responseBody )
).ToString();
}</log-to-eventhub>
</when>
</choose>
</outbound>
<on-error>
<base />
</on-error>
</policies>
Build an Eventstream in Fabric that lands the data into the Delta table.
The data comes across a bit too raw to use for analytics, but with the SQL Analytics endpoint, we can create views overtop of the table.CREATE OR ALTER VIEW [dbo].[AIUsageView] AS
SELECT CAST(EventTime AS DateTime2) AS [EventTime],
[AppSubscriptionKey],
JSON_VALUE([Response], ‘$.object’) AS [Operation],
JSON_VALUE([Response], ‘$.model’) AS [Model],
[Request],
[Response],
CAST(JSON_VALUE([Response], ‘$.usage.completion_tokens’) AS INT) AS [CompletionTokens],
CAST(JSON_VALUE([Response], ‘$.usage.prompt_tokens’) AS INT) AS [PromptTokens],
CAST(JSON_VALUE([Response], ‘$.usage.total_tokens’) AS INT) AS [TotalTokens]
FROM
[YOUR_LAKEHOUSE_NAME].[dbo].[AIData]
We can now create a report using a DirectLake query from Power BI
We can also load the data into a Spark dataframe to perform data science analysis on the prompts and responses.
You can find more detailed instructions on building this on our GitHub sample.
A Landing Zone Accelerator is also available that shows how to build the underlying foundation infrastructure in an enterprise way.
Alternative Designs
Azure Cosmos DB for NoSQL to persist Chat History – If your application is already storing Chat history (prompts & completions) in Azure Cosmos DB for NoSQL, you don’t need to log the requests and responses to Event Hub from APIM policy again. In that case, you can simply log the key metrics to Event Hub (e.g. Client Identifier, Deployment Type, Tokens consumed etc.) and source the prompts and completions from Cosmos DB for advanced analytics. The new preview feature of Mirroring a Cosmos DB can simplify this process.
Here is a code sample to parse the response body and log the token consumption through APIM policies.
<log-to-eventhub logger-id=”ai-usage”>@{
return new JObject(
new JProperty(“TotalTokens”, context.Response.Body.As<JObject>(preserveContent: true).SelectToken(“usage.total_tokens”).ToString())
).ToString();
}</log-to-eventhub>
Once the raw token counts and API consumer (e.g. different Business Units using the AI-as-a-Service model) info is logged into Event Hub and it makes its way into Fabric Lakehouse, aggregate measures can be created directly on top of the Semantic model (default or custom) and displayed in a Power BI dashboard of your choice. An example of such aggregate measure is as follows:
TokensByBU = CALCULATE(SUMX(
aoaichargeback,
VALUE(MAX(aoaichargeback[TotalTokens]))
),
ALLEXCEPT(aoaichargeback, aoaichargeback[BusinessUnitName]))
Here aoaichargeback is the name of the Lakehouse table where all events emitted from APIM are stored. TokensByBU measure calculates the sum of the maximum TotalTokens value for each BusinessUnitName in the aoaichargeback table.
Since both the chat history data and the key usage/performance metrics is in Lakehouse, they can be combined & used for any advanced analytical purposes. Similar approaches (earlier in the Article) of utilizing the Fabric Lakehouse SQL Analytics endpoint can be used for analyzing and governing the persisted data.
2. Azure OpenAI Emit Token Metric Policy – With the recent announcement of GenAI Gateway capabilities in Azure API Management – a set of features designed specifically for GenAI use cases, we can now get key Azure OpenAI consumption metrics straight out of our App Insight namespace when this feature is enabled and implemented. A new policy <azure-openai-emit-token-metric> can now be used for sending the Azure OpenAI token count metrics to Application Insights along with User ID, Client IP, and API ID as dimensions.
Microsoft Tech Community – Latest Blogs –Read More
All pins reserved by servo. I cant use a DC motor at the same time.
Hi everyone, I have just started to use MATLAB with arduino for a university project.
I’m trying to use both a servo and a DC motor with an arduino, but when I initialise the servo motor, all the pins (or at least the PWM ones) become reserved to the servo.
a = arduino(‘COM4’, ‘Uno’, ‘Libraries’, ‘Servo’)
s = servo(a, ‘D11’, ‘MinPulseDuration’, 0.00055, ‘MaxPulseDuration’, 0.0024)
writePWMDutyCycle(a, "D9", 1)
The error I get is:
"Digital pin D9 is reserved by Servo in ‘Reserved’ mode. To change the pin configuration, clear all variables holding onto this resource."
I have tried other pins, but they are all reserved. If i don’t initialise the servo, it works.
How can I reserve just one pin for the servo and not all of them?
ThanksHi everyone, I have just started to use MATLAB with arduino for a university project.
I’m trying to use both a servo and a DC motor with an arduino, but when I initialise the servo motor, all the pins (or at least the PWM ones) become reserved to the servo.
a = arduino(‘COM4’, ‘Uno’, ‘Libraries’, ‘Servo’)
s = servo(a, ‘D11’, ‘MinPulseDuration’, 0.00055, ‘MaxPulseDuration’, 0.0024)
writePWMDutyCycle(a, "D9", 1)
The error I get is:
"Digital pin D9 is reserved by Servo in ‘Reserved’ mode. To change the pin configuration, clear all variables holding onto this resource."
I have tried other pins, but they are all reserved. If i don’t initialise the servo, it works.
How can I reserve just one pin for the servo and not all of them?
Thanks Hi everyone, I have just started to use MATLAB with arduino for a university project.
I’m trying to use both a servo and a DC motor with an arduino, but when I initialise the servo motor, all the pins (or at least the PWM ones) become reserved to the servo.
a = arduino(‘COM4’, ‘Uno’, ‘Libraries’, ‘Servo’)
s = servo(a, ‘D11’, ‘MinPulseDuration’, 0.00055, ‘MaxPulseDuration’, 0.0024)
writePWMDutyCycle(a, "D9", 1)
The error I get is:
"Digital pin D9 is reserved by Servo in ‘Reserved’ mode. To change the pin configuration, clear all variables holding onto this resource."
I have tried other pins, but they are all reserved. If i don’t initialise the servo, it works.
How can I reserve just one pin for the servo and not all of them?
Thanks arduino, servomotor, dc, dc motor, servo, servo motor, pwm pins, power_electronics_control, electric_motor_control, power_conversion_control MATLAB Answers — New Questions
How can I save a 32 bit image (uint32) as a tiff?
Hello,
I am currently working on a post processing HDR script which requires that I use a 32 bit data type. The images start at 14 bit, once they are imported I convert them to int32, and after I do my calculations I convert them back to uint32. After I create my HDR image the max pixel value is about 181,000 counts. I am currently using,
imwrite(FinalImage, ‘HDR Result.tif’)
which works for 16 bit images, but does not seem to work for 32 bit images. If you could give me any help or insight I would greatly appreciate it.
Thank you,
JasonHello,
I am currently working on a post processing HDR script which requires that I use a 32 bit data type. The images start at 14 bit, once they are imported I convert them to int32, and after I do my calculations I convert them back to uint32. After I create my HDR image the max pixel value is about 181,000 counts. I am currently using,
imwrite(FinalImage, ‘HDR Result.tif’)
which works for 16 bit images, but does not seem to work for 32 bit images. If you could give me any help or insight I would greatly appreciate it.
Thank you,
Jason Hello,
I am currently working on a post processing HDR script which requires that I use a 32 bit data type. The images start at 14 bit, once they are imported I convert them to int32, and after I do my calculations I convert them back to uint32. After I create my HDR image the max pixel value is about 181,000 counts. I am currently using,
imwrite(FinalImage, ‘HDR Result.tif’)
which works for 16 bit images, but does not seem to work for 32 bit images. If you could give me any help or insight I would greatly appreciate it.
Thank you,
Jason hdr, imwrite, tiff MATLAB Answers — New Questions
Is there an option to avoid generating translations in the generated code from a Simulink Model that is created by importing an arxml with Interfaces Mapped to COMPU METTHODS?
I have generated a simulink model from an arxml , that has the interfaces associated with Application Data types and Compu methods.
Added the following logic shown here. Is there a way to avoid the translation that is generated in the code.I have generated a simulink model from an arxml , that has the interfaces associated with Application Data types and Compu methods.
Added the following logic shown here. Is there a way to avoid the translation that is generated in the code. I have generated a simulink model from an arxml , that has the interfaces associated with Application Data types and Compu methods.
Added the following logic shown here. Is there a way to avoid the translation that is generated in the code. autosar compu methods MATLAB Answers — New Questions
matlab code for spectrum sensing in cognitive radio
i am doing my thesis on cognitive radios.so ,i need matlab code for spectrum sensingi am doing my thesis on cognitive radios.so ,i need matlab code for spectrum sensing i am doing my thesis on cognitive radios.so ,i need matlab code for spectrum sensing cognitive radio, spectrum sensing MATLAB Answers — New Questions
Merge cells containing the same information in multiple tables with macro
Hi everyone, I’m new to this and I’m trying to figure out how to do the following.
I have an excel file, the first sheet is the main table and from this table different subtables are formed and each table is on a different sheet.
For a better visualization of the data, the user needs to see these excel tables with the records grouped together.
All tables have the same columns only their data changes.
As you can see in the first image the table has each record separated in each row but the user wants to view it like the second image in which the table is merged in the records that contain the same information.
I already tried to do this in power automate with an office script connector but I couldn’t get it to work and the macro I have is the following, but it only does this merge in the range of cells that I specify and only in a single table.
Sub MergeSameCell()
Set myRange=(“DataDesk”)
MergeSame:
For Each Cell In myRange
If cell.Value = cell.Offset(1, 0).Value And Not IsEmpty(cell) Then
Range(cell, cell.Offset(1, 0)).Merge
cell.VerticalAlignment = xlCenter
GoTo MergeSame
End If
Next cell
End Sub
I would greatly appreciate your help.
Hi everyone, I’m new to this and I’m trying to figure out how to do the following.I have an excel file, the first sheet is the main table and from this table different subtables are formed and each table is on a different sheet.For a better visualization of the data, the user needs to see these excel tables with the records grouped together.All tables have the same columns only their data changes.As you can see in the first image the table has each record separated in each row but the user wants to view it like the second image in which the table is merged in the records that contain the same information. I already tried to do this in power automate with an office script connector but I couldn’t get it to work and the macro I have is the following, but it only does this merge in the range of cells that I specify and only in a single table.Sub MergeSameCell()
Set myRange=(“DataDesk”)
MergeSame:
For Each Cell In myRange
If cell.Value = cell.Offset(1, 0).Value And Not IsEmpty(cell) Then
Range(cell, cell.Offset(1, 0)).Merge
cell.VerticalAlignment = xlCenter
GoTo MergeSame
End If
Next cell
End SubI would greatly appreciate your help. Read More
Adding DevOps Wiki content to M365 Copilot
We have a lot of knowledge in Devops Wiki and I want that to be exposed to M365 Copilot. So, when I ask question from copilot in Teams/M365, it uses the knowledge from wikis.
I have already created a connection to Devops Wiki using Search & Intelligence >> Data Sources >> Connections. Is that enough to add the Devops Wiki to Copilot knowledge?
We have a lot of knowledge in Devops Wiki and I want that to be exposed to M365 Copilot. So, when I ask question from copilot in Teams/M365, it uses the knowledge from wikis. I have already created a connection to Devops Wiki using Search & Intelligence >> Data Sources >> Connections. Is that enough to add the Devops Wiki to Copilot knowledge? Read More
Azure Licence – Bringing it Back to In-House
Hi
Our company is using Azure P1 Licence and underneath there are Power Automate , Sharepoint, Exchange , Business Premium licence for users.
I am trying to find a way to bring it all back so Microsoft charges it directly instead of having it invoiced from our managed service provider.
I looked at O365 Admin Center and I can add a payment method.
If I were to add a company credit card in , Do i then linked all the user licences ( Automate,Sharepoint, Business Premium) in the credit card?
We also have Azure P1 Licence and I can’t see it under the licence tab in o365 admin center so I don’t know whether that can be linked to the company credit card?
I did a search online and all seems to be pointing at licences under O365 admin center but I don’t see Azure P1 licence?
HiOur company is using Azure P1 Licence and underneath there are Power Automate , Sharepoint, Exchange , Business Premium licence for users. I am trying to find a way to bring it all back so Microsoft charges it directly instead of having it invoiced from our managed service provider. I looked at O365 Admin Center and I can add a payment method.If I were to add a company credit card in , Do i then linked all the user licences ( Automate,Sharepoint, Business Premium) in the credit card?We also have Azure P1 Licence and I can’t see it under the licence tab in o365 admin center so I don’t know whether that can be linked to the company credit card? I did a search online and all seems to be pointing at licences under O365 admin center but I don’t see Azure P1 licence? Read More