Month: June 2024
Sharepoint – No download option
Hi Everyone,
Greetings to all,
I have a new requirement for a SharePoint site where the download option should be completely disabled. This means that no users, including Site owners, Site members, and Site visitors, should be able to see or use the download option.
Thanks & Regards
Hi Everyone,Greetings to all, I have a new requirement for a SharePoint site where the download option should be completely disabled. This means that no users, including Site owners, Site members, and Site visitors, should be able to see or use the download option.Thanks & Regards@unknown_1234 Read More
VSS Migration to Azure TFVC Repo
Earlier In Azure DevOps while Creating a new project, had two options for version control.
1.Git.
2.Team Foundation Version Control.
Now , I am seeing only one option (i.e) GIT.
I need to create a project with tfvc as version Control for vss migration to azure TFVC Repo
Any help would be greatly appreciated. Thank you!
Regards,
DavidSamuellJohnRice
Earlier In Azure DevOps while Creating a new project, had two options for version control.1.Git.2.Team Foundation Version Control.Now , I am seeing only one option (i.e) GIT. I need to create a project with tfvc as version Control for vss migration to azure TFVC RepoAny help would be greatly appreciated. Thank you! Regards,DavidSamuellJohnRice Read More
Problem with duplicates storing in SharePoint
Hi everyone,
It is mixed problem with SharePoint and PowerApps, but I was lead here from PowerApps community.
I try to figure out if I can store multiple duplicates of the same Person in SharePoint list. I have multiple column for each Department(which stores persons), and the column which stores all persons together from all column department. The problem is when I have the same person more than one time in multiple departments. Then when I store them all together this person is shown only one time, not two or the etc.
I need it all copies from all department so that my workflow will work properly. It is a problem when it should be send to 3 departments but it is sent to only one(because lack of duplicates copies), because this person works in all three of them and depending on the department can give me different approval answer.
Additional info: I take Person data from combo boxes and then collect their value into collection. Then I patch this collection into “All Person” column.
Hi everyone,It is mixed problem with SharePoint and PowerApps, but I was lead here from PowerApps community.I try to figure out if I can store multiple duplicates of the same Person in SharePoint list. I have multiple column for each Department(which stores persons), and the column which stores all persons together from all column department. The problem is when I have the same person more than one time in multiple departments. Then when I store them all together this person is shown only one time, not two or the etc. I need it all copies from all department so that my workflow will work properly. It is a problem when it should be send to 3 departments but it is sent to only one(because lack of duplicates copies), because this person works in all three of them and depending on the department can give me different approval answer. Additional info: I take Person data from combo boxes and then collect their value into collection. Then I patch this collection into “All Person” column. Read More
New Outlook not showing link urls
Old outlook would show you the url if you hover over a link – New Outlook does not show anything when you hover over but still it will work as a hyperlink.
Is there a setting that changes this so you can see where you are going before you click a link ?
Old outlook would show you the url if you hover over a link – New Outlook does not show anything when you hover over but still it will work as a hyperlink. Is there a setting that changes this so you can see where you are going before you click a link ? Read More
Integration Environment Update: Introducing Unified Monitoring and Business Process Tracking Update
This post is a collaboration between Divya Swarnkar and Kent Weare
We are excited to announce a new capability in Integration Environment that allows you to monitor Azure Integration Services, including Azure Logic Apps, Azure Service Bus, and Azure API Management (APIM), from a single pane of glass. This enhancement is designed to streamline your monitoring experience, providing a comprehensive view of your integration workflows and services. Here’s why this is important and how it will benefit you.
Integration Environment and Grouping AIS Resources
The Integration Environment was launched with a vision to provide a dedicated landing zone for Azure Integration Services. Alongside this, we introduced the capability to group AIS resources as an application. The unified monitoring experience leverages the Integration Environment as this landing zone, utilizing the grouping capability to organize AIS resources effectively as an application. This approach enhances the monitoring experience by aligning it with the application-centric view of your resources, making management and troubleshooting more intuitive and efficient.
Why Unified Monitoring Matters
Simplified Operations: With telemetry from Logic Apps, Service Bus, and APIM consolidated in a single Application Insights instance, you can view and manage your integration services more efficiently. This reduces the complexity of switching between multiple monitoring tools, allowing you to focus on what matters most – your business processes.
Enhanced Visibility: A unified monitoring dashboard offers a holistic view of your integration landscape. This comprehensive visibility helps in quickly identifying and diagnosing issues across different services, ensuring that your workflows run smoothly and without interruptions.
Improved Performance Monitoring: By centralizing telemetry, you can better analyze performance metrics and detect anomalies across all your integration services. This enables proactive troubleshooting and optimization, leading to improved reliability and performance of your applications.
Streamlined Troubleshooting: When issues arise, having all relevant data in one place accelerates the troubleshooting process. You can trace the root cause of problems across your Logic Apps, Service Bus, and APIM without having to piece together information from disparate sources.
How to Get Started
To start using this new capability, ensure that telemetry from all your resources in Azure Integration Services is directed to the same Application Insights instance. Select the appropriate instance from the dropdown menu in your monitoring dashboard, and you’re ready to go.
Here are the steps –
In Azure Portal, navigate to your Integration Environment resource and and application in it. For the selected application, go to Application monitoring > Insights.
Before you use the dashboard, select the Application Insights workspace from the drop-down menu.
There are three key tabs on this dashboard which are organized by the type of service in AIS. For each tab, there are subtabs which are organized based on the categories relevant for that service. Let’s look into each of them –
Logic Apps
The Overview tab provides an aggregated view into the Logic Apps and workflows in your application. You can look into the workflow level details in the worklow tab. For the workflows that needs more analysis, you can use the Runs tab to drill down into the run history and action failures. Finally, the Compute tab gives you the underlying compute utilization for these workloads.
Service Bus
For Service Bus, you have tabs to analyze the aggregated data for the entities that are in your application. The Requests tab helps you get more information on the Requests received and their status by the entities (queues/topics). The Messages tab likewise shows you the detailed information on the Messages in the entities and their status.
API Management
The Timeline provides time charts to see the overall trends for the requests and the response times. You can get detailed information about the status and processing times of each requests by APIs and Operations through the APIs and Operations tabs.
We believe this new feature will greatly enhance your monitoring experience and provide significant value to your integration projects. Stay tuned for more updates and enhancements as we continue to improve unified monitoring experience for Azure Integration Services.
Feedback opportunity
If you are interested in participating in 1:1 feedback session regarding operation and monitoring of integration applications, we would love to hear from you. Please fill this form (https://aka.ms/unifiedais-preview) and we’ll reach out to you.
To see the recorded demo of this capability, please watch this YouTube video.
Business Process Tracking Update
In addition to the unified monitoring investments made to Integration Environment, we also have some Business Process Tracking updates to share with you. The updates to Business Process Tracking are the direct result of customer feedback from organizations using our features launched at Ignite 2023.
Updates to Business Process Tracking include:
Business Process Tracking is a top level Azure Resource
Business Processes can be added to an Integration Application as part of our unified experience
Business Process stages now have two statues
Successful
Failed
Enhanced Token Picker
Transaction ID (formerly Business Id) now supports expressions
Variable support
Emit Boolean values at runtime
To try out the new Business Process Tracking capabilities, please perform the following steps:
In the Azure Portal, search for Business Process Tracking and click on the name of the service.
Click on Create to begin the provisioning process
Provide the required configuration details including the Transaction ID and the Azure Data Explorer (ADX) details in the Storage tab.
Note: We are now referring to the Business ID as the Transaction ID to allow for more consistency in the offering. We have also moved the ADX configuration details from the Integration Application provisioning to the Business Process Tracking experience.
We now have an Overview page for your business process which acts as a landing page for the resource. To open up the designer, click on the Editor link.
Build out your Business Process by clicking on the + to add stages.
For each stage, provide data properties that you would like to track at that particular stage. You have the option to include both Success and Failure properties based upon your needs.
Note: Each stage must have a Success status modeled. Adding a Failure status is optional.
Save your work as you proceed in building your business process.
Once you have finished defining your stages, you can now begin the process of mapping your business process to the underlying Logic Apps implementation by toggling Show data source settings on.
For each stage you will need to select an Azure Subscription, Logic App and Workflow. Once you have configured these values, then click on the Select data sources link.
Where ever you have a Success or Failed status defined, you need to select a Point-in-time action which represents the trigger/action where you want to emit data to Azure Data Explorer. These point-in-time actions should be different between Success and Failure statuses.
As you complete your mappings, do Save and Validate your work
Once you have completed your mappings and have no validation errors, you can deploy your business process by clicking on the Deploy link.
After you have deployed your business process, tracking profiles will be injected into the Artifacts/Tracking Profiles directory in the Logic App(s) that are used in your mappings. Your Logic App will also restart and any new transactions should have their data tracked. You can view this data by clicking on the Transactions link.
You can drill into a specific transaction by clicking on the applicable Transaction ID (Ticket Number). When you do so, you will see an overlay of your business process and the underlying data
To view the data for a particular stage, click on it and you will see the data that was tracked during that stage.
With your solution now tracking your data, you can extend your experience by consuming this data in other analytics tools like Power BI or Azure Monitor Workbooks.
If you would like to see a recorded demo of this content, please watch the following YouTube video.
Microsoft Tech Community – Latest Blogs –Read More
Interpolating Phased Array Toolbox patterns
I have an antenna modeled in Phased Array Toolbox, and after updating its steering I’m using the pattern() function to store the AZ/EL directivity pattern (just forward hemisphere). I want to then in another chunk of code interpolate the gain to a (large) number of grid points where in a loop over each I’m calculating the observed AZ/EL. I figure a griddedInterpolant object is what I should build which I can then query a bunch of times. However, the X1,X2 ordering for interpolation has the legacy of meshgrid/ndgrid where some dimensions are transposed. The EL and AZ arrays are both 1×181 elements which helps add to the confusion (so the pattern is 181×181). What’s the best formulation, or what is the most efficient way to store/retrieve/interpolate a computed pattern?I have an antenna modeled in Phased Array Toolbox, and after updating its steering I’m using the pattern() function to store the AZ/EL directivity pattern (just forward hemisphere). I want to then in another chunk of code interpolate the gain to a (large) number of grid points where in a loop over each I’m calculating the observed AZ/EL. I figure a griddedInterpolant object is what I should build which I can then query a bunch of times. However, the X1,X2 ordering for interpolation has the legacy of meshgrid/ndgrid where some dimensions are transposed. The EL and AZ arrays are both 1×181 elements which helps add to the confusion (so the pattern is 181×181). What’s the best formulation, or what is the most efficient way to store/retrieve/interpolate a computed pattern? I have an antenna modeled in Phased Array Toolbox, and after updating its steering I’m using the pattern() function to store the AZ/EL directivity pattern (just forward hemisphere). I want to then in another chunk of code interpolate the gain to a (large) number of grid points where in a loop over each I’m calculating the observed AZ/EL. I figure a griddedInterpolant object is what I should build which I can then query a bunch of times. However, the X1,X2 ordering for interpolation has the legacy of meshgrid/ndgrid where some dimensions are transposed. The EL and AZ arrays are both 1×181 elements which helps add to the confusion (so the pattern is 181×181). What’s the best formulation, or what is the most efficient way to store/retrieve/interpolate a computed pattern? antenna pattern, interpolation MATLAB Answers — New Questions
How to accelerate the speed of frequent interpolation?
For example,
I have a point set( A), where each point corresponds to a value of (Value_A). We can use "scatteredInterpolant" function to obtain the value (Value_B) on a point set B. However, this process is very time-consuming for a large number of points.
In my code, this interpolation process is required frequent operations, but the point sets A and B do not change, only the value (value_A) on the points will change.
In this case, can we extract the weight matrix A2B from A to B and directly obtain (Value_B) through Value_A * A2B. This way, the overall efficiency will be very high. I tried to check the underlying code of the scatteredInterpolant function, but it was not available.For example,
I have a point set( A), where each point corresponds to a value of (Value_A). We can use "scatteredInterpolant" function to obtain the value (Value_B) on a point set B. However, this process is very time-consuming for a large number of points.
In my code, this interpolation process is required frequent operations, but the point sets A and B do not change, only the value (value_A) on the points will change.
In this case, can we extract the weight matrix A2B from A to B and directly obtain (Value_B) through Value_A * A2B. This way, the overall efficiency will be very high. I tried to check the underlying code of the scatteredInterpolant function, but it was not available. For example,
I have a point set( A), where each point corresponds to a value of (Value_A). We can use "scatteredInterpolant" function to obtain the value (Value_B) on a point set B. However, this process is very time-consuming for a large number of points.
In my code, this interpolation process is required frequent operations, but the point sets A and B do not change, only the value (value_A) on the points will change.
In this case, can we extract the weight matrix A2B from A to B and directly obtain (Value_B) through Value_A * A2B. This way, the overall efficiency will be very high. I tried to check the underlying code of the scatteredInterpolant function, but it was not available. interpolation MATLAB Answers — New Questions
Edit poles and zeros in root locus design editor or bode editor
While i am trying to get the bode plot for my input and output from my simulink model i am getting the error as mentioned below in the image.While i am trying to get the bode plot for my input and output from my simulink model i am getting the error as mentioned below in the image. While i am trying to get the bode plot for my input and output from my simulink model i am getting the error as mentioned below in the image. editing bode editor in control system design MATLAB Answers — New Questions
Need Help writing code for the given equation attached below for a plate (in 2d).
Below is a reynold equation. Need Help writing code for the given equation attached below for a plate such that the plate length is divided into 50 nodes in x and y direction. ( i=50 and j=50.)Below is a reynold equation. Need Help writing code for the given equation attached below for a plate such that the plate length is divided into 50 nodes in x and y direction. ( i=50 and j=50.) Below is a reynold equation. Need Help writing code for the given equation attached below for a plate such that the plate length is divided into 50 nodes in x and y direction. ( i=50 and j=50.) finite difference method, simulation MATLAB Answers — New Questions
Filter using a drop-down list.
I have a sheet that I want to filter using a drop-down list. Column A is what I want to filter the sheet by which is a column with the appropriate week number and the drop-down list I want to use is on cell B2. I tried using an advance filter but it was only filtering once and whenever I changed my selection on the drop-down list, the filter wouldn’t update. I am a complete beginner, and all the answers I have seen were to complicated with code which
I have a sheet that I want to filter using a drop-down list. Column A is what I want to filter the sheet by which is a column with the appropriate week number and the drop-down list I want to use is on cell B2. I tried using an advance filter but it was only filtering once and whenever I changed my selection on the drop-down list, the filter wouldn’t update. I am a complete beginner, and all the answers I have seen were to complicated with code which Read More
Device Preparation Policies
I am trying to use the device preparation policies within our company
After signing in and waiting for it to reach 100% it says it can’t complete the setup because the apps can’t be installed but it will keep trying.
None of the apps have been installed
I go into the logs and i see these errors
The apps which I have selected to install are these
Some of these apps are all assigned to an “All Devices” group.
While others like Chrome, Teams and Dialpad are installed in a group called “Autopilot Devices without Hash” .
This group is the profile which I have used for the device preparation policies with Intune provisioning client as the owner.
The funny thing is the apps are then deployed once the user gets to the desktop screen which defeats the purpose of this exercise.
I’m not sure if I’m doing things right but I’ve followed two guides on how to get this working.
Any help would be appreciated.
I am trying to use the device preparation policies within our company After signing in and waiting for it to reach 100% it says it can’t complete the setup because the apps can’t be installed but it will keep trying. None of the apps have been installed I go into the logs and i see these errors The apps which I have selected to install are these Some of these apps are all assigned to an “All Devices” group. While others like Chrome, Teams and Dialpad are installed in a group called “Autopilot Devices without Hash” .This group is the profile which I have used for the device preparation policies with Intune provisioning client as the owner. The funny thing is the apps are then deployed once the user gets to the desktop screen which defeats the purpose of this exercise. I’m not sure if I’m doing things right but I’ve followed two guides on how to get this working.Any help would be appreciated. Read More
Can’t make a goup calendar readonly anymore with Powershell (again…)
About a year ago the command Set-UnifiedGroup -Identity “MYGROUP” -CalendarMemberReadonly was giving problems. See this post: Can’t make a goup calendar readonly anymore with Powershell – Microsoft Community Hub
This lasted from june 2023 to august 2023 and then it was fixed.
After that, aside from the error message it gave, it used to work fine. But since about a week or 2 the command is giving problems again. I get this error message:
Write-ErrorMessage : |Microsoft.Exchange.Configuration.Tasks.TaskException|We failed to update the unified group. Please try again later.
At C:Userssome.userAppDataLocalTemptmpEXO_gny3w2ea.ldktmpEXO_gny3w2ea.ldk.psm1:1204 char:13
+ Write-ErrorMessage $ErrorObject
+ ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
+ CategoryInfo : NotSpecified: (:) [Set-UnifiedGroup], TaskException
+ FullyQualifiedErrorId : [Server=DBBPR04MB7929,RequestId=<xxx>,TimeStamp=Mon, 10 Jun 2024 06:36:03 GMT],Write-ErrorMessage
Do other people get this error to? How can I fix this?
Kind regards,
Paul
About a year ago the command Set-UnifiedGroup -Identity “MYGROUP” -CalendarMemberReadonly was giving problems. See this post: Can’t make a goup calendar readonly anymore with Powershell – Microsoft Community HubThis lasted from june 2023 to august 2023 and then it was fixed. After that, aside from the error message it gave, it used to work fine. But since about a week or 2 the command is giving problems again. I get this error message:Write-ErrorMessage : |Microsoft.Exchange.Configuration.Tasks.TaskException|We failed to update the unified group. Please try again later.At C:Userssome.userAppDataLocalTemptmpEXO_gny3w2ea.ldktmpEXO_gny3w2ea.ldk.psm1:1204 char:13+ Write-ErrorMessage $ErrorObject+ ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~+ CategoryInfo : NotSpecified: (:) [Set-UnifiedGroup], TaskException+ FullyQualifiedErrorId : [Server=DBBPR04MB7929,RequestId=<xxx>,TimeStamp=Mon, 10 Jun 2024 06:36:03 GMT],Write-ErrorMessage Do other people get this error to? How can I fix this? Kind regards, Paul Read More
Numbering in square brackets [1] in Word
Good morning!
I have a document with auto numbering with the following formatting:
[1] Text …
[1.1] Text …
My question is, the third level of numbering [1.1.1] is also on auto, but the incremental number (in bold) is not correct – how do I fix this?
Any help will be much appreciated!
Debi
Good morning! I have a document with auto numbering with the following formatting: [1] Text … [1.1] Text … My question is, the third level of numbering [1.1.1] is also on auto, but the incremental number (in bold) is not correct – how do I fix this? Any help will be much appreciated! Debi Read More
Setting up Hybrid Email setup with MS Exchange online and Third Party On-Prem (Non Microsoft)
I have a client with a third party mail server hosted on-premise. They are planning to have a hybrid setup where a part of the emails are to be hosted on Exchange Online and the remainder of the emails to remain on-premise with the third party.
How can we implement such a scenario to ensure emails routing work properly.
(New to this kind of hybrid setup)
I have a client with a third party mail server hosted on-premise. They are planning to have a hybrid setup where a part of the emails are to be hosted on Exchange Online and the remainder of the emails to remain on-premise with the third party. How can we implement such a scenario to ensure emails routing work properly. (New to this kind of hybrid setup) Read More
Adding Textbox in SharePoint Page for External users to enter their comments
Hello all,
We are trying to set up a SharePoint site and share it with external users, so that they can add comments to certain items. To do that, we need to add a Text Area in various sections of the page, so an external user can enter their thoughts related to a certain item. Customizing the SharePoint page allows us to add a Text box, but that is not editable once the page is published. How can we allow external users to type in the comments in the page?
Thanks,
Karthik
Hello all, We are trying to set up a SharePoint site and share it with external users, so that they can add comments to certain items. To do that, we need to add a Text Area in various sections of the page, so an external user can enter their thoughts related to a certain item. Customizing the SharePoint page allows us to add a Text box, but that is not editable once the page is published. How can we allow external users to type in the comments in the page? Thanks,Karthik Read More
Seamless Recovery: How to Automate Azure VM Evictions Start Ups with Azure Functions
Introduction
Azure has some incredible services that we can use for all business sizes and even budgets. One of these amazing services we find is a highly discounted virtual machine called a spot instance. A spot instance in essence is a special kind of Virtual Machine that can at any time be evicted when capacity is required for the standard or default Virtual Machines. Why would I want to run it then? Because it is super cheap! How cheap? In some cases, up to 90% Off.
Here are a few key points to consider:
Cost Savings: Spot VMs can provide discounts of up to 90% compared to regular pricing, though typical savings are often in the range of 60-80%.
Pricing Fluctuation: The price of Spot VMs fluctuates based on supply and demand in the Azure data centers. When demand is low, prices drop, making them extremely cost-effective.
Availability: Spot VMs can be evicted when Azure needs the capacity for other VMs, so they are best suited for non-critical workloads that can handle interruptions.
Use Cases: Spot VMs are ideal for batch processing, testing and development, stateless applications, and any other workloads that can be paused and resumed.
Now in the case of these Spot VMs, how can I try and start up my VMs once they have been evicted off their current host. Well, I can use Azure Functions to help me with this!
Requirements
A spot instance Virtual Machine
An Azure Function with a managed Identity that has VM Contributor Role
Access to the Activity Log so we can create an Alert
Access to Action Groups in Azure Monitor to create the action that will call the Function
Let’s go!
Here is my SPOT VM already created.
If you do not know which VMs you have that are SPOT VMs feel free to use the below Resource Graph Query that will show SPOT VMs. You can also find it in GitHub à RallTheory/AzureFunctions/AllSpotVMsGraphQuery.KQL at main · WernerRall147/RallTheory (github.com)
resources
| where type == ‘microsoft.compute/virtualmachines’
| where properties has ‘evictionPolicy’
| project
resourceGroup,
vmName = name,
vmSize = tostring(properties.hardwareProfile.vmSize),
osType = tostring(properties.storageProfile.osDisk.osType),
evictionPolicy = tostring(properties.extended.instanceView.powerState.code)
Let’s create our Azure Alerts for Spot VMs. This requires an extra step because we do not know what an Eviction even looks like?
To simulate an eviction we will run this command in the Azure Coud Shell but you can run it anywhere.
az vm simulate-eviction -g <yourresourcegroup> -n <yourserver>
If we wait between 5 – 10 Minutes we can see what eviction looks like in the Activity Log of the Virtual Machine.
Next we click on the EvictSPOTVM activity and create an alert for this activity
We do have to modify some of the Parameters because I want my function to run always.
Click review + create for now. We will come back here when our function has been created.
I created an Azure Function app with the HttpTrigger Function. The only requirement here is that your Azure function is a PowerShell function. More can be seen here Create a PowerShell function using Visual Studio Code – Azure Functions | Microsoft Learn
The code I used can be found in GitHub repo RallTheory/AzureFunctions/AzFunction_SpotVM_Evicted_StartUp.ps1 at main · WernerRall147/RallTheory (github.com)
Now that our function has been created successfully, we need to instruct the Alert and Action Group to call the Function. We go back to our Resource Group and find our Action Group and make the changes in the actions section of our action group. We point it to our Azure Function and save.
Now we are all set!
So what happens now?
In the event of an eviction happening the Azure Alert will call the action group.
The action group will call the function.
The Function will see if the VMs have been evicted and will try to bring the servers back up. (In the future I will be working on some backoff or circuit breaker patterns for this)
What does it look like on our VM?
First of all the VM is running.
Secondly, we can see in the Activity Log that the Azure Function is what started my VM.
This opens many other possibilities for us and allows us to think in a cloud way of only provisioning resources we need, at the time we need. It also shows the power of Azure Functions and that you do not need to be some Level 400 Developer to write advanced scripts.
Disclaimer
The sample scripts are not supported under any Microsoft standard support program or service. The sample scripts or Power BI Dashboards are provided AS IS without warranty of any kind. Microsoft further disclaims all implied warranties including, without limitation, any implied warranties of merchantability or of fitness for a particular purpose. The entire risk arising out of the use or performance of the sample scripts and documentation remains with you. In no event shall Microsoft, its authors, or anyone else involved in the creation, production, or delivery of the scripts or Power BI Dashboards be liable for any damages whatsoever (including, without limitation, damages for loss of business profits, business interruption, loss of business information, or other pecuniary loss) arising out of the use of or inability to use the sample scripts or documentation, even if Microsoft has been advised of the possibility of such damages. This blog post was written with the help of generative AI.
Microsoft Tech Community – Latest Blogs –Read More
Visual Studio Code AI Toolkit: Run LLMs locally
The generative AI landscape is in a constant state of flux, with new developments emerging at a breakneck pace. In recent times along with LLMs we have also seen the rise of SLMs. From virtual assistants to chatbots, SLMs are revolutionizing how we interact with technology through conversation. As the backbone of many conversational models, SLMs enable natural language understanding and generation, leading to more engaging user experiences.
The deployment of large language models (LLMs) and smaller language models (SLMs) on local infrastructure has emerged as a critical area of discussion due to several compelling factors. These factors include maintaining stringent data privacy regulations, achieving cost-effectiveness over time, and enabling greater flexibility for customization and integration.
AI Toolkit (Earlier known as Windows AI Studio) is here to address such problems, some major problems this solves is,
Onboarding the LLMs/ SLMs on our local machines. This toolkit lets us to easily download the models on our local machine.
Evaluation of the model. Whenever we need to evaluate a model to check for the feasibility to any particular application, then this tool lets us do it in a playground environment, which is what we will seeing in this blog.
Fine-tuning, this majorly delas with training the model further to do the tasks that we specifically want the model to do. Usually, it does a generic task and has generic data, with fine-tuning we can give it a particular flavor to perform particular task.
The best part is that it runs on windows machine and has models which are optimized for windows machine. The AI toolkit lets the models run locally and makes it offline capable. AI toolkit opens up plethora of scenarios for organizations in various sectors like healthcare, education, banking, governments and so on.
Bring AI development into your VS Code workflow with the AI Toolkit extension. It empowers you to:
Run pre-optimized AI models locally: Get started quickly with models designed for various setups, including Windows 11 running with DirectML acceleration or direct CPU, Linux with NVIDIA GPUs, or CPU-only environments.
Test and integrate models seamlessly: Experiment with models in a user-friendly playground or use a REST API to incorporate them directly into your application.
Fine-tune models for specific needs: Customize pre-trained models (like popular SLMs Phi-3 and Mistral) locally or in the cloud to enhance performance, tailor responses, and control their style.
Deploy your AI-powered features: Choose between cloud deployment or embedding them within your device applications.
Alright! Now let’s experience this amazing extension on our machines using Visual Studio Code. Since this is available as VS Code extension, Visual Studio code is a direct prerequisite to use this tool. Use this link to download VSCode on your machines.
We can run AI Toolkit Preview directly on local machine. However, certain tasks might only be available on Windows or Linux depending on the chosen model. Mac support is on the way!
For local run on Windows + WSL, WSL Ubuntu distro 18.4 or greater should be installed and is set to default prior to using AI Toolkit. Learn more how to install Windows subsystem for Linux and changing default distribution or I have explained it step-wise in one of the previous blog where I have demonstrated the installation of windows AI studio. You can find it here. Steps of installation of WSL remains the same as explained in that blog. Windows AI Studio is deprecated and is now rebranded as AI Toolkit. For the latest documentation and to download and use the AI Toolkit, please visit the GitHub page.
Once the WSL is installed, launch the Ubuntu terminal and type the following,
code .
This should launch the Visual studio code. Since this is the first launch, it will collect few things.
Now Visual studio code window will be launched.
Note that this will be in the remote connection with session name as WSL Ubuntu. Now the extensions we will be installing will be done in WSL.
On the activity bar Visual Studio Code window, there is an “Extension” option
.
Click on this and search for “AI Toolkit” and install the extension, once it is installed, we can see an extra icon on the activity bar.
Once it is installed, a new extension will be visible on the left side menu, when clicked on it, a pop-up notification comes up showcasing the port forwarding capabilities and also auto assigns one port for the Toolkit.
Also, two fresh sections are shown under AI-toolkit namely Models and Resources.
Models section contains the following,
Model Catalog
Resources section contains the following,
Model Playground
Model Finetuning
Models contains the Model Catalog which is basically the list of all the available AI Models. This is where we can choose and download a model which fits our use case.AI Toolkit offers the collection of publicly available AI models already optimized for Windows. The models are stored in the different locations including Hugging Face, GitHub and others, but we can browse the models and find all of them in one place ready for downloading and using in windows application.
We can also find the model cards for each of the model, to check various parameters of the model in order to further decide which one to choose for a particular application. Few more details like, number of parameters the model is pre-trained on, Dependency on CPU or GPU, the size of the model is all available here. Finally, upon deciding, the model can be downloaded using the “Download” button for each model. Any number of models can be downloaded.
For the purpose of this demonstration, I will download Mistral-mistral-7b-v02-int4-gpu and one of the recent SLM of Microsoft Phi-3-mini-128k-cuda-int4-onnx
Note: For optimized performance on Windows devices that have at least one GPU, select model versions that only target Windows. This ensures you have a model optimized for the DirectML accelerator. The model names are in the format of {model_name}-{accelerator}-{quantization}-{format}.
To check whether you have a GPU on your Windows device, open Task Manager and then select the Performance tab. If you have GPU(s), they will be listed under names like “GPU 0” or “GPU 1”.
The next interesting part is the Model playground which is available in the resources section. For the models that we have evaluated using model card and downloaded, its time now to test them out using the Playground!
Playground has multiple sections, let’s see each one.
Model: This is the placeholder which lets us load the model. In this case I will be using the Phi-3-mini-128k-cuda-int4-onnx.
Context Instructions: This is the system prompt for the model. It guides the model the way in which it has to behave to a particular scenario. For example, we can ask it to respond in a Shakespearean tone, and it will respond accordingly. I will input “Respond in Shakespearean accent” as the Context Instruction.
Inference Parameters: These are the adjustment parameters for the model. Under this section we have Max response length (tokens), Temperature, Top P, Frequency Penalty, Presence penalty. Hovering over the small “i” icon explains about each parameter.
Chat Area: This is where we type in our messages and finally engage in a chat conversation with the model. The model responds on the pretrained data.
Note: Some machines might show the error as follows while loading the model in the playground.
Failed loading model Phi-3-mini-128k-cuda-int4-onnx: /onnxruntime_src/onnxruntime/core/session/provider_bridge_ort.cc:1426 onnxruntime::Provider& onnxruntime::ProviderLibrary::Get() [ONNXRuntimeError] : 1 : FAIL : Failed to load library libonnxruntime_providers_cuda.so with error: libcudnn.so.8: cannot open shared object file: No such file or directory
This is majorly due to some missing libraries, to fix it, execute the following commands one by one at a time in the Ubuntu terminal or the VS Code terminal of the WSL session.
• pip install onnxruntime
• pip install onnxruntime-gpu
• cd /usr/local/cuda/lib64
• ls
• sudo apt install nvidia-cudnn
• sudo apt update
• apt list –upgradable
• sudo apt upgrade
• sudo apt update
• sudo apt update —fix-missing
• sudo apt-get install libcudnn8
• sudo apt update
This must help you resolve the error!
It is now responding in Shakespearean accent because of the Context information.
We can further evaluate the model based on our needs and the best part is that it is absolutely free and is now running on Local machine!! AI toolkit thus solves a major problem and also helps us in streamlining the development of GenAI applications. In the further blogs let’s see how to interact with the model using Python and build some cool applications. Stay tuned!
Microsoft Tech Community – Latest Blogs –Read More
Implementing Mission Critical Solutions with the Azure Logic Apps Rules Engine
Business rules engines offer a low-code environment that lets you build applications faster and easier, reducing dependencies on programming. Rules engine help to create and change business logic without having to write code or restart the applications that use them. Also, In a world of microservices that promotes decoupling, rules engines provide consistency, clarity, and compliance across different services and domains. Those are some of the benefits of using a Rules Engine.
BizTalk Server includes a Business Rules Engine. We have incorporated the RETE runtime included in the product along with support of .net and XML facts into Azure Logic Apps. This means that customers migrating from BizTalk Server can leverage their existing BRE implementations in Azure Logic Apps now. This includes our customers looking to migrate their BizTalk Server SWIFT solutions to Azure Logic Apps.
The Microsoft Rules Composer
To help you create rules for use with your Azure Logic Apps Rules Engine project, the Microsoft Rules Composer provides a visual tool for authoring and versioning rulesets and vocabularies. It is an stand along application that can be downloaded from https://www.microsoft.com/en-us/download/details.aspx?id=106092.
Rules
Rules are declarative statements that include a condition and actions where the condition is evaluated. If the result is true, the rules engine performs one or more actions. The following diagram shows the relationship between Rulesets, Rules, Facts, Conditions and Actions:
What are Vocabularies?
Vocabularies are collections of definitions consisting of friendly names for the facts used in rule conditions and actions. They make the rules easier to read, understand, and share by people in a particular business domain. For instance: “Status”. Vocabularies can be of the following types:
Constant Value
Range of Values
Set of Values
Control functions and Forward Chaining
Control functions help applications to control the facts in the engine’s behavior. Facts in working memory drive the conditions that the engine evaluates and the actions that execute. An example of this is the Forward Chaining Inference. The Forward Chaining Inference finds all true statements given the knowledge base and a new set of facts. It uses the control function “Update”.
Testing Rules
The Microsoft Rules Composer follows a Shift left approach: As you build your rulesets so that you can integrate business logic with your Standard workflows, you can test your ruleset incrementally. This feature is recommended for long or complex rules, to avoid lengthy troubleshooting.
To test .net facts, you should build a Fact creator. You don’t need Fact creators for XML facts.
The outcome is a trace window with the results of the evaluation of the rules.
Migrating from BizTalk Server
As the Logic Apps Rules Engine, is an evolution of the BizTalk Business Rules Engine (BRE), BRE rules can be used in Logic Apps. As policies no longer exist, you should export each policy individually.
As DBFacts are not supported in this release, you need to remove them from your policies or refactor them.
Creating Rules in VSCode
You can create Rules Engine projects using VSCode. You should create a Logic Apps workspace and then a Logic app with rules engine project.
For a complete demonstration on how to use the Azure Logic Apps Rules Engine, watch the following video:
Microsoft Tech Community – Latest Blogs –Read More
trajectory matlab transfer to robotstudio
I want to transfer my roboth path exactly to rapid (robotstudio).
How is this possible?I want to transfer my roboth path exactly to rapid (robotstudio).
How is this possible? I want to transfer my roboth path exactly to rapid (robotstudio).
How is this possible? robotstudio, trajectory, transferring MATLAB Answers — New Questions
How to replace the diagonal entries of a square matrix with entries from a vectore of equal length?
I have an nxn matrix M, and a vector v of length n. I want to swap the diagonal of M with the vector v, without a for loop. Is that possible? Thanks!I have an nxn matrix M, and a vector v of length n. I want to swap the diagonal of M with the vector v, without a for loop. Is that possible? Thanks! I have an nxn matrix M, and a vector v of length n. I want to swap the diagonal of M with the vector v, without a for loop. Is that possible? Thanks! matrix manipulation MATLAB Answers — New Questions