Category: News
Adding Textbox in SharePoint Page for External users to enter their comments
Hello all,
We are trying to set up a SharePoint site and share it with external users, so that they can add comments to certain items. To do that, we need to add a Text Area in various sections of the page, so an external user can enter their thoughts related to a certain item. Customizing the SharePoint page allows us to add a Text box, but that is not editable once the page is published. How can we allow external users to type in the comments in the page?
Thanks,
Karthik
Hello all, We are trying to set up a SharePoint site and share it with external users, so that they can add comments to certain items. To do that, we need to add a Text Area in various sections of the page, so an external user can enter their thoughts related to a certain item. Customizing the SharePoint page allows us to add a Text box, but that is not editable once the page is published. How can we allow external users to type in the comments in the page? Thanks,Karthik Read More
Seamless Recovery: How to Automate Azure VM Evictions Start Ups with Azure Functions
Introduction
Azure has some incredible services that we can use for all business sizes and even budgets. One of these amazing services we find is a highly discounted virtual machine called a spot instance. A spot instance in essence is a special kind of Virtual Machine that can at any time be evicted when capacity is required for the standard or default Virtual Machines. Why would I want to run it then? Because it is super cheap! How cheap? In some cases, up to 90% Off.
Here are a few key points to consider:
Cost Savings: Spot VMs can provide discounts of up to 90% compared to regular pricing, though typical savings are often in the range of 60-80%.
Pricing Fluctuation: The price of Spot VMs fluctuates based on supply and demand in the Azure data centers. When demand is low, prices drop, making them extremely cost-effective.
Availability: Spot VMs can be evicted when Azure needs the capacity for other VMs, so they are best suited for non-critical workloads that can handle interruptions.
Use Cases: Spot VMs are ideal for batch processing, testing and development, stateless applications, and any other workloads that can be paused and resumed.
Now in the case of these Spot VMs, how can I try and start up my VMs once they have been evicted off their current host. Well, I can use Azure Functions to help me with this!
Requirements
A spot instance Virtual Machine
An Azure Function with a managed Identity that has VM Contributor Role
Access to the Activity Log so we can create an Alert
Access to Action Groups in Azure Monitor to create the action that will call the Function
Let’s go!
Here is my SPOT VM already created.
If you do not know which VMs you have that are SPOT VMs feel free to use the below Resource Graph Query that will show SPOT VMs. You can also find it in GitHub à RallTheory/AzureFunctions/AllSpotVMsGraphQuery.KQL at main · WernerRall147/RallTheory (github.com)
resources
| where type == ‘microsoft.compute/virtualmachines’
| where properties has ‘evictionPolicy’
| project
resourceGroup,
vmName = name,
vmSize = tostring(properties.hardwareProfile.vmSize),
osType = tostring(properties.storageProfile.osDisk.osType),
evictionPolicy = tostring(properties.extended.instanceView.powerState.code)
Let’s create our Azure Alerts for Spot VMs. This requires an extra step because we do not know what an Eviction even looks like?
To simulate an eviction we will run this command in the Azure Coud Shell but you can run it anywhere.
az vm simulate-eviction -g <yourresourcegroup> -n <yourserver>
If we wait between 5 – 10 Minutes we can see what eviction looks like in the Activity Log of the Virtual Machine.
Next we click on the EvictSPOTVM activity and create an alert for this activity
We do have to modify some of the Parameters because I want my function to run always.
Click review + create for now. We will come back here when our function has been created.
I created an Azure Function app with the HttpTrigger Function. The only requirement here is that your Azure function is a PowerShell function. More can be seen here Create a PowerShell function using Visual Studio Code – Azure Functions | Microsoft Learn
The code I used can be found in GitHub repo RallTheory/AzureFunctions/AzFunction_SpotVM_Evicted_StartUp.ps1 at main · WernerRall147/RallTheory (github.com)
Now that our function has been created successfully, we need to instruct the Alert and Action Group to call the Function. We go back to our Resource Group and find our Action Group and make the changes in the actions section of our action group. We point it to our Azure Function and save.
Now we are all set!
So what happens now?
In the event of an eviction happening the Azure Alert will call the action group.
The action group will call the function.
The Function will see if the VMs have been evicted and will try to bring the servers back up. (In the future I will be working on some backoff or circuit breaker patterns for this)
What does it look like on our VM?
First of all the VM is running.
Secondly, we can see in the Activity Log that the Azure Function is what started my VM.
This opens many other possibilities for us and allows us to think in a cloud way of only provisioning resources we need, at the time we need. It also shows the power of Azure Functions and that you do not need to be some Level 400 Developer to write advanced scripts.
Disclaimer
The sample scripts are not supported under any Microsoft standard support program or service. The sample scripts or Power BI Dashboards are provided AS IS without warranty of any kind. Microsoft further disclaims all implied warranties including, without limitation, any implied warranties of merchantability or of fitness for a particular purpose. The entire risk arising out of the use or performance of the sample scripts and documentation remains with you. In no event shall Microsoft, its authors, or anyone else involved in the creation, production, or delivery of the scripts or Power BI Dashboards be liable for any damages whatsoever (including, without limitation, damages for loss of business profits, business interruption, loss of business information, or other pecuniary loss) arising out of the use of or inability to use the sample scripts or documentation, even if Microsoft has been advised of the possibility of such damages. This blog post was written with the help of generative AI.
Microsoft Tech Community – Latest Blogs –Read More
Visual Studio Code AI Toolkit: Run LLMs locally
The generative AI landscape is in a constant state of flux, with new developments emerging at a breakneck pace. In recent times along with LLMs we have also seen the rise of SLMs. From virtual assistants to chatbots, SLMs are revolutionizing how we interact with technology through conversation. As the backbone of many conversational models, SLMs enable natural language understanding and generation, leading to more engaging user experiences.
The deployment of large language models (LLMs) and smaller language models (SLMs) on local infrastructure has emerged as a critical area of discussion due to several compelling factors. These factors include maintaining stringent data privacy regulations, achieving cost-effectiveness over time, and enabling greater flexibility for customization and integration.
AI Toolkit (Earlier known as Windows AI Studio) is here to address such problems, some major problems this solves is,
Onboarding the LLMs/ SLMs on our local machines. This toolkit lets us to easily download the models on our local machine.
Evaluation of the model. Whenever we need to evaluate a model to check for the feasibility to any particular application, then this tool lets us do it in a playground environment, which is what we will seeing in this blog.
Fine-tuning, this majorly delas with training the model further to do the tasks that we specifically want the model to do. Usually, it does a generic task and has generic data, with fine-tuning we can give it a particular flavor to perform particular task.
The best part is that it runs on windows machine and has models which are optimized for windows machine. The AI toolkit lets the models run locally and makes it offline capable. AI toolkit opens up plethora of scenarios for organizations in various sectors like healthcare, education, banking, governments and so on.
Bring AI development into your VS Code workflow with the AI Toolkit extension. It empowers you to:
Run pre-optimized AI models locally: Get started quickly with models designed for various setups, including Windows 11 running with DirectML acceleration or direct CPU, Linux with NVIDIA GPUs, or CPU-only environments.
Test and integrate models seamlessly: Experiment with models in a user-friendly playground or use a REST API to incorporate them directly into your application.
Fine-tune models for specific needs: Customize pre-trained models (like popular SLMs Phi-3 and Mistral) locally or in the cloud to enhance performance, tailor responses, and control their style.
Deploy your AI-powered features: Choose between cloud deployment or embedding them within your device applications.
Alright! Now let’s experience this amazing extension on our machines using Visual Studio Code. Since this is available as VS Code extension, Visual Studio code is a direct prerequisite to use this tool. Use this link to download VSCode on your machines.
We can run AI Toolkit Preview directly on local machine. However, certain tasks might only be available on Windows or Linux depending on the chosen model. Mac support is on the way!
For local run on Windows + WSL, WSL Ubuntu distro 18.4 or greater should be installed and is set to default prior to using AI Toolkit. Learn more how to install Windows subsystem for Linux and changing default distribution or I have explained it step-wise in one of the previous blog where I have demonstrated the installation of windows AI studio. You can find it here. Steps of installation of WSL remains the same as explained in that blog. Windows AI Studio is deprecated and is now rebranded as AI Toolkit. For the latest documentation and to download and use the AI Toolkit, please visit the GitHub page.
Once the WSL is installed, launch the Ubuntu terminal and type the following,
code .
This should launch the Visual studio code. Since this is the first launch, it will collect few things.
Now Visual studio code window will be launched.
Note that this will be in the remote connection with session name as WSL Ubuntu. Now the extensions we will be installing will be done in WSL.
On the activity bar Visual Studio Code window, there is an “Extension” option
.
Click on this and search for “AI Toolkit” and install the extension, once it is installed, we can see an extra icon on the activity bar.
Once it is installed, a new extension will be visible on the left side menu, when clicked on it, a pop-up notification comes up showcasing the port forwarding capabilities and also auto assigns one port for the Toolkit.
Also, two fresh sections are shown under AI-toolkit namely Models and Resources.
Models section contains the following,
Model Catalog
Resources section contains the following,
Model Playground
Model Finetuning
Models contains the Model Catalog which is basically the list of all the available AI Models. This is where we can choose and download a model which fits our use case.AI Toolkit offers the collection of publicly available AI models already optimized for Windows. The models are stored in the different locations including Hugging Face, GitHub and others, but we can browse the models and find all of them in one place ready for downloading and using in windows application.
We can also find the model cards for each of the model, to check various parameters of the model in order to further decide which one to choose for a particular application. Few more details like, number of parameters the model is pre-trained on, Dependency on CPU or GPU, the size of the model is all available here. Finally, upon deciding, the model can be downloaded using the “Download” button for each model. Any number of models can be downloaded.
For the purpose of this demonstration, I will download Mistral-mistral-7b-v02-int4-gpu and one of the recent SLM of Microsoft Phi-3-mini-128k-cuda-int4-onnx
Note: For optimized performance on Windows devices that have at least one GPU, select model versions that only target Windows. This ensures you have a model optimized for the DirectML accelerator. The model names are in the format of {model_name}-{accelerator}-{quantization}-{format}.
To check whether you have a GPU on your Windows device, open Task Manager and then select the Performance tab. If you have GPU(s), they will be listed under names like “GPU 0” or “GPU 1”.
The next interesting part is the Model playground which is available in the resources section. For the models that we have evaluated using model card and downloaded, its time now to test them out using the Playground!
Playground has multiple sections, let’s see each one.
Model: This is the placeholder which lets us load the model. In this case I will be using the Phi-3-mini-128k-cuda-int4-onnx.
Context Instructions: This is the system prompt for the model. It guides the model the way in which it has to behave to a particular scenario. For example, we can ask it to respond in a Shakespearean tone, and it will respond accordingly. I will input “Respond in Shakespearean accent” as the Context Instruction.
Inference Parameters: These are the adjustment parameters for the model. Under this section we have Max response length (tokens), Temperature, Top P, Frequency Penalty, Presence penalty. Hovering over the small “i” icon explains about each parameter.
Chat Area: This is where we type in our messages and finally engage in a chat conversation with the model. The model responds on the pretrained data.
Note: Some machines might show the error as follows while loading the model in the playground.
Failed loading model Phi-3-mini-128k-cuda-int4-onnx: /onnxruntime_src/onnxruntime/core/session/provider_bridge_ort.cc:1426 onnxruntime::Provider& onnxruntime::ProviderLibrary::Get() [ONNXRuntimeError] : 1 : FAIL : Failed to load library libonnxruntime_providers_cuda.so with error: libcudnn.so.8: cannot open shared object file: No such file or directory
This is majorly due to some missing libraries, to fix it, execute the following commands one by one at a time in the Ubuntu terminal or the VS Code terminal of the WSL session.
• pip install onnxruntime
• pip install onnxruntime-gpu
• cd /usr/local/cuda/lib64
• ls
• sudo apt install nvidia-cudnn
• sudo apt update
• apt list –upgradable
• sudo apt upgrade
• sudo apt update
• sudo apt update —fix-missing
• sudo apt-get install libcudnn8
• sudo apt update
This must help you resolve the error!
It is now responding in Shakespearean accent because of the Context information.
We can further evaluate the model based on our needs and the best part is that it is absolutely free and is now running on Local machine!! AI toolkit thus solves a major problem and also helps us in streamlining the development of GenAI applications. In the further blogs let’s see how to interact with the model using Python and build some cool applications. Stay tuned!
Microsoft Tech Community – Latest Blogs –Read More
Implementing Mission Critical Solutions with the Azure Logic Apps Rules Engine
Business rules engines offer a low-code environment that lets you build applications faster and easier, reducing dependencies on programming. Rules engine help to create and change business logic without having to write code or restart the applications that use them. Also, In a world of microservices that promotes decoupling, rules engines provide consistency, clarity, and compliance across different services and domains. Those are some of the benefits of using a Rules Engine.
BizTalk Server includes a Business Rules Engine. We have incorporated the RETE runtime included in the product along with support of .net and XML facts into Azure Logic Apps. This means that customers migrating from BizTalk Server can leverage their existing BRE implementations in Azure Logic Apps now. This includes our customers looking to migrate their BizTalk Server SWIFT solutions to Azure Logic Apps.
The Microsoft Rules Composer
To help you create rules for use with your Azure Logic Apps Rules Engine project, the Microsoft Rules Composer provides a visual tool for authoring and versioning rulesets and vocabularies. It is an stand along application that can be downloaded from https://www.microsoft.com/en-us/download/details.aspx?id=106092.
Rules
Rules are declarative statements that include a condition and actions where the condition is evaluated. If the result is true, the rules engine performs one or more actions. The following diagram shows the relationship between Rulesets, Rules, Facts, Conditions and Actions:
What are Vocabularies?
Vocabularies are collections of definitions consisting of friendly names for the facts used in rule conditions and actions. They make the rules easier to read, understand, and share by people in a particular business domain. For instance: “Status”. Vocabularies can be of the following types:
Constant Value
Range of Values
Set of Values
Control functions and Forward Chaining
Control functions help applications to control the facts in the engine’s behavior. Facts in working memory drive the conditions that the engine evaluates and the actions that execute. An example of this is the Forward Chaining Inference. The Forward Chaining Inference finds all true statements given the knowledge base and a new set of facts. It uses the control function “Update”.
Testing Rules
The Microsoft Rules Composer follows a Shift left approach: As you build your rulesets so that you can integrate business logic with your Standard workflows, you can test your ruleset incrementally. This feature is recommended for long or complex rules, to avoid lengthy troubleshooting.
To test .net facts, you should build a Fact creator. You don’t need Fact creators for XML facts.
The outcome is a trace window with the results of the evaluation of the rules.
Migrating from BizTalk Server
As the Logic Apps Rules Engine, is an evolution of the BizTalk Business Rules Engine (BRE), BRE rules can be used in Logic Apps. As policies no longer exist, you should export each policy individually.
As DBFacts are not supported in this release, you need to remove them from your policies or refactor them.
Creating Rules in VSCode
You can create Rules Engine projects using VSCode. You should create a Logic Apps workspace and then a Logic app with rules engine project.
For a complete demonstration on how to use the Azure Logic Apps Rules Engine, watch the following video:
Microsoft Tech Community – Latest Blogs –Read More
trajectory matlab transfer to robotstudio
I want to transfer my roboth path exactly to rapid (robotstudio).
How is this possible?I want to transfer my roboth path exactly to rapid (robotstudio).
How is this possible? I want to transfer my roboth path exactly to rapid (robotstudio).
How is this possible? robotstudio, trajectory, transferring MATLAB Answers — New Questions
How to replace the diagonal entries of a square matrix with entries from a vectore of equal length?
I have an nxn matrix M, and a vector v of length n. I want to swap the diagonal of M with the vector v, without a for loop. Is that possible? Thanks!I have an nxn matrix M, and a vector v of length n. I want to swap the diagonal of M with the vector v, without a for loop. Is that possible? Thanks! I have an nxn matrix M, and a vector v of length n. I want to swap the diagonal of M with the vector v, without a for loop. Is that possible? Thanks! matrix manipulation MATLAB Answers — New Questions
High packet latency in WLAN system level simulation example
Hi,
I am new to wi-fi protocol and simulations. I have started working with the basic WLAN system level simulation example of MATLAB – Link
The latency in this example even after commenting out the wireless channel fading part and changing the "MACFrameAbstraction" to "false" and "PHYAbstractionMethod" to "none" comes out above or near to 0.2 sec for both the AP and STA.
How can I model the system for ideal scenerio and reduce this latency values?
Thanks,
GarvitHi,
I am new to wi-fi protocol and simulations. I have started working with the basic WLAN system level simulation example of MATLAB – Link
The latency in this example even after commenting out the wireless channel fading part and changing the "MACFrameAbstraction" to "false" and "PHYAbstractionMethod" to "none" comes out above or near to 0.2 sec for both the AP and STA.
How can I model the system for ideal scenerio and reduce this latency values?
Thanks,
Garvit Hi,
I am new to wi-fi protocol and simulations. I have started working with the basic WLAN system level simulation example of MATLAB – Link
The latency in this example even after commenting out the wireless channel fading part and changing the "MACFrameAbstraction" to "false" and "PHYAbstractionMethod" to "none" comes out above or near to 0.2 sec for both the AP and STA.
How can I model the system for ideal scenerio and reduce this latency values?
Thanks,
Garvit wlan, wi-fi, wireless communication MATLAB Answers — New Questions
Microsoft Todo (on Teams) goes blank occassionally
See image, The Private tasks (from Microsoft Todo) go blank occasionally.
You can still view other tasks from Planner, but the private tasks disappear and won’t come back until you click away from the Planner window completely and sometimes after multiple tries.
Microsoft Teams version 24124.2315.2911.3357. Client version is 49/24050307617
Could this be fixed please?
See image, The Private tasks (from Microsoft Todo) go blank occasionally.You can still view other tasks from Planner, but the private tasks disappear and won’t come back until you click away from the Planner window completely and sometimes after multiple tries. Microsoft Teams version 24124.2315.2911.3357. Client version is 49/24050307617 Could this be fixed please? Read More
I am unable to extend the C drive volume. The ‘Extend Volume’ option is disabled in Windows 11 Home.
I am unable to extend the C drive volume. The ‘Extend Volume’ option is disabled in Windows 11 Home. Trying using DISKPART but the same issue.
I am unable to extend the C drive volume. The ‘Extend Volume’ option is disabled in Windows 11 Home. Trying using DISKPART but the same issue. Read More
📢 Announcement!! Azure OpenAI and Azure AI Search connectors are now Generally Available (GA)
Announcing General Availability: Azure OpenAI and AI Search Connectors for Logic Apps
We are thrilled to announce the General Availability of the Azure OpenAI and AI Search connectors for Logic Apps. These new connectors integrate the power of Azure Open AI’s natural language processing with Azure AI Search’s intelligent search capabilities, enabling developers to build intelligent, AI-driven applications seamlessly.
Innovate where you Integrate
Data is the cornerstone of any AI application, unique to each organization. Business processes, whether in the cloud or within a VNET, rely on this data and can be managed by modern or legacy applications. Regardless of where your data resides, Azure Logic Apps offers the ability to easily infuse AI into both new and existing business processes.
With over 1000 connectors to various applications and services, Logic Apps simplifies the integration of AI, enabling the development of Retrieval-Augmented Generation (RAG) applications. This seamless integration enhances the functionality and intelligence of your business processes, ensuring that your applications are both innovative and efficient.
By leveraging these connectors alongside AI services, organizations can transform their operations and generate intelligent insights like never before. Whether it’s automating routine tasks, enhancing customer interactions and support, or generating insights, Azure Logic Apps provides a robust platform for embedding AI into your enterprise’s fabric.
RAG-based Patterns for AI Applications using Azure Logic Apps
Using Logic Apps and these AI connectors, you can quickly build and productionize AI applications based on RAG pattern. Retrieval-Augmented Generation (RAG) combines retrieval and generative models to improve the accuracy and relevance of AI-generated content. This pattern is particularly useful in scenarios where precise information retrieval is crucial, such as:
Customer Support Automation
Automate customer support by generating accurate responses using Azure OpenAI and retrieving relevant information from knowledge bases with Azure AI Search.
Document Processing
Streamline document-heavy processes by summarizing, extracting key information, and translating documents with Azure OpenAI, while Azure AI Search enables quick document retrieval.
Content Generation
Create high-quality content for blogs and social media with the help of Azure OpenAI, and ensure its accuracy and relevance using Azure AI Search.
Personalized Recommendations
Build personalized recommendation systems by analyzing user behavior with Azure OpenAI and retrieving matching products, services, or content using Azure AI Search.
Healthcare Data Management
Enhance healthcare data management by summarizing patient records and extracting critical information with Azure OpenAI, while Azure AI Search helps in quickly locating specific patient information or research articles.
Financial Analysis and Reporting
Improve financial analysis by generating comprehensive financial reports using Azure OpenAI and retrieving relevant financial data and market trends with Azure AI Search.
Key Features and Capabilities
The connectors simplify backend processes with a codeless setup, reducing the complexity of integrating AI capabilities into workflows.
The Azure OpenAI Connector provides powerful AI functionalities such as generating embeddings, summarization, and chat completion, which are pivotal for creating sophisticated AI applications. Meanwhile, the Azure AI Search Connector enhances data retrieval with advanced vector and hybrid search operations.
You can ingest and retrieve both structured and unstructured data and in various formats such as documents, pdfs, json, text and more.
Getting Started
To get started with the Azure OpenAI and Azure AI Search connector for Logic Apps, visit the MS Learn documentation. Here, you will find detailed guides on how to set up and integrate these powerful tools into your workflows.
What’s Next
The General Availability of the Azure OpenAI and Azure AI Search connector for Logic Apps marks a significant step forward in the integration of AI into business processes. By leveraging these cutting-edge technologies, businesses can unlock new levels of efficiency, intelligence, and innovation. We are excited to see how you will use these capabilities to transform your workflows and drive success. We’d love to hear from you – please fill out this form ( aka.ms/raglogicapps) so that we can connect with you.
Stay tuned for more updates and case studies showcasing the impact of Azure OpenAI and Azure AI Search on various industries. As always, we are committed to supporting your journey towards a more intelligent and automated future.
Microsoft Tech Community – Latest Blogs –Read More
Error publishing lane data as custom message
I am trying to publish XY lane points in world coordinates as a custom ros message. No problem adding the custom message to matlab.I can see it in rosmsg list. The error occurs in data assignment. I get the following error
‘A signal can be assigned only once, but the assigned bus signal ‘Xleft’ fully covers the assigned signal ‘Xleft.Data_SL_Info.CurrentLength’ in block ‘LKA0110/Subsystem/Bus Assignment1’
I could not find a suitable explanation on what this error means. Has anyone encountered this before?
Thank youI am trying to publish XY lane points in world coordinates as a custom ros message. No problem adding the custom message to matlab.I can see it in rosmsg list. The error occurs in data assignment. I get the following error
‘A signal can be assigned only once, but the assigned bus signal ‘Xleft’ fully covers the assigned signal ‘Xleft.Data_SL_Info.CurrentLength’ in block ‘LKA0110/Subsystem/Bus Assignment1’
I could not find a suitable explanation on what this error means. Has anyone encountered this before?
Thank you I am trying to publish XY lane points in world coordinates as a custom ros message. No problem adding the custom message to matlab.I can see it in rosmsg list. The error occurs in data assignment. I get the following error
‘A signal can be assigned only once, but the assigned bus signal ‘Xleft’ fully covers the assigned signal ‘Xleft.Data_SL_Info.CurrentLength’ in block ‘LKA0110/Subsystem/Bus Assignment1’
I could not find a suitable explanation on what this error means. Has anyone encountered this before?
Thank you ros, custom messages, ros publisher, bus assignment MATLAB Answers — New Questions
Error (misclassification probability or MSE)
Hi all. I am trying to use the Error (misclassification probability or MSE) function. In the help documentation it shows an example of useage as err = error(B,TBLnew,Ynew). I tried this with err=error(Mdl,Inps,Outs) with Mdl being the LSBOOST ensemble, Inps being the matrix of vars and Outs being the target. If I try this I get ‘Error using error Too many output arguments.’ . I presume I’m missing something here but can’t seem to find a an example.Hi all. I am trying to use the Error (misclassification probability or MSE) function. In the help documentation it shows an example of useage as err = error(B,TBLnew,Ynew). I tried this with err=error(Mdl,Inps,Outs) with Mdl being the LSBOOST ensemble, Inps being the matrix of vars and Outs being the target. If I try this I get ‘Error using error Too many output arguments.’ . I presume I’m missing something here but can’t seem to find a an example. Hi all. I am trying to use the Error (misclassification probability or MSE) function. In the help documentation it shows an example of useage as err = error(B,TBLnew,Ynew). I tried this with err=error(Mdl,Inps,Outs) with Mdl being the LSBOOST ensemble, Inps being the matrix of vars and Outs being the target. If I try this I get ‘Error using error Too many output arguments.’ . I presume I’m missing something here but can’t seem to find a an example. mse, lsboost, ensemble, error MATLAB Answers — New Questions
An error occurred (‘RTW:buildProcess:fatalBuildError’) when calling ‘sim’: error while running SIL Simulation in Test manager
I am running SIL in test manager enviornment and my test cases are failing randomly with the error saying,
An error occurred (‘RTW:buildProcess:fatalBuildError’) when calling ‘sim’:
Error(s) encountered while building "XXX_rtwlib"
While the code for model is getting generated without any issues.
Could you please assist me in resolving the issueI am running SIL in test manager enviornment and my test cases are failing randomly with the error saying,
An error occurred (‘RTW:buildProcess:fatalBuildError’) when calling ‘sim’:
Error(s) encountered while building "XXX_rtwlib"
While the code for model is getting generated without any issues.
Could you please assist me in resolving the issue I am running SIL in test manager enviornment and my test cases are failing randomly with the error saying,
An error occurred (‘RTW:buildProcess:fatalBuildError’) when calling ‘sim’:
Error(s) encountered while building "XXX_rtwlib"
While the code for model is getting generated without any issues.
Could you please assist me in resolving the issue (‘rtw:buildprocess:fatalbuilderror’) when calling MATLAB Answers — New Questions
Moved auth db from localdb to mssql now I get a cert error
here is the error
SqlException: A connection was successfully established with the server, but then an error occurred during the login process. (provider: SSL Provider, error: 0 – The certificate chain was issued by an authority that is not trusted.)
I did an export in ssms from localdb to the new sql databasse then changed the default connection
here is the error SqlException: A connection was successfully established with the server, but then an error occurred during the login process. (provider: SSL Provider, error: 0 – The certificate chain was issued by an authority that is not trusted.) I did an export in ssms from localdb to the new sql databasse then changed the default connection Read More
Templates for Azure Logic Apps Standard: Seeking Your Feedback on UI Wireframes
Preview the New Templates for Azure Logic Apps: Seeking Your Feedback on UI Wireframes
Introduction
Azure Logic Apps continues to add powerful features that simplify complex integrations and automations. As we prepare to release support for templates, we are excited to share our UI wireframes with you. Your feedback is crucial in helping us refine and perfect this new feature.
What Are Templates?
Templates in Azure Logic Apps are pre-built workflow solutions designed to address common integration scenarios. They cover a wide range of use cases, from simple data transfers to complex, multi-step automations, and event-driven processes. Templates provide a solid foundation, allowing users to quickly set up and deploy workflows without starting from scratch.
Why Are Templates Important?
Templates play a crucial role in integration projects by offering several benefits:
Time-Saving: They significantly reduce the time required to set up complex workflows, providing ready-made solutions that can be deployed quickly.
Best Practices: Templates ensure that workflows adhere to industry standards and best practices, resulting in more reliable and efficient integrations.
Ease of Use: By simplifying the integration process, templates make it accessible to users with varying levels of technical expertise.
Consistency: Using templates helps maintain consistency across different workflows and projects, reducing errors and improving maintainability.
Data Mapping: Developers often spend considerable time creating maps to transform data between formats. With this release, templates will include support for maps, allowing you to leverage these pre-configured maps. When updates are required, the tools available in Logic Apps enable you to easily customize and adapt the maps to meet your specific business needs.
Key Features Highlighted in Wireframes:
Template Library: Explore a comprehensive collection of pre-built templates.
Search and Filter: Quickly find the templates you need with powerful search and filtering tools.
Preview: Review entire workflow, pre-requisites, connectors used, and more before you decide to select a template
Configuration: Easily configure the connections, parameters via an intuitive wizard
Call to Action: We invite you to review our wireframes and share your feedback. Your insights are essential to us.
What’s in the Roadmap?
Open-Source Templates Library: We are excited to announce the development of an open source templates library (details will be shared soon). This initiative will allow the community to collaborate with Microsoft to enrich our templates gallery. You can contribute your own templates, helping to build a diverse and comprehensive repository that benefits all users.
Customer’s Templates: In addition to using templates from our gallery, you will soon be able to generate and publish templates tailored specifically for your organization. This feature will enable you to create custom templates that address your unique business needs and share them within your organization, promoting standardization and efficiency.
Advanced Scenarios: For more complex integration needs, we are working on support for advanced scenarios that expand across multiple workflows. These advanced templates will facilitate sophisticated integrations, providing robust solutions for enterprises with complex workflow requirements.
Thank you for being a part of our community and helping us shape the future of Azure Logic Apps. We are committed to delivering features that meet your needs and exceed your expectations. We appreciate your support and contributions.
Microsoft Tech Community – Latest Blogs –Read More
📢 Announcement!! Azure Open AI and Azure AI Search connectors are now Generally Available (GA)
Announcing General Availability: Azure Open AI and AI Search Connectors for Logic Apps
We are thrilled to announce the General Availability of the Azure Open AI and AI Search connectors for Logic Apps. These new connectors integrate the power of Azure Open AI’s natural language processing with Azure AI Search’s intelligent search capabilities, enabling developers to build intelligent, AI-driven applications seamlessly.
Innovate where you Integrate
Data is the cornerstone of any AI application, unique to each organization. Business processes, whether in the cloud or within a VNET, rely on this data and can be managed by modern or legacy applications. Regardless of where your data resides, Azure Logic Apps offers the ability to easily infuse AI into both new and existing business processes.
With over 1000 connectors to various applications and services, Logic Apps simplifies the integration of AI, enabling the development of Retrieval-Augmented Generation (RAG) applications. This seamless integration enhances the functionality and intelligence of your business processes, ensuring that your applications are both innovative and efficient.
By leveraging these connectors alongside AI services, organizations can transform their operations and generate intelligent insights like never before. Whether it’s automating routine tasks, enhancing customer interactions and support, or generating insights, Azure Logic Apps provides a robust platform for embedding AI into your enterprise’s fabric.
RAG-based Patterns for AI Applications using Azure Logic Apps
Using Logic Apps and these AI connectors, you can quickly build and productionize AI applications based on RAG pattern. Retrieval-Augmented Generation (RAG) combines retrieval and generative models to improve the accuracy and relevance of AI-generated content. This pattern is particularly useful in scenarios where precise information retrieval is crucial, such as:
Customer Support Automation
Automate customer support by generating accurate responses using Azure Open AI and retrieving relevant information from knowledge bases with Azure AI Search.
Document Processing
Streamline document-heavy processes by summarizing, extracting key information, and translating documents with Azure Open AI, while Azure AI Search enables quick document retrieval.
Content Generation
Create high-quality content for blogs and social media with the help of Azure Open AI, and ensure its accuracy and relevance using Azure AI Search.
Personalized Recommendations
Build personalized recommendation systems by analyzing user behavior with Azure Open AI and retrieving matching products, services, or content using Azure AI Search.
Healthcare Data Management
Enhance healthcare data management by summarizing patient records and extracting critical information with Azure Open AI, while Azure AI Search helps in quickly locating specific patient information or research articles.
Financial Analysis and Reporting
Improve financial analysis by generating comprehensive financial reports using Azure Open AI and retrieving relevant financial data and market trends with Azure AI Search.
Key Features and Capabilities
The connectors simplify backend processes with a codeless setup, reducing the complexity of integrating AI capabilities into workflows.
The Azure OpenAI Connector provides powerful AI functionalities such as generating embeddings, summarization, and chat completion, which are pivotal for creating sophisticated AI applications. Meanwhile, the Azure AI Search Connector enhances data retrieval with advanced vector and hybrid search operations.
You can ingest and retrieve both structured and unstructured data and in various formats such as documents, pdfs, json, text and more.
Getting Started
To get started with the Azure OpenAI and Azure AI Search connector for Logic Apps, visit the MS Learn documentation. Here, you will find detailed guides on how to set up and integrate these powerful tools into your workflows.
What’s Next
The General Availability of the Azure OpenAI and Azure AI Search connector for Logic Apps marks a significant step forward in the integration of AI into business processes. By leveraging these cutting-edge technologies, businesses can unlock new levels of efficiency, intelligence, and innovation. We are excited to see how you will use these capabilities to transform your workflows and drive success.
Stay tuned for more updates and case studies showcasing the impact of Azure OpenAI and Azure AI Search on various industries. As always, we are committed to supporting your journey towards a more intelligent and automated future.
Microsoft Tech Community – Latest Blogs –Read More
Error: Function definition not supported in this context. Create functions in code file.
Hello,
I was trying to create a simple function on matlab, I already have a file named "AddOne.m", so it should work, however at the first line of code, matlab throws the following error
"Error: Function definition not supported in this context. Create functions in code file.". Any ideas how to sort this out?
Thanks in advance!Hello,
I was trying to create a simple function on matlab, I already have a file named "AddOne.m", so it should work, however at the first line of code, matlab throws the following error
"Error: Function definition not supported in this context. Create functions in code file.". Any ideas how to sort this out?
Thanks in advance! Hello,
I was trying to create a simple function on matlab, I already have a file named "AddOne.m", so it should work, however at the first line of code, matlab throws the following error
"Error: Function definition not supported in this context. Create functions in code file.". Any ideas how to sort this out?
Thanks in advance! function, matlab function, error MATLAB Answers — New Questions
how do i plot the following signal in matlab? x(n)=4u(n)-u(n-1)-u(n-2)-2u(n-3)
plz include steps so that i will be able to do it properly.plz include steps so that i will be able to do it properly. plz include steps so that i will be able to do it properly. continuous time signal MATLAB Answers — New Questions
Steps to Manually Add PowerShell Modules in Function App
Function App lets you leverage PowerShell gallery for managing dependencies. Your PowerShell modules defined in requirements.psd1 file will be downloaded automatically if you enable dependency management. However, if your Function App runs on Consumption plan and the size of dependent modules is too large, chances are the download would fail. During the installation, the modules would be unziped and saved in the “D:localTemp”, which is the temp storage on the plan worker. Due to the 500 MB limit of temp storage on consumption plan, the installation would fail with not enough space error:
System Log: { Log-Level: Warning; Log-Message: Save-Module(‘Microsoft.Graph’, ‘2.18.0’): Package ‘Microsoft.Graph.Files’ failed to be installed because: There is not enough space on the disk. : ‘C:localTemp282636592Microsoft.Graph.Files.2.18.0binMicrosoft.Graph.Files.private.dll’ }
The easiest workaround is to scale up the function app plan to premium plan, as premium plan has much larger temp file size. However, this is not the most cost-saving option. You can install the module seperately to the app without depending on the dependency management feature either from local deployment or manual installation from the portal. Here I provide the steps with the second option to add the PowerShell modules directly using “Advanced Tools”.
Find and download modules from PowerShell Gallery to your local system. You should choose “manual download” and download the .nupkg files of the modules.
Unzip the .nupkg files and rename the folder to the module official name without the version number.
Open “App Service Editor” and create a new folder with name “Modules” under “wwwroot” folder.
Go to SCM site of the Function App and navigate to “C:homesitewwwrootModules” path on the file system.
Drag and drop the module package that you have downloaded before to the “Modules” folder on the SCM site.
Go to “App Files”, modify the “requirements.psd1” as the following, with the corresponding version number of the module:
In “host.json file”, verify the managedDependency setting has a value of “Enabled” and is set to “False”.
Restart the app, which is very important.
Now, you have successfully added the PowerShell modules to the Function App running on consumption plan, minus the “not enough space” error.
Microsoft Tech Community – Latest Blogs –Read More
How to make sure that the horizontal axes of two figures align?
I have two figures, which I want to put side by side in my thesis.
These two figures have different x and y labels and ticks, therefore, generally their horizontal axes are not on the same height.
That is ugly.
How to make sure the horizontal axes align?I have two figures, which I want to put side by side in my thesis.
These two figures have different x and y labels and ticks, therefore, generally their horizontal axes are not on the same height.
That is ugly.
How to make sure the horizontal axes align? I have two figures, which I want to put side by side in my thesis.
These two figures have different x and y labels and ticks, therefore, generally their horizontal axes are not on the same height.
That is ugly.
How to make sure the horizontal axes align? plot MATLAB Answers — New Questions