Category: News
Enhancing Microsoft Build with MVP Insights and Excitement
It’s hard to believe, but it has already been two and a half months since this year’s Microsoft Build, where we highlighted the contributions of Microsoft MVPs in various roles in blog post Microsoft Build 2024 with MVP Communities – Microsoft Community Hub. Some people have taken the initiative to test new AI features themselves, while others have relied on community content to understand the latest technological advancements and consider how to apply them to their own businesses.
MVPs have been actively catching up with the newly announced information, writing blog posts and sharing insights on social media. They added a layer of excitement to the official releases by delivering valuable information to the community in their own words.
Microsoft Build 2024 Watch Party in the heart of New York City, United States
Peter Smulovics, a Developer Technologies MVP, hosted the Microsoft Build 2024 Watch Party in the heart of New York City, where community members gathered to deepen their knowledge together by watching the live conference.
This event allowed attendees to experience the excitement of watching the keynote address from Microsoft Build and learn about the latest announcements. They also enjoyed networking opportunities and engaged in many conversations about the newly unveiled technologies.
For those who couldn’t attend Microsoft Build in person, a Watch Party like this offers the chance to share in the spirit and excitement of this milestone event, but also connect with other tech enthusiasts locally… Check out Peter’s blog post Watching Microsoft Build in Good Company | Dotneteers.net for a recap of the event.
Microsoft Build: Developer Learning Day in London, the United Kingdom
Microsoft Build: Developer Learning Day was co-hosted with NVIDIA with over 300 attendees. The focus of the event was to share post-Build insights and viewpoints from the community and Microsoft employees.
Participants had the opportunity to deepen their knowledge of Dev Tools, Cloud Platform, AI Development, Copilot, and Azure Data & Analytics through technical sessions. Four out of the five sessions were co-presented by MVPs including Sue Bayes, Callum Whyte, Daniel Roe, and Alpa Buddhabhatti. Additionally, in the Lab space, there were two 60-minute Lab sessions offering practical learning experiences on Deep Learning and Transformer-Based Natural Language Processing. Furthermore, at the Ask the Expert Booth, MVPs interacted with participants and engaged in discussions based on their expertise.
Read the event recap posted on LinkedIn by Marcel Lupo, a Developer Technologies and Microsoft Azure MVP, who was one of the experts at the Ask the Expert Booth, here.
2024 Microsoft Build After Party & Microsoft Build 2024 – After Party with MVP in Seoul, Korea
At the Microsoft Korea office in Seoul, Korean MVPs hosted two community-led post-Build events over two days, helping enthusiastic participants learn about the latest AI technologies to enhance their knowledge.
On the first day, at the 2024 Microsoft Build After Party, five MVP/RDs shared insights on the latest Azure AI technologies with over 100 attendees through hands-on sessions, and panel discussions. They covered Copilot Studios, multi-modal generative artificial intelligence, responsible AI, low-code development, and cloud platform. On the second day, the Microsoft Build 2024 – After Party with MVP featured diverse speakers, including Microsoft Learn Student Ambassadors and community leaders, who provided new insights on emerging technologies, such as Power Platform and Microsoft 365, to the participants.
One of the MVP hosts of the Microsoft Build 2024 After Party with MVP, Inhee Lee, expressed what they were able to convey to the participants through this event, “we shared insights of the benefit of GenAI which is a disruption and democratisation of conventional technology. Anybody can access to utilise technology easily with affordable costs.”
Post Microsoft Build and AI Day in Shanghai, Beijing, and Greater Bay Area (Shenzhen and Hong Kong)
In collaboration with Microsoft Asia and Microsoft Reactor, and 16 technology communities, the Post-Microsoft Build and AI Day event series was held in June 2024. The three events in Shanghai, Beijing, and the Greater Bay Area (Shenzhen and Hong Kong) attracted significant interest from those eager to learn about the latest AI solutions. Over 400 participants attended the events in person, and online streaming reached more than 140,000 viewers.
This series of hybrid events offered opportunities to learn about the latest technology topics announced at Microsoft Build, including Azure AI, .NET, Copilot, Power Platform, Azure OpenAI, LLM, SLM, Data Analytics and Integration, and edge computing. 11 MVPs participated as speakers, providing technical insights and engaging in discussions with attendees at the booths, enhancing the event experience for both in-person and online participants.
The event in Shanghai coincided with Children’s Day, leading to a special “Hour of Code · Children’s Day Edition.” Hao Hu, an AI Platform MVP, conducted an AI hands-on workshop themed around marine conservation for children. This initiative demonstrated how AI can be utilized across different generations.
Microsoft Build Japan 2024 – Virtual event from Tokyo, Japan
As an opportunity for Japanese-speaking users to learn the latest technologies announced at Microsoft Build, Microsoft Japan held Microsoft Build Japan 2024. Eight MVPs contributed to this two-day virtual event as speakers. On Day 1, four MVPs shared important updates on Azure AI and developer-focused topics. On Day 2, another four MVPs covered Microsoft 365, including Copilot, as well as the latest devices such as Copilot+ PC and Surface.
Tomokazu Kizawa participated as a speaker at a Microsoft-hosted event for the first time. Reflecting on his experience and the key messages he wanted to convey to the participants, he shared his thoughts: “From my perspective as a user outside of Microsoft, I focused on specifically conveying how new technologies like Copilot+ PC can positively impact daily life and work. Additionally, I aimed to highlight the positioning of Surface and the technological trends of Copilot+ PC not only from Microsoft’s standpoint but also within the broader PC industry, in order to generate more interest in Windows Devices.”
You can watch the session videos from this event, including the presentations by the Microsoft MVPs, at the following link.
Microsoft Build Japan 2024 – YouTube
– MVP session Day 1: Microsoft MVP による “推し最新技術情報” Azure/AI + Dev 編 (youtube.com)
– MVP session Day 2: Microsoft MVP による “推し最新技術情報” Microsoft 365 + Windows Device 編 (youtube.com)
Starting November 19, another major Microsoft global conference, Microsoft Ignite, is set to take place both in Chicago and online. It’s exciting to think about the innovative technology announcements that will be revealed in just three months. If you’d like to stay updated on the latest conference information, we recommend signing up for email notifications via the [Join the email list] link on the official page (note: this is not for event registration).
Also, be sure to look forward to learning opportunities provided by the Microsoft MVP community during and after the Microsoft Ignite event!
——
To learn more about Microsoft Build, please visit the following websites.
Microsoft Build (Official website)
Microsoft Build 2024 Book of News (Announcements)
Build on Microsoft Learn (Microsoft Learn)
Microsoft Build 2024 (Playlist on Microsoft Developer YouTube)
The top 5 Microsoft MVP demos presented at this year’s event are now available on the Microsoft Developer YouTube channel.
– Extend your Copilot with plugins using Copilot Studio by M365 MVP Manpreet Singh
– Create a fully functional support bot in less than 10 minutes by Business Applications MVP Em D’Arcy
– Golden path with GitHub Actions by Microsoft Azure MVP/Regional Director Michel Hubert
– Optimize Azure Infrastructure as Code Deployments with VS Code by Windows and Devices, Microsoft Azure MVP
– AI-Powered Personalized Learning by AI Platform MVP Noelle Russell
Microsoft Tech Community – Latest Blogs –Read More
Appdesigner: Remove Simulink dependency
Hi everybody,
I have accidentaly added a Simulink component to my app and now I cannot compile it anymore.
I have the MATLAB compiler but not the Simulink compiler, to overcome the problem I have removed the simulink component but the dependency is still there, hence the appdesigner does not allow me to compile the app.
Is there any trick that I can use to fix this problem?
Best regards,
GiulioHi everybody,
I have accidentaly added a Simulink component to my app and now I cannot compile it anymore.
I have the MATLAB compiler but not the Simulink compiler, to overcome the problem I have removed the simulink component but the dependency is still there, hence the appdesigner does not allow me to compile the app.
Is there any trick that I can use to fix this problem?
Best regards,
Giulio Hi everybody,
I have accidentaly added a Simulink component to my app and now I cannot compile it anymore.
I have the MATLAB compiler but not the Simulink compiler, to overcome the problem I have removed the simulink component but the dependency is still there, hence the appdesigner does not allow me to compile the app.
Is there any trick that I can use to fix this problem?
Best regards,
Giulio appdesigner, compiler, dependency MATLAB Answers — New Questions
Anova-N output question
Hello community!
The closest prompt I could find that is similar to this would be: https://www.mathworks.com/matlabcentral/answers/1876737-anova-n-outputs-as-not-full-rank-returns-nan-p-value?s_tid=sug_su , but the reason their’s showed NaN was there was not enough values. For mine I have 5.4 thousand entries, so I’m not sure that is the problem.
To reduce clutter of the code, I am going to attach the .mat files and the one-line of code.
pTHalf = anovan(stats(:,1), {Patho CellLine MW},’model’,’interaction’,’varnames’, …
{‘Pathology’,’Cell Line’,’Molecular Weights’});
I ensured the data is of the same types allowed within the format of the Anova-n overview page. The only thing that I could think of was that there are more than two groups within the Cell Line and Molecular Weights groupings, but the only one that worked was the molecular weights, so I also doubt that is the reason.
This is the output below. Why do I have missing sections associated with a ‘not full rank’. I do not see anything on the anova-N page that discusses this.
Thanks community!
NickHello community!
The closest prompt I could find that is similar to this would be: https://www.mathworks.com/matlabcentral/answers/1876737-anova-n-outputs-as-not-full-rank-returns-nan-p-value?s_tid=sug_su , but the reason their’s showed NaN was there was not enough values. For mine I have 5.4 thousand entries, so I’m not sure that is the problem.
To reduce clutter of the code, I am going to attach the .mat files and the one-line of code.
pTHalf = anovan(stats(:,1), {Patho CellLine MW},’model’,’interaction’,’varnames’, …
{‘Pathology’,’Cell Line’,’Molecular Weights’});
I ensured the data is of the same types allowed within the format of the Anova-n overview page. The only thing that I could think of was that there are more than two groups within the Cell Line and Molecular Weights groupings, but the only one that worked was the molecular weights, so I also doubt that is the reason.
This is the output below. Why do I have missing sections associated with a ‘not full rank’. I do not see anything on the anova-N page that discusses this.
Thanks community!
Nick Hello community!
The closest prompt I could find that is similar to this would be: https://www.mathworks.com/matlabcentral/answers/1876737-anova-n-outputs-as-not-full-rank-returns-nan-p-value?s_tid=sug_su , but the reason their’s showed NaN was there was not enough values. For mine I have 5.4 thousand entries, so I’m not sure that is the problem.
To reduce clutter of the code, I am going to attach the .mat files and the one-line of code.
pTHalf = anovan(stats(:,1), {Patho CellLine MW},’model’,’interaction’,’varnames’, …
{‘Pathology’,’Cell Line’,’Molecular Weights’});
I ensured the data is of the same types allowed within the format of the Anova-n overview page. The only thing that I could think of was that there are more than two groups within the Cell Line and Molecular Weights groupings, but the only one that worked was the molecular weights, so I also doubt that is the reason.
This is the output below. Why do I have missing sections associated with a ‘not full rank’. I do not see anything on the anova-N page that discusses this.
Thanks community!
Nick anova-n, matlab, documentation MATLAB Answers — New Questions
Facing error in this example openExample(‘whdl/WHDLOFDMTransmitterExample’)
>>
openExample(‘whdl/WHDLOFDMTransmitterExample’)
runOFDMTransmitterModel
Error using runOFDMTransmitterModel>trBlkSize
Too many output arguments.
Error in runOFDMTransmitterModel (line 26)
trBlkSize(txParam(n).modOrder,txParam(n).codeRateIndex)*txParam(n).numFrames,…Error using runOFDMTransmitterModel>trBlkSizeToo many output arguments.Error in runOFDMTransmitterModel (line 26) trBlkSize(txParam(n).modOrder,txParam(n).codeRateIndex)*txParam(n).numFrames,…
I am gretting this error before I was not getting . can you help me.>>
openExample(‘whdl/WHDLOFDMTransmitterExample’)
runOFDMTransmitterModel
Error using runOFDMTransmitterModel>trBlkSize
Too many output arguments.
Error in runOFDMTransmitterModel (line 26)
trBlkSize(txParam(n).modOrder,txParam(n).codeRateIndex)*txParam(n).numFrames,…Error using runOFDMTransmitterModel>trBlkSizeToo many output arguments.Error in runOFDMTransmitterModel (line 26) trBlkSize(txParam(n).modOrder,txParam(n).codeRateIndex)*txParam(n).numFrames,…
I am gretting this error before I was not getting . can you help me. >>
openExample(‘whdl/WHDLOFDMTransmitterExample’)
runOFDMTransmitterModel
Error using runOFDMTransmitterModel>trBlkSize
Too many output arguments.
Error in runOFDMTransmitterModel (line 26)
trBlkSize(txParam(n).modOrder,txParam(n).codeRateIndex)*txParam(n).numFrames,…Error using runOFDMTransmitterModel>trBlkSizeToo many output arguments.Error in runOFDMTransmitterModel (line 26) trBlkSize(txParam(n).modOrder,txParam(n).codeRateIndex)*txParam(n).numFrames,…
I am gretting this error before I was not getting . can you help me. #matlab#hdl MATLAB Answers — New Questions
Bear tool box and input excel
While I am using the Bear toolbox to run time varying parameter model. I am bit confused how can I make my excel sheet to be recognized by the bear app. I import my own excel sheet that I used to collect data and every time I click a run button it says something like exo mean priors not found or something similar to this. I assume there is an excel sheet format that I am supposed to use in bear toolbox but I cannot really find it anywhere. Can anyone help me out?
For your reference: https://www.ecb.europa.eu/press/research-publications/working-papers/html/BEAR_toolbox_v5_1.pdf –> the slide 22 talks about input excel data file but I am confused.While I am using the Bear toolbox to run time varying parameter model. I am bit confused how can I make my excel sheet to be recognized by the bear app. I import my own excel sheet that I used to collect data and every time I click a run button it says something like exo mean priors not found or something similar to this. I assume there is an excel sheet format that I am supposed to use in bear toolbox but I cannot really find it anywhere. Can anyone help me out?
For your reference: https://www.ecb.europa.eu/press/research-publications/working-papers/html/BEAR_toolbox_v5_1.pdf –> the slide 22 talks about input excel data file but I am confused. While I am using the Bear toolbox to run time varying parameter model. I am bit confused how can I make my excel sheet to be recognized by the bear app. I import my own excel sheet that I used to collect data and every time I click a run button it says something like exo mean priors not found or something similar to this. I assume there is an excel sheet format that I am supposed to use in bear toolbox but I cannot really find it anywhere. Can anyone help me out?
For your reference: https://www.ecb.europa.eu/press/research-publications/working-papers/html/BEAR_toolbox_v5_1.pdf –> the slide 22 talks about input excel data file but I am confused. bear toolbox, ecb, excel sheet MATLAB Answers — New Questions
$filter by multiple properties
Hi all,
Unfortunately, I can’t manage to filter according to several properties.
I’m currently filtering for a specific value, but I would like to filter using one or two “or” operators or other properties:
$filter=assignmentState+eq+’Delivered’
e.g.:
filter where assignmentstate is ‘Delivered’ or ‘Delivering‘ or ‘etc..’
When I follow the documentation I run into errors.
https://learn.microsoft.com/de-de/graph/filter-query-parameter?tabs=http
Does anyone have experience with multiple filters?
Regards
Hi all, Unfortunately, I can’t manage to filter according to several properties. I’m currently filtering for a specific value, but I would like to filter using one or two “or” operators or other properties: $filter=assignmentState+eq+’Delivered’ e.g.:filter where assignmentstate is ‘Delivered’ or ‘Delivering’ or ‘etc..’ When I follow the documentation I run into errors.https://learn.microsoft.com/de-de/graph/filter-query-parameter?tabs=http Does anyone have experience with multiple filters? Regards Read More
Trouble adding Personal Calendar to Work Calendar, Both MS 365 accounts
I am an accountant with my own company and have a long-term contracting relationship with another company. In addition to my own company’s MS365 account, the other company has issued me an MS365 account for my work on their behalf. The client is implementing Bookings to allow clients to self-book my services and assistance. This requires Bookings to receive information from both MS365 calendars to accurately convey my availability.
Both companies use Microsoft 365 for software services. When I try to add the calendar from the other company as a personal calendar, I receive an error message. I can add it as a shared calendar, but Bookings doesn’t consider the availability of the shared calendar. It also doesn’t work when I try to set up a shared Booking page and add the other email as staff.
I can’t imagine I am the first person to need this integration. Please let me know what you suggest.
I am an accountant with my own company and have a long-term contracting relationship with another company. In addition to my own company’s MS365 account, the other company has issued me an MS365 account for my work on their behalf. The client is implementing Bookings to allow clients to self-book my services and assistance. This requires Bookings to receive information from both MS365 calendars to accurately convey my availability. Both companies use Microsoft 365 for software services. When I try to add the calendar from the other company as a personal calendar, I receive an error message. I can add it as a shared calendar, but Bookings doesn’t consider the availability of the shared calendar. It also doesn’t work when I try to set up a shared Booking page and add the other email as staff. I can’t imagine I am the first person to need this integration. Please let me know what you suggest. Read More
Download Spotify Podcasts/Free Podcasts to MP3
Here is a detailed guide on how to download Spotify podcasts to MP3 without Spotify Premium via Tidabie Music Go:
STEP 1 Select Spotify as the Downloading Source
With Tidabie Music Go, you can download Spotify podcasts to MP3 format from your Spotify library, no matter if you are using a Spotify Free or Premium account. Once you start Tidabie on your computer, please select “Spotify” as the audio source. And then log in to your Spotify account to access your library.
Note: When choosing the Spotify source, you have the choice to capture Spotify podcasts from either the Spotify app or the Spotify web player. You can toggle between the two options by clicking on the switching icon. If you opt to record podcasts from the Spotify app, the operation will function at speeds of up to 10 times faster while preserving the best audio quality.
STEP 2 Customize the Output Settings of Downloaded Spotify Podcasts
When you finish choosing “Spotify” as the downloading audio source, you will see the “Music” interface like the picture below. Simply select the output format as “MP3” under the “Convert Settings” module in this interface. If needed, you can also adjust parameters like the bit rate and sample rate. Additional settings such as the output folder path and file naming can be customized in the full settings pop-up window, which is accessible by clicking the “More settings” button.
STEP 3 Find Spotify Podcasts to Download
Back to the Spotify app or the Spotify web player after choosing the output settings. Then you need to find the Spotify podcasts you want to download on Spotify. As you locate the podcast page, you will see a blue “Click to add” button in the lower right corner. Just tap on it to start parsing the Spotify podcast episodes.
As the podcasts are processed, all the downloadable items will be listed on a small pop-up window. What you need to do next is to choose the episodes you want to download by ticking the square box next to the episode title. When comparing downloading podcasts using Spotify and Tidabie, the advantages of using Tidabie are more noticeable. Tidabie enables you to download Spotify podcasts in batches, while on Spotify, you need to click on the download icon to get episodes one by one.
STEP 4 Start Downloading Spotify Podcasts
Before downloading, you have the option to add more podcasts to download by hitting the “Add More” icon plus the chance to modify the output settings again by tapping on the settings icon on this interface. If everything is all set, just click on the “Convert” button to start downloading. Then Tidabie will run up to 10x faster to get your favorite Spotify podcasts downloaded in MP3 format to the local PC. All you need to do now is to wait patiently.
STEP 5 Start Downloading Spotify Podcasts
Check the Downloaded Spotify Podcats on your Local PC
The output folder that keeps the downloaded Spotify podcasts will pop up by default when the downloading is completed. You can check the downloaded podcasts in the pop-up folder. Or go to the specific podcast files by hitting the folder icon near each song under the “Converted” module.
Here is a detailed guide on how to download Spotify podcasts to MP3 without Spotify Premium via Tidabie Music Go:STEP 1 Select Spotify as the Downloading SourceWith Tidabie Music Go, you can download Spotify podcasts to MP3 format from your Spotify library, no matter if you are using a Spotify Free or Premium account. Once you start Tidabie on your computer, please select “Spotify” as the audio source. And then log in to your Spotify account to access your library.Note: When choosing the Spotify source, you have the choice to capture Spotify podcasts from either the Spotify app or the Spotify web player. You can toggle between the two options by clicking on the switching icon. If you opt to record podcasts from the Spotify app, the operation will function at speeds of up to 10 times faster while preserving the best audio quality.STEP 2 Customize the Output Settings of Downloaded Spotify PodcastsWhen you finish choosing “Spotify” as the downloading audio source, you will see the “Music” interface like the picture below. Simply select the output format as “MP3” under the “Convert Settings” module in this interface. If needed, you can also adjust parameters like the bit rate and sample rate. Additional settings such as the output folder path and file naming can be customized in the full settings pop-up window, which is accessible by clicking the “More settings” button.STEP 3 Find Spotify Podcasts to DownloadBack to the Spotify app or the Spotify web player after choosing the output settings. Then you need to find the Spotify podcasts you want to download on Spotify. As you locate the podcast page, you will see a blue “Click to add” button in the lower right corner. Just tap on it to start parsing the Spotify podcast episodes.As the podcasts are processed, all the downloadable items will be listed on a small pop-up window. What you need to do next is to choose the episodes you want to download by ticking the square box next to the episode title. When comparing downloading podcasts using Spotify and Tidabie, the advantages of using Tidabie are more noticeable. Tidabie enables you to download Spotify podcasts in batches, while on Spotify, you need to click on the download icon to get episodes one by one.STEP 4 Start Downloading Spotify PodcastsBefore downloading, you have the option to add more podcasts to download by hitting the “Add More” icon plus the chance to modify the output settings again by tapping on the settings icon on this interface. If everything is all set, just click on the “Convert” button to start downloading. Then Tidabie will run up to 10x faster to get your favorite Spotify podcasts downloaded in MP3 format to the local PC. All you need to do now is to wait patiently.STEP 5 Start Downloading Spotify PodcastsCheck the Downloaded Spotify Podcats on your Local PCThe output folder that keeps the downloaded Spotify podcasts will pop up by default when the downloading is completed. You can check the downloaded podcasts in the pop-up folder. Or go to the specific podcast files by hitting the folder icon near each song under the “Converted” module. Read More
What is the best spotify music converter for Windows PC?
I enjoy listening to Spotify music offline on various devices, but I don’t have a Spotify Premium account. As a technical writer and web designer, I often deal with different software and tools, and I’m familiar with various conversion processes.
I’ve heard about several Spotify music converters, but I’m looking for one that stands out in terms of quality, ease of use, and reliability. Ideally, it should support saving Spotify tracks as MP3 files without compromising on sound quality. I’m also interested in any additional features that might enhance the overall experience.
I enjoy listening to Spotify music offline on various devices, but I don’t have a Spotify Premium account. As a technical writer and web designer, I often deal with different software and tools, and I’m familiar with various conversion processes.I’ve heard about several Spotify music converters, but I’m looking for one that stands out in terms of quality, ease of use, and reliability. Ideally, it should support saving Spotify tracks as MP3 files without compromising on sound quality. I’m also interested in any additional features that might enhance the overall experience. Read More
Making private meeting in shared mailbox calendar
Hello,
In the calendar of a shared mailbox we can’t set a meeting to private. The lock-tile is greyed out. In the users own calendar the tile is available. Is there a setting for shared mailboxes that makes it possible to set meetings to private?
Kind regards,
Arjan
Hello, In the calendar of a shared mailbox we can’t set a meeting to private. The lock-tile is greyed out. In the users own calendar the tile is available. Is there a setting for shared mailboxes that makes it possible to set meetings to private? Kind regards,Arjan Read More
Building HyDE powered RAG chatbots using Microsoft Azure AI Models & Dataloop
Customer service is undergoing an AI revolution, driven by the demand for smarter, more efficient solutions. HyDE-powered RAG chatbots offer a breakthrough technology that combines vast knowledge bases with real-time data retrieval and hypothetical document embeddings (HyDE) to deliver superior accuracy and context-specific responses. Yet, building and managing these complex systems remains a significant challenge due to the intricate integration of diverse AI components, real-time processing requirements, and the need for specialized expertise in AI and data engineering.
Simplifying GenAI solutions with Microsoft and Dataloop
The Microsoft-Dataloop partnership abstracts the deployment of powerful chatbot applications. By integrating Microsoft’s PHI-3-MINI foundation model with Dataloop’s data platform, we’ve made HyDE-powered RAG chatbots accessible to a wider developer community with minimal coding. Developers can leave the documentation behind and start utilizing these capabilities instantly, accelerating time to value.
This announcement follows our successful integration with Microsoft Azure AI Model as a Service and Azure AI Video Indexer, further enhancing our ability to deliver advanced AI solutions. These integrations enable developers to seamlessly incorporate state-of-the-art AI models into their workflows, significantly accelerating development cycles.
About Dataloop AI development platform
Dataloop is an enterprise-grade end-to-end AI development platform designed to streamline the creation and deployment of powerful GenAI applications. The platform offers a comprehensive suite of tools and services, enabling efficient AI model development and management.
Key features include:
Orchestration: Dataloop provides seamless pipeline management, access to a marketplace for AI models, and a serverless architecture to simplify deployment and scalability.
Data Management: The Dataloop platform supports extensive dataset exploration, allowing users to query, visualize, and curate data efficiently.
Human Knowledge: Dataloop facilitates knowledge-based ground truth creation through tools for annotation, review, and monitoring, ensuring high-quality data labeling.
MLOps: With reliable model management capabilities, Dataloop ensures efficient inference, training, and evaluation of AI models.
Dataloop is also available on Azure Marketplace.
About Azure AI Models as a Service
Azure AI Models as a Service (MaaS) offers developers and businesses access to a robust ecosystem of powerful AI models. This service includes a wide range of models, from pre-trained and custom models to foundation models, covering tasks such as natural language processing, computer vision, and more. The service is backed by Azure’s stringent data privacy and security commitments, ensuring that all data, including prompts and responses, remains private and secure.
Add photo pink screen
Figure: HyDE-powered RAG Chatbot Workflow – This pipeline, created using the Dataloop platform, demonstrates the process of transforming user queries into hypothetical answers, generating embeddings, and retrieving relevant documents from a vector store. This internal Slack chatbot is optimizing information retrieval to ensure that users receive accurate and contextually relevant responses, enhancing the chatbot’s ability to search for answers in the documentation.
This is how we do it!
Powering Efficient AI Inference at Scale: Microsoft’s AI tools build upon a powerful foundation of inference engines like Azure Machine Learning and ONNX Runtime. This robust toolkit ensures smooth, high-performance AI inferencing at scale. These tools specifically fine-tune neural networks for exceptional speed and efficiency, making them ideal for demanding applications like large language models (LLMs). This translates to rapid inference and scalable AI deployment across various environments.
End-to-End AI Development with Drag-and-Drop Ease: Dataloop empowers users to build and manage advanced AI capabilities entirely within its intuitive no-code interface. Simply drag and drop models provided or developed by Microsoft through our marketplace to seamlessly integrate them into your workflows. Pre-built pipeline templates specifically designed for RAG chatbots further streamline development. This eliminates the need for additional tools, making Dataloop your one-stop shop for building next-generation RAG-based chatbots.
A Node-by-Node Look at a RAG-based Document Assistant Chatbot with Microsoft and Dataloop
This section takes you behind the scenes of our RAG-based document assistant chatbot creation, utilizing Microsoft’s AI tools and the Dataloop platform. This breakdown will help you understand each component’s role and how they work together to deliver efficient and accurate responses. Below is a detailed node-by-node explanation of the system.
Node 1: Slack (or Messaging App) – Prompt Entry Point
Description: This node acts as the interface between users and the chatbot system. It integrates with a messaging platform like Slack and receives user interactions (messages, queries, commands) and starts the pipeline.
Functionality: It captures and processes the user input to be forwarded to the predictive model.
Configuration:
Integration:
Specify the target messaging platform (e.g., Slack API token, login credentials for other messaging apps).
Define event types to handle (e.g., messages, direct mentions, specific commands).
Message Handling:
Define how to pre-process messages (e.g., removing emojis, formatting, language detection).
Configure how to identify user intent and extract relevant information from the message.
Node 2 – PHI-3-MINI – Predict Model
Description: This node utilizes a generative prediction model, PHI-3-MINI, optimized with Microsoft’s AI tools.
Functionality: The node takes input from the Slack node and generates hypothetical responses. Research in Zero-Shot Learning suggests that this approach, leveraging contextual understanding and broad knowledge, can often outperform traditional methods.
Configuration:
Model Selection: Choose any LLM optimized using Microsoft’s AI tools. In our chatbot, we leverage PHI-3-MINI, specifically optimized for efficient resource usage.
System Prompt Configuration: A system prompt guides the AI’s behavior by setting tone, style, and content rules, ensuring consistent, relevant, and appropriate responses. For our case, we configure the LLM to give a hypothetical and concise answer.
Parameters: Set parameters for the model (e.g., beam search size, temperature for sampling).
Node 3 – Embed Item
Description: This node is responsible for embedding items, transforming text or data into a format that can be easily used for further processing or retrieval.
Functionality: It generates vector embeddings from the text. These embeddings represent the text in a high-dimensional space, allowing for efficient similarity searches in the next node.
Configuration:
Embedding Model: Choose the model for generating vector embeddings from text (e.g., pre-trained Word2Vec, Sentence Transformers). You can also utilize Microsoft’s embedding tools. Each embedding model comes with its own dimensionality of the vectors.
Normalization: Specify the normalization technique for the embeddings (e.g., L2 normalization).
Node 4 – Retriever Prompt (Search)
Description: This node acts as a retrieval mechanism, responsible for fetching relevant information or context based on the embedded item.
Functionality: It uses the embeddings to search a database or knowledge base, retrieving information that is relevant to the query or input provided by the user. It could use various retrieval techniques, including vector searches, to find the best matching results.
Configuration:
Dataset: Specify your dataset, with all the existing chunks and embeddings.
Similarity Metric: Define the metric for measuring similarity between the query embedding and candidate items (e.g., cosine similarity, dot product).
Retrieval Strategy: Choose the retrieval strategy. In our case, we used our feature store based on SingleStore, a database optimized for fast searches. This allows for efficient vector-based search to quickly retrieve the most relevant information.
Node 5 – PHI-3-MINI – (Refine)
Description: Similar to the earlier PHI-3-MINI node, this node also involves a predictive model, another instance of the PHI-3-MINI model optimized by Microsoft.
Functionality: Processes the retrieved information using the predictive model to generate a response or further refine the data, ensuring a contextually accurate output for the user.
Configuration: Model Selection: Specify another instance of the PHI-3-MINI model optimized with Microsoft’s AI tools.
Task Definition: Instruct the model to take all chunks of documentation and reply accurately to the user’s question.
System Prompt Configuration: Instruct the chatbot on how to respond. In our case, we configured it to respond kindly, act as a helpful documentation assistant, clearly state when it doesn’t know an answer, and avoid inventing information.
Accelerate AI Development with Dataloop’s Integration of Microsoft Foundation Models
Discover a vast ecosystem of pre-built solutions, models, and datasets tailored to your specific needs. Easily filter options by provider, media type, and compatibility to find the perfect fit. Build and customize AI workflows with easy-to-use pipeline tools and out-of-the-box end-to-end AI and GenAI workflows. We are incredibly excited to see what you can create with your new capabilities!
Microsoft Tech Community – Latest Blogs –Read More
GitHub Model Catalog – Getting Started
Welcome to GitHub Models! We’ve got everything fired up and ready for you to explore AI Models hosted on Azure AI. So as Student developer you already have access to amazing GitHub Resources like Codespaces and Copilot from http://education.github.com now you get started on developing with Generative AI and Language Models with the Model Catalog.
For more information about the Models available on GitHub Models, check out the GitHub Model Marketplace
Each model has a dedicated playground and sample code available in a dedicated codespaces environment.
There are a few basic examples that are ready for you to run. You can find them in the samples directory within the codespaces environment.
If you want to jump straight to your favorite language, you can find the examples in the following Languages:
Python
JavaScript
cURL
The dedicated Codespaces Environment is an excellent way to get started running the samples and models.
Below are example code snippets for a few use cases. For additional information about Azure AI Inference SDK, see full documentation and samples.
Create a personal access token You do not need to give any permissions to the token. Note that the token will be sent to a Microsoft service.
To use the code snippets below, create an environment variable to set your token as the key for the client code.
If you’re using bash:
Install the Azure AI Inference SDK using pip (Requires: Python >=3.8):
This sample demonstrates a basic call to the chat completion API. It is leveraging the GitHub AI model inference endpoint and your GitHub token. The call is synchronous.
from azure.ai.inference import ChatCompletionsClient
from azure.ai.inference.models import SystemMessage, UserMessage
from azure.core.credentials import AzureKeyCredential
endpoint = “https://models.inference.ai.azure.com”
# Replace Model_Name
model_name = “Phi-3-small-8k-instruct”
token = os.environ[“GITHUB_TOKEN”]
client = ChatCompletionsClient(
endpoint=endpoint,
credential=AzureKeyCredential(token),
)
response = client.complete(
messages=[
SystemMessage(content=”You are a helpful assistant.”),
UserMessage(content=”What is the capital of France?”),
],
model=model_name,
temperature=1.,
max_tokens=1000,
top_p=1.
)
print(response.choices[0].message.content)
This sample demonstrates a multi-turn conversation with the chat completion API. When using the model for a chat application, you’ll need to manage the history of that conversation and send the latest messages to the model.
from azure.ai.inference import ChatCompletionsClient
from azure.ai.inference.models import AssistantMessage, SystemMessage, UserMessage
from azure.core.credentials import AzureKeyCredential
token = os.environ[“GITHUB_TOKEN”]
endpoint = “https://models.inference.ai.azure.com”
# Replace Model_Name
model_name = “Phi-3-small-8k-instruct”
client = ChatCompletionsClient(
endpoint=endpoint,
credential=AzureKeyCredential(token),
)
messages = [
SystemMessage(content=”You are a helpful assistant.”),
UserMessage(content=”What is the capital of France?”),
AssistantMessage(content=”The capital of France is Paris.”),
UserMessage(content=”What about Spain?”),
]
response = client.complete(messages=messages, model=model_name)
print(response.choices[0].message.content)
For a better user experience, you will want to stream the response of the model so that the first token shows up early and you avoid waiting for long responses.
from azure.ai.inference import ChatCompletionsClient
from azure.ai.inference.models import SystemMessage, UserMessage
from azure.core.credentials import AzureKeyCredential
token = os.environ[“GITHUB_TOKEN”]
endpoint = “https://models.inference.ai.azure.com”
# Replace Model_Name
model_name = “Phi-3-small-8k-instruct”
client = ChatCompletionsClient(
endpoint=endpoint,
credential=AzureKeyCredential(token),
)
response = client.complete(
stream=True,
messages=[
SystemMessage(content=”You are a helpful assistant.”),
UserMessage(content=”Give me 5 good reasons why I should exercise every day.”),
],
model=model_name,
)
for update in response:
if update.choices:
print(update.choices[0].delta.content or “”, end=””)
client.close()
Install Node.js.
Copy the following lines of text and save them as a file package.json inside your folder.
“type”: “module”,
“dependencies”: {
“@azure-rest/ai-inference”: “latest”,
“@azure/core-auth”: “latest”,
“@azure/core-sse”: “latest”
}
}
Note: @azure/core-sse is only needed when you stream the chat completions response.
Open a terminal window in this folder and run npm install.
For each of the code snippets below, copy the content into a file sample.js and run with node sample.js.
This sample demonstrates a basic call to the chat completion API. It is leveraging the GitHub AI model inference endpoint and your GitHub token. The call is synchronous.
import { AzureKeyCredential } from “@azure/core-auth”;
const token = process.env[“GITHUB_TOKEN”];
const endpoint = “https://models.inference.ai.azure.com”;
// Update your modelname
const modelName = “Phi-3-small-8k-instruct”;
export async function main() {
const client = new ModelClient(endpoint, new AzureKeyCredential(token));
const response = await client.path(“/chat/completions”).post({
body: {
messages: [
{ role:”system”, content: “You are a helpful assistant.” },
{ role:”user”, content: “What is the capital of France?” }
],
model: modelName,
temperature: 1.,
max_tokens: 1000,
top_p: 1.
}
});
if (response.status !== “200”) {
throw response.body.error;
}
console.log(response.body.choices[0].message.content);
}
main().catch((err) => {
console.error(“The sample encountered an error:”, err);
});
This sample demonstrates a multi-turn conversation with the chat completion API. When using the model for a chat application, you’ll need to manage the history of that conversation and send the latest messages to the model.
import { AzureKeyCredential } from “@azure/core-auth”;
const token = process.env[“GITHUB_TOKEN”];
const endpoint = “https://models.inference.ai.azure.com”;
// Update your modelname
const modelName = “Phi-3-small-8k-instruct”;
export async function main() {
const client = new ModelClient(endpoint, new AzureKeyCredential(token));
const response = await client.path(“/chat/completions”).post({
body: {
messages: [
{ role: “system”, content: “You are a helpful assistant.” },
{ role: “user”, content: “What is the capital of France?” },
{ role: “assistant”, content: “The capital of France is Paris.” },
{ role: “user”, content: “What about Spain?” },
],
model: modelName,
}
});
if (response.status !== “200”) {
throw response.body.error;
}
for (const choice of response.body.choices) {
console.log(choice.message.content);
}
}
main().catch((err) => {
console.error(“The sample encountered an error:”, err);
});
For a better user experience, you will want to stream the response of the model so that the first token shows up early and you avoid waiting for long responses.
import { AzureKeyCredential } from “@azure/core-auth”;
import { createSseStream } from “@azure/core-sse”;
const token = process.env[“GITHUB_TOKEN”];
const endpoint = “https://models.inference.ai.azure.com”;
// Update your modelname
const modelName = “Phi-3-small-8k-instruct”;
export async function main() {
const client = new ModelClient(endpoint, new AzureKeyCredential(token));
const response = await client.path(“/chat/completions”).post({
body: {
messages: [
{ role: “system”, content: “You are a helpful assistant.” },
{ role: “user”, content: “Give me 5 good reasons why I should exercise every day.” },
],
model: modelName,
stream: true
}
}).asNodeStream();
const stream = response.body;
if (!stream) {
throw new Error(“The response stream is undefined”);
}
if (response.status !== “200”) {
stream.destroy();
throw new Error(`Failed to get chat completions, http operation failed with ${response.status} code`);
}
const sseStream = createSseStream(stream);
for await (const event of sseStream) {
if (event.data === “[DONE]”) {
return;
}
for (const choice of (JSON.parse(event.data)).choices) {
process.stdout.write(choice.delta?.content ?? “);
}
}
}
main().catch((err) => {
console.error(“The sample encountered an error:”, err);
});
Paste the following into a shell:
-H “Content-Type: application/json”
-H “Authorization: Bearer $GITHUB_TOKEN”
-d ‘{
“messages”: [
{
“role”: “system”,
“content”: “You are a helpful assistant.”
},
{
“role”: “user”,
“content”: “What is the capital of France?”
}
],
“model”: “Phi-3-small-8k-instruct”
}’
Call the chat completion API and pass the chat history:
-H “Content-Type: application/json”
-H “Authorization: Bearer $GITHUB_TOKEN”
-d ‘{
“messages”: [
{
“role”: “system”,
“content”: “You are a helpful assistant.”
},
{
“role”: “user”,
“content”: “What is the capital of France?”
},
{
“role”: “assistant”,
“content”: “The capital of France is Paris.”
},
{
“role”: “user”,
“content”: “What about Spain?”
}
],
“model”: “Phi-3-small-8k-instruct”
}’
This is an example of calling the endpoint and streaming the response.
-H “Content-Type: application/json”
-H “Authorization: Bearer $GITHUB_TOKEN”
-d ‘{
“messages”: [
{
“role”: “system”,
“content”: “You are a helpful assistant.”
},
{
“role”: “user”,
“content”: “Give me 5 good reasons why I should exercise every day.”
}
],
“stream”: true,
“model”: “Phi-3-small-8k-instruct”
}’
The rate limits for the playground and free API usage are intended to help you experiment with models and prototype your AI application. For use beyond those limits, and to bring your application to scale, you must provision resources from an Azure account, and authenticate from there instead of your GitHub personal access token. You don’t need to change anything else in your code. Use this link to discover how to go beyond the free tier limits in Azure AI.
Microsoft Tech Community – Latest Blogs –Read More
How can I update base body transformation matrix during visualization using ‘show’ function?
To visualize the rigidbody tree model from urdf file, the ‘show’ function can be used. However, it supports to set the base body position (x,y,z) and yaw angle, only… I want to arbitrary set the base body position and orientation.
For a internal function in the robotics toolbox, there is a little comment to use 6-DOF xyz & roll, ptich, yaw, thereby obtaining whole base body transformation matrix. But, the strict support does not provide or activate them.
How can I set arbitrary base body transformation matrix of rigidbody tree? I have to simulate the base body trasnform almost real-time, therefore I don’t want to add and remove body during simulation.To visualize the rigidbody tree model from urdf file, the ‘show’ function can be used. However, it supports to set the base body position (x,y,z) and yaw angle, only… I want to arbitrary set the base body position and orientation.
For a internal function in the robotics toolbox, there is a little comment to use 6-DOF xyz & roll, ptich, yaw, thereby obtaining whole base body transformation matrix. But, the strict support does not provide or activate them.
How can I set arbitrary base body transformation matrix of rigidbody tree? I have to simulate the base body trasnform almost real-time, therefore I don’t want to add and remove body during simulation. To visualize the rigidbody tree model from urdf file, the ‘show’ function can be used. However, it supports to set the base body position (x,y,z) and yaw angle, only… I want to arbitrary set the base body position and orientation.
For a internal function in the robotics toolbox, there is a little comment to use 6-DOF xyz & roll, ptich, yaw, thereby obtaining whole base body transformation matrix. But, the strict support does not provide or activate them.
How can I set arbitrary base body transformation matrix of rigidbody tree? I have to simulate the base body trasnform almost real-time, therefore I don’t want to add and remove body during simulation. robotics tool box, base body transformation update MATLAB Answers — New Questions
How should I structure the neural net based on my given input and output training data
I am trying to design a feedforward network that trains on a 4×5 matrix (5 samples of 4 separate inputs into the neural network) and its outputs are represented by a 4x5x1000 matrix (5 samples of 4 outputs where each component of the 4×1 output vector has 1000 points). This neural net is used to determine an optimal trajectory for a given terminal condition from a set of the same initial conditions . The code for this project will be placed below:
%% Neural Net Training Process
% Initial State
x1 = [0;0]; % Initial Positions
x2 = [1;1]; % Initial Velocities
xo = [x1;x2]; % 4×1 Initial State Vector
% Parsing Training Input Data
x_input = [xf1,xf2,xf4,xf5,xf6]; % 4×5 Terminal State Vector (each xf (4×1) represents a different terminal condition)
% Parsing Training Output Data
x_output = [];
for i=1:4
x_output(i,1,:) = x1(:,i);
x_output(i,2,:) = x2(:,i);
x_output(i,3,:) = x4(:,i);
x_output(i,4,:) = x5(:,i);
x_output(i,5,:) = x6(:,i);
end % 4x5x1000 Terminal State Matrix
% Parsing Validation Data
xf_valid = xf3;
x_valid = x3′;
% Neural Net Architecture Initialization
netconfig = 40;
net = feedforwardnet(netconfig);
net.numInputs = 4;
% Training the Network
for j=1:5
curr_xin = x_input(:,j);
curr_xout = x_output(:,j,:);
net = train(net,curr_xin,curr_xout);
end
From here, I am receieve an error in line 89, where I get the following error: Error using nntraining.setup>setupPerWorker (line 96)
Targets T is not two-dimensional. Any advice from here would be appreciated. Thanks.I am trying to design a feedforward network that trains on a 4×5 matrix (5 samples of 4 separate inputs into the neural network) and its outputs are represented by a 4x5x1000 matrix (5 samples of 4 outputs where each component of the 4×1 output vector has 1000 points). This neural net is used to determine an optimal trajectory for a given terminal condition from a set of the same initial conditions . The code for this project will be placed below:
%% Neural Net Training Process
% Initial State
x1 = [0;0]; % Initial Positions
x2 = [1;1]; % Initial Velocities
xo = [x1;x2]; % 4×1 Initial State Vector
% Parsing Training Input Data
x_input = [xf1,xf2,xf4,xf5,xf6]; % 4×5 Terminal State Vector (each xf (4×1) represents a different terminal condition)
% Parsing Training Output Data
x_output = [];
for i=1:4
x_output(i,1,:) = x1(:,i);
x_output(i,2,:) = x2(:,i);
x_output(i,3,:) = x4(:,i);
x_output(i,4,:) = x5(:,i);
x_output(i,5,:) = x6(:,i);
end % 4x5x1000 Terminal State Matrix
% Parsing Validation Data
xf_valid = xf3;
x_valid = x3′;
% Neural Net Architecture Initialization
netconfig = 40;
net = feedforwardnet(netconfig);
net.numInputs = 4;
% Training the Network
for j=1:5
curr_xin = x_input(:,j);
curr_xout = x_output(:,j,:);
net = train(net,curr_xin,curr_xout);
end
From here, I am receieve an error in line 89, where I get the following error: Error using nntraining.setup>setupPerWorker (line 96)
Targets T is not two-dimensional. Any advice from here would be appreciated. Thanks. I am trying to design a feedforward network that trains on a 4×5 matrix (5 samples of 4 separate inputs into the neural network) and its outputs are represented by a 4x5x1000 matrix (5 samples of 4 outputs where each component of the 4×1 output vector has 1000 points). This neural net is used to determine an optimal trajectory for a given terminal condition from a set of the same initial conditions . The code for this project will be placed below:
%% Neural Net Training Process
% Initial State
x1 = [0;0]; % Initial Positions
x2 = [1;1]; % Initial Velocities
xo = [x1;x2]; % 4×1 Initial State Vector
% Parsing Training Input Data
x_input = [xf1,xf2,xf4,xf5,xf6]; % 4×5 Terminal State Vector (each xf (4×1) represents a different terminal condition)
% Parsing Training Output Data
x_output = [];
for i=1:4
x_output(i,1,:) = x1(:,i);
x_output(i,2,:) = x2(:,i);
x_output(i,3,:) = x4(:,i);
x_output(i,4,:) = x5(:,i);
x_output(i,5,:) = x6(:,i);
end % 4x5x1000 Terminal State Matrix
% Parsing Validation Data
xf_valid = xf3;
x_valid = x3′;
% Neural Net Architecture Initialization
netconfig = 40;
net = feedforwardnet(netconfig);
net.numInputs = 4;
% Training the Network
for j=1:5
curr_xin = x_input(:,j);
curr_xout = x_output(:,j,:);
net = train(net,curr_xin,curr_xout);
end
From here, I am receieve an error in line 89, where I get the following error: Error using nntraining.setup>setupPerWorker (line 96)
Targets T is not two-dimensional. Any advice from here would be appreciated. Thanks. neural network, feedforwardnet, control, matlab MATLAB Answers — New Questions
Simscape – Source component – Input
I created a hydraulic system with my own custom components (pipes, elbows, tees, orifices,… ) in order to calculate the flow rate at the outlets.
The fluid temperature is changed with the help of a parameter at a "source component". This temperature is used at two lookuptables in order to determine the corresponding viscosity and density of the fluid.
Viscosity and density are domain parameters which are changed by the "source component" in order to provide these for the rest of the model.
Now I want to change the temperature at run time with the help of a "ramp block". Therefor I created an input at the "source component" and connected the "ramp block" by a "simulink-ps-converter".
The issue is now that of course I can´t use an input for lookuptables at the parameter section of the "source component".
I can move the lookuptables to the equations section, but this creates new issues with missing variables for viscosity and density.
Any Idee how I can solve this problem?
This is my source component…
component(Propagation=source) oil_properties
% Oil properties
%
% Temperature range: -15°C – 110°C
%
%
% Fluid type:
%
%Viscosity | CLP | CLP-PG | CLP-HC | *according Rickmeier – Viscosity/Temperature diagram
%
%——————————————————
%
% 100 | 1 | – | – |
%
% 150 | 2 | 6 | 10 |
%
% 220 | 3 | 7 | 11 |
%
% 320 | 4 | 8 | 12 |
%
% 460 | 5 | 9 | 13 |
%
%——————————————————
parameters
fluid_type = 11;
temperature ={20,’1′};
end
parameters (Access=private)
% Temperatur-Viskositätsdiagramm von Rickmeier
%CLP – Mineralöl
temp_CLP_100 = {[-15.00 -10.00 -5.00 0.00 5.00 10.00 15.00 20.00 25.00 30.00 35.00 40.00 45.00 50.00 55.00 60.00 65.00 70.00 75.00 80.00 85.00 90.00 95.00 100.00 105.00 110.00], ‘1’};
dens_CLP_100 = {[907 904 902 899 896 893 890 887 884 881 878 876 873 870 867 864 861 858 855 852 850 847 844 841 838 835 ], ‘kg/m^3’};
visk_CLP_100 = {[9300 5000 3000 1800 1100 750 500 350 240 170 130 100 70 60 46 38 32 27 22 19 16.5 14.5 12.5 11 9.6 8.6 ], ‘mm^2/s’};
temp_CLP_150 = {[-15.00 -10.00 -5.00 0.00 5.00 10.00 15.00 20.00 25.00 30.00 35.00 40.00 45.00 50.00 55.00 60.00 65.00 70.00 75.00 80.00 85.00 90.00 95.00 100.00 105.00 110.00], ‘1’};
dens_CLP_150 = {[908 905 903 900 897 894 891 888 885 882 879 877 874 871 868 865 862 859 856 853 850 848 845 842 839 836 ], ‘kg/m^3’};
visk_CLP_150 = {[18000 10000 5500 3400 2000 1200 800 550 380 280 200 150 110 85 65 54 44 36 30 25 22 18.5 16.5 14.5 12.5 11 ], ‘mm^2/s’};
temp_CLP_220 = {[-15.00 -10.00 -5.00 0.00 5.00 10.00 15.00 20.00 25.00 30.00 35.00 40.00 45.00 50.00 55.00 60.00 65.00 70.00 75.00 80.00 85.00 90.00 95.00 100.00 105.00 110.00], ‘1’};
dens_CLP_220 = {[912 910 907 904 901 898 895 892 889 886 883 880 878 875 872 869 866 863 860 857 854 851 848 846 843 840 ], ‘kg/m^3’};
visk_CLP_220 = {[34000 18000 10000 5500 3400 2000 1300 820 600 420 300 220 165 130 100 80 60 51 42 36 30 26 22 19 17 15 ], ‘mm^2/s’};
temp_CLP_320 = {[-15.00 -10.00 -5.00 0.00 5.00 10.00 15.00 20.00 25.00 30.00 35.00 40.00 45.00 50.00 55.00 60.00 65.00 70.00 75.00 80.00 85.00 90.00 95.00 100.00 105.00 110.00], ‘1’};
dens_CLP_320 = {[920 917 914 911 908 905 902 899 896 893 890 887 884 881 879 876 873 870 867 864 861 858 855 852 849 846 ], ‘kg/m^3’};
visk_CLP_320 = {[60000 30000 16000 9000 5500 3400 2200 1400 900 600 440 320 240 180 140 110 90 70 58 46 40 32 28 24 21 18.5 ], ‘mm^2/s’};
temp_CLP_460 = {[-15.00 -10.00 -5.00 0.00 5.00 10.00 15.00 20.00 25.00 30.00 35.00 40.00 45.00 50.00 55.00 60.00 65.00 70.00 75.00 80.00 85.00 90.00 95.00 100.00 105.00 110.00], ‘1’};
dens_CLP_460 = {[920 917 914 911 908 905 902 899 896 893 890 887 884 881 879 876 873 870 867 864 861 858 855 852 849 846 ], ‘kg/m^3’};
visk_CLP_460 = {[110000 50000 28000 16000 9000 5500 3400 2100 1400 900 650 460 340 260 200 150 120 95 75 60 50 44 36 31 27 23 ], ‘mm^2/s’};
%CLP PG – Synthetisches Öl auf Basis Polyglykole
temp_CLP_PG_150 = {[-15.00 -10.00 -5.00 0.00 5.00 10.00 15.00 20.00 25.00 30.00 35.00 40.00 45.00 50.00 55.00 60.00 65.00 70.00 75.00 80.00 85.00 90.00 95.00 100.00 105.00 110.00], ‘1’};
dens_CLP_PG_150 = {[1084 1080 1077 1073 1070 1066 1063 1060 1056 1053 1049 1046 1042 1039 1035 1032 1028 1025 1022 1018 1015 1011 1008 1004 1001 997 ], ‘kg/m^3’};
visk_CLP_PG_150 = {[7500 3900 2420 1650 1200 850 610 440 340 260 210 150 130 105 90 71 61 52 44 38 32 29 25.5 22.5 20 18.5 ], ‘mm^2/s’};
temp_CLP_PG_220 = {[-15.00 -10.00 -5.00 0.00 5.00 10.00 15.00 20.00 25.00 30.00 35.00 40.00 45.00 50.00 55.00 60.00 65.00 70.00 75.00 80.00 85.00 90.00 95.00 100.00 105.00 110.00], ‘1’};
dens_CLP_PG_220 = {[1084 1080 1077 1073 1070 1066 1063 1060 1056 1053 1049 1046 1042 1039 1035 1032 1028 1025 1022 1018 1015 1011 1008 1004 1001 997 ], ‘kg/m^3’};
visk_CLP_PG_220 = {[6100 4100 2800 2000 1400 1020 750 550 440 340 260 220 170 140 115 100 95 70 60 50 44 40 34 30 27 24 ], ‘mm^2/s’};
temp_CLP_PG_320 = {[-15.00 -10.00 -5.00 0.00 5.00 10.00 15.00 20.00 25.00 30.00 35.00 40.00 45.00 50.00 55.00 60.00 65.00 70.00 75.00 80.00 85.00 90.00 95.00 100.00 105.00 110.00], ‘1’};
dens_CLP_PG_320 = {[1091 1087 1084 1080 1077 1073 1070 1067 1063 1060 1056 1053 1049 1046 1042 1039 1035 1032 1028 1025 1021 1018 1014 1011 1007 1004 ], ‘kg/m^3’};
visk_CLP_PG_320 = {[5600 4000 3000 2200 1600 1220 950 775 600 480 400 320 272 225 190 165 140 120 105 92 80 70 62 55 50 45 ], ‘mm^2/s’};
temp_CLP_PG_460 = {[-15.00 -10.00 -5.00 0.00 5.00 10.00 15.00 20.00 25.00 30.00 35.00 40.00 45.00 50.00 55.00 60.00 65.00 70.00 75.00 80.00 85.00 90.00 95.00 100.00 105.00 110.00], ‘1’};
dens_CLP_PG_460 = {[1081 1077 1074 1070 1067 1063 1060 1057 1053 1050 1046 1043 1039 1036 1032 1029 1026 1022 1019 1015 1012 1008 1005 1001 998 995 ], ‘kg/m^3’};
visk_CLP_PG_460 = {[7300 5400 3900 2900 2200 1700 1300 1050 810 650 550 460 360 310 260 222 190 165 140 120 108 95 85 75 67 60 ], ‘mm^2/s’};
%CLP HC – Synthetisches Öl auf Basis Polyalphaolefine
temp_CLP_HC_150 = {[-15.00 -10.00 -5.00 0.00 5.00 10.00 15.00 20.00 25.00 30.00 35.00 40.00 45.00 50.00 55.00 60.00 65.00 70.00 75.00 80.00 85.00 90.00 95.00 100.00 105.00 110.00], ‘1’};
dens_CLP_HC_150 = {[871 868 865 862 860 857 854 851 848 846 843 840 837 835 832 829 826 823 821 818 815 812 810 807 804 801 ], ‘kg/m^3’};
visk_CLP_HC_150 = {[6900 4000 2450 1650 1120 800 560 420 310 230 180 150 110 90 71 60 50 42 35 31 27 23.3 20.3 18.2 16 14.5 ], ‘mm^2/s’};
temp_CLP_HC_220 = {[-15.00 -10.00 -5.00 0.00 5.00 10.00 15.00 20.00 25.00 30.00 35.00 40.00 45.00 50.00 55.00 60.00 65.00 70.00 75.00 80.00 85.00 90.00 95.00 100.00 105.00 110.00], ‘1’};
dens_CLP_HC_220 = {[876 873 870 867 865 862 859 856 853 851 848 845 842 839 837 834 831 828 825 823 820 817 814 812 809 806 ], ‘kg/m^3’};
visk_CLP_HC_220 = {[6900 4000 2600 1800 1300 950 680 510 380 300 230 190 150 120 100 81 70 60 50 44 38 32 29 26 23 21 ], ‘mm^2/s’};
temp_CLP_HC_320 = {[-15.00 -10.00 -5.00 0.00 5.00 10.00 15.00 20.00 25.00 30.00 35.00 40.00 45.00 50.00 55.00 60.00 65.00 70.00 75.00 80.00 85.00 90.00 95.00 100.00 105.00 110.00], ‘1’};
dens_CLP_HC_320 = {[879 876 873 870 868 865 862 859 856 854 851 848 845 842 840 837 834 831 828 826 823 820 817 814 812 809 ], ‘kg/m^3’};
visk_CLP_HC_320 = {[14500 9000 6000 4000 2700 1900 1350 960 720 540 420 320 260 205 165 137 115 95 80 65 59 50 43 38 33 29.5 ], ‘mm^2/s’};
temp_CLP_HC_460 = {[-15.00 -10.00 -5.00 0.00 5.00 10.00 15.00 20.00 25.00 30.00 35.00 40.00 45.00 50.00 55.00 60.00 65.00 70.00 75.00 80.00 85.00 90.00 95.00 100.00 105.00 110.00], ‘1’};
dens_CLP_HC_460 = {[881 878 875 872 870 867 864 861 858 856 853 850 847 844 842 839 836 833 830 827 825 822 819 816 813 811 ], ‘kg/m^3’};
visk_CLP_HC_460 = {[25000 14500 9500 6000 4000 2800 1900 1400 1000 720 550 420 340 260 210 170 140 115 95 80 68 59 51 44 38 34 ], ‘mm^2/s’};
temp = [temp_CLP_100; temp_CLP_150; temp_CLP_220; temp_CLP_320; temp_CLP_460; temp_CLP_PG_150; temp_CLP_PG_220; temp_CLP_PG_320; temp_CLP_PG_460; temp_CLP_HC_150; temp_CLP_HC_220; temp_CLP_HC_320; temp_CLP_HC_460];
dens = [dens_CLP_100; dens_CLP_150; dens_CLP_220; dens_CLP_320; dens_CLP_460; dens_CLP_PG_150; dens_CLP_PG_220; dens_CLP_PG_320; dens_CLP_PG_460; dens_CLP_HC_150; dens_CLP_HC_220; dens_CLP_HC_320; dens_CLP_HC_460];
visk = [visk_CLP_100; visk_CLP_150; visk_CLP_220; visk_CLP_320; visk_CLP_460; visk_CLP_PG_150; visk_CLP_PG_220; visk_CLP_PG_320; visk_CLP_PG_460; visk_CLP_HC_150; visk_CLP_HC_220; visk_CLP_HC_320; visk_CLP_HC_460];
density = tablelookup(temp(fluid_type,:) ,dens(fluid_type,:) ,temperature, interpolation = smooth);
viscosity_kin = tablelookup(temp(fluid_type,:) ,visk(fluid_type,:) ,temperature, interpolation = smooth);
end
%inputs
% temperature = {1 , ‘1’ }; % :left
%end
nodes
G = NORD.Hydraulics.Domain.hydraulic(density=density,viscosity_kin=viscosity_kin); % :right
end
end
… and this is my custom domain:
domain hydraulic
% Hydraulic Domain
variables % Across
p = {value={1,’bar’},imin={0,’bar’}}; % Pressure
end
variables(Balancing = true) % Through
q = {0,’lpm’ }; % Flow rate
end
parameters
viscosity_kin = {0,’mm^2/s’ }; % kinematische Viskosität
density = {0,’kg/m^3′ }; % Dichte des Öls
bulk = {0.8e9 ,’Pa’ }; % Bulk modulus at atm. pressure and no gas
alpha = {0.005 ,’1′ }; % Relative amount of trapped air
range_error = {2 ,’1′ }; % Pressure below absolute zero
RD_18 = {0.015, ‘m’ }; % Rohrdurchmesser NW18
RD_10 = {0.008, ‘m’ }; % Rohrdurchmesser NW10
RD_12_5 = {0.0125,’m’ }; % Rohrdurchmesser für DMO
RD_06 = {0.006, ‘m’ }; % Rohrdurchmesser für DMO
BR_18 = {0.060, ‘m’ }; % Biegeradius des Rohres NW18
BR_10 = {0.027, ‘m’ }; % Biegeradius des Rohres NW10
Zeta_R_AURO = {1 , ‘1’ }; % Zeta Wert für Ausfluß (AURO)
Zeta_U_90_18 = {0.110, ‘1’ }; % Zeta Wert für 90° Rohrbogen
Zeta_U_90_10 = {0.117, ‘1’ }; % Zeta Wert für 90° Rohrbogen
Zeta_U_WV = {1.07, ‘1’ }; % Zeta Wert für WV
end
endI created a hydraulic system with my own custom components (pipes, elbows, tees, orifices,… ) in order to calculate the flow rate at the outlets.
The fluid temperature is changed with the help of a parameter at a "source component". This temperature is used at two lookuptables in order to determine the corresponding viscosity and density of the fluid.
Viscosity and density are domain parameters which are changed by the "source component" in order to provide these for the rest of the model.
Now I want to change the temperature at run time with the help of a "ramp block". Therefor I created an input at the "source component" and connected the "ramp block" by a "simulink-ps-converter".
The issue is now that of course I can´t use an input for lookuptables at the parameter section of the "source component".
I can move the lookuptables to the equations section, but this creates new issues with missing variables for viscosity and density.
Any Idee how I can solve this problem?
This is my source component…
component(Propagation=source) oil_properties
% Oil properties
%
% Temperature range: -15°C – 110°C
%
%
% Fluid type:
%
%Viscosity | CLP | CLP-PG | CLP-HC | *according Rickmeier – Viscosity/Temperature diagram
%
%——————————————————
%
% 100 | 1 | – | – |
%
% 150 | 2 | 6 | 10 |
%
% 220 | 3 | 7 | 11 |
%
% 320 | 4 | 8 | 12 |
%
% 460 | 5 | 9 | 13 |
%
%——————————————————
parameters
fluid_type = 11;
temperature ={20,’1′};
end
parameters (Access=private)
% Temperatur-Viskositätsdiagramm von Rickmeier
%CLP – Mineralöl
temp_CLP_100 = {[-15.00 -10.00 -5.00 0.00 5.00 10.00 15.00 20.00 25.00 30.00 35.00 40.00 45.00 50.00 55.00 60.00 65.00 70.00 75.00 80.00 85.00 90.00 95.00 100.00 105.00 110.00], ‘1’};
dens_CLP_100 = {[907 904 902 899 896 893 890 887 884 881 878 876 873 870 867 864 861 858 855 852 850 847 844 841 838 835 ], ‘kg/m^3’};
visk_CLP_100 = {[9300 5000 3000 1800 1100 750 500 350 240 170 130 100 70 60 46 38 32 27 22 19 16.5 14.5 12.5 11 9.6 8.6 ], ‘mm^2/s’};
temp_CLP_150 = {[-15.00 -10.00 -5.00 0.00 5.00 10.00 15.00 20.00 25.00 30.00 35.00 40.00 45.00 50.00 55.00 60.00 65.00 70.00 75.00 80.00 85.00 90.00 95.00 100.00 105.00 110.00], ‘1’};
dens_CLP_150 = {[908 905 903 900 897 894 891 888 885 882 879 877 874 871 868 865 862 859 856 853 850 848 845 842 839 836 ], ‘kg/m^3’};
visk_CLP_150 = {[18000 10000 5500 3400 2000 1200 800 550 380 280 200 150 110 85 65 54 44 36 30 25 22 18.5 16.5 14.5 12.5 11 ], ‘mm^2/s’};
temp_CLP_220 = {[-15.00 -10.00 -5.00 0.00 5.00 10.00 15.00 20.00 25.00 30.00 35.00 40.00 45.00 50.00 55.00 60.00 65.00 70.00 75.00 80.00 85.00 90.00 95.00 100.00 105.00 110.00], ‘1’};
dens_CLP_220 = {[912 910 907 904 901 898 895 892 889 886 883 880 878 875 872 869 866 863 860 857 854 851 848 846 843 840 ], ‘kg/m^3’};
visk_CLP_220 = {[34000 18000 10000 5500 3400 2000 1300 820 600 420 300 220 165 130 100 80 60 51 42 36 30 26 22 19 17 15 ], ‘mm^2/s’};
temp_CLP_320 = {[-15.00 -10.00 -5.00 0.00 5.00 10.00 15.00 20.00 25.00 30.00 35.00 40.00 45.00 50.00 55.00 60.00 65.00 70.00 75.00 80.00 85.00 90.00 95.00 100.00 105.00 110.00], ‘1’};
dens_CLP_320 = {[920 917 914 911 908 905 902 899 896 893 890 887 884 881 879 876 873 870 867 864 861 858 855 852 849 846 ], ‘kg/m^3’};
visk_CLP_320 = {[60000 30000 16000 9000 5500 3400 2200 1400 900 600 440 320 240 180 140 110 90 70 58 46 40 32 28 24 21 18.5 ], ‘mm^2/s’};
temp_CLP_460 = {[-15.00 -10.00 -5.00 0.00 5.00 10.00 15.00 20.00 25.00 30.00 35.00 40.00 45.00 50.00 55.00 60.00 65.00 70.00 75.00 80.00 85.00 90.00 95.00 100.00 105.00 110.00], ‘1’};
dens_CLP_460 = {[920 917 914 911 908 905 902 899 896 893 890 887 884 881 879 876 873 870 867 864 861 858 855 852 849 846 ], ‘kg/m^3’};
visk_CLP_460 = {[110000 50000 28000 16000 9000 5500 3400 2100 1400 900 650 460 340 260 200 150 120 95 75 60 50 44 36 31 27 23 ], ‘mm^2/s’};
%CLP PG – Synthetisches Öl auf Basis Polyglykole
temp_CLP_PG_150 = {[-15.00 -10.00 -5.00 0.00 5.00 10.00 15.00 20.00 25.00 30.00 35.00 40.00 45.00 50.00 55.00 60.00 65.00 70.00 75.00 80.00 85.00 90.00 95.00 100.00 105.00 110.00], ‘1’};
dens_CLP_PG_150 = {[1084 1080 1077 1073 1070 1066 1063 1060 1056 1053 1049 1046 1042 1039 1035 1032 1028 1025 1022 1018 1015 1011 1008 1004 1001 997 ], ‘kg/m^3’};
visk_CLP_PG_150 = {[7500 3900 2420 1650 1200 850 610 440 340 260 210 150 130 105 90 71 61 52 44 38 32 29 25.5 22.5 20 18.5 ], ‘mm^2/s’};
temp_CLP_PG_220 = {[-15.00 -10.00 -5.00 0.00 5.00 10.00 15.00 20.00 25.00 30.00 35.00 40.00 45.00 50.00 55.00 60.00 65.00 70.00 75.00 80.00 85.00 90.00 95.00 100.00 105.00 110.00], ‘1’};
dens_CLP_PG_220 = {[1084 1080 1077 1073 1070 1066 1063 1060 1056 1053 1049 1046 1042 1039 1035 1032 1028 1025 1022 1018 1015 1011 1008 1004 1001 997 ], ‘kg/m^3’};
visk_CLP_PG_220 = {[6100 4100 2800 2000 1400 1020 750 550 440 340 260 220 170 140 115 100 95 70 60 50 44 40 34 30 27 24 ], ‘mm^2/s’};
temp_CLP_PG_320 = {[-15.00 -10.00 -5.00 0.00 5.00 10.00 15.00 20.00 25.00 30.00 35.00 40.00 45.00 50.00 55.00 60.00 65.00 70.00 75.00 80.00 85.00 90.00 95.00 100.00 105.00 110.00], ‘1’};
dens_CLP_PG_320 = {[1091 1087 1084 1080 1077 1073 1070 1067 1063 1060 1056 1053 1049 1046 1042 1039 1035 1032 1028 1025 1021 1018 1014 1011 1007 1004 ], ‘kg/m^3’};
visk_CLP_PG_320 = {[5600 4000 3000 2200 1600 1220 950 775 600 480 400 320 272 225 190 165 140 120 105 92 80 70 62 55 50 45 ], ‘mm^2/s’};
temp_CLP_PG_460 = {[-15.00 -10.00 -5.00 0.00 5.00 10.00 15.00 20.00 25.00 30.00 35.00 40.00 45.00 50.00 55.00 60.00 65.00 70.00 75.00 80.00 85.00 90.00 95.00 100.00 105.00 110.00], ‘1’};
dens_CLP_PG_460 = {[1081 1077 1074 1070 1067 1063 1060 1057 1053 1050 1046 1043 1039 1036 1032 1029 1026 1022 1019 1015 1012 1008 1005 1001 998 995 ], ‘kg/m^3’};
visk_CLP_PG_460 = {[7300 5400 3900 2900 2200 1700 1300 1050 810 650 550 460 360 310 260 222 190 165 140 120 108 95 85 75 67 60 ], ‘mm^2/s’};
%CLP HC – Synthetisches Öl auf Basis Polyalphaolefine
temp_CLP_HC_150 = {[-15.00 -10.00 -5.00 0.00 5.00 10.00 15.00 20.00 25.00 30.00 35.00 40.00 45.00 50.00 55.00 60.00 65.00 70.00 75.00 80.00 85.00 90.00 95.00 100.00 105.00 110.00], ‘1’};
dens_CLP_HC_150 = {[871 868 865 862 860 857 854 851 848 846 843 840 837 835 832 829 826 823 821 818 815 812 810 807 804 801 ], ‘kg/m^3’};
visk_CLP_HC_150 = {[6900 4000 2450 1650 1120 800 560 420 310 230 180 150 110 90 71 60 50 42 35 31 27 23.3 20.3 18.2 16 14.5 ], ‘mm^2/s’};
temp_CLP_HC_220 = {[-15.00 -10.00 -5.00 0.00 5.00 10.00 15.00 20.00 25.00 30.00 35.00 40.00 45.00 50.00 55.00 60.00 65.00 70.00 75.00 80.00 85.00 90.00 95.00 100.00 105.00 110.00], ‘1’};
dens_CLP_HC_220 = {[876 873 870 867 865 862 859 856 853 851 848 845 842 839 837 834 831 828 825 823 820 817 814 812 809 806 ], ‘kg/m^3’};
visk_CLP_HC_220 = {[6900 4000 2600 1800 1300 950 680 510 380 300 230 190 150 120 100 81 70 60 50 44 38 32 29 26 23 21 ], ‘mm^2/s’};
temp_CLP_HC_320 = {[-15.00 -10.00 -5.00 0.00 5.00 10.00 15.00 20.00 25.00 30.00 35.00 40.00 45.00 50.00 55.00 60.00 65.00 70.00 75.00 80.00 85.00 90.00 95.00 100.00 105.00 110.00], ‘1’};
dens_CLP_HC_320 = {[879 876 873 870 868 865 862 859 856 854 851 848 845 842 840 837 834 831 828 826 823 820 817 814 812 809 ], ‘kg/m^3’};
visk_CLP_HC_320 = {[14500 9000 6000 4000 2700 1900 1350 960 720 540 420 320 260 205 165 137 115 95 80 65 59 50 43 38 33 29.5 ], ‘mm^2/s’};
temp_CLP_HC_460 = {[-15.00 -10.00 -5.00 0.00 5.00 10.00 15.00 20.00 25.00 30.00 35.00 40.00 45.00 50.00 55.00 60.00 65.00 70.00 75.00 80.00 85.00 90.00 95.00 100.00 105.00 110.00], ‘1’};
dens_CLP_HC_460 = {[881 878 875 872 870 867 864 861 858 856 853 850 847 844 842 839 836 833 830 827 825 822 819 816 813 811 ], ‘kg/m^3’};
visk_CLP_HC_460 = {[25000 14500 9500 6000 4000 2800 1900 1400 1000 720 550 420 340 260 210 170 140 115 95 80 68 59 51 44 38 34 ], ‘mm^2/s’};
temp = [temp_CLP_100; temp_CLP_150; temp_CLP_220; temp_CLP_320; temp_CLP_460; temp_CLP_PG_150; temp_CLP_PG_220; temp_CLP_PG_320; temp_CLP_PG_460; temp_CLP_HC_150; temp_CLP_HC_220; temp_CLP_HC_320; temp_CLP_HC_460];
dens = [dens_CLP_100; dens_CLP_150; dens_CLP_220; dens_CLP_320; dens_CLP_460; dens_CLP_PG_150; dens_CLP_PG_220; dens_CLP_PG_320; dens_CLP_PG_460; dens_CLP_HC_150; dens_CLP_HC_220; dens_CLP_HC_320; dens_CLP_HC_460];
visk = [visk_CLP_100; visk_CLP_150; visk_CLP_220; visk_CLP_320; visk_CLP_460; visk_CLP_PG_150; visk_CLP_PG_220; visk_CLP_PG_320; visk_CLP_PG_460; visk_CLP_HC_150; visk_CLP_HC_220; visk_CLP_HC_320; visk_CLP_HC_460];
density = tablelookup(temp(fluid_type,:) ,dens(fluid_type,:) ,temperature, interpolation = smooth);
viscosity_kin = tablelookup(temp(fluid_type,:) ,visk(fluid_type,:) ,temperature, interpolation = smooth);
end
%inputs
% temperature = {1 , ‘1’ }; % :left
%end
nodes
G = NORD.Hydraulics.Domain.hydraulic(density=density,viscosity_kin=viscosity_kin); % :right
end
end
… and this is my custom domain:
domain hydraulic
% Hydraulic Domain
variables % Across
p = {value={1,’bar’},imin={0,’bar’}}; % Pressure
end
variables(Balancing = true) % Through
q = {0,’lpm’ }; % Flow rate
end
parameters
viscosity_kin = {0,’mm^2/s’ }; % kinematische Viskosität
density = {0,’kg/m^3′ }; % Dichte des Öls
bulk = {0.8e9 ,’Pa’ }; % Bulk modulus at atm. pressure and no gas
alpha = {0.005 ,’1′ }; % Relative amount of trapped air
range_error = {2 ,’1′ }; % Pressure below absolute zero
RD_18 = {0.015, ‘m’ }; % Rohrdurchmesser NW18
RD_10 = {0.008, ‘m’ }; % Rohrdurchmesser NW10
RD_12_5 = {0.0125,’m’ }; % Rohrdurchmesser für DMO
RD_06 = {0.006, ‘m’ }; % Rohrdurchmesser für DMO
BR_18 = {0.060, ‘m’ }; % Biegeradius des Rohres NW18
BR_10 = {0.027, ‘m’ }; % Biegeradius des Rohres NW10
Zeta_R_AURO = {1 , ‘1’ }; % Zeta Wert für Ausfluß (AURO)
Zeta_U_90_18 = {0.110, ‘1’ }; % Zeta Wert für 90° Rohrbogen
Zeta_U_90_10 = {0.117, ‘1’ }; % Zeta Wert für 90° Rohrbogen
Zeta_U_WV = {1.07, ‘1’ }; % Zeta Wert für WV
end
end I created a hydraulic system with my own custom components (pipes, elbows, tees, orifices,… ) in order to calculate the flow rate at the outlets.
The fluid temperature is changed with the help of a parameter at a "source component". This temperature is used at two lookuptables in order to determine the corresponding viscosity and density of the fluid.
Viscosity and density are domain parameters which are changed by the "source component" in order to provide these for the rest of the model.
Now I want to change the temperature at run time with the help of a "ramp block". Therefor I created an input at the "source component" and connected the "ramp block" by a "simulink-ps-converter".
The issue is now that of course I can´t use an input for lookuptables at the parameter section of the "source component".
I can move the lookuptables to the equations section, but this creates new issues with missing variables for viscosity and density.
Any Idee how I can solve this problem?
This is my source component…
component(Propagation=source) oil_properties
% Oil properties
%
% Temperature range: -15°C – 110°C
%
%
% Fluid type:
%
%Viscosity | CLP | CLP-PG | CLP-HC | *according Rickmeier – Viscosity/Temperature diagram
%
%——————————————————
%
% 100 | 1 | – | – |
%
% 150 | 2 | 6 | 10 |
%
% 220 | 3 | 7 | 11 |
%
% 320 | 4 | 8 | 12 |
%
% 460 | 5 | 9 | 13 |
%
%——————————————————
parameters
fluid_type = 11;
temperature ={20,’1′};
end
parameters (Access=private)
% Temperatur-Viskositätsdiagramm von Rickmeier
%CLP – Mineralöl
temp_CLP_100 = {[-15.00 -10.00 -5.00 0.00 5.00 10.00 15.00 20.00 25.00 30.00 35.00 40.00 45.00 50.00 55.00 60.00 65.00 70.00 75.00 80.00 85.00 90.00 95.00 100.00 105.00 110.00], ‘1’};
dens_CLP_100 = {[907 904 902 899 896 893 890 887 884 881 878 876 873 870 867 864 861 858 855 852 850 847 844 841 838 835 ], ‘kg/m^3’};
visk_CLP_100 = {[9300 5000 3000 1800 1100 750 500 350 240 170 130 100 70 60 46 38 32 27 22 19 16.5 14.5 12.5 11 9.6 8.6 ], ‘mm^2/s’};
temp_CLP_150 = {[-15.00 -10.00 -5.00 0.00 5.00 10.00 15.00 20.00 25.00 30.00 35.00 40.00 45.00 50.00 55.00 60.00 65.00 70.00 75.00 80.00 85.00 90.00 95.00 100.00 105.00 110.00], ‘1’};
dens_CLP_150 = {[908 905 903 900 897 894 891 888 885 882 879 877 874 871 868 865 862 859 856 853 850 848 845 842 839 836 ], ‘kg/m^3’};
visk_CLP_150 = {[18000 10000 5500 3400 2000 1200 800 550 380 280 200 150 110 85 65 54 44 36 30 25 22 18.5 16.5 14.5 12.5 11 ], ‘mm^2/s’};
temp_CLP_220 = {[-15.00 -10.00 -5.00 0.00 5.00 10.00 15.00 20.00 25.00 30.00 35.00 40.00 45.00 50.00 55.00 60.00 65.00 70.00 75.00 80.00 85.00 90.00 95.00 100.00 105.00 110.00], ‘1’};
dens_CLP_220 = {[912 910 907 904 901 898 895 892 889 886 883 880 878 875 872 869 866 863 860 857 854 851 848 846 843 840 ], ‘kg/m^3’};
visk_CLP_220 = {[34000 18000 10000 5500 3400 2000 1300 820 600 420 300 220 165 130 100 80 60 51 42 36 30 26 22 19 17 15 ], ‘mm^2/s’};
temp_CLP_320 = {[-15.00 -10.00 -5.00 0.00 5.00 10.00 15.00 20.00 25.00 30.00 35.00 40.00 45.00 50.00 55.00 60.00 65.00 70.00 75.00 80.00 85.00 90.00 95.00 100.00 105.00 110.00], ‘1’};
dens_CLP_320 = {[920 917 914 911 908 905 902 899 896 893 890 887 884 881 879 876 873 870 867 864 861 858 855 852 849 846 ], ‘kg/m^3’};
visk_CLP_320 = {[60000 30000 16000 9000 5500 3400 2200 1400 900 600 440 320 240 180 140 110 90 70 58 46 40 32 28 24 21 18.5 ], ‘mm^2/s’};
temp_CLP_460 = {[-15.00 -10.00 -5.00 0.00 5.00 10.00 15.00 20.00 25.00 30.00 35.00 40.00 45.00 50.00 55.00 60.00 65.00 70.00 75.00 80.00 85.00 90.00 95.00 100.00 105.00 110.00], ‘1’};
dens_CLP_460 = {[920 917 914 911 908 905 902 899 896 893 890 887 884 881 879 876 873 870 867 864 861 858 855 852 849 846 ], ‘kg/m^3’};
visk_CLP_460 = {[110000 50000 28000 16000 9000 5500 3400 2100 1400 900 650 460 340 260 200 150 120 95 75 60 50 44 36 31 27 23 ], ‘mm^2/s’};
%CLP PG – Synthetisches Öl auf Basis Polyglykole
temp_CLP_PG_150 = {[-15.00 -10.00 -5.00 0.00 5.00 10.00 15.00 20.00 25.00 30.00 35.00 40.00 45.00 50.00 55.00 60.00 65.00 70.00 75.00 80.00 85.00 90.00 95.00 100.00 105.00 110.00], ‘1’};
dens_CLP_PG_150 = {[1084 1080 1077 1073 1070 1066 1063 1060 1056 1053 1049 1046 1042 1039 1035 1032 1028 1025 1022 1018 1015 1011 1008 1004 1001 997 ], ‘kg/m^3’};
visk_CLP_PG_150 = {[7500 3900 2420 1650 1200 850 610 440 340 260 210 150 130 105 90 71 61 52 44 38 32 29 25.5 22.5 20 18.5 ], ‘mm^2/s’};
temp_CLP_PG_220 = {[-15.00 -10.00 -5.00 0.00 5.00 10.00 15.00 20.00 25.00 30.00 35.00 40.00 45.00 50.00 55.00 60.00 65.00 70.00 75.00 80.00 85.00 90.00 95.00 100.00 105.00 110.00], ‘1’};
dens_CLP_PG_220 = {[1084 1080 1077 1073 1070 1066 1063 1060 1056 1053 1049 1046 1042 1039 1035 1032 1028 1025 1022 1018 1015 1011 1008 1004 1001 997 ], ‘kg/m^3’};
visk_CLP_PG_220 = {[6100 4100 2800 2000 1400 1020 750 550 440 340 260 220 170 140 115 100 95 70 60 50 44 40 34 30 27 24 ], ‘mm^2/s’};
temp_CLP_PG_320 = {[-15.00 -10.00 -5.00 0.00 5.00 10.00 15.00 20.00 25.00 30.00 35.00 40.00 45.00 50.00 55.00 60.00 65.00 70.00 75.00 80.00 85.00 90.00 95.00 100.00 105.00 110.00], ‘1’};
dens_CLP_PG_320 = {[1091 1087 1084 1080 1077 1073 1070 1067 1063 1060 1056 1053 1049 1046 1042 1039 1035 1032 1028 1025 1021 1018 1014 1011 1007 1004 ], ‘kg/m^3’};
visk_CLP_PG_320 = {[5600 4000 3000 2200 1600 1220 950 775 600 480 400 320 272 225 190 165 140 120 105 92 80 70 62 55 50 45 ], ‘mm^2/s’};
temp_CLP_PG_460 = {[-15.00 -10.00 -5.00 0.00 5.00 10.00 15.00 20.00 25.00 30.00 35.00 40.00 45.00 50.00 55.00 60.00 65.00 70.00 75.00 80.00 85.00 90.00 95.00 100.00 105.00 110.00], ‘1’};
dens_CLP_PG_460 = {[1081 1077 1074 1070 1067 1063 1060 1057 1053 1050 1046 1043 1039 1036 1032 1029 1026 1022 1019 1015 1012 1008 1005 1001 998 995 ], ‘kg/m^3’};
visk_CLP_PG_460 = {[7300 5400 3900 2900 2200 1700 1300 1050 810 650 550 460 360 310 260 222 190 165 140 120 108 95 85 75 67 60 ], ‘mm^2/s’};
%CLP HC – Synthetisches Öl auf Basis Polyalphaolefine
temp_CLP_HC_150 = {[-15.00 -10.00 -5.00 0.00 5.00 10.00 15.00 20.00 25.00 30.00 35.00 40.00 45.00 50.00 55.00 60.00 65.00 70.00 75.00 80.00 85.00 90.00 95.00 100.00 105.00 110.00], ‘1’};
dens_CLP_HC_150 = {[871 868 865 862 860 857 854 851 848 846 843 840 837 835 832 829 826 823 821 818 815 812 810 807 804 801 ], ‘kg/m^3’};
visk_CLP_HC_150 = {[6900 4000 2450 1650 1120 800 560 420 310 230 180 150 110 90 71 60 50 42 35 31 27 23.3 20.3 18.2 16 14.5 ], ‘mm^2/s’};
temp_CLP_HC_220 = {[-15.00 -10.00 -5.00 0.00 5.00 10.00 15.00 20.00 25.00 30.00 35.00 40.00 45.00 50.00 55.00 60.00 65.00 70.00 75.00 80.00 85.00 90.00 95.00 100.00 105.00 110.00], ‘1’};
dens_CLP_HC_220 = {[876 873 870 867 865 862 859 856 853 851 848 845 842 839 837 834 831 828 825 823 820 817 814 812 809 806 ], ‘kg/m^3’};
visk_CLP_HC_220 = {[6900 4000 2600 1800 1300 950 680 510 380 300 230 190 150 120 100 81 70 60 50 44 38 32 29 26 23 21 ], ‘mm^2/s’};
temp_CLP_HC_320 = {[-15.00 -10.00 -5.00 0.00 5.00 10.00 15.00 20.00 25.00 30.00 35.00 40.00 45.00 50.00 55.00 60.00 65.00 70.00 75.00 80.00 85.00 90.00 95.00 100.00 105.00 110.00], ‘1’};
dens_CLP_HC_320 = {[879 876 873 870 868 865 862 859 856 854 851 848 845 842 840 837 834 831 828 826 823 820 817 814 812 809 ], ‘kg/m^3’};
visk_CLP_HC_320 = {[14500 9000 6000 4000 2700 1900 1350 960 720 540 420 320 260 205 165 137 115 95 80 65 59 50 43 38 33 29.5 ], ‘mm^2/s’};
temp_CLP_HC_460 = {[-15.00 -10.00 -5.00 0.00 5.00 10.00 15.00 20.00 25.00 30.00 35.00 40.00 45.00 50.00 55.00 60.00 65.00 70.00 75.00 80.00 85.00 90.00 95.00 100.00 105.00 110.00], ‘1’};
dens_CLP_HC_460 = {[881 878 875 872 870 867 864 861 858 856 853 850 847 844 842 839 836 833 830 827 825 822 819 816 813 811 ], ‘kg/m^3’};
visk_CLP_HC_460 = {[25000 14500 9500 6000 4000 2800 1900 1400 1000 720 550 420 340 260 210 170 140 115 95 80 68 59 51 44 38 34 ], ‘mm^2/s’};
temp = [temp_CLP_100; temp_CLP_150; temp_CLP_220; temp_CLP_320; temp_CLP_460; temp_CLP_PG_150; temp_CLP_PG_220; temp_CLP_PG_320; temp_CLP_PG_460; temp_CLP_HC_150; temp_CLP_HC_220; temp_CLP_HC_320; temp_CLP_HC_460];
dens = [dens_CLP_100; dens_CLP_150; dens_CLP_220; dens_CLP_320; dens_CLP_460; dens_CLP_PG_150; dens_CLP_PG_220; dens_CLP_PG_320; dens_CLP_PG_460; dens_CLP_HC_150; dens_CLP_HC_220; dens_CLP_HC_320; dens_CLP_HC_460];
visk = [visk_CLP_100; visk_CLP_150; visk_CLP_220; visk_CLP_320; visk_CLP_460; visk_CLP_PG_150; visk_CLP_PG_220; visk_CLP_PG_320; visk_CLP_PG_460; visk_CLP_HC_150; visk_CLP_HC_220; visk_CLP_HC_320; visk_CLP_HC_460];
density = tablelookup(temp(fluid_type,:) ,dens(fluid_type,:) ,temperature, interpolation = smooth);
viscosity_kin = tablelookup(temp(fluid_type,:) ,visk(fluid_type,:) ,temperature, interpolation = smooth);
end
%inputs
% temperature = {1 , ‘1’ }; % :left
%end
nodes
G = NORD.Hydraulics.Domain.hydraulic(density=density,viscosity_kin=viscosity_kin); % :right
end
end
… and this is my custom domain:
domain hydraulic
% Hydraulic Domain
variables % Across
p = {value={1,’bar’},imin={0,’bar’}}; % Pressure
end
variables(Balancing = true) % Through
q = {0,’lpm’ }; % Flow rate
end
parameters
viscosity_kin = {0,’mm^2/s’ }; % kinematische Viskosität
density = {0,’kg/m^3′ }; % Dichte des Öls
bulk = {0.8e9 ,’Pa’ }; % Bulk modulus at atm. pressure and no gas
alpha = {0.005 ,’1′ }; % Relative amount of trapped air
range_error = {2 ,’1′ }; % Pressure below absolute zero
RD_18 = {0.015, ‘m’ }; % Rohrdurchmesser NW18
RD_10 = {0.008, ‘m’ }; % Rohrdurchmesser NW10
RD_12_5 = {0.0125,’m’ }; % Rohrdurchmesser für DMO
RD_06 = {0.006, ‘m’ }; % Rohrdurchmesser für DMO
BR_18 = {0.060, ‘m’ }; % Biegeradius des Rohres NW18
BR_10 = {0.027, ‘m’ }; % Biegeradius des Rohres NW10
Zeta_R_AURO = {1 , ‘1’ }; % Zeta Wert für Ausfluß (AURO)
Zeta_U_90_18 = {0.110, ‘1’ }; % Zeta Wert für 90° Rohrbogen
Zeta_U_90_10 = {0.117, ‘1’ }; % Zeta Wert für 90° Rohrbogen
Zeta_U_WV = {1.07, ‘1’ }; % Zeta Wert für WV
end
end domain parameters, inputs, source component, simscape MATLAB Answers — New Questions
How to simulate the io interface model of hil testing with Simulink?
I have already simulated the entire vehicle model, but I don’t know how to simulate the IO interface model that can interact with hardware data.I have already simulated the entire vehicle model, but I don’t know how to simulate the IO interface model that can interact with hardware data. I have already simulated the entire vehicle model, but I don’t know how to simulate the IO interface model that can interact with hardware data. simulink, io MATLAB Answers — New Questions
how to find the bit allocation factor?
how bit allocation factor affects the quality of a video? how this factor can be used to find the quality of video? how this factors can be calculated if i have a video?how bit allocation factor affects the quality of a video? how this factor can be used to find the quality of video? how this factors can be calculated if i have a video? how bit allocation factor affects the quality of a video? how this factor can be used to find the quality of video? how this factors can be calculated if i have a video? bit allocation factor MATLAB Answers — New Questions
How to copy a built-in Data connector in Global region to China region?
A lot of Sentinel solutions/data connectors are not available in China. For example: Dynamics 365 connector. Is it possible to get the source code of a built-in data connector (like Dynamics 365 connector) and create custom data connector in China?
A lot of Sentinel solutions/data connectors are not available in China. For example: Dynamics 365 connector. Is it possible to get the source code of a built-in data connector (like Dynamics 365 connector) and create custom data connector in China? Read More
Business Central Post Deployment Offer
Hi all, 
I would like to know in this new business central deployment offer: Does the partner need to get all the licenses subscribed by customer in the very first go or the ACR is calculated based on the subscription in 1st year? In other terms the incentives for ACR is eligible when all the licenses are subscripted at first place or is it the sum total in 1st Year?
Hi all, I would like to know in this new business central deployment offer: Does the partner need to get all the licenses subscribed by customer in the very first go or the ACR is calculated based on the subscription in 1st year? In other terms the incentives for ACR is eligible when all the licenses are subscripted at first place or is it the sum total in 1st Year? Read More
How to generate Custom Bitstream for Zedboard to deploy Neural Network model?
imds = imageDatastore(‘Result_fish_images(NA)’, …
‘IncludeSubfolders’,true, …
‘LabelSource’,’foldernames’);
%%
[imdsTrain,imdsValidation,imdsTest] = splitEachLabel(imds,0.7,0.15,0.15,"randomized");
%%
numTrainImages = numel(imdsTrain.Labels);
idx = randperm(numTrainImages,16);
figure
for i = 1:16
subplot(4,4,i)
I = readimage(imdsTrain,idx(i));
imshow(I)
end
%%
classNames = categories(imdsTrain.Labels);
numClasses = numel(classNames)
%%
net = imagePretrainedNetwork("alexnet",NumClasses=numClasses);
net = setLearnRateFactor(net,"fc8/Weights",20);
net = setLearnRateFactor(net,"fc8/Bias",20);
%%
inputSize = net.Layers(1).InputSize
%%
pixelRange = [-30 30];
imageAugmenter = imageDataAugmenter( …
‘RandXReflection’,true, …
‘RandXTranslation’,pixelRange, …
‘RandYTranslation’,pixelRange);
augimdsTrain = augmentedImageDatastore(inputSize(1:2),imdsTrain, …
‘DataAugmentation’,imageAugmenter);
%%
augimdsValidation = augmentedImageDatastore(inputSize(1:2),imdsValidation);
%%
options = trainingOptions("sgdm", …
MiniBatchSize=10, …
MaxEpochs=6, …
Metrics="accuracy", …
InitialLearnRate=1e-4, …
Shuffle="every-epoch", …
ValidationData=augimdsValidation, …
ValidationFrequency=3, …
Verbose=false, …
Plots="training-progress");
%%
net = trainnet(augimdsTrain,net,"crossentropy",options);
%%
scores = minibatchpredict(net,augimdsValidation);
YPred = scores2label(scores,classNames);
%%
idx = randperm(numel(imdsValidation.Files),4);
figure
for i = 1:4
subplot(2,2,i)
I = readimage(imdsValidation,idx(i));
imshow(I)
label = YPred(idx(i));
title(string(label));
end
%%
YValidation = imdsValidation.Labels;
accuracy = mean(YPred == YValidation)
%%After the above we performed the quantization and saved the network in quantizedNet variable. We flashed the %%memory card with linux image for zedboard using SoC Blockset support package. We tested the communication between %%our zedboard and laptop via zynq() command and were able to retreive the IP Address of it. Now we want to deploy the %%trained model on zedboard platform using:
%%hTarget = dlhdl.Target(‘Xilinx’,’Interface’,’Ethernet’);
%%hW = dlhdl.Workflow(‘Network’,quantizedNet,’Bitstream’,’zcu102_int8′,’Target’,hTarget);
%%dn=hW.compile;
%%hW.deploy;
%%output=hW.predict(InputImg);
%%This should give us prediction result by performing the operation on FPGA and fetching back the result.
%%But here the pre built bit streams like zc0102 or zc706 are not available for zedboard. How to generate custom %%bitstream targetting zedboard ??imds = imageDatastore(‘Result_fish_images(NA)’, …
‘IncludeSubfolders’,true, …
‘LabelSource’,’foldernames’);
%%
[imdsTrain,imdsValidation,imdsTest] = splitEachLabel(imds,0.7,0.15,0.15,"randomized");
%%
numTrainImages = numel(imdsTrain.Labels);
idx = randperm(numTrainImages,16);
figure
for i = 1:16
subplot(4,4,i)
I = readimage(imdsTrain,idx(i));
imshow(I)
end
%%
classNames = categories(imdsTrain.Labels);
numClasses = numel(classNames)
%%
net = imagePretrainedNetwork("alexnet",NumClasses=numClasses);
net = setLearnRateFactor(net,"fc8/Weights",20);
net = setLearnRateFactor(net,"fc8/Bias",20);
%%
inputSize = net.Layers(1).InputSize
%%
pixelRange = [-30 30];
imageAugmenter = imageDataAugmenter( …
‘RandXReflection’,true, …
‘RandXTranslation’,pixelRange, …
‘RandYTranslation’,pixelRange);
augimdsTrain = augmentedImageDatastore(inputSize(1:2),imdsTrain, …
‘DataAugmentation’,imageAugmenter);
%%
augimdsValidation = augmentedImageDatastore(inputSize(1:2),imdsValidation);
%%
options = trainingOptions("sgdm", …
MiniBatchSize=10, …
MaxEpochs=6, …
Metrics="accuracy", …
InitialLearnRate=1e-4, …
Shuffle="every-epoch", …
ValidationData=augimdsValidation, …
ValidationFrequency=3, …
Verbose=false, …
Plots="training-progress");
%%
net = trainnet(augimdsTrain,net,"crossentropy",options);
%%
scores = minibatchpredict(net,augimdsValidation);
YPred = scores2label(scores,classNames);
%%
idx = randperm(numel(imdsValidation.Files),4);
figure
for i = 1:4
subplot(2,2,i)
I = readimage(imdsValidation,idx(i));
imshow(I)
label = YPred(idx(i));
title(string(label));
end
%%
YValidation = imdsValidation.Labels;
accuracy = mean(YPred == YValidation)
%%After the above we performed the quantization and saved the network in quantizedNet variable. We flashed the %%memory card with linux image for zedboard using SoC Blockset support package. We tested the communication between %%our zedboard and laptop via zynq() command and were able to retreive the IP Address of it. Now we want to deploy the %%trained model on zedboard platform using:
%%hTarget = dlhdl.Target(‘Xilinx’,’Interface’,’Ethernet’);
%%hW = dlhdl.Workflow(‘Network’,quantizedNet,’Bitstream’,’zcu102_int8′,’Target’,hTarget);
%%dn=hW.compile;
%%hW.deploy;
%%output=hW.predict(InputImg);
%%This should give us prediction result by performing the operation on FPGA and fetching back the result.
%%But here the pre built bit streams like zc0102 or zc706 are not available for zedboard. How to generate custom %%bitstream targetting zedboard ?? imds = imageDatastore(‘Result_fish_images(NA)’, …
‘IncludeSubfolders’,true, …
‘LabelSource’,’foldernames’);
%%
[imdsTrain,imdsValidation,imdsTest] = splitEachLabel(imds,0.7,0.15,0.15,"randomized");
%%
numTrainImages = numel(imdsTrain.Labels);
idx = randperm(numTrainImages,16);
figure
for i = 1:16
subplot(4,4,i)
I = readimage(imdsTrain,idx(i));
imshow(I)
end
%%
classNames = categories(imdsTrain.Labels);
numClasses = numel(classNames)
%%
net = imagePretrainedNetwork("alexnet",NumClasses=numClasses);
net = setLearnRateFactor(net,"fc8/Weights",20);
net = setLearnRateFactor(net,"fc8/Bias",20);
%%
inputSize = net.Layers(1).InputSize
%%
pixelRange = [-30 30];
imageAugmenter = imageDataAugmenter( …
‘RandXReflection’,true, …
‘RandXTranslation’,pixelRange, …
‘RandYTranslation’,pixelRange);
augimdsTrain = augmentedImageDatastore(inputSize(1:2),imdsTrain, …
‘DataAugmentation’,imageAugmenter);
%%
augimdsValidation = augmentedImageDatastore(inputSize(1:2),imdsValidation);
%%
options = trainingOptions("sgdm", …
MiniBatchSize=10, …
MaxEpochs=6, …
Metrics="accuracy", …
InitialLearnRate=1e-4, …
Shuffle="every-epoch", …
ValidationData=augimdsValidation, …
ValidationFrequency=3, …
Verbose=false, …
Plots="training-progress");
%%
net = trainnet(augimdsTrain,net,"crossentropy",options);
%%
scores = minibatchpredict(net,augimdsValidation);
YPred = scores2label(scores,classNames);
%%
idx = randperm(numel(imdsValidation.Files),4);
figure
for i = 1:4
subplot(2,2,i)
I = readimage(imdsValidation,idx(i));
imshow(I)
label = YPred(idx(i));
title(string(label));
end
%%
YValidation = imdsValidation.Labels;
accuracy = mean(YPred == YValidation)
%%After the above we performed the quantization and saved the network in quantizedNet variable. We flashed the %%memory card with linux image for zedboard using SoC Blockset support package. We tested the communication between %%our zedboard and laptop via zynq() command and were able to retreive the IP Address of it. Now we want to deploy the %%trained model on zedboard platform using:
%%hTarget = dlhdl.Target(‘Xilinx’,’Interface’,’Ethernet’);
%%hW = dlhdl.Workflow(‘Network’,quantizedNet,’Bitstream’,’zcu102_int8′,’Target’,hTarget);
%%dn=hW.compile;
%%hW.deploy;
%%output=hW.predict(InputImg);
%%This should give us prediction result by performing the operation on FPGA and fetching back the result.
%%But here the pre built bit streams like zc0102 or zc706 are not available for zedboard. How to generate custom %%bitstream targetting zedboard ?? zedboard, fpga, bitstream, neural network, alexnet MATLAB Answers — New Questions