Month: May 2024
How to fix QUICKB00KS PAYR0LL direct deposit pending issue after new update?
I’m experiencing an issue with QUICKB00KS PAYR0LL where my direct deposit is stuck as ‘pending.’ How can I resolve this and ensure my employees get paid on time?
I’m experiencing an issue with QUICKB00KS PAYR0LL where my direct deposit is stuck as ‘pending.’ How can I resolve this and ensure my employees get paid on time? Read More
Block Copy/Paste in MS Teams
Hello,
Despite finding this question asked frequently online, I haven’t been able to find a definitive answer.
We are trying to completely disable the copy/paste function in the Microsoft Teams application when users are logged in from a Windows 10/11 PC. We’ve successfully implemented this restriction for users accessing Teams via a web browser on a PC, but the same solution doesn’t seem to apply to the desktop application.
Could anyone provide guidance on whether this is possible and, if so, share any references or helpful links?
Thanks in advance!
Hello,Despite finding this question asked frequently online, I haven’t been able to find a definitive answer.We are trying to completely disable the copy/paste function in the Microsoft Teams application when users are logged in from a Windows 10/11 PC. We’ve successfully implemented this restriction for users accessing Teams via a web browser on a PC, but the same solution doesn’t seem to apply to the desktop application.Could anyone provide guidance on whether this is possible and, if so, share any references or helpful links?Thanks in advance! Read More
How to Resolve QUICKB00KS ERR0R 15103 When Updating or Installing?
I’m facing a frustrating issue with QUICKB00KS ERR0R 15103 while trying to update. It’s hindering my workflow. What could be causing this problem, and how can I fix it quickly?
I’m facing a frustrating issue with QUICKB00KS ERR0R 15103 while trying to update. It’s hindering my workflow. What could be causing this problem, and how can I fix it quickly? Read More
How can I make x and y axis dates with contour?
Hello,
I’m trying to make a Porkchop Plot for a journey from Earth to Mars. The purpose here is to plot the C3 values that will coincide with the date of departure from Earth and the date of arrival on Mars as a contour function. In other words, there should be departure dates from Earth on the x-axis and arrival dates on Mars on the y-axis, but the contour function does not accept input as datetime. How can I solve this problem?Hello,
I’m trying to make a Porkchop Plot for a journey from Earth to Mars. The purpose here is to plot the C3 values that will coincide with the date of departure from Earth and the date of arrival on Mars as a contour function. In other words, there should be departure dates from Earth on the x-axis and arrival dates on Mars on the y-axis, but the contour function does not accept input as datetime. How can I solve this problem? Hello,
I’m trying to make a Porkchop Plot for a journey from Earth to Mars. The purpose here is to plot the C3 values that will coincide with the date of departure from Earth and the date of arrival on Mars as a contour function. In other words, there should be departure dates from Earth on the x-axis and arrival dates on Mars on the y-axis, but the contour function does not accept input as datetime. How can I solve this problem? contour, plot, datetime, xaxis, yaxix, porkchopplot MATLAB Answers — New Questions
Excel Formula Help
Hi everyone, I need some formula help. The formula I am using in the sheet attached isn’t the right one. In the sheet attached i have in row 3 automatically putting in U/A if in row 6 any ENGB or ENOP = 0. So you can see in Column D and E i have 1 ENGB and 1 ENOP so U/A does not show up. But if i have 2 ENGB and 0 ENOP like in columns F and G i need those to show blank. but if ENOP = 2 or 1 and ENGB =0 i need it to show U/A. I hope this makes sense. I am probably overthinking this but cant figure it out. any help is appreciated. thank you.
Hi everyone, I need some formula help. The formula I am using in the sheet attached isn’t the right one. In the sheet attached i have in row 3 automatically putting in U/A if in row 6 any ENGB or ENOP = 0. So you can see in Column D and E i have 1 ENGB and 1 ENOP so U/A does not show up. But if i have 2 ENGB and 0 ENOP like in columns F and G i need those to show blank. but if ENOP = 2 or 1 and ENGB =0 i need it to show U/A. I hope this makes sense. I am probably overthinking this but cant figure it out. any help is appreciated. thank you. Read More
Managing a Distributed API Estate Efficiently with Azure API Management and Self-Hosted Gateways
Azure API Management (APIM) is a cloud-based service that enables you to create, publish and manage your APIs behind a secure, consistent façade. You can create and manage policies that control access to your APIs, enforce usage quotas, and transform requests and responses.
Azure APIM gateways are the proxy for handling API requests. Self-hosted gateways (SHGW) are a capability of Azure APIM which allow you to deploy an instance of the API gateway component of APIM outside of Azure, such as on-premises or on a different cloud platform. Self-hosted gateways are unique to Azure APIM.
API Center is an Azure service that provides a central point of discovery, reuse and governance for APIs in Azure, on-premises, or in other clouds. New features coming include API synchronisation from APIM and Git repos, and API compliance monitoring.
Introduction
Increasing demand for system integration and interoperability is a universal requirement that has driven a huge growth in API development and has led to a proliferation of APIs. APIs enable different systems to easily speak to one another and have become the building blocks of product ecosystems, monetising data assets and driving greater service agility and innovation. There is also a move to migrate legacy APIs into the cloud to reduce cost and provide additional resiliency to priority workloads.
Challenges
However, this increase in API dependency has come with its own challenges. APIs built across teams and time lack consistency, are implemented in different technologies, and are deployed on a multitude of different hosting platforms, from on-premises to cloud. This makes the job of managing an API estate efficiently extremely difficult. Lack of a single view of capability for management increases the effort required to operate and protect your API inventory, and prevents effective discovery leading to low rates of reuse, driving duplication.
I was recently working with a customer that was looking to simplify management of their API estate. They had a large number of mission critical APIs scattered across cloud and on-premises. They didn’t know where they all lived, how many duplicates existed, and really needed to consolidate, standardise, and have a view of all their APIs via a single pane of glass.
Requirements
The high-level requirements for the customer to address their challenges included:
Support for distributed APIs with minimal latency overhead.
A secure façade for APIs, which hides the underlying API differences from the end user.
And importantly, a single management plane for the entire estate
Solution
Azure APIM with SHGW provides a solution which meets all of these requirements. The following diagram is based on the Azure APIM landing zone accelerator, but adds SHGW elements.
Azure APIM instance, comprising Developer Portal (a fully configurable website that provides a central location for API discovery, experimentation, testing, and reuse), Gateway, and Management Plane (an interface for managing your APIM instance, including how your APIs are exposed, protected, and versioned). The APIM is configured in Internal Mode to prevent the instance from being directly publicly accessible. The instance is only accessible via the configured App Gateway, or via a peered network.
SHGW container* deployed on a third-party cloud platform (e.g. AWS, GCP).
SHGW container* deployed on on-premises resources.
Management connectivity between SHGW and APIM management plane, allowing transmission of SHGW heartbeat, configuration updates, log shipment. Connection is outbound from SHGW on port 443.
ExpressRoute dedicated on-premises to Azure connection (optional).
Network peering between APIM subnet and on-premises network, allowing direct connectivity between the APIM instance and on-premises services, including the SHGW.
Public point of ingress for all APIM services.
API consumer, accessing Azure, on-premises, and third-party cloud APIs via Azure GW and SHGW. For access to non-Azure APIs traffic goes direct to the closest gateway, not via Azure.
*SHGW is provided as a downloadable Linux container image which can be configured and hosted on your own [high availability] infrastructure. By hosting the gateway near to your APIs, users of the APIs go directly to them (via the SHGW instance) without the need to pass through Azure. This reduces latency and supports data sovereignty, while still being centrally managed via the Azure hosted APIM instance management plane. SHGW is only supported on APIM Premium and Developer tiers.
For Enterprise API inventory and discovery see API Center (not covered here) which is Generally Available.
Things to consider when designing your SHGW implementation.
Connectivity between the SHGW and the Azure. Outbound connectivity is required from a SHGW to APIM and certain Azure services such as Azure Storage and Application Insights in order to pull configuration changes, for log shipment, heartbeat, and other operational necessities. Will you go via the internet, or remain on a private network? Private networking between on-premises SHGW and Azure can be achieved via network peering and a dedicated circuit such as ExpressRoute. It is also possible to peer a third-party cloud platform and Azure via a dedicated connection from third-party to on-premises and back up to Azure.
Authentication. To authenticate with APIM, SHGW presents an authentication key which by default is stored in the SHGW container. However, this is a poor solution which risks exposing the key and increases administration effort – it needs to be rotated every 30 days, and if forgotten the SHGW will lose its connectivity. A better solution is to use Entra authentication, see https://learn.microsoft.com/en-us/azure/api-management/self-hosted-gateway-enable-azure-ad.
Scalability. If using SHGW you are responsible for scaling the gateway, consider using Kubernetes horizontal pod autoscaler to scale out the gateway.
Conclusion
Azure APIM and SHGWs provide a complete solution for managing a distributed, diverse API estate efficiently. SHGWs offer flexibility, control, and customization options for organisations managing APIs, particularly those with specific performance, compliance, or integration requirements.
Microsoft Tech Community – Latest Blogs –Read More
How to use Cosmos DB at extreme scale with large document sizes
Introduction
First, let’s provide an overview of how Cosmos DB is being used in this scenario. As per the architectural overview shown below, there are an ecosystem of Customer systems that emit data, these hit a pipeline and are then inserted into Cosmos DB via a Batch process.
The original design from 2019 had only a single container, Data, which contained all of the documents that were ingested into Cosmos DB. As the Data Container grew and reached a point where Query performance and RU/s became prohibitive. An Index Container was introduced to function as an ‘Index’ for the Data Container. This design change mandated the following data access pattern:
Query the Index container for the Document ID
Retrieve the Document from the Data Container via a Query using the Document ID
If the Document is >2MB in size, the Document payload will have a Blob Storage Pointer
Retrieve the Document from Blob Storage via GET Operation
At minimum, it’s always two Cosmos DB operations to retrieve a document with a GET Blob operation being needed if the Document Size is >2MB.
The Index Container then underwent further changes by introducing Composite Indexes to optimise the RU/s cost of running queries against it as its size increased in correlation to the Data Container. There are a several challenges plain when reviewing this architecture:
2 or 3 operations are needed to read a single document.
The Index Container necessarily has a duplication of data within the Data Container.
The size of the Data Container and therefore the Index Container are inextricably linked – as the Data container grows, so must the Index container.
A document does not exist until it can be found, predicating the Index Container write needing to be successful for data consistency to be achieved.
The overall evolvability of this architecture is limited.
It’s clear that this architecture is running out of evolutionary capacity. How many more times can it change, reasonably, within this design – not many. Introducing Composite Indexes was the last change that could be made within this architecture to have a reasonable performance and RU/s benefit. A series of architectural changes are now needed to move this architecture back to a baseline position and allow for further evolution.
Recommendations
When evaluating what to change, we had the benefit of being able to take a fresh look at the capabilities of Cosmos DB and the current and future needs of the customer – Cosmos DB has evolved significantly since 2019 and has features and companion services that solve the challenges presented.
The changes that were recommended are as follows:
Enable Dynamic Autoscaling
After enabling this feature the Customer experienced an immediate and significant cost reduction across their Dev/Test/Pre-Prod and Production Cosmos DB accounts.
Dynamic Autoscaling allows for partitions to scale independently. Improving cost efficiency for non-uniform large-scale workloads with multiple partitions.
Smooth out Batch Insert/Update/Delete data processing
Cosmos DB is billed by the Hour – it’s important to remember that how you use Cosmos DB influences cost. In this scenario, a Batch process executed on both containers to update data.
This however set the RU cost between 45% and 55% higher than the typical data access pattern.
Smoothing Batch out at the expense of update speed would help mitigate this cost increase.
Enable Cosmos DB Integrated Cache
Using Cosmos DB Integrated Cache would mean there is no RU cost for a repeated query.
Cosmos DB Integrated Cache requires no administration or management.
Each Node can store up to 64GB of Independent Cache.
High RU and complex queries and point reads > 16KB would be expected to benefit from using Cosmos DB Integrated Cache.
Enable Hierarchical Partition Keys
Hierarchical Partition Keys allows for data to be sub-partitioned, up to 3 levels deep – beneficial for large, read-heavy documents.
Queries are efficiently routed to only the subset of physical partitions that contain the relevant data when specifying either the full or partial sub-partitioned key path.
Introduce a Container per Document Type
This provides greater clarity over the composite indexes that can be applied per document type – especially useful if documents have a variety of property combinations.
This also isolates high throughput document types that may not benefit enough from Hierarchical Partition Keys.
Combined with Dynamic Autoscaling, these high throughput document type containers can scale independently.
Retire the Index Container
Adopting the correct container strategy allows for the Index Container to be retired – allowing for single write consistency, mitigation of extra costs and removes 38TB of duplicated data.
Implement Cosmos DB Analytical Store
Cosmos DB Analytical Store is a fully isolated column store, optimised for analytical queries – as opposed to running queries of this type on the Transactional Store and incurring an RU/s cost.
Analytical Store queries are served at 0 RU/s.
Synapse Link seamlessly replicates data within 2 minutes from the Transactional Store to the Analytical Store.
Underpins a more efficient data discovery strategy as custom partitioning is supported.
Replicate data using the Cosmos DB Change Feed
If there is a need to replicate data from Cosmos DB to another database e.g., Azure SQL, use the Cosmos DB Change feed with Azure Functions.
This pattern is capable of 100,000+ requests per second.
Use Premium Blob Storage
Storing Documents >2MB on Azure Blob Storage is a recommended design pattern for Cosmos DB.
Premium Blob Storage will help with accessing these documents faster given their size.
Conclusion
We’ve covered a lot of ground, but hopefully you now have an understanding of the Cosmos DB features that can be enabled to solve challenging requirements at scale.
When designing an architecture, it’s critical to ensure it remains evolvable. The way to achieve this is to conduct regular reviews of the architecture against ‘current’ best practices, product capabilities and the needs of the business – and refactor/rearchitect where necessary.
Microsoft invests in the Well-Architected Framework, and we can conduct these reviews in partnership with you or you can complete them independently. Conduct these reviews at least every six months, at most, every year.
The more you can adopt the features of the product to solve your requirements, the more evolvable your architecture will be over the longer term. One of the key aims, in this scenario, was to move the Customer to a ‘baseline’ position.
We wanted to reset their Cosmos DB architecture and enable more Cosmos DB features to solve their requirements as opposed to bespoke implementations and workarounds within the incumbent architecture.
Microsoft Tech Community – Latest Blogs –Read More
How to get details on “Unable to load interface library” in using clib.
We’re building interface modules for a C++ library with clibgen – in different linux setups (server, local PC, cloud). We have a build that works, but not on all setups – and we use Matlab2021a. The error message on computer is:
Unable to load interface library:
‘/home/de11/sim_interfaces/matlab_to_cpp/cpp_lib/WI_MatlabInterface/WI_MatlabInterfaceInterface.so’.
Reason: The specified module could not be found.
Ensure the C++ dependent libraries for the interface library are added to
run-time path.
is telling me something, but not the full picture. As far as i can see all C++ dependent libraries are on the LD_LIBRARY_PATH and calling ldd on the command line tells me all dependencies can be resolved.
But how can i check the same within the matlab session?We’re building interface modules for a C++ library with clibgen – in different linux setups (server, local PC, cloud). We have a build that works, but not on all setups – and we use Matlab2021a. The error message on computer is:
Unable to load interface library:
‘/home/de11/sim_interfaces/matlab_to_cpp/cpp_lib/WI_MatlabInterface/WI_MatlabInterfaceInterface.so’.
Reason: The specified module could not be found.
Ensure the C++ dependent libraries for the interface library are added to
run-time path.
is telling me something, but not the full picture. As far as i can see all C++ dependent libraries are on the LD_LIBRARY_PATH and calling ldd on the command line tells me all dependencies can be resolved.
But how can i check the same within the matlab session? We’re building interface modules for a C++ library with clibgen – in different linux setups (server, local PC, cloud). We have a build that works, but not on all setups – and we use Matlab2021a. The error message on computer is:
Unable to load interface library:
‘/home/de11/sim_interfaces/matlab_to_cpp/cpp_lib/WI_MatlabInterface/WI_MatlabInterfaceInterface.so’.
Reason: The specified module could not be found.
Ensure the C++ dependent libraries for the interface library are added to
run-time path.
is telling me something, but not the full picture. As far as i can see all C++ dependent libraries are on the LD_LIBRARY_PATH and calling ldd on the command line tells me all dependencies can be resolved.
But how can i check the same within the matlab session? clibgen MATLAB Answers — New Questions
system call to another compiled exe in compiled script does not work
I have a compiled exe from another script in matlab, i wrote a wrapper to run it in parallel also in matlab, i compiled to an exe file, but when i run it, the system function can not call it.
system(my_prog.exe)
it does not work 🙁 when i compiled the script.I have a compiled exe from another script in matlab, i wrote a wrapper to run it in parallel also in matlab, i compiled to an exe file, but when i run it, the system function can not call it.
system(my_prog.exe)
it does not work 🙁 when i compiled the script. I have a compiled exe from another script in matlab, i wrote a wrapper to run it in parallel also in matlab, i compiled to an exe file, but when i run it, the system function can not call it.
system(my_prog.exe)
it does not work 🙁 when i compiled the script. system, compiler MATLAB Answers — New Questions
spline error: The first input must contain unique values.
I am defining a semicircle profile and then finding the x and y coordinate using interpolation at gauss legendres points
r = 0.35;
theta= linspace(pi/2,-pi/2,51);
xq = r*cos(theta);
yq = r*sin(theta)+r;
L = max(xq);
xn = xq/L;
x = spline(xn,xq,xx); %xx are gauss legendres quadrature points along x axis
y = spline(xq,yq,x);
while running the code it is showing error :Error using chckxy
The first input must contain unique values.
Error in spline
[x,y,sizey,endslopes] = chckxy(x,y);
x = spline(xn,xq,xx);
since the profile is semi circle there is duplicate x coordinates. how do i solve this error ?I am defining a semicircle profile and then finding the x and y coordinate using interpolation at gauss legendres points
r = 0.35;
theta= linspace(pi/2,-pi/2,51);
xq = r*cos(theta);
yq = r*sin(theta)+r;
L = max(xq);
xn = xq/L;
x = spline(xn,xq,xx); %xx are gauss legendres quadrature points along x axis
y = spline(xq,yq,x);
while running the code it is showing error :Error using chckxy
The first input must contain unique values.
Error in spline
[x,y,sizey,endslopes] = chckxy(x,y);
x = spline(xn,xq,xx);
since the profile is semi circle there is duplicate x coordinates. how do i solve this error ? I am defining a semicircle profile and then finding the x and y coordinate using interpolation at gauss legendres points
r = 0.35;
theta= linspace(pi/2,-pi/2,51);
xq = r*cos(theta);
yq = r*sin(theta)+r;
L = max(xq);
xn = xq/L;
x = spline(xn,xq,xx); %xx are gauss legendres quadrature points along x axis
y = spline(xq,yq,x);
while running the code it is showing error :Error using chckxy
The first input must contain unique values.
Error in spline
[x,y,sizey,endslopes] = chckxy(x,y);
x = spline(xn,xq,xx);
since the profile is semi circle there is duplicate x coordinates. how do i solve this error ? spline, fem MATLAB Answers — New Questions
Version 2024b and ServiceHost
Hello,
From the following question/response here (https://www.mathworks.com/matlabcentral/answers/2111226-what-is-the-mathworks-service-host-and-why-is-it-running), it would appear that this and related services cannot currently be disabled.
Is there a setting we can use to write this data and run these services from a different directory other than ~/.MathWorks?
The issue we are facing in our HPC environment is users can run matlab from any number of compute nodes and each run is creating a node specific directory. Our user home directories are limted with quotas and we’re starting to receive complaints.
-ChrisHello,
From the following question/response here (https://www.mathworks.com/matlabcentral/answers/2111226-what-is-the-mathworks-service-host-and-why-is-it-running), it would appear that this and related services cannot currently be disabled.
Is there a setting we can use to write this data and run these services from a different directory other than ~/.MathWorks?
The issue we are facing in our HPC environment is users can run matlab from any number of compute nodes and each run is creating a node specific directory. Our user home directories are limted with quotas and we’re starting to receive complaints.
-Chris Hello,
From the following question/response here (https://www.mathworks.com/matlabcentral/answers/2111226-what-is-the-mathworks-service-host-and-why-is-it-running), it would appear that this and related services cannot currently be disabled.
Is there a setting we can use to write this data and run these services from a different directory other than ~/.MathWorks?
The issue we are facing in our HPC environment is users can run matlab from any number of compute nodes and each run is creating a node specific directory. Our user home directories are limted with quotas and we’re starting to receive complaints.
-Chris 2024b, hpc, servicehost MATLAB Answers — New Questions
I don’t know what’s going on and why is this happening.
So, my matlab app worked perfectly fine when i had 2023b version. I decided to update it to a newer version. When the new version was installed i ran the matlab and it didn’t run as usual. It didn’t have the default layout, and when i treid to make a script it returned me this error:I don’t know what to do. Can someone please help me?So, my matlab app worked perfectly fine when i had 2023b version. I decided to update it to a newer version. When the new version was installed i ran the matlab and it didn’t run as usual. It didn’t have the default layout, and when i treid to make a script it returned me this error:I don’t know what to do. Can someone please help me? So, my matlab app worked perfectly fine when i had 2023b version. I decided to update it to a newer version. When the new version was installed i ran the matlab and it didn’t run as usual. It didn’t have the default layout, and when i treid to make a script it returned me this error:I don’t know what to do. Can someone please help me? java, problem, matlab MATLAB Answers — New Questions
Power Plan Settings changes revert back automatically after reboot
Set the Power Plan to High Performance but it is changed to Balance automatically after rebooting. How do I make it permanent?
Set the Power Plan to High Performance but it is changed to Balance automatically after rebooting. How do I make it permanent? Read More
About Simulink Solar Energy Model
I have an assignment to do with Simulink, but I don’t know how to do it. I want to use solar energy to produce energy, then use a DC-DC converter to electrolyze water and generate hydrogen, which will be used in a PEM fuel cell to produce 9 kW of energy. In the final stage, I need to convert this energy to city electricity using a DC-AC converter. How can I do this? I would appreciate it if you could explain it through modeling without specific values.I have an assignment to do with Simulink, but I don’t know how to do it. I want to use solar energy to produce energy, then use a DC-DC converter to electrolyze water and generate hydrogen, which will be used in a PEM fuel cell to produce 9 kW of energy. In the final stage, I need to convert this energy to city electricity using a DC-AC converter. How can I do this? I would appreciate it if you could explain it through modeling without specific values. I have an assignment to do with Simulink, but I don’t know how to do it. I want to use solar energy to produce energy, then use a DC-DC converter to electrolyze water and generate hydrogen, which will be used in a PEM fuel cell to produce 9 kW of energy. In the final stage, I need to convert this energy to city electricity using a DC-AC converter. How can I do this? I would appreciate it if you could explain it through modeling without specific values. transferred MATLAB Answers — New Questions
Can I link a library or ojbect file built with one compiler to a MEX function built with a different compiler?
I am using MATLAB R2024a on a Windows machine to build and link a C++ MEX function to a C++ static library.
I am compiling the static library outside of MATLAB using a supported version of the MinGW C/C++ compiler. In MATLAB, I am using the "mex" function and a supported version of the Microsoft Visual C++ compiler.
Is this a supported workflow? And if not, what workaround is available for me?I am using MATLAB R2024a on a Windows machine to build and link a C++ MEX function to a C++ static library.
I am compiling the static library outside of MATLAB using a supported version of the MinGW C/C++ compiler. In MATLAB, I am using the "mex" function and a supported version of the Microsoft Visual C++ compiler.
Is this a supported workflow? And if not, what workaround is available for me? I am using MATLAB R2024a on a Windows machine to build and link a C++ MEX function to a C++ static library.
I am compiling the static library outside of MATLAB using a supported version of the MinGW C/C++ compiler. In MATLAB, I am using the "mex" function and a supported version of the Microsoft Visual C++ compiler.
Is this a supported workflow? And if not, what workaround is available for me? mex, compiler, staticlibrary MATLAB Answers — New Questions
How to pass a struct to a model reference instance?
Hi all,
I have a Simulink Parameter containing a struct that I would like to pass on to a model reference instance.
Using the parameter from the basae workspace is no problem, but I need to use different parameters for the individual instances.
My idea was to use the instance parameters, but it seems like I cannot use a struct there.
Data type "auto" is not allowed and struct is not available.
The parameters I’d like to pass on look like this:
Is there a way to use a struct as instance parameter?
Or is there maybe a totally different approach?
Any help would be highly appreciated. 🙂
Thanks!
ChristopherHi all,
I have a Simulink Parameter containing a struct that I would like to pass on to a model reference instance.
Using the parameter from the basae workspace is no problem, but I need to use different parameters for the individual instances.
My idea was to use the instance parameters, but it seems like I cannot use a struct there.
Data type "auto" is not allowed and struct is not available.
The parameters I’d like to pass on look like this:
Is there a way to use a struct as instance parameter?
Or is there maybe a totally different approach?
Any help would be highly appreciated. 🙂
Thanks!
Christopher Hi all,
I have a Simulink Parameter containing a struct that I would like to pass on to a model reference instance.
Using the parameter from the basae workspace is no problem, but I need to use different parameters for the individual instances.
My idea was to use the instance parameters, but it seems like I cannot use a struct there.
Data type "auto" is not allowed and struct is not available.
The parameters I’d like to pass on look like this:
Is there a way to use a struct as instance parameter?
Or is there maybe a totally different approach?
Any help would be highly appreciated. 🙂
Thanks!
Christopher instance parameters, simulink parameter, struct, model reference, instance MATLAB Answers — New Questions
Set the network range to which session control applies
Hello,
MDCA’s session control is ignore network(IP range) settings in Microsoft Entra ID Conditional Access policy?
I understand that the following items are required for the session control policy.
– a conditional access policy that matches the traffic.
– a session policy in defender for cloud apps.
However, downloads are blocked in Microsoft Edge even if there is no CA policy applied.
Hello,MDCA’s session control is ignore network(IP range) settings in Microsoft Entra ID Conditional Access policy?I understand that the following items are required for the session control policy. – a conditional access policy that matches the traffic.- a session policy in defender for cloud apps. However, downloads are blocked in Microsoft Edge even if there is no CA policy applied. Read More
Getting x axis to cross both y axes at zero
Hi. First time caller to community…
I can’t work out on an excel chart with 2 y axes how to get the x axis to cross at zero for both axes. I can do it for the left y axis but not the right y axis.
Many thanks!
Hi. First time caller to community…I can’t work out on an excel chart with 2 y axes how to get the x axis to cross at zero for both axes. I can do it for the left y axis but not the right y axis.Many thanks! Read More
Conditional Formatting based on two ranges of data (postcodes)
Hi All,
I’m looking to format one range of postcode data based on the values given ina range in another sheet. Im trying to match up businesses to postcodes, based on postcodes assigned to the businesses. See the screen shot below. In this example I am looking to highlight any postcodes in sheet 1, that are the same as the postcodes in sheet 2’s data set.
I’ve been searching for a conditional formatting formula but haven’t found any that work, please help!
Hi All, I’m looking to format one range of postcode data based on the values given ina range in another sheet. Im trying to match up businesses to postcodes, based on postcodes assigned to the businesses. See the screen shot below. In this example I am looking to highlight any postcodes in sheet 1, that are the same as the postcodes in sheet 2’s data set. I’ve been searching for a conditional formatting formula but haven’t found any that work, please help! Read More