Category: News
Looking to be able to filter multiple words in a table separately
Hi, I am currently making a roster for my Esports team in excel and was just curious if there is a way to include multiple bits of information in one cell (games in this case like the top cell) and have this result – for AuB Dusty in this case – show up for each of the individual games. So for instance when filtering for “Brawlhalla” the bottom 3 show up, as well as the top “Dusty” cell.
I get that you can search under the filter, but was just curious if you can include the player will multiple options under each individual heading
Hi, I am currently making a roster for my Esports team in excel and was just curious if there is a way to include multiple bits of information in one cell (games in this case like the top cell) and have this result – for AuB Dusty in this case – show up for each of the individual games. So for instance when filtering for “Brawlhalla” the bottom 3 show up, as well as the top “Dusty” cell. I get that you can search under the filter, but was just curious if you can include the player will multiple options under each individual heading Read More
Testovate X7 Reviews (Official 2024!) How it Work?
→→→ Click here for the latest Updated Season Sale 35% Discount Price.
One of the products that have been heartily invited and made a great deal of fervor via virtual entertainment is Testovate X7 Reviews by Peak Labs. Considering that, we chose to analyze it and make our own decisions.
Save Up To 70% OFF — “OFFICIAL WEBSITE”
→→→ Click here for the latest Updated Season Sale 35% Discount Price. One of the products that have been heartily invited and made a great deal of fervor via virtual entertainment is Testovate X7 Reviews by Peak Labs. Considering that, we chose to analyze it and make our own decisions.Save Up To 70% OFF — “OFFICIAL WEBSITE” Read More
Change the owner of a Program
Hiya,
I am setting up a project and program structure to allow some of our leadership to have an overview of what’s happening in projects.
I set a lot of this up under my account but would like to change the ownership of the programs to the leader involved. I can change the “manager” but not the owner. How do I change the owner?
This is Project Online
Hiya, I am setting up a project and program structure to allow some of our leadership to have an overview of what’s happening in projects.I set a lot of this up under my account but would like to change the ownership of the programs to the leader involved. I can change the “manager” but not the owner. How do I change the owner? This is Project Online Read More
Exchange Hybrid Wizard won’t run
Hi all,
2 x Exchange 2019 (CU 14 and April SU) in a dag
Windows Server 2022
I’m trying to run the Hybrid Configuration Wizard but nothing happens.
From one of the servers I go to https://aka.ms/HybridWizard. Using Edge it offers to Open Microsoft.Online.CSE.Hybrid.Client.application but nothing happens when I click Open.
I downloaded the application and tried to run it manually but absolutely nothing happens. Nothing in the event logs to say why nothing happened. I had a look and ClickOnce Application Deployment Support Library is the default to open .application files. Tried using the old IE to run it (as some posts online suggest) but again nothing happens.
If I run the application from another PC (Windows 10), it attempts to run but fails and I get the error:
Deployment and application do not have matching security zones
Has anyone else had this issue and was able to resolve?
thanks
justin
Hi all, 2 x Exchange 2019 (CU 14 and April SU) in a dagWindows Server 2022 I’m trying to run the Hybrid Configuration Wizard but nothing happens. From one of the servers I go to https://aka.ms/HybridWizard. Using Edge it offers to Open Microsoft.Online.CSE.Hybrid.Client.application but nothing happens when I click Open. I downloaded the application and tried to run it manually but absolutely nothing happens. Nothing in the event logs to say why nothing happened. I had a look and ClickOnce Application Deployment Support Library is the default to open .application files. Tried using the old IE to run it (as some posts online suggest) but again nothing happens. If I run the application from another PC (Windows 10), it attempts to run but fails and I get the error:Deployment and application do not have matching security zones Has anyone else had this issue and was able to resolve? thanks justin Read More
How to take the screenshots of two or more scopes of a single testcase while running the simulations?
Hello,
I’m running the simulations which shows the outputs in scopes. I could able to take the screenshots of the one scope using below commands
shh = get(0,’ShowHiddenHandles’); %finds the open figures
set(0,’ShowHiddenHandles’,’On’);
set(gcf,’PaperPositionMode’,’auto’); %gcf-current figure handle
set(gcf,’InvertHardcopy’,’off’);
saveas(gcf,sprintf(‘TC%d.png’,j));
set(0,’ShowHiddenHandles’,shh);
I could not able to take the screenshot of more than one scope of a single testcase. Here I’m using gcf command. Is there anyother command I have to use?
Any inputs highly appreciated!!
Thank you in advance!! :)Hello,
I’m running the simulations which shows the outputs in scopes. I could able to take the screenshots of the one scope using below commands
shh = get(0,’ShowHiddenHandles’); %finds the open figures
set(0,’ShowHiddenHandles’,’On’);
set(gcf,’PaperPositionMode’,’auto’); %gcf-current figure handle
set(gcf,’InvertHardcopy’,’off’);
saveas(gcf,sprintf(‘TC%d.png’,j));
set(0,’ShowHiddenHandles’,shh);
I could not able to take the screenshot of more than one scope of a single testcase. Here I’m using gcf command. Is there anyother command I have to use?
Any inputs highly appreciated!!
Thank you in advance!! 🙂 Hello,
I’m running the simulations which shows the outputs in scopes. I could able to take the screenshots of the one scope using below commands
shh = get(0,’ShowHiddenHandles’); %finds the open figures
set(0,’ShowHiddenHandles’,’On’);
set(gcf,’PaperPositionMode’,’auto’); %gcf-current figure handle
set(gcf,’InvertHardcopy’,’off’);
saveas(gcf,sprintf(‘TC%d.png’,j));
set(0,’ShowHiddenHandles’,shh);
I could not able to take the screenshot of more than one scope of a single testcase. Here I’m using gcf command. Is there anyother command I have to use?
Any inputs highly appreciated!!
Thank you in advance!! 🙂 matlab, simulation, simulink, script, screenshot MATLAB Answers — New Questions
Negative sign produced by “functionalDerivative” – why?
I am asking this question to both get some understanding, but to also provide a possible workaround to a problem that I have seen posted before with no solutions.
I have written some code to determine the equations of motion for a somewhat simple system (in preparation for the analysis of a much more complex system) and have noticed some odd behavior when using the "functionalDerivative" command and am curious is anyone has ideas on why the behavior detailed below occurs.
Background:
The Lagrangian is given by:
The constraint equations are given by:
I have already developed the EOM’s using the method of Lagrange multipliers, and would now like to develop them by simply substituting the constraint equations into the Lagrangian. Note however that is a function of time, therefore is a composite function:
Note that the generalized coordinate is acted on by a non-conservative generalized force, which I account for on the rhs of the equation of motion (line 20).
Initial Attempts
My initial attempts utilized the "diff" command to try and determine the generalized forces and momenta, however I got the following error message for the generalized force for :
"Error using symengine, First argument must not contain functionals"
This makes sense, as "diff" needs a symbolic variable, not a function. Since this failed, I decided to utilize the functionalDerivative command (which coincidently requires much less code and appears to be overall more efficient), as shown in the code below:
syms theta_c(t) theta_hc(t) z_h(t) I_eq I_h m_h k z_hi Q_c f(t)
%setting assumptions
assume(I_eq,’positive’);
assume(I_h,’positive’);
assume(m_h,’positive’);
assume(z_hi,’positive’);
assume(k,’positive’);
%constraint equations
theta_h=theta_c+theta_hc;
g=compose(f,theta_hc);
%Lagrangian
L=0.5*I_eq*diff(theta_c,t)^2+0.5*I_h*diff(theta_h,t)^2+0.5*m_h*diff(z_h,t)^2-0.5*k*(z_hi+z_h)^2;
%substitution
L=subs(L,z_h,g); %note that theta_h substitution occurs automatically
%Equations of Motion
EOM_theta_c=functionalDerivative(L,theta_c) == Q_c;
EOM_theta_hc=functionalDerivative(L,theta_hc) == 0;
%isolating accelerations
theta_c_2dot=isolate(EOM_theta_c,diff(theta_c,t,t));
theta_hc_2dot=isolate(EOM_theta_hc,diff(theta_hc,t,t));
Question
The resulting equations of motion were given by MATLAB as:
When applying the Euler-Lagrange equation by hand the equation for is correct, but the equation for has an incorrect sign for as it should be positive. Checking just the functional derivative gives:
test=functionalDerivative(L,theta_c)
test(t)=
I am confused as to why these terms have a negative sign, as according the standard form of the Euler-Lagrange equation these terms should have a positive sign. Now, in cases where the functional derivative is equal to zero (conservative systems) this negative sign has no bearing on the result as it can simply be divided out. However, as I have shown in my case (non-conservative) this negative sign does have a bearing on the result. Why is this negative sign added when the functional derivative is calculated?
Possible Fix
I have managed to mitigate this behavior by adjusting my code:
%Equations of Motion
EOM_theta_c=-functionalDerivative(L,theta_c) == Q_c;
EOM_theta_hc=-functionalDerivative(L,theta_hc) == 0;
My only worry when moving to a more complex system is if the behavior of the functionalDerivative command will be consistent as far as signs are considered.I am asking this question to both get some understanding, but to also provide a possible workaround to a problem that I have seen posted before with no solutions.
I have written some code to determine the equations of motion for a somewhat simple system (in preparation for the analysis of a much more complex system) and have noticed some odd behavior when using the "functionalDerivative" command and am curious is anyone has ideas on why the behavior detailed below occurs.
Background:
The Lagrangian is given by:
The constraint equations are given by:
I have already developed the EOM’s using the method of Lagrange multipliers, and would now like to develop them by simply substituting the constraint equations into the Lagrangian. Note however that is a function of time, therefore is a composite function:
Note that the generalized coordinate is acted on by a non-conservative generalized force, which I account for on the rhs of the equation of motion (line 20).
Initial Attempts
My initial attempts utilized the "diff" command to try and determine the generalized forces and momenta, however I got the following error message for the generalized force for :
"Error using symengine, First argument must not contain functionals"
This makes sense, as "diff" needs a symbolic variable, not a function. Since this failed, I decided to utilize the functionalDerivative command (which coincidently requires much less code and appears to be overall more efficient), as shown in the code below:
syms theta_c(t) theta_hc(t) z_h(t) I_eq I_h m_h k z_hi Q_c f(t)
%setting assumptions
assume(I_eq,’positive’);
assume(I_h,’positive’);
assume(m_h,’positive’);
assume(z_hi,’positive’);
assume(k,’positive’);
%constraint equations
theta_h=theta_c+theta_hc;
g=compose(f,theta_hc);
%Lagrangian
L=0.5*I_eq*diff(theta_c,t)^2+0.5*I_h*diff(theta_h,t)^2+0.5*m_h*diff(z_h,t)^2-0.5*k*(z_hi+z_h)^2;
%substitution
L=subs(L,z_h,g); %note that theta_h substitution occurs automatically
%Equations of Motion
EOM_theta_c=functionalDerivative(L,theta_c) == Q_c;
EOM_theta_hc=functionalDerivative(L,theta_hc) == 0;
%isolating accelerations
theta_c_2dot=isolate(EOM_theta_c,diff(theta_c,t,t));
theta_hc_2dot=isolate(EOM_theta_hc,diff(theta_hc,t,t));
Question
The resulting equations of motion were given by MATLAB as:
When applying the Euler-Lagrange equation by hand the equation for is correct, but the equation for has an incorrect sign for as it should be positive. Checking just the functional derivative gives:
test=functionalDerivative(L,theta_c)
test(t)=
I am confused as to why these terms have a negative sign, as according the standard form of the Euler-Lagrange equation these terms should have a positive sign. Now, in cases where the functional derivative is equal to zero (conservative systems) this negative sign has no bearing on the result as it can simply be divided out. However, as I have shown in my case (non-conservative) this negative sign does have a bearing on the result. Why is this negative sign added when the functional derivative is calculated?
Possible Fix
I have managed to mitigate this behavior by adjusting my code:
%Equations of Motion
EOM_theta_c=-functionalDerivative(L,theta_c) == Q_c;
EOM_theta_hc=-functionalDerivative(L,theta_hc) == 0;
My only worry when moving to a more complex system is if the behavior of the functionalDerivative command will be consistent as far as signs are considered. I am asking this question to both get some understanding, but to also provide a possible workaround to a problem that I have seen posted before with no solutions.
I have written some code to determine the equations of motion for a somewhat simple system (in preparation for the analysis of a much more complex system) and have noticed some odd behavior when using the "functionalDerivative" command and am curious is anyone has ideas on why the behavior detailed below occurs.
Background:
The Lagrangian is given by:
The constraint equations are given by:
I have already developed the EOM’s using the method of Lagrange multipliers, and would now like to develop them by simply substituting the constraint equations into the Lagrangian. Note however that is a function of time, therefore is a composite function:
Note that the generalized coordinate is acted on by a non-conservative generalized force, which I account for on the rhs of the equation of motion (line 20).
Initial Attempts
My initial attempts utilized the "diff" command to try and determine the generalized forces and momenta, however I got the following error message for the generalized force for :
"Error using symengine, First argument must not contain functionals"
This makes sense, as "diff" needs a symbolic variable, not a function. Since this failed, I decided to utilize the functionalDerivative command (which coincidently requires much less code and appears to be overall more efficient), as shown in the code below:
syms theta_c(t) theta_hc(t) z_h(t) I_eq I_h m_h k z_hi Q_c f(t)
%setting assumptions
assume(I_eq,’positive’);
assume(I_h,’positive’);
assume(m_h,’positive’);
assume(z_hi,’positive’);
assume(k,’positive’);
%constraint equations
theta_h=theta_c+theta_hc;
g=compose(f,theta_hc);
%Lagrangian
L=0.5*I_eq*diff(theta_c,t)^2+0.5*I_h*diff(theta_h,t)^2+0.5*m_h*diff(z_h,t)^2-0.5*k*(z_hi+z_h)^2;
%substitution
L=subs(L,z_h,g); %note that theta_h substitution occurs automatically
%Equations of Motion
EOM_theta_c=functionalDerivative(L,theta_c) == Q_c;
EOM_theta_hc=functionalDerivative(L,theta_hc) == 0;
%isolating accelerations
theta_c_2dot=isolate(EOM_theta_c,diff(theta_c,t,t));
theta_hc_2dot=isolate(EOM_theta_hc,diff(theta_hc,t,t));
Question
The resulting equations of motion were given by MATLAB as:
When applying the Euler-Lagrange equation by hand the equation for is correct, but the equation for has an incorrect sign for as it should be positive. Checking just the functional derivative gives:
test=functionalDerivative(L,theta_c)
test(t)=
I am confused as to why these terms have a negative sign, as according the standard form of the Euler-Lagrange equation these terms should have a positive sign. Now, in cases where the functional derivative is equal to zero (conservative systems) this negative sign has no bearing on the result as it can simply be divided out. However, as I have shown in my case (non-conservative) this negative sign does have a bearing on the result. Why is this negative sign added when the functional derivative is calculated?
Possible Fix
I have managed to mitigate this behavior by adjusting my code:
%Equations of Motion
EOM_theta_c=-functionalDerivative(L,theta_c) == Q_c;
EOM_theta_hc=-functionalDerivative(L,theta_hc) == 0;
My only worry when moving to a more complex system is if the behavior of the functionalDerivative command will be consistent as far as signs are considered. lagrangian, functional derivative, composite functions MATLAB Answers — New Questions
in System identification, if input is PWM signal….
Hello !
i’m using System identification toolbox.
i already got step response from RC servo motor.
the problem is input signal type is PWM.
i gave step siganl throught PWM (1 ms to 2ms ).
how i can identify this 1ms to 2ms PWM signal for using system identification toolbox ?Hello !
i’m using System identification toolbox.
i already got step response from RC servo motor.
the problem is input signal type is PWM.
i gave step siganl throught PWM (1 ms to 2ms ).
how i can identify this 1ms to 2ms PWM signal for using system identification toolbox ? Hello !
i’m using System identification toolbox.
i already got step response from RC servo motor.
the problem is input signal type is PWM.
i gave step siganl throught PWM (1 ms to 2ms ).
how i can identify this 1ms to 2ms PWM signal for using system identification toolbox ? system identification, pwm, power_electronics_control, electric_motor_control, power_conversion_control MATLAB Answers — New Questions
SHINE Toolbox mssim Error
Hello,
When I use luminance match code in SHINE Toolbox, I get ssim error which I write below. Whan can I do to fix error?
Output argument "mssim" (and possibly others) not assigned a value in the execution with "ssim_index" function.
Error in SHINE (line 400)
mssim = ssim_index(images_orig{im},images{im});Hello,
When I use luminance match code in SHINE Toolbox, I get ssim error which I write below. Whan can I do to fix error?
Output argument "mssim" (and possibly others) not assigned a value in the execution with "ssim_index" function.
Error in SHINE (line 400)
mssim = ssim_index(images_orig{im},images{im}); Hello,
When I use luminance match code in SHINE Toolbox, I get ssim error which I write below. Whan can I do to fix error?
Output argument "mssim" (and possibly others) not assigned a value in the execution with "ssim_index" function.
Error in SHINE (line 400)
mssim = ssim_index(images_orig{im},images{im}); shine, shinetoolbox, ssim, mssim, ssim_index, luminance, luminancematch MATLAB Answers — New Questions
Unable to generate RLAgent
Hi,
I am new to Reinforcement Learning Toolbox and its utilities. I have created a continuous Observation and Action Environment basis my requirements.
%% Code for Setting up Environment
% Author : Srivatsank P
% Created : 05/12/2024
% Edited : 05/13/2024
clear;
clc;
%% Setup Dynamics
% learned dynamics
model_path = ‘doubleInt2D_net_longtime.mat’;
% Load neural dynamics
load(model_path);
nnet = FCNReLUDyns([6, 5, 5, 4], ‘state_idx’, 1:4, ‘control_idx’, 5:6);
nnet = nnet.load_model_from_SeriesNetwork(net);
% Load black box agent
A = doubleInt2D_NN(nnet);
%% Create RL Environment
% States
ObservationInfo = rlNumericSpec([8 1]);
ObservationInfo.Name = ‘DoubleInt2D_States’;
ObservationInfo.Description = ‘x,y,xdot,ydot,x_goal,y_goal,xdot_goal,ydot_goal’;
ObservationInfo.LowerLimit = [A.min_x;A.min_x];
ObservationInfo.UpperLimit = [A.max_x;A.max_x];
% Control Variables
ActionInfo = rlNumericSpec([2 1]);
ActionInfo.Name = ‘DoubleInt2D_Control’;
ActionInfo.Description = ‘u1,u2’;
ActionInfo.LowerLimit = A.min_u;
ActionInfo.UpperLimit = A.max_u;
% Functions that define Initialization and Reward Calculation
reset_handle = @() reset_dynamics(A);
reward_handle = @(Action,LoggedSignals) dynamics_and_reward(Action,LoggedSignals,A);
%Environment
doubleInt2D_env = rlFunctionEnv(ObservationInfo,ActionInfo,reward_handle,reset_handle);
doubleInt2D_NN is an equivalent NN model I am using for the dynamics, the details of which I can’t share unfortunately.
After this, I tried using "Reinforcement Learning Designer" to generate an agent. Agent details are shown below:
I run into the following error on my console:
Warning: Error occurred while executing the listener callback for event ButtonPushed defined for class matlab.ui.control.Button:
Error using dlnetwork/connectLayers (line 250)
Dot indexing is not supported for variables of this type.
Error in rl.util.default.createSingleChannelOutNet (line 12)
Net = connectLayers(Net,BridgeOutputName,OutputLayerName);
Error in rl.function.rlContinuousDeterministicActor.createDefault (line 200)
[actorNet,actionLayerName] = rl.util.default.createSingleChannelOutNet(inputGraph,bridgeOutputName,numOutput);
Error in rlTD3Agent (line 99)
Actor = rl.function.rlContinuousDeterministicActor.createDefault(ObservationInfo, ActionInfo, InitOptions);
Error in rl.util.createAgentFactory (line 16)
Agent = rlTD3Agent(Oinfo,Ainfo,AgentInitOpts);
Error in rl.util.createAgentFromEnvFactory (line 11)
Agent = rl.util.createAgentFactory(Type,Oinfo,Ainfo,AgentInitOpts);
Error in rl.internal.app.dialog.NewAgentFromEnvironmentDialog/createAgent (line 297)
Agent = rl.util.createAgentFromEnvFactory(AgentType,Env,AgentInitOpts);
Error in rl.internal.app.dialog.NewAgentFromEnvironmentDialog/callbackOK (line 274)
Agent = createAgent(obj);
Error in rl.internal.app.dialog.NewAgentFromEnvironmentDialog>@(es,ed)callbackOK(obj) (line 183)
addlistener(obj.OKButton,’ButtonPushed’,@(es,ed) callbackOK(obj));
Error in appdesservices.internal.interfaces.model.AbstractModel/executeUserCallback (line 282)
notify(obj, matlabEventName, matlabEventData);
Error in matlab.ui.control.internal.controller.ComponentController/handleUserInteraction (line 442)
obj.Model.executeUserCallback(callbackInfo{:});
Error in matlab.ui.control.internal.controller.PushButtonController/handleEvent (line 95)
obj.handleUserInteraction(‘ButtonPushed’, event.Data, {‘ButtonPushed’, eventData});
Error in appdesservices.internal.interfaces.controller.AbstractController>@(varargin)obj.handleEvent(varargin{:}) (line 214)
obj.Listeners = addlistener(obj.ViewModel, ‘peerEvent’, @obj.handleEvent);
Error in
viewmodel.internal.factory.ManagerFactoryProducer>@(src,event)callback(src,viewmodel.internal.factory.ManagerFactoryProducer.convertStructToEventData(event))
(line 79)
proxyCallback = @(src, event)callback(src, …
> In appdesservices.internal.interfaces.model/AbstractModel/executeUserCallback (line 282)
In matlab.ui.control.internal.controller/ComponentController/handleUserInteraction (line 442)
In matlab.ui.control.internal.controller/PushButtonController/handleEvent (line 95)
In appdesservices.internal.interfaces.controller.AbstractController>@(varargin)obj.handleEvent(varargin{:}) (line 214)
In viewmodel.internal.factory.ManagerFactoryProducer>@(src,event)callback(src,viewmodel.internal.factory.ManagerFactoryProducer.convertStructToEventData(event)) (line 79)
>>
I run into the same error even with the default provided environments. I am currently on R2024a and have tried the same on Linux(ubuntu) and Windows 11. I do not know how to proceed and am unable to create agents. TIA for help!Hi,
I am new to Reinforcement Learning Toolbox and its utilities. I have created a continuous Observation and Action Environment basis my requirements.
%% Code for Setting up Environment
% Author : Srivatsank P
% Created : 05/12/2024
% Edited : 05/13/2024
clear;
clc;
%% Setup Dynamics
% learned dynamics
model_path = ‘doubleInt2D_net_longtime.mat’;
% Load neural dynamics
load(model_path);
nnet = FCNReLUDyns([6, 5, 5, 4], ‘state_idx’, 1:4, ‘control_idx’, 5:6);
nnet = nnet.load_model_from_SeriesNetwork(net);
% Load black box agent
A = doubleInt2D_NN(nnet);
%% Create RL Environment
% States
ObservationInfo = rlNumericSpec([8 1]);
ObservationInfo.Name = ‘DoubleInt2D_States’;
ObservationInfo.Description = ‘x,y,xdot,ydot,x_goal,y_goal,xdot_goal,ydot_goal’;
ObservationInfo.LowerLimit = [A.min_x;A.min_x];
ObservationInfo.UpperLimit = [A.max_x;A.max_x];
% Control Variables
ActionInfo = rlNumericSpec([2 1]);
ActionInfo.Name = ‘DoubleInt2D_Control’;
ActionInfo.Description = ‘u1,u2’;
ActionInfo.LowerLimit = A.min_u;
ActionInfo.UpperLimit = A.max_u;
% Functions that define Initialization and Reward Calculation
reset_handle = @() reset_dynamics(A);
reward_handle = @(Action,LoggedSignals) dynamics_and_reward(Action,LoggedSignals,A);
%Environment
doubleInt2D_env = rlFunctionEnv(ObservationInfo,ActionInfo,reward_handle,reset_handle);
doubleInt2D_NN is an equivalent NN model I am using for the dynamics, the details of which I can’t share unfortunately.
After this, I tried using "Reinforcement Learning Designer" to generate an agent. Agent details are shown below:
I run into the following error on my console:
Warning: Error occurred while executing the listener callback for event ButtonPushed defined for class matlab.ui.control.Button:
Error using dlnetwork/connectLayers (line 250)
Dot indexing is not supported for variables of this type.
Error in rl.util.default.createSingleChannelOutNet (line 12)
Net = connectLayers(Net,BridgeOutputName,OutputLayerName);
Error in rl.function.rlContinuousDeterministicActor.createDefault (line 200)
[actorNet,actionLayerName] = rl.util.default.createSingleChannelOutNet(inputGraph,bridgeOutputName,numOutput);
Error in rlTD3Agent (line 99)
Actor = rl.function.rlContinuousDeterministicActor.createDefault(ObservationInfo, ActionInfo, InitOptions);
Error in rl.util.createAgentFactory (line 16)
Agent = rlTD3Agent(Oinfo,Ainfo,AgentInitOpts);
Error in rl.util.createAgentFromEnvFactory (line 11)
Agent = rl.util.createAgentFactory(Type,Oinfo,Ainfo,AgentInitOpts);
Error in rl.internal.app.dialog.NewAgentFromEnvironmentDialog/createAgent (line 297)
Agent = rl.util.createAgentFromEnvFactory(AgentType,Env,AgentInitOpts);
Error in rl.internal.app.dialog.NewAgentFromEnvironmentDialog/callbackOK (line 274)
Agent = createAgent(obj);
Error in rl.internal.app.dialog.NewAgentFromEnvironmentDialog>@(es,ed)callbackOK(obj) (line 183)
addlistener(obj.OKButton,’ButtonPushed’,@(es,ed) callbackOK(obj));
Error in appdesservices.internal.interfaces.model.AbstractModel/executeUserCallback (line 282)
notify(obj, matlabEventName, matlabEventData);
Error in matlab.ui.control.internal.controller.ComponentController/handleUserInteraction (line 442)
obj.Model.executeUserCallback(callbackInfo{:});
Error in matlab.ui.control.internal.controller.PushButtonController/handleEvent (line 95)
obj.handleUserInteraction(‘ButtonPushed’, event.Data, {‘ButtonPushed’, eventData});
Error in appdesservices.internal.interfaces.controller.AbstractController>@(varargin)obj.handleEvent(varargin{:}) (line 214)
obj.Listeners = addlistener(obj.ViewModel, ‘peerEvent’, @obj.handleEvent);
Error in
viewmodel.internal.factory.ManagerFactoryProducer>@(src,event)callback(src,viewmodel.internal.factory.ManagerFactoryProducer.convertStructToEventData(event))
(line 79)
proxyCallback = @(src, event)callback(src, …
> In appdesservices.internal.interfaces.model/AbstractModel/executeUserCallback (line 282)
In matlab.ui.control.internal.controller/ComponentController/handleUserInteraction (line 442)
In matlab.ui.control.internal.controller/PushButtonController/handleEvent (line 95)
In appdesservices.internal.interfaces.controller.AbstractController>@(varargin)obj.handleEvent(varargin{:}) (line 214)
In viewmodel.internal.factory.ManagerFactoryProducer>@(src,event)callback(src,viewmodel.internal.factory.ManagerFactoryProducer.convertStructToEventData(event)) (line 79)
>>
I run into the same error even with the default provided environments. I am currently on R2024a and have tried the same on Linux(ubuntu) and Windows 11. I do not know how to proceed and am unable to create agents. TIA for help! Hi,
I am new to Reinforcement Learning Toolbox and its utilities. I have created a continuous Observation and Action Environment basis my requirements.
%% Code for Setting up Environment
% Author : Srivatsank P
% Created : 05/12/2024
% Edited : 05/13/2024
clear;
clc;
%% Setup Dynamics
% learned dynamics
model_path = ‘doubleInt2D_net_longtime.mat’;
% Load neural dynamics
load(model_path);
nnet = FCNReLUDyns([6, 5, 5, 4], ‘state_idx’, 1:4, ‘control_idx’, 5:6);
nnet = nnet.load_model_from_SeriesNetwork(net);
% Load black box agent
A = doubleInt2D_NN(nnet);
%% Create RL Environment
% States
ObservationInfo = rlNumericSpec([8 1]);
ObservationInfo.Name = ‘DoubleInt2D_States’;
ObservationInfo.Description = ‘x,y,xdot,ydot,x_goal,y_goal,xdot_goal,ydot_goal’;
ObservationInfo.LowerLimit = [A.min_x;A.min_x];
ObservationInfo.UpperLimit = [A.max_x;A.max_x];
% Control Variables
ActionInfo = rlNumericSpec([2 1]);
ActionInfo.Name = ‘DoubleInt2D_Control’;
ActionInfo.Description = ‘u1,u2’;
ActionInfo.LowerLimit = A.min_u;
ActionInfo.UpperLimit = A.max_u;
% Functions that define Initialization and Reward Calculation
reset_handle = @() reset_dynamics(A);
reward_handle = @(Action,LoggedSignals) dynamics_and_reward(Action,LoggedSignals,A);
%Environment
doubleInt2D_env = rlFunctionEnv(ObservationInfo,ActionInfo,reward_handle,reset_handle);
doubleInt2D_NN is an equivalent NN model I am using for the dynamics, the details of which I can’t share unfortunately.
After this, I tried using "Reinforcement Learning Designer" to generate an agent. Agent details are shown below:
I run into the following error on my console:
Warning: Error occurred while executing the listener callback for event ButtonPushed defined for class matlab.ui.control.Button:
Error using dlnetwork/connectLayers (line 250)
Dot indexing is not supported for variables of this type.
Error in rl.util.default.createSingleChannelOutNet (line 12)
Net = connectLayers(Net,BridgeOutputName,OutputLayerName);
Error in rl.function.rlContinuousDeterministicActor.createDefault (line 200)
[actorNet,actionLayerName] = rl.util.default.createSingleChannelOutNet(inputGraph,bridgeOutputName,numOutput);
Error in rlTD3Agent (line 99)
Actor = rl.function.rlContinuousDeterministicActor.createDefault(ObservationInfo, ActionInfo, InitOptions);
Error in rl.util.createAgentFactory (line 16)
Agent = rlTD3Agent(Oinfo,Ainfo,AgentInitOpts);
Error in rl.util.createAgentFromEnvFactory (line 11)
Agent = rl.util.createAgentFactory(Type,Oinfo,Ainfo,AgentInitOpts);
Error in rl.internal.app.dialog.NewAgentFromEnvironmentDialog/createAgent (line 297)
Agent = rl.util.createAgentFromEnvFactory(AgentType,Env,AgentInitOpts);
Error in rl.internal.app.dialog.NewAgentFromEnvironmentDialog/callbackOK (line 274)
Agent = createAgent(obj);
Error in rl.internal.app.dialog.NewAgentFromEnvironmentDialog>@(es,ed)callbackOK(obj) (line 183)
addlistener(obj.OKButton,’ButtonPushed’,@(es,ed) callbackOK(obj));
Error in appdesservices.internal.interfaces.model.AbstractModel/executeUserCallback (line 282)
notify(obj, matlabEventName, matlabEventData);
Error in matlab.ui.control.internal.controller.ComponentController/handleUserInteraction (line 442)
obj.Model.executeUserCallback(callbackInfo{:});
Error in matlab.ui.control.internal.controller.PushButtonController/handleEvent (line 95)
obj.handleUserInteraction(‘ButtonPushed’, event.Data, {‘ButtonPushed’, eventData});
Error in appdesservices.internal.interfaces.controller.AbstractController>@(varargin)obj.handleEvent(varargin{:}) (line 214)
obj.Listeners = addlistener(obj.ViewModel, ‘peerEvent’, @obj.handleEvent);
Error in
viewmodel.internal.factory.ManagerFactoryProducer>@(src,event)callback(src,viewmodel.internal.factory.ManagerFactoryProducer.convertStructToEventData(event))
(line 79)
proxyCallback = @(src, event)callback(src, …
> In appdesservices.internal.interfaces.model/AbstractModel/executeUserCallback (line 282)
In matlab.ui.control.internal.controller/ComponentController/handleUserInteraction (line 442)
In matlab.ui.control.internal.controller/PushButtonController/handleEvent (line 95)
In appdesservices.internal.interfaces.controller.AbstractController>@(varargin)obj.handleEvent(varargin{:}) (line 214)
In viewmodel.internal.factory.ManagerFactoryProducer>@(src,event)callback(src,viewmodel.internal.factory.ManagerFactoryProducer.convertStructToEventData(event)) (line 79)
>>
I run into the same error even with the default provided environments. I am currently on R2024a and have tried the same on Linux(ubuntu) and Windows 11. I do not know how to proceed and am unable to create agents. TIA for help! reinforcement learning MATLAB Answers — New Questions
How to copy-paste heading and its content in MS word from Matlab?
Hi everyone, I’m working on a Matlab code that allows to write a report in MS word as follows (the following code has been written taking as reference Office VBA Reference and AI tools):
clc; clear all; close all;
% Create a new Word document from the template
word = actxserver(‘Word.Application’); % it’s a Matlab function
word.visible=1;
doc = word.Documents.Add(tmpl_path);
headings = doc.GetCrossReferenceItems(1) % here I get all the headings of my .docx
% Find the index of the heading you want to copy
heading_index = find(strcmp(headings, ‘ 2.1 TR-x-xxx’));
% Get the range of the heading and its content
start_range = doc.Paragraphs.Item(heading_index).Range.Start;
end_range = doc.Paragraphs.Item(heading_index + 1).Range.End; % End of the next paragraph after the heading
content_range = doc.Range(start_range, end_range);
% Copy the content
content_range.Copy;
Now I would like to copy an heading (TR-x-xxx) and its content (see the following image)
it follows as I would do in MS word if I didn’t used Matlab:
Once selcted and copied, I would like to get the following result (I would like to be able to copy-paste and get as many headings+content as I want):
Is my code right? How do i proceed??Hi everyone, I’m working on a Matlab code that allows to write a report in MS word as follows (the following code has been written taking as reference Office VBA Reference and AI tools):
clc; clear all; close all;
% Create a new Word document from the template
word = actxserver(‘Word.Application’); % it’s a Matlab function
word.visible=1;
doc = word.Documents.Add(tmpl_path);
headings = doc.GetCrossReferenceItems(1) % here I get all the headings of my .docx
% Find the index of the heading you want to copy
heading_index = find(strcmp(headings, ‘ 2.1 TR-x-xxx’));
% Get the range of the heading and its content
start_range = doc.Paragraphs.Item(heading_index).Range.Start;
end_range = doc.Paragraphs.Item(heading_index + 1).Range.End; % End of the next paragraph after the heading
content_range = doc.Range(start_range, end_range);
% Copy the content
content_range.Copy;
Now I would like to copy an heading (TR-x-xxx) and its content (see the following image)
it follows as I would do in MS word if I didn’t used Matlab:
Once selcted and copied, I would like to get the following result (I would like to be able to copy-paste and get as many headings+content as I want):
Is my code right? How do i proceed?? Hi everyone, I’m working on a Matlab code that allows to write a report in MS word as follows (the following code has been written taking as reference Office VBA Reference and AI tools):
clc; clear all; close all;
% Create a new Word document from the template
word = actxserver(‘Word.Application’); % it’s a Matlab function
word.visible=1;
doc = word.Documents.Add(tmpl_path);
headings = doc.GetCrossReferenceItems(1) % here I get all the headings of my .docx
% Find the index of the heading you want to copy
heading_index = find(strcmp(headings, ‘ 2.1 TR-x-xxx’));
% Get the range of the heading and its content
start_range = doc.Paragraphs.Item(heading_index).Range.Start;
end_range = doc.Paragraphs.Item(heading_index + 1).Range.End; % End of the next paragraph after the heading
content_range = doc.Range(start_range, end_range);
% Copy the content
content_range.Copy;
Now I would like to copy an heading (TR-x-xxx) and its content (see the following image)
it follows as I would do in MS word if I didn’t used Matlab:
Once selcted and copied, I would like to get the following result (I would like to be able to copy-paste and get as many headings+content as I want):
Is my code right? How do i proceed?? actxserver, word, vba MATLAB Answers — New Questions
Computer Account Log in problem, help!
When I set up my new laptop, I signed in using my Microsoft Account, email address removed for privacy reasons, along with a strong password. Initially, I started using a PIN for convenience. I considered switching to a Passkey but decided against it. Now, whenever I try to log in, it displays my name and photo, but instead of showing my Gmail address, it lists an account as outlook_a long string of numbers.com.
When I set up my new laptop, I signed in using my Microsoft Account, email address removed for privacy reasons, along with a strong password. Initially, I started using a PIN for convenience. I considered switching to a Passkey but decided against it. Now, whenever I try to log in, it displays my name and photo, but instead of showing my Gmail address, it lists an account as outlook_a long string of numbers.com. Read More
How do I download video from blob link on Windows 11 ?
I’m currently facing an issue with downloading a video from a website that uses a blob URL. I noticed that the video content I’m trying to download is embedded in the webpage and is accessed via a blob link, which typically starts with ‘blob:’. Unlike conventional links, I’m unable to directly save the video by right-clicking on it. I’ve tried several methods such as looking through the page source and inspecting elements using developer tools, but I still can’t figure out how to retrieve the actual video file from the blob URL. Could anyone provide guidance or recommend tools that could help with downloading a video file from a blob link?
I’m currently facing an issue with downloading a video from a website that uses a blob URL. I noticed that the video content I’m trying to download is embedded in the webpage and is accessed via a blob link, which typically starts with ‘blob:’. Unlike conventional links, I’m unable to directly save the video by right-clicking on it. I’ve tried several methods such as looking through the page source and inspecting elements using developer tools, but I still can’t figure out how to retrieve the actual video file from the blob URL. Could anyone provide guidance or recommend tools that could help with downloading a video file from a blob link? Read More
How can I download a video from bilibili to my PC?
I’m currently facing a challenge with downloading videos from Bilibili for personal offline viewing on my Windows 11 PC. Despite following several guides and tutorials found online, I’ve yet to find a method that works consistently. Most tools I’ve tried either fail to capture the videos in their full quality or simply don’t work after a few uses. I am particularly interested in a solution that allows me to download videos at their highest available resolution and ideally, one that is easy to use regularly. Thanks in advance for your insights and assistance!
I’m currently facing a challenge with downloading videos from Bilibili for personal offline viewing on my Windows 11 PC. Despite following several guides and tutorials found online, I’ve yet to find a method that works consistently. Most tools I’ve tried either fail to capture the videos in their full quality or simply don’t work after a few uses. I am particularly interested in a solution that allows me to download videos at their highest available resolution and ideally, one that is easy to use regularly. Thanks in advance for your insights and assistance! Read More
Demoted domain controller problem with agents
I ran into an issue with 2 agents on certificate authorities failing to start with LDAP connection errors. The AD site they are had all its domain controllers replaced with new servers a few weeks ago. The agent logs showed they were trying to connect to old DC server names. There’s no trace of those servers in DNS or elsewhere in AD that could make them discoverable. There’s no trace of those server names in the config files or registry anywhere, but somehow the agent wasn’t forgetting them. I had to reinstall the agent to resolve the issue. It seems like this should be something that is more of a standard DC discovery process, or using the existing secure channel server as the DC a standalone agent connects to.
I ran into an issue with 2 agents on certificate authorities failing to start with LDAP connection errors. The AD site they are had all its domain controllers replaced with new servers a few weeks ago. The agent logs showed they were trying to connect to old DC server names. There’s no trace of those servers in DNS or elsewhere in AD that could make them discoverable. There’s no trace of those server names in the config files or registry anywhere, but somehow the agent wasn’t forgetting them. I had to reinstall the agent to resolve the issue. It seems like this should be something that is more of a standard DC discovery process, or using the existing secure channel server as the DC a standalone agent connects to. Read More
Error Code: 3399614467
Correlation Id: 0593b61d-aab6-40a7-bcd1-ce7c5246ccc6
Timestamp: 2024-05-14T00:54:51.000Z
Error Tag: 7q20j
Error Code: 3399614467
Correlation Id: 0593b61d-aab6-40a7-bcd1-ce7c5246ccc6Timestamp: 2024-05-14T00:54:51.000ZError Tag: 7q20jError Code: 3399614467 Read More
Vortex Genesis AI Reviews™| The official site updated{2024}- Vortex Genesis AI!!
The AI algorithms within the platform are trained to detect patterns, recognize market trends, and identify profitable trading opportunities. This level of accuracy drastically reduces the risk of making poor investment choices and increases the likelihood of achieving substantial returns. Traditionally, trading was primarily reserved for financial institutions and professional traders. However, Vortex Genesis AI is on a mission to democratize trading by making it accessible to everyone. Whether you are a seasoned trader or a novice investor, the platform caters to traders of all skill levels. The user-friendly interface and intuitive design make it easy for anyone to navigate and utilize the powerful features of Vortex Genesis AI.
Get More Information to click here
The AI algorithms within the platform are trained to detect patterns, recognize market trends, and identify profitable trading opportunities. This level of accuracy drastically reduces the risk of making poor investment choices and increases the likelihood of achieving substantial returns. Traditionally, trading was primarily reserved for financial institutions and professional traders. However, Vortex Genesis AI is on a mission to democratize trading by making it accessible to everyone. Whether you are a seasoned trader or a novice investor, the platform caters to traders of all skill levels. The user-friendly interface and intuitive design make it easy for anyone to navigate and utilize the powerful features of Vortex Genesis AI. Get More Information to click here Read More
Better “probabilities”?
I created a small script in which I present two random colors to the user. See attached script.
As you can see, on the left, are the two RGB colors calculated using "rand" function. On the left is the colors plotted on an CIE ab diagram. I intend to use this script with my students to discuss "Color Harmonies". This is a "diad". Next up, is a "trad".
First question
What I’d like to know is how "effective" is the rand function? Would there be more "advanced" ways of coming up with those two colors?
The motivation behind this script is to get the students thinking "outside of the box", to move them away from getting "inspiration" from all kinds of "real-world" objects and only look at the "sensation" produced b the colors.
Second question
I would like to explore generating random colors from my Munsell Book of Color 1600 measured CIE Lab colors. This is how I bring in my Munsell color data from an Excel file:
% Import TAB-delimited file
file_path = ‘Munsell Glossy All Colors Extracted (2024 03 21) TAB.txt’;
my_table = readtable(file_path, ‘Delimiter’, ‘t’, ‘ReadVariableNames’, false);
MunsellNotationTMP = my_table{:, 1}; % Text data
MunsellNotation = string(MunsellNotationTMP);
HVCcolumns_Lab = my_table{:, 2:4}; % Numeric data
HVC_Lab = [HVCcolumns_Lab(:, 1), HVCcolumns_Lab(:, 2), HVCcolumns_Lab(:, 3)];I created a small script in which I present two random colors to the user. See attached script.
As you can see, on the left, are the two RGB colors calculated using "rand" function. On the left is the colors plotted on an CIE ab diagram. I intend to use this script with my students to discuss "Color Harmonies". This is a "diad". Next up, is a "trad".
First question
What I’d like to know is how "effective" is the rand function? Would there be more "advanced" ways of coming up with those two colors?
The motivation behind this script is to get the students thinking "outside of the box", to move them away from getting "inspiration" from all kinds of "real-world" objects and only look at the "sensation" produced b the colors.
Second question
I would like to explore generating random colors from my Munsell Book of Color 1600 measured CIE Lab colors. This is how I bring in my Munsell color data from an Excel file:
% Import TAB-delimited file
file_path = ‘Munsell Glossy All Colors Extracted (2024 03 21) TAB.txt’;
my_table = readtable(file_path, ‘Delimiter’, ‘t’, ‘ReadVariableNames’, false);
MunsellNotationTMP = my_table{:, 1}; % Text data
MunsellNotation = string(MunsellNotationTMP);
HVCcolumns_Lab = my_table{:, 2:4}; % Numeric data
HVC_Lab = [HVCcolumns_Lab(:, 1), HVCcolumns_Lab(:, 2), HVCcolumns_Lab(:, 3)]; I created a small script in which I present two random colors to the user. See attached script.
As you can see, on the left, are the two RGB colors calculated using "rand" function. On the left is the colors plotted on an CIE ab diagram. I intend to use this script with my students to discuss "Color Harmonies". This is a "diad". Next up, is a "trad".
First question
What I’d like to know is how "effective" is the rand function? Would there be more "advanced" ways of coming up with those two colors?
The motivation behind this script is to get the students thinking "outside of the box", to move them away from getting "inspiration" from all kinds of "real-world" objects and only look at the "sensation" produced b the colors.
Second question
I would like to explore generating random colors from my Munsell Book of Color 1600 measured CIE Lab colors. This is how I bring in my Munsell color data from an Excel file:
% Import TAB-delimited file
file_path = ‘Munsell Glossy All Colors Extracted (2024 03 21) TAB.txt’;
my_table = readtable(file_path, ‘Delimiter’, ‘t’, ‘ReadVariableNames’, false);
MunsellNotationTMP = my_table{:, 1}; % Text data
MunsellNotation = string(MunsellNotationTMP);
HVCcolumns_Lab = my_table{:, 2:4}; % Numeric data
HVC_Lab = [HVCcolumns_Lab(:, 1), HVCcolumns_Lab(:, 2), HVCcolumns_Lab(:, 3)]; random, munsell MATLAB Answers — New Questions
fmincon get the wrong answer
i have a optimization problem that fmincon get the wrong answer, there are a demand power(2) that can be generate or buy with different price, the codes are as follow:
a = 0.005;
b = 6;
c = 100;
% Define the cost function for generating energy (quadratic cost function) for the current microgrid
Cg_i = @(Ec, a, b, c) a * Ec^2 + b * Ec + c;
% Given prices
prices = [33, 26];
mprice = 27; % Given value of mprice
% Define the cost function for buying energy
Cb_i = @(Ec, price) price * Ec;
% Given total energy constraint
Ec_total = 2; % Given value of Ec_total
% Define the objective function
objective = @(X) Cg_i(X(1), a, b, c) + Cb_i(X(2), prices(1)) + Cb_i(X(3), prices(2)) + Cb_i(X(4), mprice);
% Initial guess for Ec_g, Ec1, Ec2
initial_guess = [Ec_total / 3, Ec_total / 3, Ec_total / 3, Ec_total / 3];
% initial_guess = [0, 0, Ec_total, 0];
% Linear equality constraint for the sum of Ec_g, Ec1, Ec2
Aeq = [1, 1, 1, 1];
beq = Ec_total;
% Options for fmincon
options = optimoptions(‘fmincon’, ‘Display’, ‘iter’);
% Define solver options
% Call fmincon with specified options
X = fmincon(objective, initial_guess, [], [], Aeq, beq, [0, 0, 0,0], [], [], options)
the matlab answer is X=[2 0 0 0]; that means generat 2 and no buy, that obviously is wrong because cost of buy with price 26 is 26*2=52 while generating cost is 112!i have a optimization problem that fmincon get the wrong answer, there are a demand power(2) that can be generate or buy with different price, the codes are as follow:
a = 0.005;
b = 6;
c = 100;
% Define the cost function for generating energy (quadratic cost function) for the current microgrid
Cg_i = @(Ec, a, b, c) a * Ec^2 + b * Ec + c;
% Given prices
prices = [33, 26];
mprice = 27; % Given value of mprice
% Define the cost function for buying energy
Cb_i = @(Ec, price) price * Ec;
% Given total energy constraint
Ec_total = 2; % Given value of Ec_total
% Define the objective function
objective = @(X) Cg_i(X(1), a, b, c) + Cb_i(X(2), prices(1)) + Cb_i(X(3), prices(2)) + Cb_i(X(4), mprice);
% Initial guess for Ec_g, Ec1, Ec2
initial_guess = [Ec_total / 3, Ec_total / 3, Ec_total / 3, Ec_total / 3];
% initial_guess = [0, 0, Ec_total, 0];
% Linear equality constraint for the sum of Ec_g, Ec1, Ec2
Aeq = [1, 1, 1, 1];
beq = Ec_total;
% Options for fmincon
options = optimoptions(‘fmincon’, ‘Display’, ‘iter’);
% Define solver options
% Call fmincon with specified options
X = fmincon(objective, initial_guess, [], [], Aeq, beq, [0, 0, 0,0], [], [], options)
the matlab answer is X=[2 0 0 0]; that means generat 2 and no buy, that obviously is wrong because cost of buy with price 26 is 26*2=52 while generating cost is 112! i have a optimization problem that fmincon get the wrong answer, there are a demand power(2) that can be generate or buy with different price, the codes are as follow:
a = 0.005;
b = 6;
c = 100;
% Define the cost function for generating energy (quadratic cost function) for the current microgrid
Cg_i = @(Ec, a, b, c) a * Ec^2 + b * Ec + c;
% Given prices
prices = [33, 26];
mprice = 27; % Given value of mprice
% Define the cost function for buying energy
Cb_i = @(Ec, price) price * Ec;
% Given total energy constraint
Ec_total = 2; % Given value of Ec_total
% Define the objective function
objective = @(X) Cg_i(X(1), a, b, c) + Cb_i(X(2), prices(1)) + Cb_i(X(3), prices(2)) + Cb_i(X(4), mprice);
% Initial guess for Ec_g, Ec1, Ec2
initial_guess = [Ec_total / 3, Ec_total / 3, Ec_total / 3, Ec_total / 3];
% initial_guess = [0, 0, Ec_total, 0];
% Linear equality constraint for the sum of Ec_g, Ec1, Ec2
Aeq = [1, 1, 1, 1];
beq = Ec_total;
% Options for fmincon
options = optimoptions(‘fmincon’, ‘Display’, ‘iter’);
% Define solver options
% Call fmincon with specified options
X = fmincon(objective, initial_guess, [], [], Aeq, beq, [0, 0, 0,0], [], [], options)
the matlab answer is X=[2 0 0 0]; that means generat 2 and no buy, that obviously is wrong because cost of buy with price 26 is 26*2=52 while generating cost is 112! fmincon MATLAB Answers — New Questions
I WANT ANSWER FOR THIS DIVIDED DIFFEENCE PROBLEM.
#include<stdio.h> #include<conio.h> void main() { int x[10], y[10], p[10]; int k,f,n,i,j=1,f1=1,f2=0; printf("nEnter the number of observations:n"); scanf("%d", &n); printf("nEnter the different values of x:n"); for (i=1;i<=n;i++) scanf("%d", &x[i]); printf("nThe corresponding values of y are:n"); for (i=1;i<=n;i++) scanf("%d", &y[i]); f=y[1]; printf("nEnter the value of ‘k’ in f(k) you want to evaluate:n"); scanf("%d", &k); do { for (i=1;i<=n-1;i++) { p[i] = ((y[i+1]-y[i])/(x[i+j]-x[i])); y[i]=p[i]; } f1=1; for(i=1;i<=j;i++) { f1*=(k-x[i]); } f2+=(y[1]*f1); n–; j++; } while(n!=1); f+=f2; printf("nf(%d) = %d", k , f); getch();#include<stdio.h> #include<conio.h> void main() { int x[10], y[10], p[10]; int k,f,n,i,j=1,f1=1,f2=0; printf("nEnter the number of observations:n"); scanf("%d", &n); printf("nEnter the different values of x:n"); for (i=1;i<=n;i++) scanf("%d", &x[i]); printf("nThe corresponding values of y are:n"); for (i=1;i<=n;i++) scanf("%d", &y[i]); f=y[1]; printf("nEnter the value of ‘k’ in f(k) you want to evaluate:n"); scanf("%d", &k); do { for (i=1;i<=n-1;i++) { p[i] = ((y[i+1]-y[i])/(x[i+j]-x[i])); y[i]=p[i]; } f1=1; for(i=1;i<=j;i++) { f1*=(k-x[i]); } f2+=(y[1]*f1); n–; j++; } while(n!=1); f+=f2; printf("nf(%d) = %d", k , f); getch(); #include<stdio.h> #include<conio.h> void main() { int x[10], y[10], p[10]; int k,f,n,i,j=1,f1=1,f2=0; printf("nEnter the number of observations:n"); scanf("%d", &n); printf("nEnter the different values of x:n"); for (i=1;i<=n;i++) scanf("%d", &x[i]); printf("nThe corresponding values of y are:n"); for (i=1;i<=n;i++) scanf("%d", &y[i]); f=y[1]; printf("nEnter the value of ‘k’ in f(k) you want to evaluate:n"); scanf("%d", &k); do { for (i=1;i<=n-1;i++) { p[i] = ((y[i+1]-y[i])/(x[i+j]-x[i])); y[i]=p[i]; } f1=1; for(i=1;i<=j;i++) { f1*=(k-x[i]); } f2+=(y[1]*f1); n–; j++; } while(n!=1); f+=f2; printf("nf(%d) = %d", k , f); getch(); @divided difference, matlab MATLAB Answers — New Questions
Can I increase visibility of text insertion cursor in Matlab Editor?
Using Matlab 2016a on a Windows 10 system:
In the Editor, I use a black background with light text of various colors. The text is easily visible, but the text insertion cursor (the little vertical line segment) is thin and dim, and hard for me to see unless I’m staring right at it. The Win10 OS (under "ease of access" or "accessibility") has a setting for width of the text insertion cursor line, but this only affects the insertion line in Matlab’s command window; the insertion line in the editor is unaffected. Matlab preferences appear to have no setting for this. Any suggestions?Using Matlab 2016a on a Windows 10 system:
In the Editor, I use a black background with light text of various colors. The text is easily visible, but the text insertion cursor (the little vertical line segment) is thin and dim, and hard for me to see unless I’m staring right at it. The Win10 OS (under "ease of access" or "accessibility") has a setting for width of the text insertion cursor line, but this only affects the insertion line in Matlab’s command window; the insertion line in the editor is unaffected. Matlab preferences appear to have no setting for this. Any suggestions? Using Matlab 2016a on a Windows 10 system:
In the Editor, I use a black background with light text of various colors. The text is easily visible, but the text insertion cursor (the little vertical line segment) is thin and dim, and hard for me to see unless I’m staring right at it. The Win10 OS (under "ease of access" or "accessibility") has a setting for width of the text insertion cursor line, but this only affects the insertion line in Matlab’s command window; the insertion line in the editor is unaffected. Matlab preferences appear to have no setting for this. Any suggestions? text insertion cursor MATLAB Answers — New Questions