Using tapped delays in the process of training an artificial neural network for the purpose of dynamic system modeling.
Hello,I’m trying to use an artificial neural network for creating a model for the system :
from what I have got so far first I should get the response of the system to an arbitrary input and use that data for training my network. after discretisation this second order diffrence equation () would describe the dynamic behaviour of system.
Now what is the method to implement this delays in my neural network? should I just give these delayed outputs and inputs of the plant as a input to my neural network or is there an easier way to achive this?Hello,I’m trying to use an artificial neural network for creating a model for the system :
from what I have got so far first I should get the response of the system to an arbitrary input and use that data for training my network. after discretisation this second order diffrence equation () would describe the dynamic behaviour of system.
Now what is the method to implement this delays in my neural network? should I just give these delayed outputs and inputs of the plant as a input to my neural network or is there an easier way to achive this? Hello,I’m trying to use an artificial neural network for creating a model for the system :
from what I have got so far first I should get the response of the system to an arbitrary input and use that data for training my network. after discretisation this second order diffrence equation () would describe the dynamic behaviour of system.
Now what is the method to implement this delays in my neural network? should I just give these delayed outputs and inputs of the plant as a input to my neural network or is there an easier way to achive this? artificial neural network, tapped delay, system identification with ann MATLAB Answers — New Questions