Neural network for regression task: Circuit modeling
Hi everyone, i’m experimenting with using a DNN for modeling an electrical circuit. I have done many tries and until now i always used one sample for input( sequenceInputLayer(1) ) a one for output( since it is a regression task). Now, since the circuit i am modeling has memory, i would like to try to extend input lenght to count for more input sample at once but keeping my output as one single sample. I am facing problems in this beacuse i have my input_data vector that is 1167×1 so 1 column. This means that sequenceInputLayer expect always 1 as dimension and i am limited to this.
On the other hand i can transpose my vectors and use 1167 as input and output dimension, but I would like more freedom and try for example a NN with an input layer of size 300, then 2 hidden layes with size 50/100 (using tanh ecc..) and then my final output layer with output size 1 ( again, since this is a regression task). I already read MATLAB documentstion but i didn’t find what i am searching for. This is the code i am using now, if you have suggestions i would be really happy!!!
Maybe i have to use mini batches? What changes i have to do in order to realize the network i said before?
(Yes, i know that for dynamical systems with memory CNN and RNN neural nets are better but in literature there are also papaer that use DNN for this type of modeling, i.e. using more input(past sample) in a DNN for taking into account memory effects).
%load Vout.mat %LTspice data
t = (0:5.145e-5:60e-3)’; %creates 1167 samples
input_data=10*sin(200*2*pi*t);
output_data=Vout;
% plot(input_data);hold on
% plot(output_data);
%B = reshape(input_data,[],4); %ottengo numero di colonne specifiche,4
% Creazione della rete neurale profonda
layers = […
sequenceInputLayer(1)
fullyConnectedLayer(10)
reluLayer
%lstmLayer(100)
fullyConnectedLayer(10)
sigmoidLayer
fullyConnectedLayer(10)
tanhLayer
fullyConnectedLayer(1) % Output layer con una sola uscita
%tanhLayer
];
% Creazione della dlnetwork
net = dlnetwork(layers);
% Impostazione delle opzioni di addestramento
options = trainingOptions(‘adam’, …
‘MaxEpochs’,1000,…
‘InitialLearnRate’,1e-2, …
‘Verbose’,false, …
‘Plots’,’training-progress’);
% Addestramento della rete neurale
net = trainnet(input_data, output_data, net,"huber" , options);
%Predict
output_test = predict(net,input_data);Hi everyone, i’m experimenting with using a DNN for modeling an electrical circuit. I have done many tries and until now i always used one sample for input( sequenceInputLayer(1) ) a one for output( since it is a regression task). Now, since the circuit i am modeling has memory, i would like to try to extend input lenght to count for more input sample at once but keeping my output as one single sample. I am facing problems in this beacuse i have my input_data vector that is 1167×1 so 1 column. This means that sequenceInputLayer expect always 1 as dimension and i am limited to this.
On the other hand i can transpose my vectors and use 1167 as input and output dimension, but I would like more freedom and try for example a NN with an input layer of size 300, then 2 hidden layes with size 50/100 (using tanh ecc..) and then my final output layer with output size 1 ( again, since this is a regression task). I already read MATLAB documentstion but i didn’t find what i am searching for. This is the code i am using now, if you have suggestions i would be really happy!!!
Maybe i have to use mini batches? What changes i have to do in order to realize the network i said before?
(Yes, i know that for dynamical systems with memory CNN and RNN neural nets are better but in literature there are also papaer that use DNN for this type of modeling, i.e. using more input(past sample) in a DNN for taking into account memory effects).
%load Vout.mat %LTspice data
t = (0:5.145e-5:60e-3)’; %creates 1167 samples
input_data=10*sin(200*2*pi*t);
output_data=Vout;
% plot(input_data);hold on
% plot(output_data);
%B = reshape(input_data,[],4); %ottengo numero di colonne specifiche,4
% Creazione della rete neurale profonda
layers = […
sequenceInputLayer(1)
fullyConnectedLayer(10)
reluLayer
%lstmLayer(100)
fullyConnectedLayer(10)
sigmoidLayer
fullyConnectedLayer(10)
tanhLayer
fullyConnectedLayer(1) % Output layer con una sola uscita
%tanhLayer
];
% Creazione della dlnetwork
net = dlnetwork(layers);
% Impostazione delle opzioni di addestramento
options = trainingOptions(‘adam’, …
‘MaxEpochs’,1000,…
‘InitialLearnRate’,1e-2, …
‘Verbose’,false, …
‘Plots’,’training-progress’);
% Addestramento della rete neurale
net = trainnet(input_data, output_data, net,"huber" , options);
%Predict
output_test = predict(net,input_data); Hi everyone, i’m experimenting with using a DNN for modeling an electrical circuit. I have done many tries and until now i always used one sample for input( sequenceInputLayer(1) ) a one for output( since it is a regression task). Now, since the circuit i am modeling has memory, i would like to try to extend input lenght to count for more input sample at once but keeping my output as one single sample. I am facing problems in this beacuse i have my input_data vector that is 1167×1 so 1 column. This means that sequenceInputLayer expect always 1 as dimension and i am limited to this.
On the other hand i can transpose my vectors and use 1167 as input and output dimension, but I would like more freedom and try for example a NN with an input layer of size 300, then 2 hidden layes with size 50/100 (using tanh ecc..) and then my final output layer with output size 1 ( again, since this is a regression task). I already read MATLAB documentstion but i didn’t find what i am searching for. This is the code i am using now, if you have suggestions i would be really happy!!!
Maybe i have to use mini batches? What changes i have to do in order to realize the network i said before?
(Yes, i know that for dynamical systems with memory CNN and RNN neural nets are better but in literature there are also papaer that use DNN for this type of modeling, i.e. using more input(past sample) in a DNN for taking into account memory effects).
%load Vout.mat %LTspice data
t = (0:5.145e-5:60e-3)’; %creates 1167 samples
input_data=10*sin(200*2*pi*t);
output_data=Vout;
% plot(input_data);hold on
% plot(output_data);
%B = reshape(input_data,[],4); %ottengo numero di colonne specifiche,4
% Creazione della rete neurale profonda
layers = […
sequenceInputLayer(1)
fullyConnectedLayer(10)
reluLayer
%lstmLayer(100)
fullyConnectedLayer(10)
sigmoidLayer
fullyConnectedLayer(10)
tanhLayer
fullyConnectedLayer(1) % Output layer con una sola uscita
%tanhLayer
];
% Creazione della dlnetwork
net = dlnetwork(layers);
% Impostazione delle opzioni di addestramento
options = trainingOptions(‘adam’, …
‘MaxEpochs’,1000,…
‘InitialLearnRate’,1e-2, …
‘Verbose’,false, …
‘Plots’,’training-progress’);
% Addestramento della rete neurale
net = trainnet(input_data, output_data, net,"huber" , options);
%Predict
output_test = predict(net,input_data); neural networks, circuit modeling, feedforward neural networks MATLAB Answers — New Questions