How can I implement Dual LSTM in matlab?
I want to implement dual LSTM network in matlab. How can i do it ? When I run this code, i get error as :
"Network: Invalid input layers. Network must have at most one sequence input layer".
How can i solve it? I would be grateful for your quick solution.
My objective is to train different types of features with seperate LSTM models and concatenate the outputs for fully connected layer to get single classification output.
Is it possible in matlab ?
inputSize1 = 4;
inputSize2 = 20;
numClasses = 5;
layers1 = [ …
sequenceInputLayer(inputSize1, ‘Name’, ‘input1’)
lstmLayer(64, ‘OutputMode’, ‘last’, ‘Name’, ‘lstm1’)
dropoutLayer(0.2, ‘Name’, ‘dropout1’)
fullyConnectedLayer(64, ‘Name’, ‘fc1’)];
layers2 = [ …
sequenceInputLayer(inputSize2, ‘Name’, ‘input2’)
lstmLayer(64, ‘OutputMode’, ‘last’, ‘Name’, ‘lstm2’)
dropoutLayer(0.2, ‘Name’, ‘dropout2’)
fullyConnectedLayer(64, ‘Name’, ‘fc2’)];
combinedLayers = [ …
concatenationLayer(1, 2, ‘Name’, ‘concat’)
fullyConnectedLayer(64, ‘Name’, ‘fc_combined’)
reluLayer(‘Name’, ‘relu’)
fullyConnectedLayer(numClasses, ‘Name’, ‘fc_final’)
softmaxLayer(‘Name’, ‘softmax’)
classificationLayer(‘Name’, ‘classification’)];
lgraph = layerGraph();
lgraph = addLayers(lgraph, layers1);
lgraph = addLayers(lgraph, layers2);
lgraph = addLayers(lgraph, combinedLayers);
lgraph = connectLayers(lgraph, ‘fc1’, ‘concat/in1’);
lgraph = connectLayers(lgraph, ‘fc2’, ‘concat/in2’);
plot(lgraph);
options = trainingOptions(‘adam’, …
‘InitialLearnRate’, 0.001, …
‘MaxEpochs’, 10, …
‘MiniBatchSize’, 32, …
‘Shuffle’, ‘once’, …
‘Plots’, ‘training-progress’, …
‘Verbose’, false);
net = trainNetwork(Normalized_data, y_train, lgraph, options);I want to implement dual LSTM network in matlab. How can i do it ? When I run this code, i get error as :
"Network: Invalid input layers. Network must have at most one sequence input layer".
How can i solve it? I would be grateful for your quick solution.
My objective is to train different types of features with seperate LSTM models and concatenate the outputs for fully connected layer to get single classification output.
Is it possible in matlab ?
inputSize1 = 4;
inputSize2 = 20;
numClasses = 5;
layers1 = [ …
sequenceInputLayer(inputSize1, ‘Name’, ‘input1’)
lstmLayer(64, ‘OutputMode’, ‘last’, ‘Name’, ‘lstm1’)
dropoutLayer(0.2, ‘Name’, ‘dropout1’)
fullyConnectedLayer(64, ‘Name’, ‘fc1’)];
layers2 = [ …
sequenceInputLayer(inputSize2, ‘Name’, ‘input2’)
lstmLayer(64, ‘OutputMode’, ‘last’, ‘Name’, ‘lstm2’)
dropoutLayer(0.2, ‘Name’, ‘dropout2’)
fullyConnectedLayer(64, ‘Name’, ‘fc2’)];
combinedLayers = [ …
concatenationLayer(1, 2, ‘Name’, ‘concat’)
fullyConnectedLayer(64, ‘Name’, ‘fc_combined’)
reluLayer(‘Name’, ‘relu’)
fullyConnectedLayer(numClasses, ‘Name’, ‘fc_final’)
softmaxLayer(‘Name’, ‘softmax’)
classificationLayer(‘Name’, ‘classification’)];
lgraph = layerGraph();
lgraph = addLayers(lgraph, layers1);
lgraph = addLayers(lgraph, layers2);
lgraph = addLayers(lgraph, combinedLayers);
lgraph = connectLayers(lgraph, ‘fc1’, ‘concat/in1’);
lgraph = connectLayers(lgraph, ‘fc2’, ‘concat/in2’);
plot(lgraph);
options = trainingOptions(‘adam’, …
‘InitialLearnRate’, 0.001, …
‘MaxEpochs’, 10, …
‘MiniBatchSize’, 32, …
‘Shuffle’, ‘once’, …
‘Plots’, ‘training-progress’, …
‘Verbose’, false);
net = trainNetwork(Normalized_data, y_train, lgraph, options); I want to implement dual LSTM network in matlab. How can i do it ? When I run this code, i get error as :
"Network: Invalid input layers. Network must have at most one sequence input layer".
How can i solve it? I would be grateful for your quick solution.
My objective is to train different types of features with seperate LSTM models and concatenate the outputs for fully connected layer to get single classification output.
Is it possible in matlab ?
inputSize1 = 4;
inputSize2 = 20;
numClasses = 5;
layers1 = [ …
sequenceInputLayer(inputSize1, ‘Name’, ‘input1’)
lstmLayer(64, ‘OutputMode’, ‘last’, ‘Name’, ‘lstm1’)
dropoutLayer(0.2, ‘Name’, ‘dropout1’)
fullyConnectedLayer(64, ‘Name’, ‘fc1’)];
layers2 = [ …
sequenceInputLayer(inputSize2, ‘Name’, ‘input2’)
lstmLayer(64, ‘OutputMode’, ‘last’, ‘Name’, ‘lstm2’)
dropoutLayer(0.2, ‘Name’, ‘dropout2’)
fullyConnectedLayer(64, ‘Name’, ‘fc2’)];
combinedLayers = [ …
concatenationLayer(1, 2, ‘Name’, ‘concat’)
fullyConnectedLayer(64, ‘Name’, ‘fc_combined’)
reluLayer(‘Name’, ‘relu’)
fullyConnectedLayer(numClasses, ‘Name’, ‘fc_final’)
softmaxLayer(‘Name’, ‘softmax’)
classificationLayer(‘Name’, ‘classification’)];
lgraph = layerGraph();
lgraph = addLayers(lgraph, layers1);
lgraph = addLayers(lgraph, layers2);
lgraph = addLayers(lgraph, combinedLayers);
lgraph = connectLayers(lgraph, ‘fc1’, ‘concat/in1’);
lgraph = connectLayers(lgraph, ‘fc2’, ‘concat/in2’);
plot(lgraph);
options = trainingOptions(‘adam’, …
‘InitialLearnRate’, 0.001, …
‘MaxEpochs’, 10, …
‘MiniBatchSize’, 32, …
‘Shuffle’, ‘once’, …
‘Plots’, ‘training-progress’, …
‘Verbose’, false);
net = trainNetwork(Normalized_data, y_train, lgraph, options); lstm, dual, concatenation of layers MATLAB Answers — New Questions