How can I accelerate deep learning training using GPU?
I’ve made a simple neural network
It classifies MNIST handwritten digit using fully-connected layers
lgraph_2 = [ …
imageInputLayer([28 28 1])
fullyConnectedLayer(512)
reluLayer
fullyConnectedLayer(256)
reluLayer
fullyConnectedLayer(128)
reluLayer
fullyConnectedLayer(10)
softmaxLayer
classificationLayer];
And the options in the neural network is
miniBatchSize = 10;
valFrequency = 5;
options = trainingOptions(‘sgdm’, …
‘MiniBatchSize’,miniBatchSize, …
‘MaxEpochs’,5, …
‘InitialLearnRate’,3e-4, …
‘Shuffle’,’every-epoch’, …
‘ValidationData’,augimdsValidation, …
‘ValidationFrequency’,valFrequency, …
‘Verbose’,true, …
‘Plots’,’training-progress’, …
‘ExecutionEnvironment’, ‘parallel’);
I expected when i use a GPU, it’s training speed will be high
But when I train this network using Macbook(sigle CPU)
it takes 1 hour for around 2500 iterations
And when I use my desktop using RTX 2080Ti,
It takes much longer time to train.
MATLAB detects my GPU properly(I checked the GPU information using gpuDevice)
I don’t know how can I accelerate the training proess.
Thank you in advanceI’ve made a simple neural network
It classifies MNIST handwritten digit using fully-connected layers
lgraph_2 = [ …
imageInputLayer([28 28 1])
fullyConnectedLayer(512)
reluLayer
fullyConnectedLayer(256)
reluLayer
fullyConnectedLayer(128)
reluLayer
fullyConnectedLayer(10)
softmaxLayer
classificationLayer];
And the options in the neural network is
miniBatchSize = 10;
valFrequency = 5;
options = trainingOptions(‘sgdm’, …
‘MiniBatchSize’,miniBatchSize, …
‘MaxEpochs’,5, …
‘InitialLearnRate’,3e-4, …
‘Shuffle’,’every-epoch’, …
‘ValidationData’,augimdsValidation, …
‘ValidationFrequency’,valFrequency, …
‘Verbose’,true, …
‘Plots’,’training-progress’, …
‘ExecutionEnvironment’, ‘parallel’);
I expected when i use a GPU, it’s training speed will be high
But when I train this network using Macbook(sigle CPU)
it takes 1 hour for around 2500 iterations
And when I use my desktop using RTX 2080Ti,
It takes much longer time to train.
MATLAB detects my GPU properly(I checked the GPU information using gpuDevice)
I don’t know how can I accelerate the training proess.
Thank you in advance I’ve made a simple neural network
It classifies MNIST handwritten digit using fully-connected layers
lgraph_2 = [ …
imageInputLayer([28 28 1])
fullyConnectedLayer(512)
reluLayer
fullyConnectedLayer(256)
reluLayer
fullyConnectedLayer(128)
reluLayer
fullyConnectedLayer(10)
softmaxLayer
classificationLayer];
And the options in the neural network is
miniBatchSize = 10;
valFrequency = 5;
options = trainingOptions(‘sgdm’, …
‘MiniBatchSize’,miniBatchSize, …
‘MaxEpochs’,5, …
‘InitialLearnRate’,3e-4, …
‘Shuffle’,’every-epoch’, …
‘ValidationData’,augimdsValidation, …
‘ValidationFrequency’,valFrequency, …
‘Verbose’,true, …
‘Plots’,’training-progress’, …
‘ExecutionEnvironment’, ‘parallel’);
I expected when i use a GPU, it’s training speed will be high
But when I train this network using Macbook(sigle CPU)
it takes 1 hour for around 2500 iterations
And when I use my desktop using RTX 2080Ti,
It takes much longer time to train.
MATLAB detects my GPU properly(I checked the GPU information using gpuDevice)
I don’t know how can I accelerate the training proess.
Thank you in advance deep learning MATLAB Answers — New Questions