How do I load pretrained LSTM models (in .mat format) into a MATLAB Function Block in Simulink?
I am trying to create a Simulink model for SoC and SoH prediction of a battery, in order to implement in Hardware-in-the-loop(HIL) using OP4512 simulator. Using Stateful predict block in MATLAB 2021a did not help as RT-LAB did not support the block during the real-time simulation. So, I have tried using a user defined function block for loading two LSTM models, which were trained using M code and has an RMSE of 0.2365%. I have used the following code for my function block:
function [soc,soh] = prediction(voltage, current, temperature, cycles)
persistent lstmNet lstmNet1;
if isempty(lstmNet)
lstmNet = coder.loadDeepLearningNetwork(‘net25_3layer.mat’);
end
if isempty(lstmNet1)
lstmNet1 = coder.loadDeepLearningNetwork(’12_5_23_SOH_optimised_3layer.mat’);
end
norm_voltage = (voltage-3.9102)/0.137;
norm_current = (current+0.8334)/2.4459;
norm_temperature = (temperature-24.5092)/0.3066;
norm_cycles = (cycles-174.9916)/114.5728;
X = [norm_voltage, norm_current, norm_temperature];
norm_soc = predict(lstmNet, X’);
soc = norm_soc*21.612+63.0467;
X1 = [norm_soc, norm_cycles];
norm_soh = predict(lstmNet1, X1′);
soh = norm_soh*33.6548+46.8586;
end
Also, I have set my simulation target language as C++.
When I run this model, I find that the output is in a different range (in case of SoC, 63% – 78%) than the output data in the dataset (In case of SoC, 28%-100%). This, I believe, is because the loadDeepLearningNetwork do not consider any recurrent computation in the network (correct me if I am wrong).
Is there any way I can load these two LSTM models into the function block such that I get the appropriate output?
Find the two .mat files (LSTM models) and the input variables in soc_sim_ws.mat file.I am trying to create a Simulink model for SoC and SoH prediction of a battery, in order to implement in Hardware-in-the-loop(HIL) using OP4512 simulator. Using Stateful predict block in MATLAB 2021a did not help as RT-LAB did not support the block during the real-time simulation. So, I have tried using a user defined function block for loading two LSTM models, which were trained using M code and has an RMSE of 0.2365%. I have used the following code for my function block:
function [soc,soh] = prediction(voltage, current, temperature, cycles)
persistent lstmNet lstmNet1;
if isempty(lstmNet)
lstmNet = coder.loadDeepLearningNetwork(‘net25_3layer.mat’);
end
if isempty(lstmNet1)
lstmNet1 = coder.loadDeepLearningNetwork(’12_5_23_SOH_optimised_3layer.mat’);
end
norm_voltage = (voltage-3.9102)/0.137;
norm_current = (current+0.8334)/2.4459;
norm_temperature = (temperature-24.5092)/0.3066;
norm_cycles = (cycles-174.9916)/114.5728;
X = [norm_voltage, norm_current, norm_temperature];
norm_soc = predict(lstmNet, X’);
soc = norm_soc*21.612+63.0467;
X1 = [norm_soc, norm_cycles];
norm_soh = predict(lstmNet1, X1′);
soh = norm_soh*33.6548+46.8586;
end
Also, I have set my simulation target language as C++.
When I run this model, I find that the output is in a different range (in case of SoC, 63% – 78%) than the output data in the dataset (In case of SoC, 28%-100%). This, I believe, is because the loadDeepLearningNetwork do not consider any recurrent computation in the network (correct me if I am wrong).
Is there any way I can load these two LSTM models into the function block such that I get the appropriate output?
Find the two .mat files (LSTM models) and the input variables in soc_sim_ws.mat file. I am trying to create a Simulink model for SoC and SoH prediction of a battery, in order to implement in Hardware-in-the-loop(HIL) using OP4512 simulator. Using Stateful predict block in MATLAB 2021a did not help as RT-LAB did not support the block during the real-time simulation. So, I have tried using a user defined function block for loading two LSTM models, which were trained using M code and has an RMSE of 0.2365%. I have used the following code for my function block:
function [soc,soh] = prediction(voltage, current, temperature, cycles)
persistent lstmNet lstmNet1;
if isempty(lstmNet)
lstmNet = coder.loadDeepLearningNetwork(‘net25_3layer.mat’);
end
if isempty(lstmNet1)
lstmNet1 = coder.loadDeepLearningNetwork(’12_5_23_SOH_optimised_3layer.mat’);
end
norm_voltage = (voltage-3.9102)/0.137;
norm_current = (current+0.8334)/2.4459;
norm_temperature = (temperature-24.5092)/0.3066;
norm_cycles = (cycles-174.9916)/114.5728;
X = [norm_voltage, norm_current, norm_temperature];
norm_soc = predict(lstmNet, X’);
soc = norm_soc*21.612+63.0467;
X1 = [norm_soc, norm_cycles];
norm_soh = predict(lstmNet1, X1′);
soh = norm_soh*33.6548+46.8586;
end
Also, I have set my simulation target language as C++.
When I run this model, I find that the output is in a different range (in case of SoC, 63% – 78%) than the output data in the dataset (In case of SoC, 28%-100%). This, I believe, is because the loadDeepLearningNetwork do not consider any recurrent computation in the network (correct me if I am wrong).
Is there any way I can load these two LSTM models into the function block such that I get the appropriate output?
Find the two .mat files (LSTM models) and the input variables in soc_sim_ws.mat file. simulink, lstm, function block MATLAB Answers — New Questions