LSTM network time series prediction error occurs at the initial time step
I have trained a LSTM network for time series regression. After training, I want to test its performance based on the test dataset. The testing result of one single sample (extracted from minibatch results) is shown as follows:
The prediction result has a transient response process. I think this issue is caused by the zero states (CellStates and HiddenStates) of LSTM netweok. How to resolve this zero states problem when predicting time sreries ?I have trained a LSTM network for time series regression. After training, I want to test its performance based on the test dataset. The testing result of one single sample (extracted from minibatch results) is shown as follows:
The prediction result has a transient response process. I think this issue is caused by the zero states (CellStates and HiddenStates) of LSTM netweok. How to resolve this zero states problem when predicting time sreries ? I have trained a LSTM network for time series regression. After training, I want to test its performance based on the test dataset. The testing result of one single sample (extracted from minibatch results) is shown as follows:
The prediction result has a transient response process. I think this issue is caused by the zero states (CellStates and HiddenStates) of LSTM netweok. How to resolve this zero states problem when predicting time sreries ? lstm, deep learning, time series MATLAB Answers — New Questions









