Removing bias drift from integration of noisy signal
Hello,
I am working on a Simulink simulation where I need to perform a double integration of a noisy signal. The input signal is generated by exciting the system with a chirp signal that varies in frequency from 0.01 Hz to 10 Hz. To reduce noise before integration, I apply a moving mean filter.
However, despite the filtering, the double-integrated signal exhibits an unwanted upward trend (bias drift), which should not be present. I suspect this may be due to numerical drift and low-frequency components.
What would be the best approach to mitigate this bias? Should I use a different filtering technique, remove the DC component, or apply a different integration method?
I have attached plots showing:
The signal pre filtering
The signal after filtering
The output after single and double integration, highlighting the bias issue
Any advice would be greatly appreciated. Thank you!Hello,
I am working on a Simulink simulation where I need to perform a double integration of a noisy signal. The input signal is generated by exciting the system with a chirp signal that varies in frequency from 0.01 Hz to 10 Hz. To reduce noise before integration, I apply a moving mean filter.
However, despite the filtering, the double-integrated signal exhibits an unwanted upward trend (bias drift), which should not be present. I suspect this may be due to numerical drift and low-frequency components.
What would be the best approach to mitigate this bias? Should I use a different filtering technique, remove the DC component, or apply a different integration method?
I have attached plots showing:
The signal pre filtering
The signal after filtering
The output after single and double integration, highlighting the bias issue
Any advice would be greatly appreciated. Thank you! Hello,
I am working on a Simulink simulation where I need to perform a double integration of a noisy signal. The input signal is generated by exciting the system with a chirp signal that varies in frequency from 0.01 Hz to 10 Hz. To reduce noise before integration, I apply a moving mean filter.
However, despite the filtering, the double-integrated signal exhibits an unwanted upward trend (bias drift), which should not be present. I suspect this may be due to numerical drift and low-frequency components.
What would be the best approach to mitigate this bias? Should I use a different filtering technique, remove the DC component, or apply a different integration method?
I have attached plots showing:
The signal pre filtering
The signal after filtering
The output after single and double integration, highlighting the bias issue
Any advice would be greatly appreciated. Thank you! remove trends, bias, noise, simulink, identification MATLAB Answers — New Questions