Correct way to calculate the dominant frequency of time series signal?
My dataset consists of two parameters: time (in micro second) and amplitude (Attached):
I am particularly interested in analyzing the fluctuation of the dominant frequency across the entire dataset. Additionally, I want to examine how the frequency content changes as a function of time. For instance, in time intervals such as 0–200 ms, 100–300 ms, 300–500 ms, and so on.
I attempted multiple approaches, but the results are not very consistent. I would be interested in getting a second opinion.
Thanks in Advance!My dataset consists of two parameters: time (in micro second) and amplitude (Attached):
I am particularly interested in analyzing the fluctuation of the dominant frequency across the entire dataset. Additionally, I want to examine how the frequency content changes as a function of time. For instance, in time intervals such as 0–200 ms, 100–300 ms, 300–500 ms, and so on.
I attempted multiple approaches, but the results are not very consistent. I would be interested in getting a second opinion.
Thanks in Advance! My dataset consists of two parameters: time (in micro second) and amplitude (Attached):
I am particularly interested in analyzing the fluctuation of the dominant frequency across the entire dataset. Additionally, I want to examine how the frequency content changes as a function of time. For instance, in time intervals such as 0–200 ms, 100–300 ms, 300–500 ms, and so on.
I attempted multiple approaches, but the results are not very consistent. I would be interested in getting a second opinion.
Thanks in Advance! time series, fft MATLAB Answers — New Questions