How do I fix this error?
deltat=0.1;
nsamples=2^10;
fs=1/deltat;
time=[0 : deltat : deltat*(nsamples-1)];
psdtot(1:nsamples/2+1) = zeros(1,nsamples/2+1);
nblock = 10;
for k=1:nblock
for i=1: nsamples
ydata(i,:) =5+10*cos(2*pi*time+pi/6)+15* randn(1);
end
[pxx,f] = periodogram(ydata,rectwin(nsamples),nsamples,fs);
size(psdtot)
size(pxx)
psdtot = psdtot + pxx/nblock;
end
figure(1);
plot(time,ydata);
I’m trying to generate Gaussian noise from x(t)=5+10*cos(2*pi*t+pi/6)+G(sigma,t)
G(sigma,t) is Gaussian noise with a standard deviation of 15 (sigma G=15), detat=0.1(s), and nsample=2^10.
It is the start of calculating Power Spectral Density with FFT.
The overall variance of the signal is (sigma g)^2+(A)^2/2 as expected.
The generated random signal should be sinusoidal because X(t) has the cosine factor.deltat=0.1;
nsamples=2^10;
fs=1/deltat;
time=[0 : deltat : deltat*(nsamples-1)];
psdtot(1:nsamples/2+1) = zeros(1,nsamples/2+1);
nblock = 10;
for k=1:nblock
for i=1: nsamples
ydata(i,:) =5+10*cos(2*pi*time+pi/6)+15* randn(1);
end
[pxx,f] = periodogram(ydata,rectwin(nsamples),nsamples,fs);
size(psdtot)
size(pxx)
psdtot = psdtot + pxx/nblock;
end
figure(1);
plot(time,ydata);
I’m trying to generate Gaussian noise from x(t)=5+10*cos(2*pi*t+pi/6)+G(sigma,t)
G(sigma,t) is Gaussian noise with a standard deviation of 15 (sigma G=15), detat=0.1(s), and nsample=2^10.
It is the start of calculating Power Spectral Density with FFT.
The overall variance of the signal is (sigma g)^2+(A)^2/2 as expected.
The generated random signal should be sinusoidal because X(t) has the cosine factor. deltat=0.1;
nsamples=2^10;
fs=1/deltat;
time=[0 : deltat : deltat*(nsamples-1)];
psdtot(1:nsamples/2+1) = zeros(1,nsamples/2+1);
nblock = 10;
for k=1:nblock
for i=1: nsamples
ydata(i,:) =5+10*cos(2*pi*time+pi/6)+15* randn(1);
end
[pxx,f] = periodogram(ydata,rectwin(nsamples),nsamples,fs);
size(psdtot)
size(pxx)
psdtot = psdtot + pxx/nblock;
end
figure(1);
plot(time,ydata);
I’m trying to generate Gaussian noise from x(t)=5+10*cos(2*pi*t+pi/6)+G(sigma,t)
G(sigma,t) is Gaussian noise with a standard deviation of 15 (sigma G=15), detat=0.1(s), and nsample=2^10.
It is the start of calculating Power Spectral Density with FFT.
The overall variance of the signal is (sigma g)^2+(A)^2/2 as expected.
The generated random signal should be sinusoidal because X(t) has the cosine factor. predefine the array MATLAB Answers — New Questions