my sample equation is supposed to be
x(k+1) = x(k)exp(-wT) + w(k) where wT = 0.005 and w(k) is Gaussian R.V.
The algorithm is as follows:
The simulated spectrum is obtained by taking the squared magnitude of the DFT of the computer generated process. The simulate curve is averaged over 100,000 sample functions, with each sample function containing 10,000 time points.
---------------------------------------------------------------------------------------------------
What I did was to loop the sample equation for 100,000 times, and i take the last value and take fft(X(100000),10000). However, my graph turn out to be a straight line. Anyone care to help?