Continue to Site

Welcome to EDAboard.com

Welcome to our site! EDAboard.com is an international Electronics Discussion Forum focused on EDA software, circuits, schematics, books, theory, papers, asic, pld, 8051, DSP, Network, RF, Analog Design, PCB, Service Manuals... and a whole lot more! To participate you need to register. Registration is free. Click here to register now.

Generate Noise in Continous Time OFDM Systems

Status
Not open for further replies.

David83

Advanced Member level 1
Advanced Member level 1
Joined
Jan 21, 2011
Messages
410
Helped
45
Reputation
92
Reaction score
45
Trophy points
1,308
Visit site
Activity points
3,639
Hello,

I am trying to simulate an OFDM system where the transmitted symbols are BPSK. The system goes like this:

  1. Generate binary stream of N bits
  2. Convert it to bipolar stream
  3. Take the oversampled IFFT (multipled by K) with oversampling ratio Ns (K=N*Ns) + CP
  4. D/A Conversion and baseband transmission
  5. Add noise
  6. A/D conversion & Remove CP & FFT (divided by K) operation
  7. Take the first N symbols
  8. Do ML detection

My question is how to generate the noise such that the SNR of subcarrier k is |Hk|^2*SNR? I do it like this:

\[n=\sqrt{\frac{N\,Ns}{SNR}}(n_R+j\,n_I)\]

where n_R and n_I are both normalized Gaussian random variables. Is this correct?
 

Status
Not open for further replies.

Part and Inventory Search

Welcome to EDABoard.com

Sponsor

Back
Top