symbol error rate verses average received power in awgn at OFDM

Status
Not open for further replies.

zunayeed

Junior Member level 1
Joined
May 4, 2011
Messages
15
Helped
1
Reputation
2
Reaction score
1
Trophy points
1,283
Activity points
1,373


can any body tell me if the graph i plotted is correct or not? if not then how to correct it? please explain..
need ur comments and feedback.

thanks in advance
 

Could you plz give us more details, how does the SER increase with the increase of received power?
 

for example i have 10 data blocks
and took 5 symbols in in data block
and then i measured avg power for each data block in awgn. then at receiver side i plotted avg received power vs ber
 

I've never seen this before zunayeed, but if you want to check your code, plot the conventional BER vs. SNR and it would roughly be 10^-5 at 10 dB SNR. Good luck
 

Status
Not open for further replies.
Cookies are required to use this site. You must accept them to continue using the site. Learn more…