Continue to Site

Welcome to EDAboard.com

Welcome to our site! EDAboard.com is an international Electronics Discussion Forum focused on EDA software, circuits, schematics, books, theory, papers, asic, pld, 8051, DSP, Network, RF, Analog Design, PCB, Service Manuals... and a whole lot more! To participate you need to register. Registration is free. Click here to register now.

bits/symbol meaning in OFDM

Status
Not open for further replies.

vreg

Member level 4
Member level 4
Joined
Oct 16, 2012
Messages
70
Helped
1
Reputation
2
Reaction score
1
Trophy points
1,288
Visit site
Activity points
1,935
Hello,
I was reading a paper related to Resource Allocation in OFDM-based Cognitive Radios, in which the author had stated the rate requirement of the CR users as uniformly set to 20bits/symbol. What is the meaning of rate in terms of bits/symbol? Isn't rate given in terms of bits/second?
Please help.
 

This is probably referring to the spreading factor. If you encode each data symbol using a 32-bit code, then you have 32 bits per symbol (or, equivalently, we can say the spreading factor is 32).

Normally, to clarify the difference between data bits and bits transmitted into the channel, we instead say there are 32 "chips" per symbol.
 

Hi, thanks for your reply.
But its given that Rk=sumN{log(1 + pk,n*Hk,n)}, and
Rk = Rreq=10bits/symbol which is given as the fixed rate requirement (equality constraint) of the CR users
Rk is the rate of the kth user. N is the number of subchannels.
So does Rk refer to the data rate or the spreading factor as you mentioned? How do I interpret this?
 

If Rk is measured in bits/symbol, then my guess would be that it's a spreading factor (or something like that). Of course, this will be linked to data rate if the overall bitrate is constant (as is the case in many practical systems).
 

Status
Not open for further replies.

Similar threads

Part and Inventory Search

Welcome to EDABoard.com

Sponsor

Back
Top