vreg
Member level 4
- Joined
- Oct 16, 2012
- Messages
- 70
- Helped
- 1
- Reputation
- 2
- Reaction score
- 1
- Trophy points
- 1,288
- Activity points
- 1,935
Hello,
I was reading a paper related to Resource Allocation in OFDM-based Cognitive Radios, in which the author had stated the rate requirement of the CR users as uniformly set to 20bits/symbol. What is the meaning of rate in terms of bits/symbol? Isn't rate given in terms of bits/second?
Please help.
I was reading a paper related to Resource Allocation in OFDM-based Cognitive Radios, in which the author had stated the rate requirement of the CR users as uniformly set to 20bits/symbol. What is the meaning of rate in terms of bits/symbol? Isn't rate given in terms of bits/second?
Please help.