vreg
Member level 4

Hello,
I was reading a paper related to Resource Allocation in OFDM-based Cognitive Radios, in which the author had stated the rate requirement of the CR users as uniformly set to 20bits/symbol. What is the meaning of rate in terms of bits/symbol? Isn't rate given in terms of bits/second?
Please help.
I was reading a paper related to Resource Allocation in OFDM-based Cognitive Radios, in which the author had stated the rate requirement of the CR users as uniformly set to 20bits/symbol. What is the meaning of rate in terms of bits/symbol? Isn't rate given in terms of bits/second?
Please help.