15th March 2007, 14:16 #1
Member level 3
The difference between bit duration, bit period and symbol duration, sample period
i have a basic doubt ... please quote the differences between ,
15th March 2007, 14:16
15th March 2007, 21:38 #2
Newbie level 3
1) and 2) should be the same, or you can also call it bit interval, which means the time of one bit
a symbol may include several bits, e.g. for binary signal, one symbol contains one bit, for M-ary signal, one symbol has log2(M) bits. so a symbol duration = bit duration * number of bits.
sample period should be the period between two samples, its inverse should be the sampling rate (or sampling frequency).
15th March 2007, 21:38
18th March 2007, 14:38 #3
Newbie level 5
also when you get data rate you multiply fs * bit per sample
18th March 2007, 14:38
29th December 2011, 16:10 #4
Newbie level 1
Re: The difference between bit duration, bit period and symbol duration, sample perio
I will also like to know where 'chips,' which is the unit of 'spreading codes' in CDMA, fit in here 'physically' with respect to bit and symbol, bit rate and symbol rate?!