The difference between bit duration, bit period and symbol duration, sample period

Status
Not open for further replies.

rmreddy

Member level 3
Joined
Feb 20, 2007
Messages
65
Helped
2
Reputation
4
Reaction score
1
Trophy points
1,288
Activity points
1,718
hello everyone,
i have a basic doubt ... please quote the differences between ,
1.bit duration
2.bit period
3.symbol duration
4.sample period
 

symbol duration

1) and 2) should be the same, or you can also call it bit interval, which means the time of one bit

a symbol may include several bits, e.g. for binary signal, one symbol contains one bit, for M-ary signal, one symbol has log2(M) bits. so a symbol duration = bit duration * number of bits.

sample period should be the period between two samples, its inverse should be the sampling rate (or sampling frequency).
 

symbol duration

also when you get data rate you multiply fs * bit per sample
 

I will also like to know where 'chips,' which is the unit of 'spreading codes' in CDMA, fit in here 'physically' with respect to bit and symbol, bit rate and symbol rate?!
 

Status
Not open for further replies.
Cookies are required to use this site. You must accept them to continue using the site. Learn more…