copyfriend
Newbie level 1
Hi guys,
I'm working on multipath delay estimation.
I use PN sequence in the transmitter.
The chip rate of PN sequence is Tc.
It is known that with delays that are integer times of Tc can be estimated by auto-correlation directly.
But for arrivals separated by less than Tc seconds can not be solved in this way.
My question is how to simulate delay signal delay that is fractional time of Tc?
I tried to add some points between Tc and 2Tc,
but in this case,auto-correlation still works which is not possible in practical.
I guess the reason for this is adding points between Tc and 2Tc will also increase sample rate of simulation which will increase the resolution.
Has anybody worked on this?or have any idea for this problem?
Thanks
Xiao
I'm working on multipath delay estimation.
I use PN sequence in the transmitter.
The chip rate of PN sequence is Tc.
It is known that with delays that are integer times of Tc can be estimated by auto-correlation directly.
But for arrivals separated by less than Tc seconds can not be solved in this way.
My question is how to simulate delay signal delay that is fractional time of Tc?
I tried to add some points between Tc and 2Tc,
but in this case,auto-correlation still works which is not possible in practical.
I guess the reason for this is adding points between Tc and 2Tc will also increase sample rate of simulation which will increase the resolution.
Has anybody worked on this?or have any idea for this problem?
Thanks
Xiao