Continue to Site

Welcome to EDAboard.com

Welcome to our site! EDAboard.com is an international Electronics Discussion Forum focused on EDA software, circuits, schematics, books, theory, papers, asic, pld, 8051, DSP, Network, RF, Analog Design, PCB, Service Manuals... and a whole lot more! To participate you need to register. Registration is free. Click here to register now.

How to model/estimate channel in discrete time with different sampling time ?

Status
Not open for further replies.

siato

Newbie level 1
Newbie level 1
Joined
Mar 18, 2013
Messages
1
Helped
0
Reputation
0
Reaction score
0
Trophy points
1,281
Location
Montreal
Visit site
Activity points
1,277
Hello,

I want to estimate a channel based on LTE 3GPP EVA with given power delay profile (set of average power and delay of channel taps).
tau = [0, 30e-9, 150e-9, 310e-9, 370e-9, 710e-9, 1090e-9, 1730e-9, 2510e-9]; % relative delay (s)
pdb = [0, -1.5, -1.4, -3.6, -0.6, -9.1, -7.0, -12.0, -16.9]; % avg. power (dB)

Unfortunately, the relative delays of channel taps are not multiples of my sampling time \[T_s\]=3.3333e-07 s.
Here, I assumed the number of taps to be estimated is \[ \frac{D_s}{T_s} \], where \[ D_s\] is the delay spread.

First of all, please let me know if there is anything wrong in my approach.
Also, How should I measure my estimation error in this case? Should I compare my estimates with channel sampled at my sampling rate, or should I subtract the reconstructed frequency response of estimate and the original one?
Anyhow, due to the difference in the sampling time and delays, there is an intrinsic error in my estimation no matter how I measure it.

Thanks in advance :)
 

Status
Not open for further replies.

Similar threads

Part and Inventory Search

Welcome to EDABoard.com

Sponsor

Back
Top