Continue to Site

Welcome to EDAboard.com

Welcome to our site! EDAboard.com is an international Electronics Discussion Forum focused on EDA software, circuits, schematics, books, theory, papers, asic, pld, 8051, DSP, Network, RF, Analog Design, PCB, Service Manuals... and a whole lot more! To participate you need to register. Registration is free. Click here to register now.

QPSK over tap delay model for TDMA

Status
Not open for further replies.

moona.sheikh

Newbie level 3
Joined
Dec 26, 2008
Messages
3
Helped
0
Reputation
0
Reaction score
0
Trophy points
1,281
Activity points
1,307
I am not from a communication background and keenly interested in some advice on some problems.

I am working on a TDMA system having time slot of 14.167ms for each frame. In which 255 pi/4 QPSK modulated symbols are transmitted. Symbol time is about 55.55 us.

System have 25khz band width and system throughput is 36kbps. My fading channel model is Rayleigh\Racian having one straight path having 0s delay and 0dBm delay, second path is 5us delay and -22.3dBm relative power.

My design system flow is as follows:

Bit Generation -> Bit Mapper -> pi/4 QPSK Symbol generation -> Sq. root Raised cosine filter -> Fading Channel ->Noise addition -> Estimation -> Symbol recovery

My problems:

very basic but Im not clear

what is my sampling frequency? FS and fd for Sq. root Raised cosine filter parameters? Im using RCOSINE() function for the purpose.

Will it be appropriate if I use Rayleigh channel object and pass my delay profile as input?

where will down sampling block will be implemented?

What general recommendations for implementing such flat fading system where delay is almost 11th fraction of the symbol period


moona.
 

Status
Not open for further replies.

Similar threads

Part and Inventory Search

Welcome to EDABoard.com

Sponsor

Back
Top