Continue to Site

Welcome to EDAboard.com

Welcome to our site! EDAboard.com is an international Electronics Discussion Forum focused on EDA software, circuits, schematics, books, theory, papers, asic, pld, 8051, DSP, Network, RF, Analog Design, PCB, Service Manuals... and a whole lot more! To participate you need to register. Registration is free. Click here to register now.

Time Interleaved ADC Filtering at Output

Status
Not open for further replies.

Puppet123

Full Member level 6
Joined
Apr 26, 2017
Messages
356
Helped
22
Reputation
44
Reaction score
21
Trophy points
18
Activity points
3,058
When making time interleaved ADC you get gain, offset mismatches along with bandwidth and timing mismatch/skew.

How can mismatches be stopped using filtering - how can I simulate this filter using circuit simulation ?
 

Hi,

Please describe more exactly what's your concern.

Klaus
 

To simulate dynamic mismatch and its possible compensation, you need
1. a mismatch model, e.g. specified as acquisition bandwidth and sampling time skew
2. a hypothetic input signal applied to the ADC pair
3. a compensation filter prototype, e.g. RC or LC low pass
 

In a interleaved ADC you get spurs that are at harmonics of Fs/N, N being the interleave factor. The spurs from gain mismatch, timing skew and bandwidth mismatch are actually at k*Fs/N +/- Fin. So, they depend on you Fin and Fs. In my view it will be difficult to filter out spurs that fall into your signal bandwidth. Usually, people try to match subADCs as much as possible and use calibration for the mismatches.
 

Status
Not open for further replies.

Part and Inventory Search

Welcome to EDABoard.com

Sponsor

Back
Top