hello there,
why constant bandwidth across the gain range is required in VGA/PGA design, what if the bandwidths of different gain settings are different ? Is there a fundamental principle behind this, any comment is welcome.
In VGA, it is necessary to have a settling time that is independent of the input amplitude inorder to complete the VGA operation optimally w.r.t time. Suppose if it is not followed, when the amplitude is smaller, the required gain will be high and a constant gain bandwidth would take long time before the next block can process the signal.