Souljah44
Junior Member level 1
Hi All,
I'm simulating a transistor level, continuous first order delta sigma modulator and need some clarification on the my gm-c filter.
Simulation is all in Cadence.
The input frequency of the DSM is 10hkz so what should the 3-db cutoff frequency (not my unity gain frequency) of my gm-c be?
I was thinking that my f_3db should at least be 10khz to effectively pass my analog signal but my ENOB numbers do not coincide with this assumption. They get better as the 3db bandwidth drops. Am I missing something here? At an f_3db of 1khz, I get my desired ENOB. Any link or resources on this would be appreciated.
Thanks in advance.
I'm simulating a transistor level, continuous first order delta sigma modulator and need some clarification on the my gm-c filter.
Simulation is all in Cadence.
The input frequency of the DSM is 10hkz so what should the 3-db cutoff frequency (not my unity gain frequency) of my gm-c be?
I was thinking that my f_3db should at least be 10khz to effectively pass my analog signal but my ENOB numbers do not coincide with this assumption. They get better as the 3db bandwidth drops. Am I missing something here? At an f_3db of 1khz, I get my desired ENOB. Any link or resources on this would be appreciated.
Thanks in advance.