Hello people,

I am having some trouble simulating the NF of a mixer I designed. I am trying to simulate its performance using spectre Periodic Noise Figure analysis (combined with PSS of course), in the ways described both in CADENCE's app note (http://www.ek.isy.liu.se/~jdab/SpectreRF_Mixer533AN.pdf) and in a tutorial from a Sweden University (very similar instructions). Anyway, the schematics I am using are not quite identical to those that appear in the app note. In particular, I am biasing all transistors through some sort of mirroring structures, and not directly in the Ports that feed the LO and RF signals. That means I am also including capacitors to couple the RF and LO signals to the gates of the appropriate transistors. I believe it is not so hard to imagine the schamatics. Anyway, whenever I simulate NF sweeping frequency, for the apropriate LO level, I get values for NF that span from 30 to 40 dB (for 10KHz to 10MHz IF frequency). Those are absurdely huge values, I believe.

I also checked the input and output refered noise, and both values are very high (input noise around 20nV/sqrt(Hz), output noise around 500 nV/sqrt(Hz)). Also, the noise factor is huge (as expected from the NF), somewhere around the tens or hundreds of KV/V. Furthermore, I printed the Noise contribution summary. The summary indicates that the RF V-I transducer transistor constributes most of the noise, along with the mirroring transistor that is responsible for its biasing.

I believe what is happening is some sort of coupling from the LO port to the RF port. I did not forget to turn of the RF port (by setting it to DC instead of sine), so that is not the problem. Maybe the LO signal is being coupled to the RF input, and then downconverted to baseband and being interpreted as noise. I don't know! Has anyone ever seen something like that?

Thanks in advance!