Continue to Site

Welcome to EDAboard.com

Welcome to our site! EDAboard.com is an international Electronics Discussion Forum focused on EDA software, circuits, schematics, books, theory, papers, asic, pld, 8051, DSP, Network, RF, Analog Design, PCB, Service Manuals... and a whole lot more! To participate you need to register. Registration is free. Click here to register now.

Help for VCO phase noise simulation after bandpass filter by using functional block

Status
Not open for further replies.

RYAN_WANG

Newbie level 1
Joined
Apr 14, 2012
Messages
1
Helped
0
Reputation
0
Reaction score
0
Trophy points
1,281
Activity points
1,289
Hi, all.

I am simulating a VCO after an ideal bandpass filter. The phase noise of original VCO is about -122 dBc/Hz at 3 GHz.
Then I set bandpass filter by using cadence functional block, Q=50, at 3 GHz. However, the phase noise after filtering
is only -75 dBc/Hz.
I am not very sure where I did wrong.

Hope someone could help me with this problem.

Thanks !
 

my first guess would be some sort of simulation convergence glitch in your program.

BUT, it IS possible that the bandpass filter is reflecting harmonic energy back to the active device in the VCO and actually degrading the phase noise. Try inserting an ideal 50 ohm line length of various phases in between the VCO and Bandpass filter, and see if the predicted phase noise varies all over the place.
 

What about the input/output impedance of this ideal filter ??
 

According to Leeson phase noise equation, doubling the loaded-Q the phase noise gets 6dB better.
Try to use Q of 100 instead of 50, and see what happen. At least you find if the reason is the low Q filter added to the circuit.
Beware that Leeson equation applies only between 1/f flicker corner and the corner where the white (flat) noise starts.
 

Status
Not open for further replies.

Part and Inventory Search

Welcome to EDABoard.com

Sponsor

Back
Top