Continue to Site

Welcome to EDAboard.com

Welcome to our site! EDAboard.com is an international Electronics Discussion Forum focused on EDA software, circuits, schematics, books, theory, papers, asic, pld, 8051, DSP, Network, RF, Analog Design, PCB, Service Manuals... and a whole lot more! To participate you need to register. Registration is free. Click here to register now.

Unwanted voltage drop across gate resistor.

Status
Not open for further replies.

Terminator3

Advanced Member level 3
Joined
Feb 25, 2012
Messages
802
Helped
71
Reputation
142
Reaction score
63
Trophy points
1,308
Activity points
9,027
I am trying to make tunable negative biasing for an oscillator at 10GHz. I am using negative bias generator IC connected to 10kOhm potentiometer (center pin to additional 100kOhm resistor goes to gate stub, two other pins to GND and -3v generator.).

100kOhm resistor connected to gate stub using high impedance line and quarterwave stub on it. Also there is additional 1nF and 10pF capacitors to ground. And still I observe voltage drop around 0.4v through 100kOhm resistor. When biasing becomes more negative and oscillation stopped, voltage drop disappears. Which makes me think 10GHz AC signal is somehow reaching biasing resistor or something other is going wrong.
From what i learned no current flows to gate, and voltage drop across Rg must be zero.

I thought i made anything proper. Quarterwave stub and 10pF must stop 10GHz from reaching 100kOhm resistor. Also i made additional "test" quarterwave stubs at LO input of mixer to check their efficiency, and their length is surely right, as mixer starts to work only after cutting them.

Long time ago i tried to make multiplier with similar biasing scheme and it looked ok.
 

Status
Not open for further replies.

Part and Inventory Search

Welcome to EDABoard.com

Sponsor

Back
Top