Continue to Site

Welcome to EDAboard.com

Welcome to our site! EDAboard.com is an international Electronics Discussion Forum focused on EDA software, circuits, schematics, books, theory, papers, asic, pld, 8051, DSP, Network, RF, Analog Design, PCB, Service Manuals... and a whole lot more! To participate you need to register. Registration is free. Click here to register now.

Noise performance of LNA after Calibre PEX

Status
Not open for further replies.

mtwieg

Advanced Member level 6
Joined
Jan 20, 2011
Messages
3,835
Helped
1,310
Reputation
2,626
Reaction score
1,405
Trophy points
1,393
Activity points
29,374
My design relies on several very large, low noise OTAs which in the schematic view simulate with ~1nV/√Hz of input referred noise. But after parasitic extraction with PEX, the simulated noise jumps up tremendously to 1.64nV/√Hz. I'm only extracting resistance (frequency is 1MHz so it's not really relevant). I've found that if I only extract the PMOS input transistors then I still get about the same noise increase, not surprisingly. If I convert the input referred noise voltage to an input referred noise resistance, then my results suggest that there's effectively over 50Ω of resistance on the input of my LNA (I'll call this ΔRn for convenience: the gate resistance which accounts for the difference in noise between the schematic and PEX simulations).

The issue is that this is far higher than expected based on my layout (I expect a few ohms at most). In fact nothing I do to the layout changes ΔRn by much, it's always between 40-100Ω. As of now I've thrown everything at the gate resistance I can think of with little success. Doubling the number of fingers, using gate contacts on both sides, using four rows of poly contacts on each side, using triple layered metal for the gate interconnects... and I don't think source resistance is to blame either, since the voltage gain in PEX only drops by 0.1dB, which isn't enough the explain the noise increase.

In fact, if I scale the device (and its bias current) down by a factor of, for example, N=8, ΔRn does not scale up by 8 as expected, but maybe only a . This suggests to me that the noise in PEX is not from effective gate resistance, but some other mysterious mechanism I can't deduce. Resistance in the N well occurred to me, but adding more N taps doesn't seem to help at all.

When I get back to the office I'll grab a screenshot of my layout for reference.

Any insight would be appreciated.
 

So are you saying that the gate resistance is too high? Do you know how much of your gate resistance is modeled in your device model (like BSIM)? Or is it all extracted as a parasitic resistance? If you add the rlayer keyword into your pex netlist statement, you can then see which parasitic resistance corresponds to which physical layer, and then you can look at your extracted netlist to see what the gate resistance is, and if it meets your expectations or not.

My design relies on several very large, low noise OTAs which in the schematic view simulate with ~1nV/√Hz of input referred noise. But after parasitic extraction with PEX, the simulated noise jumps up tremendously to 1.64nV/√Hz. I'm only extracting resistance (frequency is 1MHz so it's not really relevant). I've found that if I only extract the PMOS input transistors then I still get about the same noise increase, not surprisingly. If I convert the input referred noise voltage to an input referred noise resistance, then my results suggest that there's effectively over 50Ω of resistance on the input of my LNA (I'll call this ΔRn for convenience: the gate resistance which accounts for the difference in noise between the schematic and PEX simulations).

The issue is that this is far higher than expected based on my layout (I expect a few ohms at most). In fact nothing I do to the layout changes ΔRn by much, it's always between 40-100Ω. As of now I've thrown everything at the gate resistance I can think of with little success. Doubling the number of fingers, using gate contacts on both sides, using four rows of poly contacts on each side, using triple layered metal for the gate interconnects... and I don't think source resistance is to blame either, since the voltage gain in PEX only drops by 0.1dB, which isn't enough the explain the noise increase.

In fact, if I scale the device (and its bias current) down by a factor of, for example, N=8, ΔRn does not scale up by 8 as expected, but maybe only a . This suggests to me that the noise in PEX is not from effective gate resistance, but some other mysterious mechanism I can't deduce. Resistance in the N well occurred to me, but adding more N taps doesn't seem to help at all.

When I get back to the office I'll grab a screenshot of my layout for reference.

Any insight would be appreciated.
 
  • Like
Reactions: mtwieg

    mtwieg

    Points: 2
    Helpful Answer Positive Rating
So are you saying that the gate resistance is too high?
I'm saying that the PEX results suggest the gate resistance is very high, much higher than it should be given my layout.
Do you know how much of your gate resistance is modeled in your device model (like BSIM)? Or is it all extracted as a parasitic resistance?
Good question. I've been told that poly gate resistance is included in the MOS device models, so it should be factored into the schematic simulation results. But looking at my PEX netlist I can see that it includes parasitic resistors for the poly gates.

If you add the rlayer keyword into your pex netlist statement, you can then see which parasitic resistance corresponds to which physical layer, and then you can look at your extracted netlist to see what the gate resistance is, and if it meets your expectations or not.
This sounds very useful, could you give some details on how to do this? Is it something in the PEX inputs dialog?
 

I did a few more simple tests and discovered a mystery, but it may partially explain the existing mystery...

I did two simulations of a simple PMOS differential pair. First simulation used total width=768u, nf=192, with one instance per PMOS. Second simulation used total width=96u, nf=24, but with each PMOS being an instance array of 8 models, so the two simulations have the same overall L, W, and nf. After simulating both with just the schematic views, both have almost exactly the same gm, but the first simulation has significantly lower noise. I'm unable to think of any explanation for why this would be the case in real or simulated devices. But for some reason the noise models suggest it is so.

If, as that simulation suggests, MOS devices which are broken into smaller instances will have worse noise than a single large instance, then that might explain why my previous results from PEX are poor. PEX breaks fingered devices into many smaller ones with 1 finger each, and therefore this would cause the PEX results to be noisier, but not because of interconnect resistance.

Now the question is why am I seeing this strange behavior in the modeled noise in the MOS devices? Frankly I don't think such a trend can be real. So which result should I believe?
 

Status
Not open for further replies.

Part and Inventory Search

Welcome to EDABoard.com

Sponsor

Back
Top