Continue to Site

Welcome to EDAboard.com

Welcome to our site! EDAboard.com is an international Electronics Discussion Forum focused on EDA software, circuits, schematics, books, theory, papers, asic, pld, 8051, DSP, Network, RF, Analog Design, PCB, Service Manuals... and a whole lot more! To participate you need to register. Registration is free. Click here to register now.

idiot-proof adjustible voltage generator for near-zero-current draw application?

Status
Not open for further replies.

quantized

Member level 2
Joined
Jul 6, 2012
Messages
51
Helped
1
Reputation
2
Reaction score
1
Trophy points
1,288
Activity points
1,887
Hi, I have an all-digital design that would benefit greatly from the ability to add fine-grained, in-the-field adjustable delays significantly less than one gate delay (FO4). These delays are not critical to functionality -- if they don't work, the chip won't be a total loss; it will simply underperform.

My plan is to include current-starved inverters where I need this functionality. I only need to delay the falling edge of the signal, so the inverter in question has an extra NMOS where the GND connection ought to be, and the gate of the extra NMOS is connected to an analog reference voltage; the lower the voltage, the slower the fall time of the inverter. The reference voltage is set via an (off-chip software) feedback loop; I do not need to be able to set absolute voltages or fall times -- I just need the ability to say "make it fall slower" and "make it fall more quickly" every few hundred milliseconds until I find the right fall time (which is probably going to be die-specific). I make these decisions based on the error rates observed in various results; again, all of this is done off-chip in easy-to-modify software.

So, my question is this: being a digital engineer, is there some very-hard-to-screw-up circuit that can generate adjustable voltages on chip? Again, the voltages in question are only relative (not absolute), the current drawn from them is near zero (just gate leakage) and they are never below GND or above VDD. I'm slightly afraid of charge pumps due to the fact that they can create voltages above VDD or below GND, which has scary latchup consequences.

I've seen all sorts of super-sophisticated designs for similar starved inverters inputs in DLLs and PLLs, but I get the feeling that there must be something a lot simpler I could be using. For example, I only need to change the voltage every few hundred milliseconds -- or even less often -- so putting an enormous capacitor on the generator's output ought to stabilize the voltage and filter out noise, right? The downside of course is that it takes longer to change the voltage, but that doesn't matter to me. Even if it took an entire second to change the voltage, that would still be very useful.

If all else fails I'll bring in the reference voltages from off-chip via dedicated analog pads, but in an ideal world I'd like to have far more separate adjustable voltages than there are pads on the chip.

So, if anybody can point me to a paper that describes the most idiot-proof-possible solution to this problem, I'd appreciate it. Again, I'm an experienced digital designer, but I have essentially zero experience with analog circuitry.

Thanks!
 

Can't you simply use a DAC, or a digital-switch controlled output from a potentiometer (resistor chain)?
 

Can't you simply use a DAC, or a digital-switch controlled output from a potentiometer (resistor chain)?

Hrm, the resistor chain idea sounds like a pretty good one! I can make resistors pretty easily with unsilicided poly or ordinary diffusion (it's a non-silicided-diffusion process). Not a whole lot that can go wrong with all-passive components as long as I resample and adjust frequently enough.

Thanks!
 

Status
Not open for further replies.

Part and Inventory Search

Welcome to EDABoard.com

Sponsor

Back
Top