sharkies
Member level 5
Let's consider a 1.0V supply current-steering 10-bit DAC in 90nm.
1LSB will be approximately as small as 1mV.
Now, if the current-steering DAC is driving an on-chip resistor, then the voltage across this resistor should change by 1mV when the input code changes by 1LSB.
If we consider a PMOS current cell for the DAC, the DAC resistor is connected to the current cell on one side, and the analogGround on the other side.
However, the analogGround will inevitably fluctuate more than 1mV due to substrate noise etc.
How do people get around this problem? Does making the resistor off-chip help? since this way the resistor is connected to a more quiet ground reference? I am quite confused here..
1LSB will be approximately as small as 1mV.
Now, if the current-steering DAC is driving an on-chip resistor, then the voltage across this resistor should change by 1mV when the input code changes by 1LSB.
If we consider a PMOS current cell for the DAC, the DAC resistor is connected to the current cell on one side, and the analogGround on the other side.
However, the analogGround will inevitably fluctuate more than 1mV due to substrate noise etc.
How do people get around this problem? Does making the resistor off-chip help? since this way the resistor is connected to a more quiet ground reference? I am quite confused here..