donkehote
Newbie level 4
Hi folks,
This is my first post on this board and Im hoping you guys can help out. I have a variable gain amp that draws 1 mA in the high gain condition and 40 mA in the low gain condition. Im planning on using a potentiometer to supply the AGC control voltage that can range from 1.2V (high gain) to 5 V (low gain). My question is really about potentiometers. My senior designer said that if I use a 5K potentiometer then it needs to be able to handle 1 mA in the high gain condition and 4 mA in the low gain condition. If I use a 1K potentiometer, it needs to be able to handle 5 mA in high gain and 4 mA in the low gain condition.
Can someone here explain why the difference for the high gain condition? Is it just the total voltage drop across the potentiometer should be 5V, in which case if I used a 10K potentiometer, it would need to handle 0.5 mA? Or is it something else?
This is my first post on this board and Im hoping you guys can help out. I have a variable gain amp that draws 1 mA in the high gain condition and 40 mA in the low gain condition. Im planning on using a potentiometer to supply the AGC control voltage that can range from 1.2V (high gain) to 5 V (low gain). My question is really about potentiometers. My senior designer said that if I use a 5K potentiometer then it needs to be able to handle 1 mA in the high gain condition and 4 mA in the low gain condition. If I use a 1K potentiometer, it needs to be able to handle 5 mA in high gain and 4 mA in the low gain condition.
Can someone here explain why the difference for the high gain condition? Is it just the total voltage drop across the potentiometer should be 5V, in which case if I used a 10K potentiometer, it would need to handle 0.5 mA? Or is it something else?