neazoi
Advanced Member level 6
I have a 100uA analogue (needle) meter and I want to convert it to 1A so I can read currents to 1A. When the current is 1A it should read 100uA, when it is 0.5A it should read 50uA and so on (linear conversion).
How can I do it without posing great resistance to the DC, so as not to loose voltage fed to the series circuit?
(DC can be anything from 1.2v to 20v and the device draws 500-800mA)
How can I do it without posing great resistance to the DC, so as not to loose voltage fed to the series circuit?
(DC can be anything from 1.2v to 20v and the device draws 500-800mA)