difference between analog and digital power supply

Status
Not open for further replies.
Hi,

Analog circuits for high precision are sensitive against noise or spikes on the supply and ground lines. Digital circuits (especially if you switch higher inductive loads off, or high capacitances on) generating fast current transitions and creating ground shifts or supply line spikes, or cross talk to other lines. That is why it is the best design practice to separate the supply of analog circuits and digital circuits. Analog circuits require often a low supply and ground ripple, thus linear regulator are better then DC/DC converter. To supply for instance high precision sensor systems (or microcontroller with ADC on-chip) with digital and analog supply a mixed topology on a DC/DC converter with a linear regulator is a good approach. Either using multiple power supply ICs or a solution like the iC-DC from iC-Haus with a combined structure and two supply outputs (here you find a block diagram: iC-Haus Homepage - product: iC-DC ).
 

Status
Not open for further replies.

Similar threads

Cookies are required to use this site. You must accept them to continue using the site. Learn more…