Continue to Site

Welcome to EDAboard.com

Welcome to our site! EDAboard.com is an international Electronics Discussion Forum focused on EDA software, circuits, schematics, books, theory, papers, asic, pld, 8051, DSP, Network, RF, Analog Design, PCB, Service Manuals... and a whole lot more! To participate you need to register. Registration is free. Click here to register now.

difference between analog and digital power supply

Status
Not open for further replies.
Hi,

Analog circuits for high precision are sensitive against noise or spikes on the supply and ground lines. Digital circuits (especially if you switch higher inductive loads off, or high capacitances on) generating fast current transitions and creating ground shifts or supply line spikes, or cross talk to other lines. That is why it is the best design practice to separate the supply of analog circuits and digital circuits. Analog circuits require often a low supply and ground ripple, thus linear regulator are better then DC/DC converter. To supply for instance high precision sensor systems (or microcontroller with ADC on-chip) with digital and analog supply a mixed topology on a DC/DC converter with a linear regulator is a good approach. Either using multiple power supply ICs or a solution like the iC-DC from iC-Haus with a combined structure and two supply outputs (here you find a block diagram: iC-Haus Homepage - product: iC-DC ).
 

Status
Not open for further replies.

Similar threads

Part and Inventory Search

Welcome to EDABoard.com

Sponsor

Back
Top