Continue to Site

Welcome to EDAboard.com

Welcome to our site! EDAboard.com is an international Electronics Discussion Forum focused on EDA software, circuits, schematics, books, theory, papers, asic, pld, 8051, DSP, Network, RF, Analog Design, PCB, Service Manuals... and a whole lot more! To participate you need to register. Registration is free. Click here to register now.

Choice between Buck and Regulator

Status
Not open for further replies.

tiwari.sachin

Full Member level 6
Joined
Aug 1, 2009
Messages
341
Helped
3
Reputation
6
Reaction score
3
Trophy points
1,298
Location
India
Activity points
4,447
I need to generate 3.3V/2A from 24V Supply (Max 28V)

I am currently using

24V --> 5V (Buck) --> 3.3V (LDO)

There is a single part that uses 5V while all others work on 3.3V

I was considering using only DC to DC (Buck) to generate 3.3V directly from 24V

The customer (Manager of the project) feels that it might not be a good option as the input will be at 95% of max rated input value of the Buck so dropping it to 5V then to 3.3V is a good option.

I am of the opinion that we can save a dollar or two by removing the LDO.


Will it really be a problem on a long run if we use bucks at max input range?
 

Hi,

Your informations are confusing.

The customer (Manager of the project) feels that it might not be a good option as the input will be at 95% of max rated input value of the Buck
The max. rated input voltage of a buck converter does not depend on output voltage. (unless you use different ICs).

If you use are talking about different ICs, then just select an IC with higher rated input voltage. Every manufacturer and distributor has online selection tools. Use them.

I am of the opinion that we can save a dollar or two by removing the LDO.
How can you remove it?
If you use 24V --> 5V then you can't supply the 3.3V parts.
If you use 24V --> 3.3V then you can't supply the 5V part.
So what configuration are you talking about?

Will it really be a problem on a long run if we use bucks at max input range?
No. But be aware that the 24V dies not mean 24.000V without noise and without spikes and without variations.
Maybe it varies with input voltage, maybe there are overshots at power_on....
Thus it's a good idea to leave enough headroom.... for every part..IC, caoacitor....

Klaus
 

How can you remove it?
If you use 24V --> 5V then you can't supply the 3.3V parts.
If you use 24V --> 3.3V then you can't supply the 5V part.
So what configuration are you talking about?


I have only one part that needs 5V and its operating voltages vary from 1.65 to 5.5 and 3.3V should be fine
 

Hi,

I have only one part that needs 5V and its operating voltages vary from 1.65 to 5.5 and 3.3V should be fine
--> the part does not need 5V. It may be supplied with 3.3V.

Klaus
 

No. But be aware that the 24V dies not mean 24.000V without noise and without spikes and without variations.
Maybe it varies with input voltage, maybe there are overshots at power_on....
Thus it's a good idea to leave enough headroom.... for every part..IC, caoacitor....

Klaus


This is the concern. The input is a 24V/3A wall adaptor.
There will be switch connected between J1 and J2 as shown. what can I use to control overshoots at power on (Switch ON/OFF)

1.png
 

Hi,

There are overvoltage protection devices as well as overvoltage protection circuits. Just do an internet search.

My recommendation still is: Use a buck regulator with appropriate input voltage range. There are more than enough available.

Klaus
 

You may have a valid point that a 3.3v converter is more straightforward to the majority of devices. Only one device needs 5v. But now someone ought to verify it (you seem to be a candidate). Test various options to compare cost, stability, parts count, etc.

Questions:

* As for the 5V load, what is its Ampere amount?

* How do you propose to derive 5v? From 24V? Or from a 3.3v converter? Which is more economical. A 5v converter as a middleman, or separate converters?

* Do your loads switch on and off, or change rapidly creating glitches on either the 5v or 3V supply? Do you have devices which get upset by such glitches? Then the decision might be made for you, and you cannot use the customer's easy arrangement. Multiple power supplies may be needed to satisfy some types of devices.
 

You may have a valid point that a 3.3v converter is more straightforward to the majority of devices. Only one device needs 5v. But now someone ought to verify it (you seem to be a candidate). Test various options to compare cost, stability, parts count, etc. [/QUOTE

Questions:

* As for the 5V load, what is its Ampere amount?


Its hardly 50 or 100mA (Its just a buzzer)

* How do you propose to derive 5v? From 24V? Or from a 3.3v converter? Which is more economical. A 5v converter as a middleman, or separate converters?

Don't Need 5V at all

* Do your loads switch on and off, or change rapidly creating glitches on either the 5v or 3V supply? Do you have devices which get upset by such glitches? Then the decision might be made for you, and you cannot use the customer's easy arrangement. Multiple power supplies may be needed to satisfy some types of devices.


This is the thing that's bothering me. I cannot check it at design phase and need a assembled PCB to do so. If it works, everyones happy and if it doesn't then have to be redone all over again, including PCB design and its gonna cost time and money.


Anyways... I am taking my chances and designing 24V to 3.3V. Lets see how the test goes :)
 
Last edited:

Your decision sounds okay to use a 3.3v converter (in view of post #4).
However could the customer-manager prefer to his arbitrary decision? Does he forget about losses due to inefficiency of putting in a middleman converter stepping down 5v to 3.3v? It adds more burden on the 5v converter which may fail prematurely.
He may pay attention to what you say if you let him hear a test of the loudness of the buzzer at 3.3v.
 

Thanks Guys. Appreciate your responses.


As said the customer (manager) is more concerned of...

if the input voltage crosses 28V....

if the buck fails... all the other parts will burn (that running on 3.3V)


If it is 24V --> 5 --> 3.3V, then chances of burning the entire board is less. as max buck fails... and my answer to that was... if... only bulk fails and instead of open its a short then chances are even the 3.3V regulator will probably fail in seconds.

He finally left saying... You are the lead... its your call... so I am going with 24V to 3.3V (I don't see reasons why it would fail). We donot have a over voltage protection at 24V input and that can possibly be the only reason. Not using OVP due to cost constraints and its always a 24V adaptor (Tested) will be supplied
 

Hi,

if the input voltage crosses 28V....
I assume there are about hundreds different controllers specified for V_in > 28V.
--> Simply choose the right one.

if the buck fails... all the other parts will burn (that running on 3.3V)
No circuit will survive all imaginable fails. Even not the circuit before.
--> First check the requirements (ask the customer manager forthem), then design your circuit according this. Then there will be no reason for the circuit to fail.
If the circuit fails, then:
* maybe the requirement was not correct (not your fault)
* or there was another failure (mistreating device, bad assembling or soldering...) (also not your fault)

Klaus
 


Status
Not open for further replies.

Similar threads

Part and Inventory Search

Welcome to EDABoard.com

Sponsor

Back
Top