Continue to Site

Welcome to EDAboard.com

Welcome to our site! EDAboard.com is an international Electronics Discussion Forum focused on EDA software, circuits, schematics, books, theory, papers, asic, pld, 8051, DSP, Network, RF, Analog Design, PCB, Service Manuals... and a whole lot more! To participate you need to register. Registration is free. Click here to register now.

Power Supply Question

Status
Not open for further replies.

lucky6969b

Member level 2
Joined
Oct 14, 2009
Messages
49
Helped
0
Reputation
0
Reaction score
0
Trophy points
1,286
Activity points
1,639
I have a 520W power supply for my personal computer appliances.
I used to have a 500W power supply for the the same computer set. If the mains supplies 13A of current, would the 520W power supply draw more energy than the 500W one from the mains to the supply, meanwhile the current is constant. Would the voltage level increase while the mains is a constant current supplier?
Thanks
Jack
 

I have a 520W power supply for my personal computer appliances.
I used to have a 500W power supply for the the same computer set. If the mains supplies 13A of current, would the 520W power supply draw more energy than the 500W one from the mains to the supply, meanwhile the current is constant. Would the voltage level increase while the mains is a constant current supplier?
Thanks
Jack


I didnt understand Your question what you want to ask. Can You better construct question?

13A on mains (220V) is 2,86KW of power.

You cant supply more current then circuit resistance allow.
 

I mean how come a 13A mains can supply enough energy to both a 520W and 500W supply or even 1000W? as the energy supplied is finite.
Thanks
Jack
 

You understanding of how current works needs adjusting.

A device will only draw as much current as it needs, according to its design. A 13A mains supply simply means that it can supply up to 13A, not that it must supply that much. the 13A limit is usually due to a 13A fuse, and the safe limit of the wiring itself.

Voltage (as provided by a power supply, whether your 500W one or the mains itself) is the 'pushing force' that tries to push charge through things. Current is a measure of the flow of charge that gets pushed through a device connected across the voltage. The device will have some resistance to that flow of charge, either very simply (like a resistor) or more complicated (an 'active' device like a PC, or a 500W power supply). That resistance will determine, in conjunction with the voltage, how much charge actually flows.

So, while the source might be capable of providing a huge flow of charge (high current), the device connected will regulate the flow as it requires.

For simple things, like a heater element, the current will be determined by the resistance of the heater coil by a formula known as Ohm's law:

V = I * R

Where:
V = voltage (across the heating coil resistance, the power supply voltage)
R = the resistance (to the flow of current) of the heater coil, in ohms
I = the amount of current that gets to flow through the heater

Your 500/520W power supplies are more complicated than that simple example, and don't follow Ohm's law, but the same principle of only taking what current they require still applies.

A 500W power supply, like the mains, can supply up to 500W safely (and remember, W = V * A). It will provide a fixed voltage and whatever it is powering will determine how much current is taken (and hence how much power in watts is consumed).

Likewise the 520W power supply will only supply what is needed, but it can supply an extra 20W if the powered device requires it. It will not provide all 520W unless it is demanded by the powered device, and will not draw more current from the mains than it itself requires.

So, the PC that is powered by the power supply will consume a certain amount of power, and this will determine how much current is taken from the power supply (by power =V*I). The power supply passes this power requirement up to the mains; that is its function. The power supply itself also consumes some power for its own functioning, and wastes some as heat because it is not 100% efficient at converting mains voltage to a lower voltage. So, the total amount of power needed by the PC and PSU determines how much current is drawn from the mains.
 
Last edited:

The current drawn by the circuit is constant based on the power rating the rest of current is not drawn at all (in lay man terms) so kindly don't confuse yourself with it
 

I think that you are incorrectly assuming that the mains could only supply 13A. I dont know why you say that. The mains is the electric company supply that is capable of delivery Mega Amps. The factor limiting how much current you can take from it is the wires, circuit breakers, fuses, etc that are between the mains and your outlet.
For example if the outlet is protected by a 13A fuse then it will blow if you try to get more current from it. But with a suitable electric installation you can power much more than 13A for devices that require it.
Also as it may be said before:
Remember that the mains is not a constant current source. It is virtually and ideal Voltage Source. (because of the "unlimited" capability of delivering current that I mentioned before)
 

Status
Not open for further replies.

Similar threads

Part and Inventory Search

Welcome to EDABoard.com

Sponsor

Back
Top