Continue to Site

Welcome to EDAboard.com

Welcome to our site! EDAboard.com is an international Electronics Discussion Forum focused on EDA software, circuits, schematics, books, theory, papers, asic, pld, 8051, DSP, Network, RF, Analog Design, PCB, Service Manuals... and a whole lot more! To participate you need to register. Registration is free. Click here to register now.

Significance of 220V Power Supply

Status
Not open for further replies.

purifier

Full Member level 4
Joined
Nov 30, 2004
Messages
220
Helped
9
Reputation
18
Reaction score
8
Trophy points
1,298
Activity points
2,570
In India, we have 220V Power supply whereas in US its 110V... What is the significance of these values?

What is so special about 220V? Why not generate 180V or any other number?
 

actually 110V for safety..although it needs tick cable because consumes 2x ampere

but 220V system is economical.
 

Also at 220V it is used 50Hz frequency, and for 110 is 60Hz one.
 

epp said:
Also at 220V it is used 50Hz frequency, and for 110 is 60Hz one.

voltage and frequency are completely different. the reason for using 110V has already been stated above.

but the frequency is different because 60Hz is more efficient than 50Hz. but its not such a big difference. actually Nikola Tesla who developed all the theory of AC generation and transmission calculated that 60Hz would be the most efficient.
 

Is it possible for someone to give me a small proof as to how 220V is more efficient than 110V...?
 

Basically, stepping up to higher voltages require less current flow for a given power output. When you consider the long power transmission cables to have fixed resistance, the lower the current, the lesser power that is wasted as heat disspation on the transmission cables. In long range power transmission, transmitted voltages are in the range of 100kV.
 

220V can be transfered more efficiently than 110V because less current is needed to transfer the same amount of energy.

Less current means less resistive loss in the conductors, unless you use thicker (more expensive) conductors.

Imagine running a 2KW heater. at 220V, this would draw 9.1 amps. At 110V, it draws 18.2 amps. Now imagine the supply wires have a resistance of 1 ohm. The 9.1A current would therefore heat the wire with 1x1x9.1 = 9.1 watts. That's 9.1 watts power wasted. The same wire with 18.2A would heat the wire with 18.2W of wasted power.

Edit: this is WRONG!
Let this stand as a lesson never to try to be clever after sleep deprevation. The answer is correctly provided by Borber below.


It is the same reason electrical distribution lines use hundreds of thousands of volts. If they sent 220V or 110V) in the big power lines, they would need to supply millions of amps! Imagine the waste heat with that.

FoxyRick.[/b]
 

Correct calculation of power loss for 1ohm cables carrying 9.1A at 220V and 18.2A at 110V is 9.1x9.1x1=81.81W and 18.2x18.2x1=331.24W respectively what is 4 times more. Power is P=I^2xR.
 

So, you mean to say More Power = More Heat = More losses and that ultimately explains everything?
 

I am not shure that is ultimate explanation. It's only one of the aspects. Decision which system will be used depends mainly on economical analisys.
 

    purifier

    Points: 2
    Helpful Answer Positive Rating
Power is power; It doesn't matter if it's 110V @1A or 220V @0.5A. Sounded to me like the original question was asked to determine which is better. So my answer is a rhetorical question: Which is better, foot/pound/Fahrenheit or meter/gram/Celsius? The answer to both questions is the same. (Let's not start a war on this question...)

The truth of the matter is that power generation and distribution is done at thousands of volts. Only at the termination (end user's home or business or street/block) is it stepped down to 110V or 220V, whatever is needed. The U.S. uses 220V in some cases, but it is less common. At home my clothes drier requires a 220V line while the rest of the appliances require 110V. 110V is delivered to the house in three phases and in the circuit breaker box these phases are then tapped to give house wiring the proper voltage(s) for the appliances they are serving.
 

Oops, that's what I get for answering questions after 2 hours sleep!

Of course, it is I²R. Why did I do R²I ??? I was even thinking I²R as I wrote it.

Thank you Borber for pointing this out. I actually teach this stuff as well!

:oops:

I've just spent the last few hours soldering up a board. I hope I haven't made any similarly stupid mistakes on that. I did repeatedly have to check the transistor pinouts due to sleepiness.

Yours very embarassed and sleepy,
FoxyRick.
 

In that case why didn't they give us 180V instead of 220V... I agree that Power is Power but there should be somthing about 220V and that something was pointed out by FoxyRick and Borber... Thought of any other reasons?
 

Here in Brazil we have a lot of different voltages at our houses.
For example in my city, we have both 115V and 230V @60Hz available. 230V lines are composed of two 115V lines phased 180 degrees of each other. For normal appliances we use 115V (TVs, lamps, etc). For high current devices (electrical heaters, etc) we use 230V.

Some cities only have 220V for domestic uses. Others have 127V and 220V (derived from a three phase arrangement network). So, I think that there is no standard of voltage. The same happens in the World.

Also in three phase systems, we can have 220V and 380V.
 

Transformers for 60Hz are smaller and lighter than for 50Hz.
Just a point!
 

Just have a look to

h**p://www.school-for-champions.com/science/ac_world_volt_freq.htm
 

Status
Not open for further replies.

Part and Inventory Search

Welcome to EDABoard.com

Sponsor

Back
Top