Continue to Site

Welcome to EDAboard.com

Welcome to our site! EDAboard.com is an international Electronics Discussion Forum focused on EDA software, circuits, schematics, books, theory, papers, asic, pld, 8051, DSP, Network, RF, Analog Design, PCB, Service Manuals... and a whole lot more! To participate you need to register. Registration is free. Click here to register now.

why is ac prefered???

Status
Not open for further replies.

joe1986

Full Member level 4
Joined
Jan 25, 2008
Messages
204
Helped
24
Reputation
48
Reaction score
11
Trophy points
1,298
Activity points
2,922
why ac prefered than dc

why do power stations prefer to supply A.C power than directly sending D.C
 

ac prefered over dc reasons

At the beginning of the electric power use in the world, DC was used, not AC.

But then, the engineers found out that, AC is much easier to be generated than DC, and much easier either to convert to different voltages for different applications as well as for long distance transmission. Then AC is choosen.
 

I think that easiness of generation is about the same for DC and AC. The major reason why AC is used is the easiness of power distribution. The AC power line frequency was determined from the best compromise for transformers and electrical machines loss as well as the cost of manufacturing. In 60-ties years of last century engineers started to developed DC power distribution systems because they allow to reduce big AC related loss. Several very high voltage power lines were built up to 1.5 million volts and some of them still work even today. But it is so small percentage of world power distribution system that practically we can consider that only AC still used in practice.
 

    joe1986

    Points: 2
    Helpful Answer Positive Rating
For more info about HVDC, here's a fun and interesting brochure from Siemens. It includes a technical overview with lots of diagrams and photos, and some historic info.

"High Voltage Direct Current Transmission – Proven Technology for Power Exchange"
**broken link removed**
 

but wouldn't it be better to directly send DC, so that we wouldn't require power supplies that convert ac power into dc??
 

You have to consider the ohmic losses.

Say all your appliances in your home at one instant consume 1 kW and you power them with 12 VDC.
The mains cable carrying that power to your house would then have to withstand a current of 83 A.
Multiply that with the number of homes in your town and I'm sure you will see the ridiculous
amount of copper that would be needed - and the power losses in the system.


/Ram

Added after 1 hours 21 minutes:

I should add that aluminium is used in high voltage power lines as, when compared to copper, aluminium has about 65%
of the conductivity by volume, although 200% by weight.
And nowadays it is much cheaper too.

Read more here: https://en.wikipedia.org/wiki/Aluminium_wire

/Ram
 

    joe1986

    Points: 2
    Helpful Answer Positive Rating
If we are using 230/110V DC to supply to home, still we require power convertors(buck regulators). But aren't they as energy efficient as transformers these days? (Why do you use an SMPS for your computer instead of Transformer power supply?)
 

AC is easily converted in magnitude so thats the biggest reason that it can be any where converted to any required voltage level. also Ac provides separation physical seperation in our main and power station lines.
anyway magnitude conversion is the most common reason for AC prefference
 

Another great advantage of AC is that when you break an AC current the electric arc will be estinguished at the
first zero-crossing of the voltage.

If you'd run your house lights on 220 VDC the light switches would wear out after some time due to the arcing.
Not to mention the enormous problems the power companies would have in cutting the power for maintenance.

And then we have the millions of AC induction motors that run day in and day out...

/Ram
 

    joe1986

    Points: 2
    Helpful Answer Positive Rating
joe1986 said:
why do power stations prefer to supply A.C power than directly sending D.C

from :-
https://answers.yahoo.com/question/index?qid=20080719041415AABQ9SE

I assume you mean power transmission, as most automobile transmissions are in steel and aluminum.

AC allows you to use transformers, which can change the voltage to very high numbers, which reduces the current to a low value. Since line losses are due to current, this allows you to ship large amounts of power for long distances without significant losses.

DC, you can't easily change voltages, so you have to transmit the low voltage of 120 volts, and that you can only do for a few hundred yards.
 

Works for high voltage DC transmission lines were done only for huge power lines that convey the electrical energy from power station to the local high voltage distribution center. Then this energy converted to AC with lower voltage, say about 35 kV and then distribute to next level distribution centers where voltage reduced to 6kV. After this energy distribute to the next levels and finally goes to customer as 220/380 of 120/208 AC voltages depends on country standards. So DC was used only for high power long distance transmissions because in this case the power losses are significantly less. The biggest problem was not-high-enough reliability of high voltage and power converters as well as their high cost. I was involved in such works a little bit about 30 years ago, and I am not sure what is today’s situation, but it is probably still the same.

To XNOX_Rambo,

Electrical arc will not extinguish itself after first zero-crossing. This is one of the biggest problem in AC commutation and usually requires special means to prevent equipment damage from arcing. Another example is arc welding. Arc is easy to start, but to stop you need to increase the arc length until air ionization limit exceeds.
 

To RF-OM,

Yes, true when speaking of high-voltage breakers e.g. SF6 breakers but I was referring to "household" applications - presumably
the type that Joe and our fellow Edaboardians are most likely to encounter. :wink:

I myself have some experience of both AC and DC heavy duty contactors - and have had a close encounter
with the arc from a knife-blade breaker @ 700 VDC/20 A. It was pretty scary... 8O

As for welding:
"One disadvantage of AC, the fact that the arc must be re-ignited after every zero crossing, has been addressed with the invention
of special power units that produce a square wave pattern instead of the normal sine wave, eliminating low-voltage time after the
zero crossings and minimizing the effects of the problem."
See Weman, Klas (2003). Welding processes handbook. New York: CRC Press LLC. ISBN 0-8493-1773-8. p. 16
:?:

/Ram

PS As for HVDC - here is a good example of its use today, though it is already 20 years old:
**broken link removed**
 

    joe1986

    Points: 2
    Helpful Answer Positive Rating
To XNOX_Rambo,

I think that the author to whom you cited kept in mind more scientific than practical definition of re-igniting. Even for 50 Hz systems ionization still exist long enough after zero-crossing and when applied voltage exceeds ionization level (say roughly 65 V) the arc will be self-supportive again. Usually there is more than enough time. By the way, arc extinguishing problem mostly exist when we turn off the circuit, not when we turn it on. Actually arc may be self-supportive for long time, sometimes seconds. When I immigrated in 1991 it was impossible to find engineering job and I worked as welder for several months, just recall what I studied when was young and used this skills again. So, I know all this stuff in practice. Now welding machines and processes is one of my areas of consulting, but most of the time I work with spot-welding problems. Sometimes I use the arc welding in my RF and microwave works to weld thermocouples or some parts together when welding is more preferable as soldering. But I use special tiny electrodes for this work.


To echo47,

Thanks a lot. The picture is very real. I saw such cases a few times when I was electrical college student and we went through our practice with power plants and distribution centers. Such arching can be very dangerous and may destroy electrical equipment quickly.
 

wow...........superb replies from all the member's....................one more question i have is why does the U.S use 120 VOLT'S A.C whereas the rest of the world goes with 230 VOLTS A.C.....................out of these two standards which one do you think is a better one???
 

SIMPLE!!!!!!!!!!!!!!!!!!
americans wants to have their own standards.....
 

Some years ago the UK had 240 VAC and the rest of Europe 220 VAC (at least the parts I'm familiar with)
but nowadays it's 230 VAC all over Europe.

In the seventies I lived for some time in Brazil where they had 220 VAC but 60 Hz so the belt driven record player
we brought with us from 50 Hz Europe had to have the pin on the motor shaft replaced with one with
a smaller diameter. Otherwise the records would play too fast. :D

All the differences have historic reasons and it would be ridiculously costly to harmonize the world now.
Just think of the gazillions of plugs and wall sockets that have to be replaced!

We'll make it right on the next planet. :wink:

/Ram
 

    joe1986

    Points: 2
    Helpful Answer Positive Rating
@XNOX_RAMBO NICE one buddy..................wht was the logic behind the AMERICAN'S using 110v A.C when electricity was introduced couldn't the whole world stick with the common standard............???
 

Way back everybody was inventing their own thing and the concept of standardisation didn't exist.
It applies to all areas - just think of the different types of screw threads that still exist: https://en.wikipedia.org/wiki/Screw_thread

And lately we had the HD DVD vs. Blu-ray "battle" so it seems to be a neverending story...

Here are some electric power links:
https://www.eei.org/industry_issues/industry_overview_and_statistics/history/index.htm
https://en.wikipedia.org/wiki/List_of_countries_with_mains_power_plugs,_voltages_and_frequencies

It might be that the USA picked 120 VAC because you stand a better chance of surviving it.
I once took 110 VAC (in a laboratory) from one hand to the other, i.e. over the heart, and although nasty it didn't injure me.
Had it been 220 VAC I might not have made it to the Internet era...

/Ram
 

you seem to be a fantastic researcher , but isn't 110v quite enough to cause harm to a person??
 

Status
Not open for further replies.

Similar threads

Part and Inventory Search

Welcome to EDABoard.com

Sponsor

Back
Top