- 28th July 2012, 17:49 #1

- Join Date
- Jun 2012
- Posts
- 7
- Helped
- 0 / 0
- Points
- 247
- Level
- 2

## Effect of increasing frequency on voltage of transformer?

How does the increase in frequecy(e.g. from 50 hz to 60 hz) will effect the voltage(and current) of a transformer secondary? Will it result in increase(or decrease) of voltage at secondary terminals?

- 28th July 2012, 17:49

- 28th July 2012, 20:17 #2

- Join Date
- Nov 2006
- Location
- Wrocław, Poland
- Posts
- 53
- Helped
- 7 / 7
- Points
- 1,652
- Level
- 9

## Re: Effect of increasing frequency on voltage of transformer?

Increasing the frequency won't increase your voltage on the secondary side, simply said the voltage is depending on the amount of windings on both sides.

If You have primary 100 windings secondary 10 then that is what You get, 10 to 1.

It might cause difficulties however because it changes the field strength in the core (and possible saturation of it) and some other (minor) changes in for instance skin effect etc.

However with a change from 50 to 60 Hertz, skineffect is no problem, core saturation only in the cheapest of the cheapest of transformers.

(You might run into trouble if You try to push the frequency further up)

- 28th July 2012, 20:42 #3

- Join Date
- Sep 2007
- Location
- Richmond Hill, ON, Canada
- Posts
- 3,586
- Helped
- 847 / 847
- Points
- 18,287
- Level
- 32
- Blog Entries
- 7

## Re: Effect of increasing frequency on voltage of transformer?

300 Hz transformers benefit from lower mass cores as coupling is improved and impedance is raised. This is why Aircraft industry accepted this standard long ago. There is no going back on 50/60/300Hz as changes have a huge cost. But for SMPS they are starting to use > 1MHz cores for transformers with higher power / size ratio. Iron core is most rugged for power transformers but ferrite has higher permeability for lower cost & size requirements and is used exclusively for very high freq. transformers.

The best question deserves a better answer.Be kind to others**helpedme**& [**solved**]Ohm's Law of life... Tony Stewart EE since 1975

1 members found this post helpful.

- 28th July 2012, 21:57 #4

- Join Date
- May 2012
- Posts
- 20
- Helped
- 3 / 3
- Points
- 321
- Level
- 3

## Re: Effect of increasing frequency on voltage of transformer?

Going from 50 to 60 Hz will have no effect on the output voltage. The current situation is a bit more complicated. The power rating for a transformer is determined by the sum of the core losses and the resistive losses of the coils. Going from 50 to 60 Hz will reduce the core losses but won't change the resistive losses. But if you try to increase the current rating, you will find that the output voltage drops due to the resistance of the coils. Best practice would be to just leave the current rating at the 50 Hz level.

Ed

- 28th July 2012, 21:57

- 29th July 2012, 10:59 #5

- Join Date
- Jun 2012
- Posts
- 7
- Helped
- 0 / 0
- Points
- 247
- Level
- 2

## Re: Effect of increasing frequency on voltage of transformer?

@ walkura

As you said that going from 50 hz to 60 hz won't effect much but what if we move from 50 hz to 300-400hz??

Will it now effect on secondary voltage and current?If yes then how;I mean will it result in increasing voltage(and/or current) or decreasing?

- 29th July 2012, 10:59

- 29th July 2012, 13:37 #6

- Join Date
- Oct 2011
- Location
- lahore
- Posts
- 153
- Helped
- 20 / 20
- Points
- 1,202
- Level
- 7

## Re: Effect of increasing frequency on voltage of transformer?

hello,

With increasing frequency,The transformer energy losses will tend to worsen and the skin effect within the winding will reduce the cross sectional area for electrons flow,thereby increasing effective resistance as the the frequency goes up and power will be lost through resistive dissipation.Magnetic core losses will also increase,eddy currents and hysteresis effect will be more severe.Rgards

- 29th July 2012, 20:43 #7

- Join Date
- Nov 2006
- Location
- Wrocław, Poland
- Posts
- 53
- Helped
- 7 / 7
- Points
- 1,652
- Level
- 9

## Re: Effect of increasing frequency on voltage of transformer?

@Yousuf21

The basic's of it have all been said in the previous post's.

With frequency increasing less of the copper area is used for conducting the current, the temperature will rise because of this (the resistive losses due to the skineffect)

Also the core losses will increase, usually transformers for higher frequency's (then 50 Hertz) have thinner laminate to reduce eddy currents and have more silicon added to the metal (to be more resistive).

All of this has limit's, that is why You will only see "iron cores" at relative low frequency's.

If You do not change the (50 Hertz) transformer itself (the amount of windings etc) your core might saturate or otherwise overheat because of the previous mentioned reasons.

Changing a transformer to 400 Hertz is possible but You would have to recalculate the amount of windings, your wire diameter (and i would keep an eye on how the laminate of the core is)

In itself 400 Hertz is no problem for using iron cores, they produce special iron core material's for higher frequency's (above 50 Hz) in Czech and I am sure they produce them elsewhere too.

(thinner laminate, more silicone added etc)

All You have to do is calculate the windings, wire diameter etc.

And as answer to your question, the voltage might decrease because the resistance of your windings increases compared to your load, and the current might go down for the same reason)

(so even though Faraday's law will show that the field strength goes up the power You might get will go down because of all the increased losses)

Best regards,

Walkura.

- 29th July 2012, 21:19 #8

- Join Date
- Jan 2008
- Location
- Bochum, Germany
- Posts
- 28,593
- Helped
- 8973 / 8973
- Points
- 175,136
- Level
- 99

## Re: Effect of increasing frequency on voltage of transformer?

As a first point, the problem should be clearly specified. I understand that the original question is asking about the implications of using a given 50 Hz transformer at 60 Hz with same input voltage. It's not asking how to redesign the transformer for a different frequency.

Interestingly, we hear different predictions of core loss tendency. In fact we have contrary effects: Core flux is reduced by 20 % which tends to reduce the core losses. Higher frequency on the other hand will increase it. The total effect depends on the core alloy, laminate thickness and peak flux. I would rather expect a small core loss reduction in total, but the opposite tendency is possible as well. Due to the contrary contributions, the overall effect will be small in any case.

Copper resistance will increase by skin effect in principle. Copper skin depth at 50/60 Hz is about 9 mm, so the effect will be only relevant for very thick windings, either high power or very low voltage transformers.

Additionally, increased leakage inductance will raise the reactive part of the series impedance, resulting in slightly reduced output voltage at nominal power.

Taking all effects together, it's no problem to use a low or medium power transformer (e.g. < 100 kVA) specified for 50 Hz at 60 Hz.

- 29th July 2012, 21:31 #9

- Join Date
- Sep 2011
- Location
- Bangalore ,India
- Posts
- 191
- Helped
- 10 / 10
- Points
- 1,485
- Level
- 8
- Blog Entries
- 1

## Re: Effect of increasing frequency on voltage of transformer?

Hello everyone!

I am just curious to know what will happen if a ferrite core transformer is used for low frequency like 50/60Hz.

I know that we can't use iron core transformer[with silicon steel stampings] for high frequencies.

- 29th July 2012, 21:41 #10

- Join Date
- Jan 2008
- Location
- Bochum, Germany
- Posts
- 28,593
- Helped
- 8973 / 8973
- Points
- 175,136
- Level
- 99

## Re: Effect of increasing frequency on voltage of transformer?

I am just curious to know what will happen if a ferrite core transformer is used for low frequency like 50/60Hz.**can**use it, but the saturation flux is lower by a factor of about 5, thus windings per volt are much higher for the same core cross section.

1 members found this post helpful.

- 30th July 2012, 00:12 #11

- Join Date
- Sep 2007
- Location
- Richmond Hill, ON, Canada
- Posts
- 3,586
- Helped
- 847 / 847
- Points
- 18,287
- Level
- 32
- Blog Entries
- 7

## Re: Effect of increasing frequency on voltage of transformer?

As f increases, A switch to ferrite will minimize these losses but at a cost of decreased B, as FvM indicated.

However, the efficiency gains from a higher frequency will more than offset the lower B.

The higher frequency would also allow for a smaller transformer, N and/or area of core, Ac would decrease.

Losses for Eddy currents are as follows;

P =(2 k B ƒ^2 D^2)/ ρ

Where P is the eddy current losses, [W]

k is a constant depending on the shape of the core

B is the maximum induction, [Gauss]

ƒ is the frequency, [Hz]

D is the thickness of the narrowest dimension of the core perpendicular to the flux, [cm]

ρ is the electrical resistivity, [ohm-cm]

Iron core has a orders of magnitude lower resistivity, ρ than ferrite, and higher Bmax, while losses increase with freq.^2,

You can still use Ferrite for 50/60 Hz and the weight is around 20~70 VA/pound of ferrite up to 1500 VA for large torroids.

So they run cooler but tend to be bigger diameter but flatter or low profile than iron core.

Weight may be comparable and not sure about cost.

losses are small compared to eddy current losses. Ferrite materials were developed with narrow hysteresis loops. Since hysteresis dissipation is proportional to the area enclosed by the hysteresis loop, the narrow loops greatly reduces the hysteresis losses.

"Hysteresis core

The proximity effect is caused by eddy currents induced in wires by the magnetic fields of currents in adjacent wires or adjacent layers of the coil. Proximity effect losses are greater than skin effect losses." ...

E = 4.44 B N Ac ƒ x 10^-8

Where;

E = induced voltage, [volts]

B = maximum induction, [gauss]

N = number of turns in the windings [A]

c = cross-section of the magnetic material, [cm^2]

ƒ = frequency, [Hz]The best question deserves a better answer.Be kind to others**helpedme**& [**solved**]Ohm's Law of life... Tony Stewart EE since 1975

1 members found this post helpful.

- 31st July 2012, 17:48 #12

- Join Date
- Jul 2010
- Location
- Pacific NW
- Posts
- 170
- Helped
- 94 / 94
- Points
- 2,188
- Level
- 10

## Re: Effect of increasing frequency on voltage of transformer?

1 members found this post helpful.

- 31st July 2012, 18:18 #13

- Join Date
- Jan 2008
- Location
- Bochum, Germany
- Posts
- 28,593
- Helped
- 8973 / 8973
- Points
- 175,136
- Level
- 99

## Re: Effect of increasing frequency on voltage of transformer?

Good point. I wanted to say increased

*effect of*leakage inductance with rising frequency will raise the reactive part of the series impedance.

Xs = ωLs

- 24th April 2013, 12:13 #14

- Join Date
- Apr 2013
- Posts
- 1
- Helped
- 0 / 0
- Points
- 13
- Level
- 1

## Re: Effect of increasing frequency on voltage of transformer?

I would like you to confirm if I can use a 3 phase step down transformer (input 415 volts 3Phase AC and output 50 volts 3 Phase AC) after a variable frequency drive with fixed frequency of 150 hertz.

- 24th April 2013, 19:29 #15

- Join Date
- Sep 2007
- Location
- Richmond Hill, ON, Canada
- Posts
- 3,586
- Helped
- 847 / 847
- Points
- 18,287
- Level
- 32
- Blog Entries
- 7

## Re: Effect of increasing frequency on voltage of transformer?

It depends on transformer part number and THD of load being used on output. transformers for AC generally have poor loss on harmonics and 150 Hz is already 3rd harmonic as the fundamental.

Pls specify above. If no PN, we can only guess that it will work if you derate power usage and avoid high THD type loads. ( pulse loads like Diode Cap power supplies.)The best question deserves a better answer.Be kind to others**helpedme**& [**solved**]Ohm's Law of life... Tony Stewart EE since 1975