Maker Pro
Maker Pro

Efficiency of Transformer, increasing output current

In a standard (low voltage eg 40V)transformer - same number of input
turns as output turns, should the effiency go up with increasing the
output current or down?
 
P

PeteS

The efficiency will eventually go down for a number of reasons.

1. Copper losses will increase due to P = I^2R. Increase the current
and you increase the losses in the windings (notably by a square
factor).

2. The input and output windings will heat up, thereby increasing their
resistance (copper has a positive temperature coefficient). With a
higher resistance, you exacerbate the problem in (1)

3. Core and eddy current losses in the magnetic core will increase.
Increased output current (which implies increased input current) will
increase the magnetic flux density. As you increase it, the losses in
the core will increase, up to magnetic saturation, where you can
effectively get no more current (the maximum energy through a
transformer is limited by the magnetics as well as the winding limits).

That's a simple overview - there's a lot more to it, but those are the
highlights.

Cheers

PeteS
 
L

Larry Brasfield

PeteS said:
The efficiency will eventually go down for a number of reasons.

1. Copper losses will increase due to P = I^2R. Increase the current
and you increase the losses in the windings (notably by a square
factor).

2. The input and output windings will heat up, thereby increasing their
resistance (copper has a positive temperature coefficient). With a
higher resistance, you exacerbate the problem in (1)

Agree with 1 and 2.
3. Core and eddy current losses in the magnetic core will increase.
Increased output current (which implies increased input current) will
increase the magnetic flux density. As you increase it, the losses in
the core will increase, up to magnetic saturation, where you can
effectively get no more current (the maximum energy through a
transformer is limited by the magnetics as well as the winding limits).

It is not true that magnetizing losses increase with
output current. They actually go down a little.
This is because the increased IR drop in the primary
reduces the amount of flux change necessary in the
core to provide enough induced voltage to equal the
applied primary voltage adjusted by the IR drop.

For magnetization, you can think of the primary as a
more or less pure inductor [1] in series with the primary
resistance and in parallel with some real impedance
representing the eddy current, hysteresis, (and radiated)
losses. As the voltage across that inductor drops, so do
its losses.

[1. The inductor is usually non-linear, but its inductance
is a monotonic function of current.]

....
 
F

Fritz Schlunder

Larry Brasfield said:
Agree with 1 and 2.


Also agree with 1 and 2. Item 1 is very real and makes a very tangible and
signicant impact and should not be ignored. The influence of item 2 however
is usually quite small for a normal transformer operated with a normal
temperature range. Since most loss and thermal rise calculations are
somewhat approximate anyway, item #2 can often be ignored.

3. Core and eddy current losses in the magnetic core will increase.
Increased output current (which implies increased input current) will
increase the magnetic flux density. As you increase it, the losses in
the core will increase, up to magnetic saturation, where you can
effectively get no more current (the maximum energy through a
transformer is limited by the magnetics as well as the winding limits).

It is not true that magnetizing losses increase with
output current. They actually go down a little.
This is because the increased IR drop in the primary
reduces the amount of flux change necessary in the
core to provide enough induced voltage to equal the
applied primary voltage adjusted by the IR drop.

Agreed.


For magnetization, you can think of the primary as a
more or less pure inductor [1] in series with the primary
resistance and in parallel with some real impedance
representing the eddy current, hysteresis, (and radiated)
losses. As the voltage across that inductor drops, so do
its losses.

[1. The inductor is usually non-linear, but its inductance
is a monotonic function of current.]


I suppose that is one way to think about it. I think of it a little
differently. Ampere's Law would have you believe the magnetic flux density
B is proportional to the number of turns times the current flowing through
those turns in any given magnetic device. Since a transformer is a magnetic
device, it would seem logical that as the output current increases the
magnetic flux density in the core also increases. This would suggest at
some current level the transformer's core would saturate.

This is not the case however for a regular transformer (IE: one not a
flyback transformer, they are different). What one must realize is that a
transformer has two or more independent windings on a single core. Ampere's
Law applies to the primary winding, and it also applies for the secondary.
As the load current on the secondary increases you would tend to get more
flux generated in the core. However, as the primary current increases to
supply that secondary current, the primary winding also generates it's own
flux. If you studiously apply the right hand rule, you will find that the
flux will be in different directions for primary and secondary, and so they
both serve to cancel each other out. As a consequence the flux density in
the core is essentially independent of the load current of the transformer.
In effect the maximum power output rating of a transformer is limited by the
winding resistances, or by total thermal dissipation limits resulting in a
given temperature rise. Practical transformers are thermal dissipation
limited long before winding resistance limited. In theory a transformer
made with superconducting windings could be made very small and output an
outrageously huge amount of power (efficiently at that).

Flyback transformers are different, and have properties more like inductors
than do regular transformers. In the flyback transformer current does not
normally flow simultaneously through primary and secondary windings. At any
given time only one of them conducts. As a consequence you don't get the
flux canceling effect mentioned above for regular transformers. If you keep
increasing the load on a flyback transformer it will eventually saturate the
core.

As for the OP's original question, does the efficiency improve with
increasing or decreasing load current... Obviously at zero output current
the efficiency is zero since any transformer will waste some idle power
primarily due to core hysteresis and eddy current loss. As you apply a
heavier and heavier load the efficiency continues to improve, up to a point.
At some point the I^2*R loss effect will start to dominate and thus the
efficiency will start to decrease again. In practice, where this peak of
efficiency occurs depends on the design of the transformer. Typical
transformers will often be designed to have maximum efficiency somewhat near
(though often a little below) their maximum rated continuous output current.
The efficiency peak is relatively broad.
 
B

Bob Eldred

In a standard (low voltage eg 40V)transformer - same number of input
turns as output turns, should the effiency go up with increasing the
output current or down?

All transformers consume magnetizing current to energize the core. Some of
this is reactive while some is resistive. The resistive portion involve
copper and core losses, mostly core when the current is low. These losses
occur whether there is a load on the transformer or not. If there is no load
on the transformer, no output, there is still magnetizing loses so the
efficiency is zero. Pout/Pin = 0/(small number) = 0. As you increase the
load, the output becomes higher and the copper losses also become higher but
the efficiency increases because there is now an output. The efficiency
continues to increase with load until it reaches some maximum where the
transformer is most efficient. Once that point is reached, the efficiency
will decrease with increasing load because heating will exacerbate copper
losses (higher resistance with temperature). Any transformer can give
several times it rated output current for short durations. Heat is what
limits it. The flux density in the core and the related magnetic losses are
NOT a function of load. They are a function of the primary voltage and not
load, by faraday's law. The secondary load current balances and is in the
opposite direction of the primary current. The flux density stays the same.
When the primary goes positive, current flows into the primary, by
convention. The same polarity winding on the secondary also goes positive
but the current flows out of that winding. The two currents times their
number of turns (amp-turns) balance, one in the other out. This is reversed
every half cycle. That balance means that there is no net flux density
caused by the load.

To sum it up, the efficiency is maximum when the load is at some rated value
usually at or near the nameplate values and is zero with no load and becomes
low again when the transformer is smoking.
Bob
 
Top