Obviously! It will give more out put when we operate a transformer (of same rating) on 50 Hz instead of 60 Hz.
Because in previous posts, we proved that, in inductive circuit, when frequency increases, the circuit power factor decreases. Consequently, the transformer out put decreases.
Let’s consider the following example.
Suppose,
When Transformer operates on 50 Hz Frequency
Transformer = 100kVA, R=700Ω, L=1.2 H, f= 50 Hz.
XL = 2πfL = 2 x 3.1415 x 50 x 1.2 = 377 Ω
Impedance Z = √ (R2+XL2) = √ (7002 + 3772) = 795 Ω
Power factor Cos θ = R/Z = 700/795 =0.88
Transformer Output (Real Power)
kVA x Cos θ
100kVA x 0.88
88000 W = 88kW
Now,
When Transformer operates on 60 Hz Frequency
Transformer =100kVA, R=700Ω, L=1.2 H, f= 60 Hz.
XL = 2πfL = 2 x 3.1415 x 60 x1.2 = 452.4 Ω
Impedance Z = √ (R2+XL2) = √ (7002 + 452.4 2) = 833.5 Ω
Power factor = Cos θ = R/Z = 700/833.5 =0.839
Transformer Output (Real Power)
kVA x Cos θ
100kVA x 0.839
=83900W = 83.9kW Output
Now see the difference (real power i.e., in Watts)
88kW- 83.9kW = 4100 W = 4.1kW
If we do the same (As above) for the power transformer i.e, for 500kVA Transformer, the result may be huge, as below.
(Suppose everything is same, without frequency)
Power Transformer Output (When operates on 50 Hz)
500kVA x 0.88 = 44000 = 440kW
Power Transformer Output (When operates on 60 Hz)
500kVA x 0.839 = 419500 = 419.5kW
Difference in Real power i.e. in Watts
440kW – 419.5kW = 20500 = 20kVA