Transformer Rating: Definition & Why it is in kVA?

Learn what transformer rating means, how it is determined, and why transformers are rated in kVA instead of kW. Understand the role of losses and cooling in transformer rating and efficiency.

What is a Transformer Rating?

A transformer rating refers to the maximum voltage and current a transformer can handle safely. This rating is specified in Volt-Amps (VA) or kiloVolt-Amps (kVA) and is displayed on the transformer’s nameplate. Manufacturers design transformers based on the required voltage and current, ensuring they operate efficiently within their limits.

The rating of a transformer is influenced by temperature rise, which is directly affected by power losses. To maintain safe operating conditions, transformers use cooling systems that help dissipate heat. The efficiency of the cooling system determines the transformer’s overall rating—the better the cooling, the higher the transformer rating.

For example, consider a 100 kVA transformer with a primary voltage of 11 kV and a secondary voltage of 415 V. This transformer is designed to handle a maximum apparent power of 100 kVA while maintaining safe operating conditions. If the cooling system is efficient, it can sustain this rating without overheating. However, inadequate cooling could reduce its effective capacity and lifespan.

transformer rating

Losses and Transformer Rating

A transformer experiences two primary types of losses:

  1. Core losses (constant losses): These depend on voltage (V) and occur due to hysteresis and eddy currents in the core.
  2. Ohmic losses (variable losses): These depend on current (I) and occur due to the resistance of the windings (I²R losses).

Since total losses depend on both voltage (V) and current (I), the transformer rating is based on apparent power (V × I), which is expressed in VA or kVA, rather than kilowatts (kW). This is because losses in a transformer are independent of the power factor of the load.

For example, consider a 500 kVA transformer. If the load operates at a low power factor (e.g., 0.7), the actual power delivered in kW will be lower (500 × 0.7 = 350 kW). However, the transformer still experiences the same losses and must be rated in kVA to account for varying power factors

Why is the Transformer Rating in kVA and Not kW?

A transformer supplies power to loads with varying power factors. For example, if a transformer operates at rated voltage and current but the load power factor is zero, it delivers zero real power (kW) to the load but still supplies its rated apparent power (kVA). Since transformers experience losses regardless of the load power factor, they are rated in kVA rather than kW.

The rated input power (kVA) at the primary winding equals the rated output power (kVA) at the secondary winding, plus losses. However, since transformers operate at very high efficiency, losses are often negligible, meaning:

Rated Input (kVA)≈Rated Output (kVA)

This means the kVA rating marked on the nameplate applies to both the primary and secondary windings. However, this rating represents the full load condition of the transformer.

For example, consider a 200 kVA transformer supplying a commercial building. If the building’s total load operates at a power factor of 0.8, the real power consumed is 200 × 0.8 = 160 kW. Despite delivering only 160 kW, the transformer is still rated at 200 kVA because its losses depend on voltage and current rather than power factor.

Read Next:

  1. Dry Type Transformer
  2. Different Types of Transformers
  3. Power Transformer
  4. Applications of Transformer
  5. Advantages and Disadvantages of Transformer

Leave a comment