answersLogoWhite

0


Best Answer

Because a transformer is only in use when it is under load. An actual power factor requires a load and it is neccessary if you are going to calculate for kilowatt.
The Power equation is

P = V I cos #

Where cos # is power factor

so its depend on load but
VA or Volt-Amperes is the unit that is used to rate how much apparent power the transformer can produce before saturating. Transformers can all come in different shapes and sizes and all vary from VA rating. a 1KVA transformer is the equivalent of a 1000VA transformer(very big). the 'K' in front of any unit means 1000 i.e 1k resistor is the same as a 1000Ω resistor.
You might think it should be rated in kilowatts instead. KVA sound like they are algebraically identical to kilowatts.

Not all electrical loads will have voltage multiplied by current, equal the power consumed by the load. That is only the case for simple resistive loads.

Some electrical loads have the ability to store energy over the cycle, and cause the current and voltage to be "delayed" from one another. The total power, is therefore less than the product of the voltage and current that are supplied to the load.

A transformer must be sized, so that each of its windings can handle a given operating voltage, and a maximum current, and still supply the intended loads. Due to a loads delaying of current and voltage, the current might be much greater than the value of real power divided by voltage. We need to size wire windings according to what the current will be. I.e. the kVA rating divided by voltage.

A perfect transformer would have the same operating kVA on both sides of it, just with voltage traded for current differing between the windings.

User Avatar

Wiki User

11y ago
This answer is:
User Avatar
More answers
User Avatar

Wiki User

9y ago

Because the generator has separate maximum ratings for voltage and current. The VA rating is the product of those ratings. This is independent of the load, which might have any power factor between 0 and 1.

This answer is:
User Avatar

User Avatar

Wiki User

10y ago

The alternator runs at a specified maximum voltage and a specified maximum current. Multiplying the two together gives the VA or kVA rating for the alternator.

For a resistive load with unity power factor, the kW rating is equal to the kVA. For other types of load multipy the kVA by the power factor to find the kW rating. If in doubt, assume 0.8 for the power factor and check the alternator for overheating.

This answer is:
User Avatar

User Avatar

Wiki User

9y ago

It's because these devices have a limiting voltage they work at, and a limiting current as well. The two are unrelated so they are multiplied together to give the kVA limit.

If a load is applied that has a poor power factor, so that the power in kW is a lot less than the kVA, the generator's (or transformer's) voltage and current limits must still be observed.

This answer is:
User Avatar

User Avatar

Wiki User

9y ago

A transformer has separate ratings for the working voltage and the working current. Multiply the two together and you get the VA rating, more commonly the kVA or MVA rating for power transformers.

The current rating is set by how much heat can be wasted in the wire windings in the transformer, while the voltage rating depends on the heat generated in the magnetic core.

This answer is:
User Avatar

User Avatar

Wiki User

11y ago

because intransformer kva is same for primary and secondary, so the total complex power (S) is constant. So giving rating in kva gives information about both the primary & secondary.

in generators kva shows the total complex power developed by generator, so generators also rated in kva.

Answer

Because their rating is based on the product of their rated voltage and rated current, and this is expressed in volt amperes not watts. To express their ratings in watts, it's necessary to know the power factor of the load they will supply and, of course, the manufacturer has no means of knowing this.
This is approximately equal to kW, but it is not exactly the same. In the case of AC, voltage and current are not necessarily in the same phase. If they are in the same phase (i.e., synchronized), then watt = volt x amperes. Otherwise (if voltage leads or lags current), a power factor (the cosine of the angle between voltage and current) must be included as a factor.

Now, the probable reason they are rated in VA or a multiple such as kVA, as opposed to "watts", is because they can provide a certain amount of voltage, and a certain amount of current (amperes), up to a certain amount. If a machine connected to the generator has a lot of induction (causing the discrepancy between voltage and current), then the generator can generate less power (watts) for the same voltage and current - but the real limitation of the generator is the voltage and current.

This answer is:
User Avatar

User Avatar

Wiki User

7y ago

The short answer is because AC circuits are weird.

What it boils down to is that engineers have decided to use volt-amperes for talking about "apparent power", and watts for talking about "true power". True power is apparent power multiplied by a "power factor" which depends on the phase angle between the current and voltage in a circuit. For a purely resistive load, the phase angle is zero and true power is the same as apparent power. For a load which is partly resistive and partly reactive (most AC circuits), there will be a difference in phase, and the power factor is the cosine of the phase angle, so true power will be less than apparent power (there's also something called reactive power, which is measured in VAR for volt-ampere reactive, but let's not confuse things any more than they already are).

This answer is:
User Avatar

User Avatar

Wiki User

14y ago

the power factor of the load is not fixed on the transformer that why the trans. rating is not given in KW.

This answer is:
User Avatar

User Avatar

Wiki User

13y ago

because in alternators copper losses depends on current having unit Ampere and iron losses depends on voltage having unit voltage, so transformer are always rated in KVA.

This answer is:
User Avatar

User Avatar

Wiki User

12y ago

kva- kilo volt amphere. not only alternator but every electrical device output capacity is measured in kva.

This answer is:
User Avatar

Add your answer:

Earn +20 pts
Q: Why is the transformer rated in KVA?
Write your answer...
Submit
Still have questions?
magnify glass
imp
Related questions

What is kva rating of single phase transformer?

This is the rated output of the transformer, obtained by multiplying the rated secondary voltage by the rated secondary current. And it's 'kV.A', not 'kva'.


Why transformer rated kva?

The correct symbol for kilovolt amperes is 'kV.A, not kva. A volt ampere is the product of the transformer's secondary rated voltage and its rated current. It is not rated in watts, because the transformer designer has no idea what sort of load is to be applied to the transformer, and it is the load that determines the amount of watts, not the transformer.


How do you Convert va to kva?

a kva is 1000 vaK is kilo, which means 1000 similar to how a kilometer is 1000 metersTransformers are usually rated in KVA, so a 45 KVA Transformer is a 45 000 VA Transformer


What is the current of a 45 kva transformer?

Presumably, you are asking what is the rated secondary current for a 45 kV.A (not 'kva') transformer? The answer depends on its rated secondary voltage. To obtain the rated secondary current, you divide the (apparent) power rating by its secondary rated voltage.


What is the full load current for1600 kva transformer?

It depends on the rated voltage. Take 1600 KVA and divide by KV, and you will get A.


480 transformer to 120 ac current what is the kva on transfomer?

The kVA rating will be listed on the transformer's nameplate, which is usually on the front of the transformer. The 480v to 120v is irrelevant, because many transformers with different kVA ratings convert 480 volts to 120 volts. The kVA ratings can be different and thus affect the rated current through the transformer.


How are transformer rated?

kva k-kilo v-voltage a-amps(current)


Why transformer are rated in kVA instead of KW?

Transformers are rated in VA or kVA. That is because the voltage is limited by the power loss in the magnetic core, and the current is limited by the power loss in the resistance of the windings. The rated voltage times the rated current gives the transformer's rating in kVA.


How many amps will a 37.5 KVA transformer carry?

It depends on the rated voltage of its secondary.


How can measure transformer KVA?

the capacity of a transformer is defined as a product of voltage and current flowing through it.AS THE CURRENT IS MEASURED IN AMPERES AND VOLTAGE IN VOLTS, Hence transformers are measured/rated in KVA


Does a transformer convert the current from 240 volts to 120 volt?

Transformers are rated in KVA or VA (volt-amps). They transform voltages from one value to another. The current in a transformer is inverse to the voltage. This is why transformers are rated in KVA and smaller ones in VA.


Why transformer is rated inkva?

Because it's the product of the transformer's rated secondary voltage and its rated secondary current. The product of voltage and current, in a.c., is the volt ampere.Incidentally, it's 'kV.A', not 'kva'.