answersLogoWhite

0

The 200 watts on an amplifer would refer to output if it were turned up all the way. This does not tell you how much electricity the amplifier would use. There is some power lost from heat, but we don't know just how much, from the information above.

User Avatar

Wiki User

15y ago

What else can I help you with?

Related Questions

How do you work out how many amps a forty watt bulb uses?

To calculate the amperage of a 40-watt bulb, you need to use the formula: Amps = Watts / Volts. If the bulb operates at 120 volts (standard for US households), the amperage will be 0.33 amps (40 watts / 120 volts).


How many amperes does a standard 60 watt light bulb use?

Watts = Volts X Amps. Amps=Watt / Volts. So, with a 240V mains, a 60W bulb draws 0.25amps. On a 12 system (car/auto) a 60W bulb draws 5 amps. On a 110V mains, a 60W bulb draws .55 Amps.


How many amps does a 65 watt light bulb have at 120 volts?

The amps drawn by a 65 watt light bulb should be 65/120 or 0.54167. This fraction of an ampere may be restated as 541.67 milli-amps.


How many amps does a 50 watt bulb use?

You just have to divide the watts by the voltage to find the amps. For example 60 watts on a 120 v system would take ½ amp.


How many watts does a 12volt X 65milliamp bulb use?

Watts = Amps x Volts. Milliamps to amps move the decimal place three places to the left.


How many amps in a 50 watt light bulb?

To find the current in amps for a 50-watt light bulb, you can use the formula ( I = \frac{P}{V} ), where ( I ) is the current in amps, ( P ) is the power in watts, and ( V ) is the voltage in volts. For example, if the bulb operates at 120 volts, the current would be ( \frac{50}{120} ), which equals approximately 0.42 amps. If the bulb operates at a different voltage, simply adjust the voltage in the formula accordingly.


How many amps does a 65 watt light bulb use?

Amps x volts = watts So, assuming you are running on 110 volt line, the answer is 65 watts/110 volts=.591 amps.


How many ampere in 240 volt with 100 watts bulb?

To calculate the amperage, you can use the formula: Amperes = Watts / Volts. So for a 100-watt bulb connected to a 240-volt source, the amperage would be 0.42 Amps.


How many amp hours does a 23 watt bulb use?

The formula for finding amperage is I = W/E. Amps = Watts/Volts. Without the value of the voltage to the bulb this question can not be answered.


How many amps in a 300watt bulb?

There are zero amps in a 300 watt bulb. Watts are the product of amps times volts. W = A x V. To find amperage use the equation A = W/V. If you find the voltage of the 300 watt bulb uses, divide the wattage by the voltage and it will give you the amps of a 300 watt bulb.


What equation would you use to calculate how much current in AMPs is required to achieve the full 100 to output from a volt bulb?

The bulb is marked with the power (watts) and the voltage. Divide the watts by the volts and you have the amps.


How many amps does ATM use?

10 amps