this answer was found concerning computer requirements, but the equation used at the end will produce the same results for whichever type of electronics you're trying to guage the watts on.
# Meets ENERGY STAR requirements
# Line voltage: 100-125V AC or 200-240V AC
# Frequency: 50Hz to 60Hz, single phase
# Maximum current: 6.5A (low-voltage range) or 7.5A (high-voltage range) for 100-125V AC, 3.5A for 200-240V AC
The maximum current would be when starting up or when the fans and processor are at 100%. I can't remember the Energy Star regs, but at idle it is somewhere down around 20W. Add the consumption of the monitor to this. A LCD screen will be way less than a large CRT.
Sustained use would be in the middle - I think the guess of 350W is pretty good for active use.
(Note: Watts = Volts x amps, so 6.5A at 120 V = 780 Watts -- that's at peak. Comparatively, this is like 8 light bulbs or 1/2 of a space heater or a microwave.)
Generally speaking, a couple of amps will power up most televisions. The actual current draw will vary as the size, type and age of the set. Certainly an older tube-type set (vacuum tubes other than a picture tube) will draw more. By looking at the manufacturer's information label, one should be able to discover the wattage rating of the appliance. And by simply dividing the wattage rating by the line (or mains) voltage, one can discover about how much current the unit is rated to draw.
There is no one answer to this. It depends on the actual design of the TV. The TV should indicate somewhere (probably on the back) how many amps it will pull.
If that information is not there, you would have to measure the effective resistance of the TV by hooking a multi-meter to the two ends of the unplugged power cable and seeing how many Ohms it reads.
To then calculate the amps, assuming that the TV is meant to be plugged into standard 120V service, you would take the voltage (120V) and divide it by the resistance that you measured.
100 amps
A deep freezer can draw between 6 to 8 times its running amps on start-up, depending on the model and size of the freezer. For example, if a freezer runs at 6 amps, it could draw between 36 to 48 amps when starting up.
Each 32-watt bulb in a 48-inch fluorescent light typically draws around 0.27 amps. Therefore, a two-bulb setup would draw approximately 0.54 amps in total.
An arc welder can draw anywhere from 20 to 200 amps, depending on the specific type and size of the welder being used. Smaller welders typically draw around 20-30 amps, while larger industrial welders can draw upwards of 200 amps or more.
To calculate the amperage, you can use the formula: Amps = Watts / Volts. In this case, a 400-watt heater cartridge at 240 volts would draw 1.67 amps.
23
amps like.. amplifiers? it depends on how many speakers you have. or amps like.. current draw? again. depends on your power needs, your power amps... ect
1100 watts or about ten amps then another 3 to 4 amps for turn table light and fan
25
1.25 amps
Nothing at all. Increasing the cranking amps will not harm anything. The starter will only draw the amps it needs.
As a continuous load your TVs have to add up to 12 amps or less. There will be a rating plate on TV. At about 3 amps per TV would allow 4.
It depends on the voltage-- I think at 110v it's 4 amps per hp
Twenty amps is zero watts. You are missing one value. W = Amps x Volts. <<>> It depends on the resistance and the draw current in the electrical circuit.
About 0.6 amps for a 12v 21w bulb
Unanswerable. It depends on the number of lamps and their individual power consumptions.
The spec for ignition off draw (all components off, key removed) is .035 amps or less. It can take 5 minutes for the computers to "time out" and go to "sleep".