this answer was found concerning computer requirements, but the equation used at the end will produce the same results for whichever type of electronics you're trying to guage the watts on.
# Meets ENERGY STAR requirements
# Line voltage: 100-125V AC or 200-240V AC
# Frequency: 50Hz to 60Hz, single phase
# Maximum current: 6.5A (low-voltage range) or 7.5A (high-voltage range) for 100-125V AC, 3.5A for 200-240V AC
The maximum current would be when starting up or when the fans and processor are at 100%. I can't remember the Energy Star regs, but at idle it is somewhere down around 20W. Add the consumption of the monitor to this. A LCD screen will be way less than a large CRT.
Sustained use would be in the middle - I think the guess of 350W is pretty good for active use.
(Note: Watts = Volts x amps, so 6.5A at 120 V = 780 Watts -- that's at peak. Comparatively, this is like 8 light bulbs or 1/2 of a space heater or a microwave.)
Chat with our AI personalities
Generally speaking, a couple of amps will power up most televisions. The actual current draw will vary as the size, type and age of the set. Certainly an older tube-type set (vacuum tubes other than a picture tube) will draw more. By looking at the manufacturer's information label, one should be able to discover the wattage rating of the appliance. And by simply dividing the wattage rating by the line (or mains) voltage, one can discover about how much current the unit is rated to draw.
There is no one answer to this. It depends on the actual design of the TV. The TV should indicate somewhere (probably on the back) how many amps it will pull.
If that information is not there, you would have to measure the effective resistance of the TV by hooking a multi-meter to the two ends of the unplugged power cable and seeing how many Ohms it reads.
To then calculate the amps, assuming that the TV is meant to be plugged into standard 120V service, you would take the voltage (120V) and divide it by the resistance that you measured.
A TV, or other electrical device, doesn't use "watts per hour", it uses "watts", which is already a unit of power (energy / time). The amount of power used varies widely, depending on size and technology; for a specific TV set, look on the back for the electrical specifications.
The average TV draws around 1-2 amps of current when in use. However, this can vary based on the size and model of the TV.
there is a place on your tv that tells you the amperage. just look on the back or in your owners manuel
It depends on the type and age of the TV set, but a general answer is between 100 and 300 watts.
100 amps
A deep freezer can draw between 6 to 8 times its running amps on start-up, depending on the model and size of the freezer. For example, if a freezer runs at 6 amps, it could draw between 36 to 48 amps when starting up.
Each 32-watt bulb in a 48-inch fluorescent light typically draws around 0.27 amps. Therefore, a two-bulb setup would draw approximately 0.54 amps in total.
An arc welder can draw anywhere from 20 to 200 amps, depending on the specific type and size of the welder being used. Smaller welders typically draw around 20-30 amps, while larger industrial welders can draw upwards of 200 amps or more.
To calculate the amperage, you can use the formula: Amps = Watts / Volts. In this case, a 400-watt heater cartridge at 240 volts would draw 1.67 amps.