this answer was found concerning computer requirements, but the equation used at the end will produce the same results for whichever type of electronics you're trying to guage the watts on.
# Meets ENERGY STAR requirements
# Line voltage: 100-125V AC or 200-240V AC
# Frequency: 50Hz to 60Hz, single phase
# Maximum current: 6.5A (low-voltage range) or 7.5A (high-voltage range) for 100-125V AC, 3.5A for 200-240V AC
The maximum current would be when starting up or when the fans and processor are at 100%. I can't remember the Energy Star regs, but at idle it is somewhere down around 20W. Add the consumption of the monitor to this. A LCD screen will be way less than a large CRT.
Sustained use would be in the middle - I think the guess of 350W is pretty good for active use.
(Note: Watts = Volts x amps, so 6.5A at 120 V = 780 Watts -- that's at peak. Comparatively, this is like 8 light bulbs or 1/2 of a space heater or a microwave.)
Generally speaking, a couple of amps will power up most televisions. The actual current draw will vary as the size, type and age of the set. Certainly an older tube-type set (vacuum tubes other than a picture tube) will draw more. By looking at the manufacturer's information label, one should be able to discover the wattage rating of the appliance. And by simply dividing the wattage rating by the line (or mains) voltage, one can discover about how much current the unit is rated to draw.
There is no one answer to this. It depends on the actual design of the TV. The TV should indicate somewhere (probably on the back) how many amps it will pull.
If that information is not there, you would have to measure the effective resistance of the TV by hooking a multi-meter to the two ends of the unplugged power cable and seeing how many Ohms it reads.
To then calculate the amps, assuming that the TV is meant to be plugged into standard 120V service, you would take the voltage (120V) and divide it by the resistance that you measured.
100 amps
A deep freezer can draw between 6 to 8 times its running amps on start-up, depending on the model and size of the freezer. For example, if a freezer runs at 6 amps, it could draw between 36 to 48 amps when starting up.
Each 32-watt bulb in a 48-inch fluorescent light typically draws around 0.27 amps. Therefore, a two-bulb setup would draw approximately 0.54 amps in total.
An arc welder can draw anywhere from 20 to 200 amps, depending on the specific type and size of the welder being used. Smaller welders typically draw around 20-30 amps, while larger industrial welders can draw upwards of 200 amps or more.
To calculate the amperage, you can use the formula: Amps = Watts / Volts. In this case, a 400-watt heater cartridge at 240 volts would draw 1.67 amps.
I assume you are talking about the newest Samsung 65", the one with the curved screen. That draws 276 watts which, using watts=volts x amps and assuming your voltage is 115v, gives you 2.4 amps.
100 amps
23
amps like.. amplifiers? it depends on how many speakers you have. or amps like.. current draw? again. depends on your power needs, your power amps... ect
1100 watts or about ten amps then another 3 to 4 amps for turn table light and fan
It depends on how many amps each TV draws. The continuous load should be 80% of teh breaker or 12 amps. If an average TV draws 2.5 amps that would be 4 TVs. Look for a rating plate on TV and just add the currents up.
Amps for an oven are governed by the total wattage of the oven and what the voltage supply to the oven is.
It is drawing .06 amps.
A deep freezer can draw between 6 to 8 times its running amps on start-up, depending on the model and size of the freezer. For example, if a freezer runs at 6 amps, it could draw between 36 to 48 amps when starting up.
1 AMP
It would be at least 250 amps, maybe 300 amps.
Each 32-watt bulb in a 48-inch fluorescent light typically draws around 0.27 amps. Therefore, a two-bulb setup would draw approximately 0.54 amps in total.