1.25 A
The bulb with the lowest resistance. Current = Volts / Resistance
The total current in a circuit consisting of six operating 100 watt lamps connected in parallel to a 120 volt source is 5 amperes. Since power is volts times amps, take 600 watts (100 times 6) and divide by 120 volts to get 5 amps.
Volts = Current x Resistance. You have 24 Volts divided by 2 ohms and the draw will be 12 amps. Your batteries will fail quickly if not spectacularly.
In general, increased resistance will lower current draw. See ohm's law (V = IR)
Impossible to say without knowing the voltage
20A By using V=IR
The bulb with the lowest resistance. Current = Volts / Resistance
You need to have the amperage to determine how many volts you get out of 20 watts.
Using Ohm's Law (E = I R) Voltage = Current x Resistance or switch around to get R = E / I: 115 volts / 8 Amperes = 14.375 Ohms The above is correct for DC current but is close enough to be used for AC current.
A 10 watt bulb is defined by the voltage supply and the resulting current. So to make the math simple, suppose you have a 10 watt incandescent bulb designed to work at 20 volts. That means it will draw 1/2 amps. Watts = Volts x Amps. The resistance of the bulb is then Volts / Amps so in this case the resistance of the bulb would be 40 ohms. So our mythical bulb has a resistance of 40 ohms with 20 volts across the bulb in our example. Now if we put two of these bulbs in series with the same 20 volts we now have a total resistance of 80 ohms supplied by 20 volts and the circuit will draw 1/4 amp. This lower current will cause the bulbs to be dimmer.
Twenty amps is zero watts. You are missing one value. W = Amps x Volts. <<>> It depends on the resistance and the draw current in the electrical circuit.
It depends on how many volts it has.
The total current in a circuit consisting of six operating 100 watt lamps connected in parallel to a 120 volt source is 5 amperes. Since power is volts times amps, take 600 watts (100 times 6) and divide by 120 volts to get 5 amps.
If the question asks how 13.5 volts can be supplied to a device that draws 20 amps (nominally), the supply responds to the setpoint selected (13.5 volts). The supply's voltage has the ability to actually change as the dynamic resistance of the device it supplies changes. That's weird because we want the voltage to stay the same. The supply is actually changing the amount of current it supplies as the resistance of the load changes, and this will keep the applied voltage fixed at 13.5 volts. How does that work? We know that for a given resistance, if we wish to supply a constant voltage, we will get a fixed amount of current draw. As the resistance changes (goes down) due to thermal effects, the supply will actually deliver more current to maintain the 13.5 volts. In this way, the supply can accomplish voltage regulation. It's classic Ohm's law. The volts equals the current times the resistance. If voltage is to remain constant, then the current times the resistance will have to remain constant. The onlyway this can happen is that as resistance goes down, current must go up. As the device heats up and its dynamic resistance decreases, the current it "demands" to keep the supplied voltage at the 13.5 volts goes up. The supply does all this automatically.
Check the current draw that is on the label of the ballast.
Volts = Current x Resistance. You have 24 Volts divided by 2 ohms and the draw will be 12 amps. Your batteries will fail quickly if not spectacularly.
In general, increased resistance will lower current draw. See ohm's law (V = IR)