Because that's the definition of Ohm's law. Current = voltage / resistance.
Here's one way to think about it. Resistance is the degree of difficulty in moving charge from one place to another. If you increase that resistance, it is harder to move the charge. Current is rate of flow of that charge, hence current goes down.
Chat with our AI personalities
If resistance increases and voltage stays the same, then current decreases. Ohm's Law: Current equals Voltage divided by Resistance.
Ohm's Law says that Voltage = Current x Resistance (Load). Therefore Current = Voltage / Resistance and as resistance decreases current increases and as resistance increases current decreases.
Their relationship is only dependent on the voltage lost across that resistor; voltage equals resistance times current, so increasing the current for a given voltage will require a decrease in the resistance, and vice versa.
The physical equation governing voltage is V = IR, where V is voltage, I is current, and R is resistance. If V remains constant while R is increased, I or current must decrease. Increasing the resistance in a circuit is simply introducing a material that further resists or impedes the electron flow (current), thus current decreases.
If the current is held constant, the voltage will decrease.