Let’s say you have a 9V battery and you want to use it to power a red LED. You look at the LED specifications and you see that it requires a forward voltage of 2V and a current of 20 mA. If you connect the LED directly to the battery, what happens? You will get some light, if you are lucky, and then your LED is likely to be fried.
Why is that? Your LED can’t survive 9V of power! Nor can it survive to a current of more than 20mA. Well, if your voltage/current is slightly higher than what it should be, your LED may survive for a while but it will not last as long as it should if used properly.
The solution to that problem is simple. You simply need to add a resistor in series with your DEL. Because your DEL uses 2V, it leaves 7V to be handled by your resistance.
This is where you use Ohm’s law to figure out how much resistance your resistor should have.
Ohm’s law says V = RI, so Voltage (in Volts) = Resistance (in Ohms) X Current (in Amps).
It can be rewritten as R = V/I.
In our case, we have R = 7V / 0.020A = 350 Ohms. You may have an hard time finding a 350 Ohms resistor because it’s not a standard value. You should use the next largest value you can get, which is a 360 Ohms resistor. By using that resistor, you just ensured that no more than 20mA of current will flow through your LED.
We are not done yet! The next step is to make sure your resistor power rating is appropriate! If it is too small, your resistor will generate too much heat and will eventually die. There are different resistor power ratings available including: 1/8 Watt, 1/6 Watt, 1/4 Watt, 1/2 Watt, … Usually, the larger the power rating, the larger the resistor.
To find out what we need, we can use the electric power formula: P = VI, so
Power (in Watts) = Voltage (in Volts) X Current (in Amps).
In our example, that would result into:
P = 7 Volts X 0.020A = 0.14 Watts.
So, you would need a resistor of 1/6 Watt or more for that circuit.