Noob here. I'm trying to understand what's going on in these circuits, which I've drawn up to represent real-world circuits I've created:
http://i.imgur.com/ZyyVPHO.png
I have a 130R resistor wired to an LED. The ammeter shows 22.55mA. When I add a second LED in parallel, it goes to 23.73mA, and when I have four LEDs in parallel, it's 24.69mA. (My real-world circuits have LEDs with a forward voltage of 2.4, and a forward current of 20mA, so my real-world measurements differ slightly on the multimeter, but the principle is the same.)
My expectation was that each LED would draw a specific amount of current, such that the total current drawn would be:
That would have meant, however, that the first circuit used ~20mA, while the final circuit used ~80mA. That is apparently not the case, as my ammeter readings are starting around ~20mA and going up ever so slightly each time I add an LED in parallel.
Why is this happening, and what concepts do I need to learn to understand what is going on?
http://i.imgur.com/ZyyVPHO.png
I have a 130R resistor wired to an LED. The ammeter shows 22.55mA. When I add a second LED in parallel, it goes to 23.73mA, and when I have four LEDs in parallel, it's 24.69mA. (My real-world circuits have LEDs with a forward voltage of 2.4, and a forward current of 20mA, so my real-world measurements differ slightly on the multimeter, but the principle is the same.)
My expectation was that each LED would draw a specific amount of current, such that the total current drawn would be:
Code:
I = forward current of a single LED * number of LEDs wired in parallel
That would have meant, however, that the first circuit used ~20mA, while the final circuit used ~80mA. That is apparently not the case, as my ammeter readings are starting around ~20mA and going up ever so slightly each time I add an LED in parallel.
Why is this happening, and what concepts do I need to learn to understand what is going on?