Hi everyone,
I have a question about the voltage drop across a resistor with a small resistance i.e. <10 Ohms.
My simple circuit to test this is a voltage source in series with a resistor. The resistor has a value of 4 ohms and is rated at 1W.
If I set the voltage source to 1V I expect a 1V drop across the resistor. However with the 4 Ohm resistor the voltage drop is only 0.6V. If I use a 100 Ohm resistor instead it works as it should, a 1V drop.
Can anyone please explain to me why this is happening?
Thanks,
Chris
I have a question about the voltage drop across a resistor with a small resistance i.e. <10 Ohms.
My simple circuit to test this is a voltage source in series with a resistor. The resistor has a value of 4 ohms and is rated at 1W.
If I set the voltage source to 1V I expect a 1V drop across the resistor. However with the 4 Ohm resistor the voltage drop is only 0.6V. If I use a 100 Ohm resistor instead it works as it should, a 1V drop.
Can anyone please explain to me why this is happening?
Thanks,
Chris
Last edited by a moderator: