Maker Pro
Maker Pro

help with LM317T current limiter circuit

I made a simple circuit using the LM317T to regulate the current going through an LED. The voltage drop between ADJ and Vout is supposed to be 1.25V and the current is limited based on a resistor placed between the adj and output terminals where current out = 1.25/Resistance;
http://www.instructables.com/id/Super-simple-high-power-LED-driver/
But when i measure the voltage across the output pins of the LM317 I get 0.73 between the ADJ and Vout and 2.35 volts dropped across the LM317 in total meaning my LED isn't getting enough volts to light up as brightly as it should. The supply is 5.3 volts and the LED is only seeing 2.8 volts across it when i measure it using my multimeter, the rest being dropped across the LM317T. The LED is specified for 3.5 volts and should be a lot brighter than it is. Also when i measure the current out of the LM317 i get 125mA as i should based on the fomula, until i connect up the LED then it drops to 50mA.
Can anyone offer any suggestions as to what i'm doing wrong? Thanks.
LM317.jpg
 
Last edited:
In case it isn't obvious i'm just starting to learn the basics of circuitry and don't really know as much as i should.
 
If youre dealing with something like an LED, a regulator IC is really a bit extravagent. 1 resistor should be enough to limit current to the LED.

In order to work out the value you need, you should know the voltage your're working with and the voltage drop across the LED (this should be in the spec, usually about 2V). Subtract this value from your starting voltage and use Ohm's law (V=IR) to work out the resistance based on the required current. If you're already using a current limiter I assume you know this. Standard LEDs are usually around 20mA. I see you're using 3V so if we take those assumed values (dont use these if your specs are different) then R = V / I = (3 - 2) / 0.02 (current in amps) = 1 / 0.02 = 50 ohms: thats your resistor value, but you'll want to go slightly above that - probably nearer 70 to account for tolerances etc.
 
Gonzo - after further tinkering and some help from elsewhere it turns out you are exactly right. Good to have confirmation though. I've got it working properly now (i think), thanks.

The reason i'm using the lm317 is that it should allow me to power LEDs up to about 3 watts and will allow me to use PWM for dimming. I've made basic LED torches before using just a resistor but wanted to do something a bit more involved with this project. I'm making an LED table lamp with a dimmer.
 

(*steve*)

¡sǝpodᴉʇuɐ ǝɥʇ ɹɐǝɥd
Moderator
To expand on GonzoEngineer's explanation, the LM317 does indeed maintain a voltage difference between the output and adj pin of around 1.25V, but it also requires about 2V between the in and the out pins to allow it to operate.

When connected as a current limit, the 1.25V sets the current limit (by selecting a resistor which will drop 1.25V at the required current, and the excess voltage is dropped across the in and out pins of the regulator.

However, when the voltage differential across the in and out pins gets to about 2 volts (the actual value depends on current, temperature, and random variation between devices) the voltage across the resistor cannot be maintained, and the current falls.

To put it simply, you need about 3.25 (say 3.5) volts MORE from your supply than the maximum voltage across your load.

So to light your LED at the full current (assuming it will have about 3.2 volts across it, you would need a power supply of at least 3.5 + 3.2 = 6.7 volts.

I don't know whether you're doing this to learn about voltage regulators, or to power the LED.

If it's the latter then as long as your power supply is fairly constant in voltage, Raven Luni provides an appropriate answer.

If you're learning about regulators, or you want to light the LED from a varying voltage (say from 3 to 30 volts) then your approach is sound.

There are voltage regulators which have a lower dropout voltage (called, unsurprisingly, low-dropout regulators) and these may reduce the overhead to 1.25 + 0.5 volts or even less.

The drawback with low dropout regulators is that they are less stable and may require a trick or two to make them operate correctly. I've not tried them as constant current sources and I'd have to check the datasheet(s) to see if the manufacturer has any specific recommendations.

A third alternative is to create a discrete constant current source. These are not perfectly accurate, but may exhibit a lower voltage drop (at least in part because the reference voltage is 0.7 volts). Note that this one has an input to switch the LEDs on and off. This can be connected to V+ to keep them on all the time, or to the collector of the top transistor to make it a 2 terminal device. The 4.7M resistor on the input seems too high for my liking, especially if it is connected directly between the base and collector of the top transistor.

edit: ah! high power LED. Yes, constant current source is good. But consider switchmode devices. For 1 LED the regulator will dissipate almost 50% of the power used by your circuit, and this will get worse as the input voltage rises (although it will improve as you add more LEDs in series)
 
But consider switchmode devices.

Thanks for your advice, i will. Is there a simple example you could recommend please?

I don't know whether you're doing this to learn about voltage regulators, or to power the LED.

A bit of both, but i chose this design as it looked very simple and i could see how it worked (more or less) and it can be easily pulse width modulated.

So to light your LED at the full current (assuming it will have about 3.2 volts across it, you would need a power supply of at least 3.5 + 3.2 = 6.7 volts.

I will look for a 7 to 8 volt battery pack and / or wall adapter i can use, thanks.
 
Top