Maker Pro
Maker Pro

[HELP] 10w and 50w LED on 220v schematic

Since is my firs post here I would like to say hello to everybody.

I have one problem. Recently I buy 10 Watts LED. It looks like this on the left, and on the right is the internal connection.
$%28KGrHqVHJFIFGW98PnfsBRtoRgdtWg~~60_35.JPG
10w%20led.jpg


I tested it on a DC power supply
on 9V it pulls 1100 mA
on 10V it pulls 1300 mA
on 11V it pulls 1600mA (a lot of heat)

I want to connect this on 220V with transformerless power supply but I cant find connection diagram for that. I need help how to modify this scheme below to output at least 1A
220V-leds.jpg


http://www.marcspages.co.uk/tech/6103.htm
I was looking here how to calculate the capacitor and the resistors but I'm getting some strange results.

In a future i want to buy 100W LED (35V 3000mA) and I want to connect on 220V also. I know with transformer will be a lot easier but the cons of that are will be more expensive and a lot more heavier. I will use this as a portable light and that's why I don't want to use transformer. Any help will be appreciated.
Thank you.
 

(*steve*)

¡sǝpodᴉʇuɐ ǝɥʇ ɹɐǝɥd
Moderator
My very strong advice is: DO NOT try to drive these LEDs like this.

Get a proper LED driver. There are many available. You probably want one that is rated for between 1A and 1.2A.

The LED will need to be connected to a heatsink.

The advantage of using a proper driver is that you do not risk a deadly electric shock if you touch the LED.

edit: proper drivers contain a transformer, but it is small and light-weight.
 
LEDs can't regulate their own current draw, and from a quick check online your LED chip is rated for a maximum of 1050mA. Your power supply is setting what current the LEDs are getting not the other way round. This is why drivers are used as they provide a fixed current for the LEDs and also as pointed out above stop you risking electrical shock with 220V
 

KrisBlueNZ

Sadly passed away in 2015
As well as being dangerous, a direct line-powered LED supply is not practical for currents above around 50 mA. I've simulated a circuit with 1A output and it's very impractical. The input capacitor would have to be extremely large, a large amount of smoothing capacitance would be required to reduce ripple even if a pi filter is used, and the dissipation in the input protection resistor would be extremely high. There are ICs available that can be used to build a mains-powered (off-line) LED driver, although you will have to wind your own pulse transformer. Google off-line LED driver IC.

Edit: Or look on eBay. There's sure to be lots of options available there. But as with all eBay electronics purchases, caveat emptor! (Buyer beware!)
 

(*steve*)

¡sǝpodᴉʇuɐ ǝɥʇ ɹɐǝɥd
Moderator
Another issue (yep, there's more) is that the current is determined by the frequency.

Since noise is predominantly higher frequency than the mains, the effect is that noise comes straight through.

If there is a high energy noise event, the LED will be subject to a large current spike. The same is true of switch-on unless you can time it to be at or near a zero crossing of the mains. The resistor (R2) will limit the switch on current to about 660mA in this circuit, but that resistor would probably be orders of magnitude smaller if you wanted 1A from it.
 
Thank you for the answers. I have to use LED drivers after all. I find on ebay 10W driver for $5.50 and 100W for $24. I think I will go with that.
And also how much heat I have to expect from the 100W LED? If I use pc cooler like this will that be enough? Im guessing I don't need extra heatsink for the driver since it already have one.
CPU-Cooler-for-Intel-P4-478-2-8GHZ.jpg
 

(*steve*)

¡sǝpodᴉʇuɐ ǝɥʇ ɹɐǝɥd
Moderator
Assume you'll get 10W of heat from the 10W LED and 100W from the 100W LED.

Then decide on the temperature you want to be thge maximum that the LED will operate at. Let's say you pick 75 degrees C

Then determine the maximum ambient temperature that the unit will be exposed to. Let's say it;s 40 degrees C.

The difference is the temperature rise you have to play with. In this case it's 35 degrees C.

Divide this by the LED power. So, for the 10W LED you will get 3.5 degrees C per Watt.

Heatsinks are rated in terms of degrees C per Watt. You need to find one rated at (or preferably a little better than) 3.5 degC/W.

If the ambient temperature will only occasionally be 40C, then this may be sufficient. If it will *always* be 40C then you should allow extra margin (lower degC/W rating) to allow for dust build-up on the heatsink.

For the 100W LED, this calculation This would be enormous, unless you go for a forced air solution. The one you show above is probably what you're after (and I've seen similar used for 100W LEDs).

The problem with dust is far greater for forced air heatsinks. For maximum reliability you may wish to have a thermal sensor on the heatsink that will reduce the LED current if the heatsink ever gets too warm. (better a dim light than a dead LED and no light at all).

p.s. that device you show above *is* a heatsink. The metal plate that the LEDs are on is simply a way to efficiently get the heat from the LED die to the heatsink.
 
Top