Maker Pro
Maker Pro

LED driver chip question

hi, yet another question about my LED cube design. here is an except from the MBI5026 data sheet

my VLed will be 5v and the leds are rated Voltage:3.0-3.6V current:20ma (that's Vf in the schematic i guess)

so i guess i need a VDrop of about 1v to get Vds in the safe range? is that right? (simple math i know, just want to be sure)

adding 1 resister per LED isn't very attractive to me because i have 800+ LEDS
what about the diode option? do i need a zener diode? a normal diode will give a 0.6v voltage drop, will that do? if i do need a zener, what value?

or should i regulate the 5V 20Amp power down to 4V to supply LED to all the 800+ LEDS
thanks again, steve
 

Attachments

  • MBI5026.jpg
    MBI5026.jpg
    75.3 KB · Views: 522
LED are current driven not voltage, the MBI5026 has fixed output channel current, set it for your LED specs and no resistor is needed...
 
LED are current driven not voltage, the MBI5026 has fixed output channel current, set it for your LED specs and no resistor is needed...

i realize i need to set the current too by putting the correct resistor on the R-EXT pin but my reading of the attached spec, it looks to me like i need add a voltage drop if i plan on using 5v VLed. am i reading the spec wrong?
 
steveeeee,

CocaCola is correct. No separate resistor or other voltage dropper is needed on each LED. Your driver IC does the work. It actually regulates the current for each LED.

There might be some advantage in setting the LED supply voltage to an optimum value different from +5V. If there is more voltage drop across the driver IC than necessary, it will dissipate more power and get hotter.

But if the LED supply voltage is too low, the driver IC will not be able to draw the full set current through the LEDs, so they will be dimmer, and perhaps unevenly so.

Ted
 
steveeeee,

CocaCola is correct. No separate resistor or other voltage dropper is needed on each LED. Your driver IC does the work. It actually regulates the current for each LED.

There might be some advantage in setting the LED supply voltage to an optimum value different from +5V. If there is more voltage drop across the driver IC than necessary, it will dissipate more power and get hotter.

But if the LED supply voltage is too low, the driver IC will not be able to draw the full set current through the LEDs, so they will be dimmer, and perhaps unevenly so.

Ted

i think that is what the data sheet is saying, if the Vds is > 0.4 to 1v, and the current is close to max or all 16 outs, it may overdrive the chip.

i'll play with it and measure current consumption i guess. my feeling is to put a diode in the Vled supply like to drop Vled to to 4.3 V then Vds will be closer to its recommended voltage
 
The chip can dissapate at least 1.7 W depending on the package you are using. At 20ma per LED, that is 320 ma. Dropping a full 2V will only dissapate 0.64 W, less that half the chip is capable of, so I don't see how you would need to drop the voltage externally. If you were trying to drive the max of 90ma, you would have a problem, but not ast 20ma.

Bob
 
Top