I have 16 feet of RGB LED strips, cut into 4 foot segments that are physically located in parallel, wired in parallel. I've been having issues with some channels lighting up when powering the circuit resulting in the MOSFETs becoming very hot within seconds. I've found a workaround(when connected to an arduino) was to perform and analogWrite(0) to the channels. After that, during normal operation even at full duty cycle they would barely be warm to the touch. First question, what could be causing this?
I'm trying to move to a RaspberryPi 3 instead of the Arduino to drive these LEDs, but when I connect them to the circuit via respective GPIO ports, I get insane flicker of all channels(about 10hz). I'm not sure how to resolve this, either. I tried other LEDs and did not get this flickering, so I know its something with my design.
This is the circuit design I used .. https://cdn-learn.adafruit.com/asse...dium800/led_strips_ledstripfet.gif?1448059609 except the MOSFETs I have are NTD3055L104G.
I'm trying to move to a RaspberryPi 3 instead of the Arduino to drive these LEDs, but when I connect them to the circuit via respective GPIO ports, I get insane flicker of all channels(about 10hz). I'm not sure how to resolve this, either. I tried other LEDs and did not get this flickering, so I know its something with my design.
This is the circuit design I used .. https://cdn-learn.adafruit.com/asse...dium800/led_strips_ledstripfet.gif?1448059609 except the MOSFETs I have are NTD3055L104G.