Maker Pro
Maker Pro

Driving a high-frequency IR emitter

I'm looking for some help on driving some IR emitters at high frequencies with a duty cycle of ~5-10%. This is for a lab experiment where i'm looking at the response of photosynthetic bacteria to pulsed IR light. The IR emitters that i've been browsing on digi key draw about ~50 mA of current at 1.5 V. I don't plant on running the emitters in parallel, so they shouldn't draw more current than the above mentioned value. Initially I would like to drive the emitters with 1 MHz square waves at a 5% duty cycle, but would later like to do 2 MHz with a 10% duty cycle.

I had in my head a system where some square wave generator would control a transistor, which would switch the IR emitters on and off. The problem is I don't really know which wave generators or transistors would i) work reliably at those frequencies, and ii) work well with each other. Also, are there drawbacks to using the same power source to control both the square wave generator and the IR emitter? Finally, can I build such a circuit without the use of a microcontroller (since I don't really know how to program them yet)?
 
Yes, that would be correct. Do I have my hopes set too high? As long as the emitters have an ON time of at least 20 nanoseconds it's ok if there's some delay in the rise and fall times of the signal.
 
Last edited:
I would go lower than that say around 10ns. And a fast transistor to switch the IR, now you are going to need to current limit this with say a resistor. this resistor will produce an RC time constant with the capacitance of the device, this needs to be considered also.
Adam
 
Ok then. Will an RF transistor like the SS9018 work for this purpose? It's rated for up to 50 mA. The other popular RF transistor that I see on Digi key is the KSP10. This one doesn't have a rated maximum current, however. I intend to connect four IR emitters in series. Power will be supplied by a DC power supply set to 6V.

I'm not terribly familiar with the RC time constant concept, but from some googling, it seems that it's a product of the transistor's capacitance times the value of the current limiting resistor. Does this mean that I should keep the base current as low as possible such that I don't have to use a resistor with a high value? How low can it go before a transistor like the SS9018 wouldn't permit current to pass through the collector at all?
 
Last edited:
The SS9018 on has a rating of 50mA I thought you needed 200 or more? You can generally get away with much greater current by pulsing it as you are planning on doing. But you have to be careful you don't over do it as you could still damage the component.

You are right about the transistor and yes you could set up one circuit to work like this. But generally we would give ourselves some headroom so we didn't need to set each PCB up separately so we didn't damage the LEDs or transistor because of different transistors gain or LED forward voltage drops.

Doing it how you suggest gives very little if any protection to the circuitry and if temperature changes component properties then the situation can get even worse until destruction of the circuit happens.

The other important point when not using a constant current circuit and why a resistor should be used .This resistor only allows a certain maximum current through it and will protect the LEDs if they try to draw more current as they warm up. My opinion is constant current is the best way to drive an LED.

Because it does not care about the LEDs forward voltage drop and so when you fit another LED as a replacement it will push the same current through the new LED which gives you a better chance of having the near same light output if the LEDs are reasonably matched.


Adam
 
Top