Maker Pro
Maker Pro

Laser detecting plate

My sensor should be a plate which should give out the position of laser which falls on the plate (30cm x 30cm).
Essentially, I am searching for a large Position sensitive device, which I didn't find in the market.
The options I have to detect the laser position is to use a NOIR camera in a Raspberry pi.
The one more rugged option is to use an array of LDR to detect the laser position, muxing gets complicated.
Is there any other commercially available sensor which is a plate and detects position of laser which falls on it ?
 

hevans1944

Hop - AC8NS
In the mid-1960s I was working at the Air Force Weapons Laboratory in Albuquerque, NM trying to build a Shrack-Hartmann Plate wavefront analyzer for infrared weapons-grade lasers. This required sampling the output laser beam with a beam sampler, sending the sampled beam to the wavefront analyzer, and then using the wavefront analyzer outputs to control deformable sections of a mirror located in the resonant cavity of the laser.

Someone writing his graduate thesis at university thought they had discovered that a small insulating substrate, coated with a thin bismuth film, could be used with four op-amps to locate the centroid intensity position of an infrared laser beam as a pair of analog x-y co-ordinates. From this humble beginning the laboratory tried to construct a wavefront analyzer consisting of about sixteen or so plano-convex zinc selenide infrared lens arranged in a hexagonal array that focused the sampled laser beam onto individual x-y detectors, each with four op-amps wired in near the detectors.

Deviations of the focused beamlets from the centers of the detectors was a measure of the wavefront "tilt" at that location. With some heavy-duty mathematics and some hand waving, the detector outputs could be placed in a negative feedback loop controlling a deformable mirror located in the laser cavity, thereby correcting for the wavefront tilt and making a better plane wave for imaging onto a target. I have no idea whether this actually worked for weapons-grade lasers or not, but it does work for astronomers by using "guide stars" instead of a laser to "clean up" atmospheric distortions and provide better "seeing" with their telescopes.

Variations on this scheme have been reported in the open literature for years, but AFAIK there has never been a market for large-format x-y position-sensing arrays... except perhaps in the astronomical community where they find use in mapping star positions. For real-time position sensing it is hard to beat an ordinary CCD camera, but you are on your own in converting the video pixels into x-y co-ordinates. Photosensitive plates used by astronomers have very high resolutions and very large number of pixels, but they are quite expensive. You need to carefully specify exactly what you are trying to DO before spending a lot of money pursuing this path. You might also want to investigate the field of photogrammetry which is devoted to extracting precise data from imagery.
 
In the mid-1960s I was working at the Air Force Weapons Laboratory in Albuquerque, NM trying to build a Shrack-Hartmann Plate wavefront analyzer for infrared weapons-grade lasers. This required sampling the output laser beam with a beam sampler, sending the sampled beam to the wavefront analyzer, and then using the wavefront analyzer outputs to control deformable sections of a mirror located in the resonant cavity of the laser.

Someone writing his graduate thesis at university thought they had discovered that a small insulating substrate, coated with a thin bismuth film, could be used with four op-amps to locate the centroid intensity position of an infrared laser beam as a pair of analog x-y coordinates. From this humble beginning the laboratory tried to construct a wavefront analyzer consisting of about sixteen or so plano-convex zinc selenide infrared lens arranged in a hexagonal array that focused the sampled laser beam onto individual x-y detectors, each with four op-amps wired in near the detectors.

Deviations of the focused beamlets from the centers of the detectors was a measure of the wavefront "tilt" at that location. With some heavy-duty mathematics and some hand waving, the detector outputs could be placed in a negative feedback loop controlling a deformable mirror located in the laser cavity, thereby correcting for the wavefront tilt and making a better plane wave for imaging onto a target. I have no idea whether this actually worked for weapons-grade lasers or not, but it does work for astronomers by using "guide stars" instead of a laser to "clean up" atmospheric distortions and provide better "seeing" with their telescopes.

Variations on this scheme have been reported in the open literature for years, but AFAIK there has never been a market for large-format x-y position-sensing arrays... except perhaps in the astronomical community where they find use in mapping star positions. For real-time position sensing it is hard to beat an ordinary CCD camera, but you are on your own in converting the video pixels into x-y co-ordinates. Photosensitive plates used by astronomers have very high resolutions and very large number of pixels, but they are quite expensive. You need to carefully specify exactly what you are trying to DO before spending a lot of money pursuing this path. You might also want to investigate the field of photogrammetry which is devoted to extracting precise data from imagery.

Wow! Thank you Sir Hevans1944, it has been an absolute pleasure to know your insights. My heartfelt thank you for still sharing your expertise.

I have decided to go with the NOIR camera in Raspberry Pi and to used an IR laser, I would put an old film roll over it which would only pass IR light into it. I would get the X, Y and Z after calibrating it, there are some resources on how to do that. The system need not be that accurate for my project.

This IR led tracking is already done with normal webcam after removing the IR filter, This is the link of the freeware, free track https://www.free-track.net/english/freetrack/comment-ca-marche.php
It not only tracks the head orientation, but also gives out the position.

I am thinking of developing this from scratch, to find X,Y and depth by pixel size of the laser beam on the plate. I would find the position and orientation of the Plate by attaching IR LED's over them. Now That I know the position and orientation of the plate, I can track the IR laser position by subtracting with the previous image. Now I can compute the X, Y position of the laser w.r.t the plate position.

The reason I was reluctant using a Raspberry Pi and NOIR camera is the cost. One of my requirement is to make it as cost efficient as I can, I was exploring every way and hardware seemed easier to do initially, but it's not. The RPI with NOIR is easier and cost efficient as far as I know now.

Keep up the good work Sir,
Have a good day
Tim
 

hevans1944

Hop - AC8NS
Not sure what z-axis "information" you expect to extract from your IR image. Most IR lasers will cause the image to "bloom" above a certain power level, and this pretty much destroys any x-y information by smearing the IR pixel out over several hundred adjacent pixels. We used to use a special heat-sensitive thin film coated onto a paddle to "see" CO2 laser beams, which are typically 10.6 μm wavelength. This thin film would fluoresce under exposure to long-wave (safe) ultraviolet light until it's temperature was above ambient, at which point it would quit fluorescing. So, it was applied to a thin strip of aluminum to allow the heat produced by absorption of the laser beam to be conducted away. On either side of the centroid of the beam the strip continued to fluorece, so the beam appeared as a dark brown area on the strip surrounded by a soft green or yellow-green glow, best viewed in a darkened room.

It was not very sensitive, IIRC, but it did allow us to poke around the optics table to "find" the invisible infrared laser beams. One problem we had was burning holes in the fluorescent paper with our five watt laser. An RPI attached to an ordinary CCD camera would suffice for viewing this paddle since everything is seen under visible light. Problem is, the paddle is not very large (you would need several of them laid side-by-side to obtain a 30cm x 30cm area) and IIRC they are a bit pricey.

The only other technology that appears to come close to what you want to do is the micro-bolometer focal-plane detector. This device has revolutionized the near-IR imaging field, replacing expensive liquid-nitrogen cooled HgCdTe and PbSnTe infrared detectors with detector arrays operating at room temperature. This make truly affordable and portable near-infrared imaging possible. I don't think this would be a DIY project just yet... Integrated circuit manufacturing techniques are required to build the micro-bolometer elements, as well as construct electronic read-outs that operate (pretty much) at commercial TV line rates. So far, the market has supported hand-held imagers but not large-format cameras. Well, I am sure our national security agencies have their "pick of the litter" options when it comes to exploiting large-format infrared imagery, and eventually this technology may become de-classified and filter down to the commercial market if a need arises for it, but for now your choices are limited.
 
The bloom can be avoided by image processing. And I am not planning to use high power IR lasers, my application is a simple game for training people to localise sound when they are blindfolded similar to a dart game. I would move the source sound and make a sound, they should pinpoint the source with an IR laser pen while they are blindfolded. The feedback would of audio given such as bulls eye, 10, 20, 30 etc,. I am planning to use less than 1 m Watt laser source.
If the Laser is not safe my back up plan is to drop the laser and track the orientation and position of the source sound and the users hand by attaching three IR LED's onto them, otherwise will use a retroreflective material onto the plate and hand and track them using the IR camera with an IR source illuminating them. In real time when they user hand/head aligns to the source sound, it would give feedback.
By the way, I am using this for visually handicapped childrens, to help them localise sound sources with good accuracy.

Thank you for your insights,
It's much appreciated.
Have a good day Sir Hevans1944.
 
Top