Archimedes' Lever said:
"Can't work"? You're an idiot. The first IR imagers were CCD. They
came out before CMOS image planes were even around.
I'm with John on this one. The first CCDs were largely in the visible
with a strong peak sensitivity in the *near* IR and with a very big push
to get them out to much longer wavelengths on somewhat exotic materials.
I knew an astronomy imaging group that were given new military chips to
test because they could turn one into a fully working prototype camera
way faster and cheaper than the approved contractors.
Rough graphs of the typical sensitivity for silcon CCDs with front and
back thinned window construction are online at:
http://www.andor.com/learn/digital_cameras/?docid=315
Their response does not extend much beyond 1000nm which is still an
order of magnitude short of the roughly ~10um IR wavelengths needed for
thermal imaging at ambient temperatures.
Eventually they did get longer wavelength CCDs working, and they stopped
at a particular point. As the man said "what we have is good enough to
see what *we* need to see". Astronomers were a bit disappointed that
after that they had to pay for their own chip R&D. It didn't stop
terahertz sensors eventually being made though. Strangely the terahertz
image of my favourite object Cass A appears to have been removed from
the web.
As for the lenses, they are specifically for narrowing a spectral
response.
OK. So you found a site that gave you a good picture of what passes
through Germanium. So what?
So what? The FP device did as well. All that was needed was a bit of
gain increase after placing the filter.
That only gets you near infra red. The thermal band for things in the
temperature range 0-100C is much more tricky and generally involves
exotic doped materials, germanium lenses and cunning optical design
since the emissions from the casing start to be almost as bright as the
target. Some form of multistage thermoelectric cooling is usually
employed or LN2.
Regards,
Martin Brown